# NVIDIA GTX 590 Owners Club



## Alatar

*NVIDIA GTX 590 Owners Club*


















http://uk.geforce.com/hardware/desktop-gpus/geforce-gtx-590
Geforce.com
Quote:


> The NVIDIA GeForce GTX 590 is the most powerful DirectX 11 graphics card ever built. With a sleek exterior that houses dual 512-core GTX 500 GPUs and a custom vapor chamber cooler, the GeForce GTX 590 delivers high-end performance with quiet operation.


*To be added please include in your post:*
**Some PROOF that you own the card (photo, screenshot etc. preferably with your OCN name)*
**Please include the BRAND of your card*
**Please include the CLOCKS you are running your card at. With proof please.*

*Useful links*
*How to Build a Quad SLI Gaming Rig*
*EK Waterblocks for the GTX 590*
*ManuelG on the OCP/OCing/voltages*
*The GTX 590 disassembled - by Juggalo23451*
*Gtx 590 waterblock prep/install - by Juggalo23451*
*The GTX 590 Flashing & Overclocking Resource Thread - by RagingCain*

*Drivers*

*MEMBERS* list:
*Question mark = more proof needed*

**whosjohnny : EVGA GeForce GTX 590 @ 800mHz / Proof*
**Cociotfp : Gainward GeForce GTX 590 @ 775mHz / Proof*
**sarojz : EVGA GeForce GTX 590 @ 772mHz / Proof*
**whiz882 : Zotac GeForce GTX 590 Quad SLI @ 750mHz / Proof*
**helifax : ASUS GeForce GTX 590 @ 723mHz / Proof*
**Timmaigh! : Gainward GeForce GTX 590 @ 723mHz / Proof*
**[email protected] fr : Gainward GeForce GTX 590 @ 720mHz / Proof*
**kazukun : Zotac GeForce GTX 590 Quad SLI @ 700mHz / Proof*
**CSHawkeye : EVGA GeForce GTX 590 @ 700mHz / Proof*
**Strider_2001 : EVGA GeForce GTX 590 @ 700mHz / Proof*
**anand00x : EVGA GeForce GTX 590 @ 700mHz / Proof*
**Juggalo23451 : ASUS GeForce GTX 590 Quad SLI @ 700mHz / Proof*
**remer : ASUS GeForce GTX 590 @ 690mHz / Proof*
**vwmikeyouhoo : ASUS GeForce GTX 590 @ 690mHz / Proof*
**soilentblue : ASUS GeForce GTX 590 @ 685mHz / Proof*
**DyWaN : Gigabyte GeForce GTX 590 @ 675mHz [?] / Proof*
**jcde7ago : EVGA GeForce GTX 590 @ 670mHz [?] / Proof*
**ilukeberry : PoV GeForce GTX 590 Charged @ 668mHz / Proof*
**rush2049 : ASUS GeForce GTX 590 @ 660mHz / Proof*
**Jeppzer : Gigabyte GeForce GTX 590 Quad SLI @ 650mHz / Proof*
**Wogga : Palit GeForce GTX 590 @ 650mHz / Proof*
**RagingCain : EVGA GeForce GTX 590 HC Quad SLI @ stock / Proof*
**ExTrEmE_PeRfOrMaNcE : EVGA GeForce GTX 590 HC Quad SLI @ Stock / Proof*
**manidu : EVGA GeForce GTX 590 HC Quad SLI @ Stock / Proof*
**xjonathanvx : EVGA GeForce GTX 590 HC Quad SLI @ Stock / Proof*
**Opp47 : EVGA GeForce GTX 590 HC Quad SLI @ Stock / Proof*
**Special_K : EVGA GeForce GTX 590 Quad SLI @ Stock / Proof*
**Baasha : EVGA GeForce GTX 590 Quad SLI @ Stock / Proof*
**vertex : ASUS GeForce GTX 590 Quad SLI @ Stock / Proof*
**Cyanotical : EVGA GeForce GTX 590 Quad SLI @ Stock / Proof*
**evilmustang66 : EVGA GeForce GTX 590 Quad SLI @ Stock / Proof*
**Canis-X : ASUS GeForce GTX 590 Quad SLI @ Stock / Proof*
**zerounleashednl : ASUS GeForce GTX 590 Quad SLI @ Stock / Proof*
**kevink82 : Zotac GeForce GTX 590 Quad SLI @ Stock / Proof*
**tlr3715 : EVGA GeForce GTX 590 Quad SLI @ Stock / Proof*
**Shinobi Jedi : EVGA GeForce GTX 590 Quad SLI @ Stock / Proof*
**PannonOC : Gigabyte GeForce GTX 590 Quad SLI @ Stock / Proof*
**OverK1LL : EVGA GeForce GTX 590 HC @ Stock / Proof*
**sublimejhn : EVGA GeForce GTX 590 HC @ Stock / Proof*
**Robitussin : EVGA GeForce GTX 590 HC @ Stock / Proof*
**Juzam : Zotac GeForce GTX 590 @ Stock / Proof*
**Abiosis : EVGA GeForce GTX 590 @ Stock / Proof*
**Alatar : ASUS GeForce GTX 590 @ Stock / Proof*
**sl00tje : EVGA GeForce GTX 590 @ Stock / Proof*
**ReignsOfPower : EVGA GeForce GTX 590 @ Stock / Proof*
**Twilex : EVGA GeForce GTX 590 @ Stock / Proof*
**toX0rz : MSI GeForce GTX 590 @ Stock / Proof*
**manu97416 : ASUS GeForce GTX 590 @ Stock / Proof*
**GTDanny : Gigabyte GeForce GTX 590 @ Stock / Proof*
**exlink : EVGA GeForce GTX 590 @ Stock / Proof*
**dvanderslice : EVGA GeForce GTX 590 @ Stock / Proof*
**Smo : EVGA GeForce GTX 590 @ Stock / Proof*
**Semedar : ASUS GeForce GTX 590 @ Stock / Proof*
**L1eutenant : Palit GeForce GTX 590 @ Stock / Proof*
**Fallendreams : EVGA GeForce GTX 590 @ Stock / Proof*
**hammer32261 : EVGA GeForce GTX 590 @ Stock / Proof*
**HOOPAJOO : EVGA GeForce GTX 590 @ Stock / Proof*
**Recipe7 : ASUS GeForce GTX 590 @ Stock / Proof*
__________________

http://spreadsheets.google.com/pub?key=0AplylPaljD8adGtWUzRWZUZJUjhlNW50dS1aeEVrNGc&w=100&h=500&gid=0&single=true]NVIDIA GTX 590 Owners club[/URL]

*Feel free to edit your info in the spreadsheet, but also remember to post your proof the old fashioned way as well. And please edit with care, I don't want to fix too much stuff. (also feedback is welcome and I'd like to know how well the editing works for you guys)

Updates every ~5 minutes be patient*

*Benchmarking settings & instructions*
(For the spreadsheet)

**To get your benchies on the spreadsheet:
-Post the scores with links and proof in this thread
-Edit in your results by yourself (please note that you need permission to edit)
-Please post a direct link to your score if possible

Benchmarks: settings

3dMark 11:
*Performance Preset
*Extreme preset
+ Remember to post a link of your score!

Unigine Heaven 2.5
*1680x1050x8AA Very High + Extreme Tessellation
*1920x1080x8AA Very High + Extreme Tessellation
*2560x1600x0AA Very High + Extreme Tessellation

Metro 2033
*1680x1050xAAA: Max Settings + DoF On + Tess On + PhysX Off
*1680x1050x4AA: Max Settings + DoF On + Tess On + PhysX Off
*1920x1080xAAA: Max Settings + DoF On + Tess On + PhysX Off
*1920x1080x4AA: Max Settings + DoF On + Tess On + PhysX Off
*2560x1600xAAA: Max Settings + DoF On + Tess On + PhysX Off
*2560x1600x4AA: Max Settings + DoF On + Tess On + PhysX Off

Feel free to suggest more benchies, different settings etc.

*

*Signature code*
View attachment 202056


If you guys have any ideas on what to improve / add to the OP then please share your thoughts. Any useful links that you know? post them.

* Google Spreadsheet? (reviews, drivers, member benchies etc.? )


----------



## Alatar

currently stock


----------



## remer

Does my thread count as proof?


----------



## Alatar

Quote:


> Originally Posted by *remer;12877197*
> Does my thread count as proof?


of course







Welcome!

E: if you want some special information added like WC, or your clocks then please post more pics and a gpu-z screen etc.


----------



## remer

^^Sure thing. I'm already pressing the limit of my psu. I'm maxing out at 559W on a 620HX and can hear some coil whine. Until I get a new psu I'm going to hold off on OCing. For now its just a plain old *cough* stock asus gtx 590.


----------



## Alatar

Quote:


> Originally Posted by *remer;12877548*
> ^^Sure thing. I'm already pressing the limit of my psu. I'm maxing out at 559W on a 620HX and can hear some coil whine. Until I get a new psu I'm going to hold off on OCing. For now its just a plain old *cough* stock asus gtx 590.


yeah sure thing I wouldn't OC with a 620w unit either, whatever the unit is. I'm planning on a PSU upgrade in the near future as well. Preparing for socket 2011/1365 when it comes out. And it probably wouldn't hurt my 590 either.


----------



## remer

Yeah, my next major upgrade will be the 1366 replacement.

I think I may order a new psu today. I can't stand not having enough headroom to push the limit even though my box is already a space heater. I need to keep my window cracked when its 30-40F outside. Overclocking the 590 will turn this place into a scorched wasteland.

Seeing the price of the hydro copper it looks like it might make sense to get an aftermarket waterblock if I end up going down the water cooled path.


----------



## CSHawkeye

Count me in, its running like a champ for me.


----------



## Norlig

There is a Geforce GTX 5xx club already


----------



## Alatar

Quote:


> Originally Posted by *CSHawkeye;12877677*
> Count me in, its running like a champ for me.


any pics/screenies for proof?
Quote:


> Originally Posted by *pangeltveit;12877746*
> There is a Geforce GTX 5xx club already


this is a club for the 590 owners specifically


----------



## Juzam

http://www.techpowerup.com/gpuz/vv4z2/


----------



## Alatar

updated.


----------



## kcuestag

I know you guys don't speak Spanish most probably, but:

http://foro.noticias3d.com/vbulletin/showpost.php?p=3960849&postcount=352

One of my friends has killed his GTX590 (It burned completely...) with *no OC at all.*

This doesn't look good... He went for lunch, left PC off, he came back, turned it on... KABOOM! GPU made a cracking sound, and burned.

Doesn't look good







He just wasted 650€ on it...And it burned his motherboard too... pfff


----------



## Alatar

Quote:


> Originally Posted by *kcuestag;12878054*
> I know you guys don't speak Spanish most probably, but:
> 
> http://foro.noticias3d.com/vbulletin/showpost.php?p=3960849&postcount=352
> 
> One of my friends has killed his GTX590 (It burned completely...) with *no OC at all.*
> 
> This doesn't look good... He went for lunch, left PC off, he came back, turned it on... KABOOM! GPU made a cracking sound, and burned.
> 
> Doesn't look good
> 
> 
> 
> 
> 
> 
> 
> He just wasted 650€ on it...And it burned his motherboard too... pfff


Hoping for the best here, and your buddy should have warranty so that shouldn't be too much of a problem. Just call in and ask for a new card.

And it took his motherboard with it? sounds more like a PSU/mobo issue to me but still, best of luck for him.

Ooookay enough of that who wants to crank their card to the max without the OCP


----------



## kcuestag

Sucks for him, he just bought that whole new i7 2600k + GTX590 system...

The card only lasted him for 3 hours.............................

And he did not OC it, and his PSU is a Strider 1500W which is one of the bests, I am starting to think all th GTX590 have an internal problem, it's not right they're burning even without OC, this aint the first case









So yeah, he'll ahve to RMA, but i dont know of they'll take the RMA for his P67 motherboard..


----------



## Alatar

Quote:


> Originally Posted by *kcuestag;12878199*
> Sucks for him, he just bought that whole new i7 2600k + GTX590 system...
> 
> The card only lasted him for 3 hours.............................
> 
> And he did not OC it, and his PSU is a Strider 1500W which is one of the bests, I am starting to think all th GTX590 have an internal problem, it's not right they're burning even without OC, this aint the first case
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So yeah, he'll ahve to RMA, but i dont know of they'll take the RMA for his P67 motherboard..


if the system was completely new the problem might have been a defective component. Sure the 590 might have a bit fragile VRMs but I think I'll reserve my judgement for a while so that we can see how much of the cards go. Not many people have their cards yet.

And why wouldn't the motherboard be covered?


----------



## saulin

Does anyone have overclcoked benchmarks yet? How far can this cards do without voltage added but setting the fan manually?


----------



## Alatar

Quote:


> Originally Posted by *saulin;12879001*
> Does anyone have overclcoked benchmarks yet? How far can this cards do without voltage added but setting the fan manually?


not yet sadly, I think most of us are still waiting for the cards. There are few guys who have got theirs but haven't posted anything about OCs, not with proof anyways.


----------



## Masked

Had all mine at 9am release day









1 of them:










Card is OC'd at 1.05 to @840mhz atm.


----------



## Alatar

updated!

E. wait you were OC'd, a sec

2nd E: I'm gonna give you a question mark since I have no proof of the OC, but it's listed anyway.


----------



## RagingCain

Have 2 that will be here on Wednesday I suspect, although I am very hesitant about opening now with rumors flooding the internet of random card death.

Its hard to tell fact from fiction as to whats going on. Lifetime warranty from EVGA should cover that though, still would prefer not to go through that hassle.

Alatar you should do a members list with google spreadsheet, and then we can keep track of driver performance, benchmarks and other things. This might help like those that had the 295 issues. Some drivers being better than others, ya know?


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;12879546*
> 
> Alatar you should do a members list with google spreadsheet, and then we can keep track of driver performance, benchmarks and other things. This might help like those that had the 295 issues. Some drivers being better than others, ya know?


yeah I'll look into it.

Maybe keep the old one but also add in a spread sheet list. Any examples or ideas on how to lay out the spreadsheet?


----------



## RagingCain

Yeah sure, I will help with that just busy now. I will have something for ya later.


----------



## saulin

Quote:


> Originally Posted by *Masked;12879137*
> Had all mine at 9am release day
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1 of them:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card is OC'd at 1.05 to @840mhz atm.


post some benchies dude...

Looking forward to see what this card is capable of


----------



## Alatar

Quote:



Originally Posted by *RagingCain*


Yeah sure, I will help with that just busy now. I will have something for ya later.


thanks mate, all ideas and suggestions are well appreciated


----------



## soilentblue

reserved for when mine comes in

also if anyone is looking for a waterblock for the card, here's some current choices. http://www.overclock.net/hardware-news/973503-updated-ekwb-swiftech-dangerden-gtx-590-a.html


----------



## RagingCain

Quote:


> Originally Posted by *Alatar;12881065*
> thanks mate, all ideas and suggestions are well appreciated


http://www.overclock.net/overclock-net-related-news-information/502580-google-spreadsheets-your-post.html

That link is how to embed spread sheets, if you look at my post "Lucid HydraLogix Guide" you can see an example how I did it.

Just requires for you to use Google Documents.

You can start collecting the the Hardware Reviews, although most of them are in the news section, and we could keep a list of them.
http://www.overclock.net/hardware-news/973239-en-nvidias-dual-gpu-geforce-gtx.html

Also until we get some more information some commenting on the danger of overclocking/overvolting and just to avoid driver set 267.71 which isn't the newest but regardless.


----------



## OverK1LL

EVGA 590 Hydro

http://www.overclock.net/showthread.php?t=974491

Stock clocks until I finish my loop...


----------



## Juggalo23451

I have 2 of them


----------



## RagingCain

Quote:



Originally Posted by *Juggalo23451*


I have 2 of them


Do some benches!!! Wednesday cannot get here fast enough!


----------



## Juggalo23451

Quote:



Originally Posted by *RagingCain*


Do some benches!!! Wednesday cannot get here fast enough!


i still need gpu blocks,rads and tubing,


----------



## CSHawkeye

Here is my validation:

http://www.techpowerup.com/gpuz/cnsby/


----------



## RagingCain

Quote:



Originally Posted by *Juggalo23451*


i still need gpu blocks,rads and tubing,










YOU sir are a fail sonar tech. You should have all that stuff ready









How much rad do you think would be needed for cooling them/everything?

I am only using a 140.1 and 120.3 for everything... I was at least thinking of replacing the 140.1 with another 120.3 for 2 x 120.3... I am not sure though.


----------



## Juzam

Finally some pictures:


----------



## delavan

Mine is in the "mail". It will be delivered on Monday or Tuesday (better be Monday)









It's a EVGA GTX 590 Classified....hopefully, it's gonna be rock stable (no intents to OC just yet)...

Juggalo: Crazy stuff!!! two of those!









I'll post a proof when I have it...


----------



## BeerPowered

@kcuestag The GTX 590 is fine. There was a driver issue where the .72 version didn't activate the fail-safe mechanism.

Which is why his card shorted out. It's under warranty and will be replaced so he has nothing to worry about.


----------



## Alatar

okay all updated









Good to see some epic pics and more guys getting cards


----------



## rush2049

Alright everyone got mine the morning after, but was too busy playing with it to post anything anywhere. Here is my proof, I took these comparing it with my XFX Geforce GTX 275.


















































I don't know if I want to overclock it just yet, maybe I will wait for a week or two till I trust the drivers......


----------



## Alatar

holy picsize batman! Updated though, nice card









And about the spreadsheet, I think some ideas on how to lay it out would be appreciated. What info do people want it to include? Do we want members listed in the sheet as well? etc. It's not hard to make but would be a good thing to have some material for it


----------



## rush2049

yea, sorry, I didn't feel like spending the time to down size the pictures my 7D takes.....


----------



## Alatar

Whoops CSHawkeye didn't notice you had an overclock according to GPU-Z, corrected.


----------



## Juggalo23451

Quote:


> Originally Posted by *RagingCain;12882744*
> YOU sir are a fail sonar tech. You should have all that stuff ready
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How much rad do you think would be needed for cooling them/everything?
> 
> I am only using a 140.1 and 120.3 for everything... I was at least thinking of replacing the 140.1 with another 120.3 for 2 x 120.3... I am not sure though.


I am going for overkill cause on OCN there is no such thing. I ordered a rad,compression fittings, new psu, tubing too. Block for the.gpus I want won't be available till later this week. Koolance is the best for NVIDIA gpus.
Oh rads ill be using is two Rx480 rads
Check out my log in Sig(upgrade time)


----------



## ruderthanyou

http://www.techpowerup.com/gpuz/m2zqw/

Another club member.


----------



## Alatar

updated

also:

Quote:


> Originally Posted by *ruderthanyou;12886442*
> http://www.techpowerup.com/gpuz/m2zqw/
> 
> Another club member.


Welcome to OCN!


----------



## Arizonian

Here is a qoute from another thread I'd like the GTX 590 owners to look at and please feel free to make comment in that particular thread regarding this.
Quote:


> Originally Posted by *Levesque;12881754*
> The cheap VRMs on the 590 will limit any OC potential.
> 
> The 6990 with a waterblock will be a monster. And without any risk of black smoke coming out when raising the voltage a tiny tiny little bit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Buy the 6990, put a waterblock on it, OC it like crazy. WIN.
> 
> Buy the 590, put a waterblock on it, ''try'' to OC it, black smoke coming out, Nvidia says it's your fault. FAIL.


Very incorrect, please feel free to respond. GTX590 vs. HD6990


----------



## Abiosis

Quote:


> Originally Posted by *Arizonian;12886743*
> Here is a qoute from another thread I'd like the GTX 590 owners to look at and please feel free to make comment in that particular thread regarding this.
> 
> Very incorrect, please feel free to respond. GTX590 vs. HD6990


_Don't let those haters get into your head...it's not worth it ~







_


----------



## Abiosis

_Hi, Alatar

I'm in~







_


----------



## Alatar

updated


----------



## KingT

Just for the laugh







:










CHEERS..


----------



## blues man

Quote:


> Originally Posted by *KingT;12887324*
> Just for the laugh
> 
> 
> 
> 
> 
> 
> 
> :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CHEERS..


Whatever


----------



## Yukss

Quote:


> Originally Posted by *KingT;12887324*
> Just for the laugh
> 
> 
> 
> 
> 
> 
> 
> :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CHEERS..


***.......


----------



## Alatar

really guys? I think all of us in this club have seen those already. There is absolutely no reason to post them if you're only going to be trolling.

I agree that the VRM issue might be a real problem with the cards but if you have nothing to add or nothing that you want to discuss then would you please refrain from trolling.

Thanks you.


----------



## CSHawkeye

Quote:


> Originally Posted by *Alatar;12885892*
> Whoops CSHawkeye didn't notice you had an overclock according to GPU-Z, corrected.


Yeah right now I am messing with 700/1800 right now. Idle its at 34/36c for both cores and full load 87/87 now.


----------



## Alatar

Quote:


> Originally Posted by *CSHawkeye;12888256*
> Yeah right now I am messing with 700/1800 right now. Idle its at 34/36c for both cores and full load 87/87 now.


if you want that 700mhz listed, post a gpu-z link, screenie etc.









But anyways, have you raised the volts at all? and have you benchmarked the card to see if the OCP is kicking in?


----------



## Masked

Quote:


> Originally Posted by *Alatar;12888289*
> if you want that 700mhz listed, post a gpu-z link, screenie etc.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But anyways, have you raised the volts at all? and have you benchmarked the card to see if the OCP is kicking in?


From what I've fooled around with, 800mhz+ is possible at 1.00 ~ I can't really test it out formally until tomorrow, though.


----------



## CSHawkeye

I have not adjusted any voltages, here is my new gpuz confirmation:

http://www.techpowerup.com/gpuz/w8ckf/


----------



## kcuestag

I have seen in a Spanish forum that Nvidia has pronounced themselves about the 590 burning issue.

They've said they will actually limit the GTX590's OC, so it won't OC at all...

Who would spend 700€ on a high-end enthusiast card which will be capped and have no OC?

This is a joke... They've also said, people who burnt their cards OC'ed it with heavy voltages of up to 1.2v, and they will not take care of those cards.... Seriously?...

My friend's GTX590 which died yesterday had NO OC AT ALL!!! And it still died while the only thing he did was play Crysis 1...

Will they replace him the card? If they don't, I'm going to be quite dissapointed at Nvidia, he's already quite mad he didn't pick up the HD6990 or x2 570 instead...


----------



## trivium nate

I don't have a 590 but I am wondering how well they run!


----------



## 2010rig

There should be a rule to keep AMD fanboys out of Nvidia club threads.

Anyone who has bought the 590 is already aware of the pro's and cons. They act like they're sharing revolutionary news that no one is aware of. Go spread your negativity in the AMD forums.


----------



## remer

I don't know about you guys but I have been thrilled with my 590. I've been waiting for a single card solution for a while now because I have only one pcie slot available.

Could we see voltages listed with the overclocked cards? I think it would help everyone get an idea of what is achievable with 5 vrm's per gpu.

Also, if I'm not mistaken the EVGA Classified cards come overclocked at 630MHz so OverK1LL should have that on the list. I cant wait to see that thing in a loop. I've been drooling over the EK blocks. Definitely going to consider water for my next build.


----------



## hokeyplyr48

Ordered mine on Friday, should be here on Tuesday or Wednesday


----------



## kcuestag

Quote:


> Originally Posted by *2010rig;12889083*
> There should be a rule to keep AMD fanboys out of Nvidia club threads.
> 
> Anyone who has bought the 590 is already aware of the pro's and cons. They act like they're sharing revolutionary news that no one is aware of. Go spread your negativity in the AMD forums.


I'm not a fanboy.... Not at all, I have another rig with a GTX570 and another with a GTX285...

I'm just saying, It's sad to see this happen, my friend is quite sad because of his dead GTX590









And again, I'm not a fanboy of ANY brand, Intel or AMD, have both, Nvidia or AMD, I have both. Don't get the wrong view of me, Im not a fanboy.


----------



## Alatar

Quote:


> Originally Posted by *kcuestag;12888890*
> snip


if you want to have a discussion about that then please post some proof first.
Quote:


> Originally Posted by *2010rig;12889083*
> There should be a rule to keep AMD fanboys out of Nvidia club threads.
> 
> Anyone who has bought the 590 is already aware of the pro's and cons. They act like they're sharing revolutionary news that no one is aware of. Go spread your negativity in the AMD forums.


exactly. There was no reason to post the sweclockers vid etc. We know what we're getting into. And I'm quite sure all of us are going to be happy with our purchases.
Quote:


> Originally Posted by *remer;12889084*
> I don't know about you guys but I have been thrilled with my 590. I've been waiting for a single card solution for a while now because I have only one pcie slot available.
> 
> Could we see voltages listed with the overclocked cards? I think it would help everyone get an idea of what is achievable with 5 vrm's per gpu.
> 
> Also, if I'm not mistaken the EVGA Classified cards come overclocked at 630MHz so OverK1LL should have that on the list. I cant wait to see that thing in a loop. I've been drooling over the EK blocks. Definitely going to consider water for my next build.


Glad to see you enjoying your card







I can only hope that mine comes in tomorrow. Oh and I didn't know about the EVGA ones being clocked higher. so the stock setting in those is 630mhz, but I guess we can still mark them as OCs.


----------



## 2010rig

Something cool to add to the club.





http://www.youtube.com/watch?v=2pOnI4jRQsY[/ame[/URL]]


----------



## 2010rig

Quote:


> Originally Posted by *kcuestag;12889149*
> I'm not a fanboy.... Not at all, I have another rig with a GTX570 and another with a GTX285...
> 
> I'm just saying, It's sad to see this happen, my friend is quite sad because of his dead GTX590
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And again, I'm not a fanboy of ANY brand, Intel or AMD, have both, Nvidia or AMD, I have both. Don't get the wrong view of me, Im not a fanboy.


Funny, but I wasn't talking about you in particular, yet you took it to heart? I'm talking about in general, check out the other posts.

It sucks that your buddy's card died, surely the warranty covers that. So, he's not out the money for it.

I haven't been following that story, but, what drivers was he using?


----------



## Alatar

Quote:


> Originally Posted by *Alatar;12889160*
> Oh and I didn't know about the EVGA ones being clocked higher. so the stock setting in those is 630mhz, but I guess we can still mark them as OCs.


scratch that, I'm gonna list any EVGA 630mhz cards as stock ones.


----------



## coolhandluke41

Quote:


> Originally Posted by *2010rig;12889169*
> Something cool to add to the club.


good post ,thanks for sharing this:thumb:


----------



## delavan

There is a lot of drama generated by this launch!

I'll get my card on monday, and enjoy every bit of it. I won't try to bypass OCP or all those things cause it doesn't make sense to me.

W1ZZARD from TPU, is working on bypassing OCP from the beginning of the GTX5** launch...
He helps people frying cards, I guess they get a stupid grin on their faces from doing so...must be a "drug' to them..sniffing burnt VRM vapors...









Anyway, to be honest, I'm glad I got the EVGA, for the little overclock, and the lifetime warranty...just to stay on the safe side.

IF my card burns at stock clocks (factory oc), then we'll talk...if it lasts me years, i'll be laughing....I needed a new card, drawing less than SLI 570s and this is what I bought...

To me, it's not an offense that this 590 doesn't beat SLI 580s...I don't think it was the goal anyway...

some people need to get real...


----------



## kcuestag

Quote:


> Originally Posted by *2010rig;12889180*
> Funny, but I wasn't talking about you in particular, yet you took it to heart? I'm talking about in general, check out the other posts.
> 
> It sucks that your buddy's card died, surely the warranty covers that. So, he's not out the money for it.
> 
> I haven't been following that story, but, what drivers was he using?


He used the latest nvidia drivers, he knew about the CD drivers issue so he didn't even touch the CD.

Let's hope this gets sorted out soon, I really like the GTX590, and I was about to buy one... I will wait 1-2 weeks to see how things go, really need a dual card on single pcb for my future build in about a month, I think GTX590 will be the candidate









If I end up buying it, what brand should I be looking at here in Europe?


----------



## 2010rig

Quote:


> Originally Posted by *kcuestag;12889466*
> He used the latest nvidia drivers, he knew about the CD drivers issue so he didn't even touch the CD.
> 
> Let's hope this gets sorted out soon, I really like the GTX590, and I was about to buy one... I will wait 1-2 weeks to see how things go, really need a dual card on single pcb for my future build in about a month, I think GTX590 will be the candidate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I end up buying it, what brand should I be looking at here in Europe?


It's definitely wise to wait, and keep an eye on this thread to see if more cards die at stock.

What are your choices and prices?


----------



## Alatar

Quote:


> Originally Posted by *2010rig;12889169*
> Something cool to add to the club.


that is indeed a really cool setup. I would love to try that out but I'm afraid that I do not have the space to even think about using projectors anywhere. I also wonder how do all the reviewers always find so bad computer gamers


----------



## Twilex

Quote:


> Originally Posted by *kcuestag;12889466*
> He used the latest nvidia drivers, he knew about the CD drivers issue so he didn't even touch the CD.
> 
> Let's hope this gets sorted out soon, I really like the GTX590, and I was about to buy one... I will wait 1-2 weeks to see how things go, really need a dual card on single pcb for my future build in about a month, I think GTX590 will be the candidate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I end up buying it, what brand should I be looking at here in Europe?


Off topic: Where do you work on ramstein homey?


----------



## kcuestag

Quote:



Originally Posted by *Twilex*


Off topic: Where do you work on ramstein homey?



LOL!!!!!!!!

You're also in the Air Base? Are you kidding?


----------



## RagingCain

More information for the main post:
Reviews
Water Blocks

This message from ManuelG.

A link to the latest drivers of course:
x64 http://www.nvidia.com/object/win7-wi...85-driver.html
x86 http://www.nvidia.com/object/win7-wi...85-driver.html

For the spread sheet, the members list should probably go on it, with Core voltage, core speed, shader speed, memory speed.

A different tab for benchmarks, also isolating each benchmark to have separate tabs for 3DMark11/3DMarkVantage/Unigine/StoneGiant any other demos, such as Crysis, FarCry 2, AvP, or Metro2033.

We could run leaderboards, we can also do notes (bugs) discovered in certain games with certain drivers.


----------



## rush2049

Just letting everyone know:

I got a bluescreen last night from running a 3DMARK Vantage Extreme run. (This was at stock)

It didn't bluescreen for the 3DMARK Vantage Performance run right before that one though, so it might have been a heat issue.... I am not sure at this point because I wasn't running any kind of logging.

Here is my run: http://3dmark.com/3dmv/3023525


----------



## Strider_2001

Mine is on the way...I will have it Tuesday and I will update with pictures and OC ability.


----------



## soilentblue

is anyone seeing throttling with their overclocks? masked said he isn't seeing it and he is above the 700mhz that Pedro is proclaiming to be where it beings at.

it could be the waterblock helping avoid it.


----------



## rush2049

I just ran a 3Dmark 11 performance run and something isn't quite right! (This is still at stock clocks)

Here is the link: http://3dmark.com/3dm11/915012

That is a horribly low score and the fps in the tests was slowly increasing as the test progressed. I was seeing increments of .1 fps every half second. The fps started low and ended slightly higher.....

Is that normal? Is this what the throttling does?

edit: Just looked over the logs, yea the card is getting clocked down and the voltage is decreased in the middle of the run, the temperature increases to 70 C and then drops down to 45 C in the middle of the tests...... thats when the card is getting throttled.....


----------



## Behemoth777

Quote:



Originally Posted by *soilentblue*


is anyone seeing throttling with their overclocks? masked said he isn't seeing it and he is above the 700mhz that Pedro is proclaiming to be where it beings at.

it could be the waterblock helping avoid it.


He's full of it. I can't wait until he posts his overclocking tests. I want to see the difference between a 700mhz oc and an 800mhz oc. Because I bet you there is little/no difference.


----------



## 2010rig

Quote:



Originally Posted by *Behemoth777*


He's full of it. I can't wait until he posts his overclocking tests. I want to see the difference between a 700mhz oc and an 800mhz oc. Because I bet you there is little/no difference.


And you would know because you have a 590, you have tested this, and have come to this conclusion, right?

Please post proof of your findings to support your claims.


----------



## soilentblue

Quote:


> Originally Posted by *rush2049;12892624*
> I just ran a 3Dmark 11 performance run and something isn't quite right! (This is still at stock clocks)
> 
> Here is the link: http://3dmark.com/3dm11/915012
> 
> That is a horribly low score and the fps in the tests was slowly increasing as the test progressed. I was seeing increments of .1 fps every half second. The fps started low and ended slightly higher.....
> 
> Is that normal? Is this what the throttling does?
> 
> edit: Just looked over the logs, yea the card is getting clocked down and the voltage is decreased in the middle of the run, the temperature increases to 70 C and then drops down to 45 C in the middle of the tests...... thats when the card is getting throttled.....


sounds like the card got too hot and then got throttled down from what you said. this does make sense as masked has said that so far he hasn't seen any throttling on his watercooled card. have you tried keeping everything stock and raising the fan to max, then running the test again to see what happens?
Quote:


> Originally Posted by *Behemoth777;12892698*
> He's full of it. I can't wait until he posts his overclocking tests. I want to see the difference between a 700mhz oc and an 800mhz oc. Because I bet you there is little/no difference.


he has the card and neither you nor i do. i can't say he's full of it and wouldn't want to. i hope he's correct. only a fool calls someone out with no proof and we both don't have any that he's lying.


----------



## Pedros

I'm interested in a WCled 590 too...
from testing the Asus 590 with a friend, we went to 840Mhz core at 1.05v ... we were getting worst vantage runs than we had at 670.

After 1 hours testing voltages ... what nvidia says is true ... the sweet pot for the 590 is under 1.00v ... everything above it will result in OCP kicking onand throttling the card down ... and the best we got was at 700Mhz ... we tried 725Mhz and the OCP kicked ... 715... the same ... 710 the same ... so we kept it at 700 and a 38350marks at vantage ...

If this is a OCP thing, i don't see the waterblock bringing miracles ... Yes, the card can go to the 840Mhz, but the OCP will laugh at your face









I may be mistaken, that's why i want to check what happens with the waterblock ...
If the OCP "algorithm" found in the drivers, uses the thermal readings, along with the voltages and core speeds ... maybe having low thermal readings will allow the OCP to kick later in the game, allowing more expressive overclocks


----------



## 2010rig

@Pedros, thanks for the update.









Now, we'll have to see how it behaves under water.

I'm surprised Nvidia opted for 4 VRM's, and didn't keep the 6+2 like in the 580. They dropped the ball there unfortunately.

What a difference that would've made.
http://hothardware.com/articleimages/Item1585/geforce-gtx-580.jpg
vs
http://www.crunchgear.com/wp-content/uploads/2011/03/gtx590.jpg

I doubt people would've cared if the card was slightly longer to make the extra VRM's fit.


----------



## Pedros

It was due to pcb size ...
To use this cooling solution and keep it silent, they really couldn't use a bigger PCB. I think Nvidia is defending the wrong ideas with this card... they clearly are marketing the size, silence and cooling capability ...

That would be great ... but ...
A bigger PCB would give more space for better VRM;
Cooling had to be more agressive... hence, more noisy ...

But... it is what it is and Nvidia won't do a v2 of the 590 at this time of the championship ... so, that's why want to see the water results







i'm delaying my buy to check this, how it turns ...


----------



## remer

Well, I couldn't stand not oc'ing it a little. I got a 3dmark 11 score of P8958 with these clocks:
Core: 650MHz
Sharders: 1300MHz
Mermory: 1750MHz
Voltage: Stock
Peak power consumption: 575W
Peak temp w/ custom fan profile: 82C


I got this 3dmark 11 score at stock gpu settings with my i7 at 3.2 ghz. Max power consumption was 559W.

















For now I'm back to stock but who knows what the future holds.


----------



## jellis142

Quote:


> Originally Posted by *rush2049;12885662*
> Alright everyone got mine the morning after, but was too busy playing with it to post anything anywhere. Here is my proof, I took these comparing it with my XFX Geforce GTX 275.
> 
> (lots of UBER resolution pics)
> 
> I don't know if I want to overclock it just yet, maybe I will wait for a week or two till I trust the drivers......


They may be huge, but a few of them are wallpaper-worthy







Take pictures when they're installed at that resolution too!


----------



## soilentblue

guy on another site got this after 3hrs 15 min while using autofan. he's going to try a few game benchmarks and see what happens. seems the cards can easily reach 580 clocks, but the frames seem like they should be higher. i really think it's heat and not voltage.

i don't have a card to test it out but it seems like on water, turning the ocp off, and keeping it below 1.05v like nvidia recommends should be safe. just a theory right now until proof arises but it still looks like it's pointing to heat that is kicking the ocp in and not volts.


----------



## soilentblue

it could very well also be bad drivers. is the ocp still on if you don't click on the box in furmark? i'm assuming thats a no?

edit:
Quote:


> New: added /gtx500ocp checkbox for GPU-Z to disable GTX 500 OCP (over current protection).


----------



## Masked

Do I get to say "I told you so" yet?


----------



## grunion

Quote:


> Originally Posted by *soilentblue;12895257*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> guy on another site got this after 3hrs 15 min while using autofan. he's going to try a few game benchmarks and see what happens. seems the cards can easily reach 580 clocks, but the frames seem like they should be higher. i really think it's heat and not voltage.
> 
> i don't have a card to test it out but it seems like on water, turning the ocp off, and keeping it below 1.05v like nvidia recommends should be safe. just a theory right now until proof arises but it still looks like it's pointing to heat that is kicking the ocp in and not volts.


That card is throttling, poor fps and look at the gpu usage.


----------



## soilentblue

i thought throttling with gpu usage has major spikes, not very weak ones like those. the fps are weak but that could be driver issue and not throttling.

linus' video shows major spikes. incredibly noticeable(cept to him).


----------



## grunion

Quote:


> Originally Posted by *soilentblue;12896012*
> i thought throttling with gpu usage has major spikes, not very weak ones like those. the fps are weak but that could be driver issue and not throttling.
> 
> linus' video shows major spikes. incredibly noticeable(cept to him).


IDK

But I know FM always gave me a solid 99% usage, any and all set ups I've ran.

Someone here with a 590 should test it out.


----------



## soilentblue

Quote:


> Originally Posted by *grunion;12896064*
> IDK
> 
> But I know FM always gave me a solid 99% usage, any and all set ups I've ran.
> 
> Someone here with a 590 should test it out.


you could be right. just seems like an excessive loss of fps for a 8% dip in gpu load.


----------



## soilentblue

go here for benches done by the same guy

http://hardforum.com/showpost.php?p=1037035298&postcount=312

too much to repost.


----------



## Masked

Quote:


> Originally Posted by *soilentblue;12896322*
> go here for benches done by the same guy
> 
> http://hardforum.com/showpost.php?p=1037035298&postcount=312
> 
> too much to repost.


I believe that...I'm noticing much the same.

That is 100% a driver issue, as well.

Same thing happened to the 295series in terms of Heaven//Vantage when the 280 was top dog...If anyone remembers back to those days.

Took a few months for drivers to mature and benches to work properly...


----------



## soilentblue

yeah the streetfighter bench shows supreme driver issues. this is with one gtx 580










374.20 is more than what the gtx 590 did oc'd. immature drivers is still better than throttling. lol


----------



## grunion

Quote:


> Originally Posted by *soilentblue;12896322*
> go here for benches done by the same guy
> 
> http://hardforum.com/showpost.php?p=1037035298&postcount=312
> 
> too much to repost.


I see you're a member there.
Ask him to test at a lower oc to rule out throttling, maybe 700mhz.


----------



## soilentblue

Quote:


> Originally Posted by *grunion;12896499*
> I see you're a member there.
> Ask him to test at a lower oc to rule out throttling, maybe 700mhz.


??? he did the benches at both stock and oc settings. or do you mean for Furmark?

edit:nm i know what your saying.


----------



## soilentblue

k I asked.


----------



## kryptiq

Good luck with the cards ladies & gents, screw the haters once the issues get sorted I am sure they will be jealous. That WC version from EVGA sure is beautiful.


----------



## Special_K

This is the only proof I have.


















I hate having to wait.


----------



## Alatar

Quote:


> Originally Posted by *Special_K;12896933*
> This is the only proof I have.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [/IMG]
> 
> I hate having to wait.


just post a pic or a screenie when they arrive and you'll get your place next to juggalo's quad SLI









I'm supposed to get my card today, it's 7am here, have to go to school first. Let's hope the post office doesn't fail or something.


----------



## ruderthanyou

This is the best I can get right now with stock voltage and 3DMark 11 stable.

http://www.techpowerup.com/gpuz/nyz7v/


----------



## reflex99

the 'splody VRM club


----------



## Arizonian

Quote:


> Originally Posted by *reflex99;12897560*
> the 'splody VRM club


Why don't you go back to your 'I'm a huge ATI fanboy' club. So inappropriate on this thread Reflex99. Thanks for letting everyone know how little you know and how immature. Go back to your 'I have no money and can't afford a real GPU' club', so I have to invade others clubs thread and make derogatory statements to make myself feel better. Get a life man.

Your not even worth a real response. Your reps must be from people just like yourself and hold very little value.


----------



## reflex99

It was a joke.

I am sorry if some people did not understand it or find it funny.


----------



## Alatar

who gave a one star rating?

guys let's get it up now!

Updating you ruderthanyou in a min

Also, I'm f5'ing my mail tracking system from school


----------



## Pedros

I know that it would be frustrating to bench at 800Mhz!







Believe me ... 700+- Mhz is the best oc ...









Nvidia did their homework when it comes to handicapping the damn card with the OCP


----------



## rush2049

Ok did some more testing, getting some weird results here. Here are my notes:

*************************************************************
Test Note(s) in order of execution:
vantage performance run at STOCK: High Temp: ?? Score: P24334
vantage extreme run at STOCK: High Temp: 80 C --Bluescreened

reboot

3d11 performance run at STOCK: High Temp: 74 C Score: P7733

reboot

vantage extreme run at 95% fan: High Temp: 62 C Score: X17752
vantage performance run at 95% fan: High Temp: 60 C --Bluescreened

reboot

vantage performance run at 95% fan: High Temp: 60 C --Bluescreened (might have been cause I didn't clear the cache)
*************************************************************

This makes me wonder if there is glitch with vantage with running another test right after, with only letting the gpu cool down and not closing and re-opening it.

I am rather confident it is the gpu doing this as my system was stable doing this type of thing before I switched graphics cards.

And also that second to last run with vantage had the following end to its log before it bluescreened:

Code:



Code:


00, 28-03-2011 05:29:07, Hardware monitoring log v1.4 
01, 28-03-2011 05:29:07, GeForce GTX 590,GeForce GTX 590
02, 28-03-2011 05:29:07, GPU1 temperature    ,GPU2 temperature    ,GPU1 usage          ,GPU2 usage          ,GPU1 fan speed      ,GPU1 fan tachometer ,GPU1 core clock     ,GPU2 core clock     ,GPU1 shader clock   ,GPU2 shader clock   ,GPU1 memory clock   ,GPU1 memory usage   ,GPU2 memory clock   ,GPU2 memory usage   
**** SNIP ****
80, 28-03-2011 05:37:15, 53.000              ,54.000              ,0.000               ,0.000               ,95.000              ,2730.000            ,612.500             ,612.500             ,1225.000            ,1225.000            ,1710.000            ,193.688             ,1710.000            ,193.688             
80, 28-03-2011 05:37:16, 53.000              ,54.000              ,0.000               ,0.000               ,95.000              ,2760.000            ,612.500             ,612.500             ,1225.000            ,1225.000            ,1710.000            ,193.688             ,1710.000            ,193.688             
80, 28-03-2011 05:37:17, 53.000              ,53.000              ,0.000               ,0.000               ,95.000              ,2730.000            ,612.500             ,612.500             ,1225.000            ,1225.000            ,1710.000            ,193.688             ,1710.000            ,193.688             
80, 28-03-2011 05:37:18, 52.000              ,53.000              ,0.000               ,0.000               ,95.000              ,2730.000            ,612.500             ,612.500             ,1225.000            ,1225.000            ,1710.000            ,193.688             ,1710.000            ,193.688             
80, 28-03-2011 05:37:19, 52.000              ,53.000              ,0.000               ,0.000               ,95.000              ,2730.000            ,612.500             ,612.500             ,1225.000            ,1225.000            ,1710.000            ,193.688             ,1710.000            ,193.688             
80, 28-03-2011 05:37:20, 52.000              ,53.000              ,0.000               ,0.000               ,95.000              ,2760.000            ,612.500             ,612.500             ,1225.000            ,1225.000            ,1710.000            ,193.688             ,1710.000            ,193.688             
80, 28-03-2011 05:37:21, 51.000              ,52.000              ,0.000               ,0.000               ,95.000              ,2730.000            ,553.500             ,553.500             ,1107.000            ,1107.000            ,1603.000            ,193.688             ,1603.000            ,193.688             
****THIS LINE IS A WHOLE BUNCH OF NULL SYMBOLS****

This was during a loading screen between tests which explains the no gpu load, but if you look carefully you will see the clock speeds all slightly decrease suddenly and then it bluescreened.

I am going to try some different benchmarks and see if this type of thing repeats itself again.


----------



## Pedros

Hmmmm... what card did you have before the 590 ?
When testing the 590 i had lots of problems with the drivers ... had to do a clean install because the system got really instable ...


----------



## rush2049

I had a nvidia gtx 275 before hand. When I installed the drivers I told it to do a clean instal so that shouldn't be the problem...

I am going to try a vantage performance run here with fan set at 95% to compare with a stock fan, then I will try another clean install with the .85 drivers and see if that fixes it.


----------



## Pedros

Those Bsod's aren't normal by any means ... :x
You know, the last time i had to re-install my os







eheheh .. that's what i called a clean install


----------



## Abiosis

nVidia GeForce Driver v267.91

Version:267.91 Certified

Release Date:2011.03.28

Supported products:GTX 590

Operating System:Windows Vista 64-bit, Windows 7 64-bit

Language:English (U.S.)

File Size:127 MB

This driver adds support for the new GeForce GTX 590 dual-GPU graphics card, including support for Quad SLI when paired with a second GeForce GTX 590 card.

For 3D Vision support, you will also need to install the NVIDIA 3D Vision Controller Driver v266.21.

Other

* Installs HD Audio driver version 1.1.13.1
* Installs PhysX System Software to version 9.10.0514

http://www.nvidia.com/object/win7-wi...91-driver.html

_3 new drivers within two days...it seems they care~_


----------



## Alatar

I was wondering if I should do a completely fresh install when I get my card, but the problem is that I'm kind of pressed for time, and if I start fooling around with the windows installation I won't have any time to do benchies or game.

But I guess if everything works out normally I don't have to reinstall the whole windows.


----------



## Pedros

Usually i do a fresh install ...
Just for the sake of it ... it's easier to isolate problems this way ... and you know everything will work with fresh driver installs ...


----------



## rush2049

Ok downloading the new drivers.

Here is a small comparison for everyone to see. This is the exact same settings except the fan is manually set to 95% on the one.

http://3dmark.com/compare/3dmv/3027371/3dmv/3023525

Its 177 points difference. (I haven't use vantage enough, let me know if 177 is withing normal variances)

I will do another clean driver install with the .91 drivers and test again. Hoping for a large boost.....


----------



## RagingCain

Quote:


> Originally Posted by *Abiosis;12899561*
> nVidia GeForce Driver v267.91
> 
> Version:267.91 Certified
> 
> Release Date:2011.03.28
> 
> Supported products:GTX 590
> 
> Operating System:Windows Vista 64-bit, Windows 7 64-bit
> 
> Language:English (U.S.)
> 
> File Size:127 MB
> 
> This driver adds support for the new GeForce GTX 590 dual-GPU graphics card, including support for Quad SLI when paired with a second GeForce GTX 590 card.
> 
> For 3D Vision support, you will also need to install the NVIDIA 3D Vision Controller Driver v266.21.
> 
> Other
> 
> * Installs HD Audio driver version 1.1.13.1
> * Installs PhysX System Software to version 9.10.0514
> 
> http://www.nvidia.com/object/win7-wi...91-driver.html
> 
> _3 new drivers within two days...it seems they care~_


Very nice catch, how is your Raven project going Ambi?? Been a while since Dead Space 2









I haven't started on my Tempest mod :*( -> Just blew all my hooker money on 2x 590s
Quote:


> Originally Posted by *rush2049;12899596*
> Ok downloading the new drivers.
> 
> Here is a small comparison for everyone to see. This is the exact same settings except the fan is manually set to 95% on the one.
> 
> http://3dmark.com/compare/3dmv/3027371/3dmv/3023525
> 
> Its 177 points difference. (I haven't use vantage enough, let me know if 177 is withing normal variances)
> 
> I will do another clean driver install with the .91 drivers and test again. Hoping for a large boost.....


Don't forget new drivers being pumped out, every day now it seems, are generally fixing problems, not performance boosts, so keep that in mind since you have had some BSODs.

Secondly, I had quite a bit of BSOD with a 1090t/1100T back in the day that was related to installing a 3rd 5870. Turns out I needed more CPU-NB voltage to handle the extra GPU so what are your settings, maybe I can trouble shoot it with you.


----------



## Pedros

Those are some low results you have there Rush.
My 580 was doing >25000 marks ... :x


----------



## rush2049

here are my relevant bios settings, I could up the NB volts, I kept them at stock back when I was doing my cpu benching....

CPU Ration: 14
CPU Bus: 250
PCIE: 100
DRAM:1667
CPU/NB: 2000
HT Link: 2000

CPU Volt: 1.300000
CPU/NB Volt: 1.162500
DRAM Volt: 1.500000
HT Volt: 1.20000
NB Volt: 1.10000

CPU Load-line Cali: Enabled
CPU/NB Load-line Cali: Enabled

how much would you suggest up on the NB volts? .05?


----------



## Abiosis

Quote:


> Originally Posted by *RagingCain;12899621*
> Very nice catch, how is your Raven project going Ambi?? Been a while since Dead Space 2


_Dead Space II is pretty old news now...

I hardly hang on with the same game too long...







_


----------



## delavan

Quote:


> Those are some low results you have there Rush.
> My 580 was doing >25000 marks ... :x


=(
Maybe that's related to 3Dmarks being CPU-dependant. Those X6 are number crunchers, but they don't push 3Dmarks as high as Intel..

Anyway, I just checked my shipping status...my "classified" 590 will be here today...
I feel like turning it-in before opening the package...I don't like stories of expensive cards throtlling-down, so maybe I'll ship the card back...let's see what my etailer has to say about that...I don't feel like being stuck with a limited-edition card that everybody and their dogs are laughing at...for being a massive launch screw-up!

Stupid Nvidia, they screwed the pooch on the GTX570 weak power phases and they reproduce the same issue with another card 4-5 months after?
so what now, I'll have to increase fan speed right out of the box to prevent my flagship card from throttling at stock clocks?"

And the card might be running cooler than HD6990, but it's cool because it's throttling down?

what's next? We'll wait for a guide about "how to bake a 590"?
dunno what to do....


----------



## Pedros

delavan, the 590 is cooler because the gpu's have super low voltages and the pcb is smaller ... that way the cooling is efficient.

That's why you can really had volts to the card ... or at least, Nvidia discourages users to do so ...







Once you increase voltage, the gpu's will be hotter, the vrm's load will increase, the cooling starts to be deficient ...







The gpu's will be cooled alright ... the problem is the power circuitry .

But i mean, the damn card is a beast, just like 6990







People won't need to overclock it ... although, if you are like me ... you'll be really stressed to do so







eheheh

It's like having a big red button on a control panel, with the label " JUST PRESS IN CASE OF EMERGENCy " ...

Man... every time you'll look at the damn button you will fighter not to press the damn thing


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;12899676*
> here are my relevant bios settings, I could up the NB volts, I kept them at stock back when I was doing my cpu benching....
> 
> CPU Ration: 14
> CPU Bus: 250
> PCIE: 100
> DRAM:1667
> CPU/NB: 2000
> HT Link: 2000
> 
> CPU Volt: 1.300000
> CPU/NB Volt: 1.162500
> DRAM Volt: 1.500000
> HT Volt: 1.20000
> NB Volt: 1.10000
> 
> CPU Load-line Cali: Enabled
> CPU/NB Load-line Cali: Enabled
> 
> how much would you suggest up on the NB volts? .05?


I would up the core voltage to 1.325v first. See if that helps, if you have an OC that is "tripping" over an error on a core, it will significantly hurt benchmarks, if not BSOD. To see if you have a core failing start running Prime95 - Small FFTs test. If you are 100% stable, then you should move onto the Custom Blend tests with CPU-NB / Memory stress testing. If you also feel comfortable, I believe 250 FSB is pushing it a bit on your ASUS mobo, yet I don't see any increase in Mobo NB voltages. I would put that at 1.20 which is just a modest increase to help stabilize the FSB.

I have to tell you will get significant gains on an AM3 platform if you get your CPU-NB up to 2650~2800 MHz. 3000 MHz would be best, and the IMC is capable of it, but this is where you have to tweak both your RAM and CPU voltages. For 2800 you will need approximately 1.31~1.33 CPU-NB voltages depending on your CPU (maybe more, maybe less.) Leave LLC on for both. Leave HT Link speed to 2000, and if that voltage is on AUTO leave it there, if its not on AUTO, get it there.

What type of BSOD are you getting (which STOP code is it) because your memory voltage looks a little low for dual channel @ 1667. Which could be stock specs, I just don't know what RAM you have. I do know you can push a little more volts but not entirely necessary unless its a random BSOD below x100, and x100-125 is primarily CPU clock speed/voltage related, all though technically its all CPU related.

As far as 3D Mark 11 / Vantage go, AMD scores will never be as high as Intel based systems. Just a fact of life, however the GPU scores should be the same, or slightly in Intel's favor. Keep that in mind in just posting a Pscore or Xscore.
Quote:


> Originally Posted by *Abiosis;12899691*
> _Dead Space II is pretty old news now...
> 
> I hardly hang on with the same game too long...
> 
> 
> 
> 
> 
> 
> 
> _


Tell me about it... Dead Space 2 -> Bioshock -> Bioshock 2 (half-way) -> Dead Rising 2 (first chapter) -> Dragon Age 2 (Act 3) -> Homefront (Completed) -> Crysis 2 (Completed) -> WoW Cataclysm (84 with 10% to go to 85)... back to Bioshock 2... or Crysis 2 Dx11.


----------



## Pedros

Ok... GPU score is 28339 ... my 580 at 960 would do 30350 or something ...


----------



## RagingCain

Quote:


> Originally Posted by *Pedros;12899889*
> Ok... GPU score is 28339 ... my 580 at 960 would do 30350 or something ...


Looks like SLI is not running full steam, you confirmed both GPUs lighting up during the benchmark with either Afterburner, Precision or GPU-z?


----------



## rush2049

Well my cpu is prime95 stable for 24 hours. I know what I am doing when it comes to cpu/ram overclocking. The volts are correct for the ram, (see my system in my sig for my details).

If the CPU/NB voltage increase makes the pci-e interfaces more stable then I will try that after a few more test runs with the new driver. As of me typing I am on run #3 in a row with vantage and so far no BSOD at all. Too bad the release notes for the .91 driver don't say something like: "fixed compatibility with am3 platforms" That would convince me the issue is gone. But luckily the only time my system BSOD's with the 590 is doing 3dmark vantage/11. Playing Crysis 2, BF:BC2, L4D2, or any other game on super settings has no issues.....

As I type this run #4 has started, no errors yet.


----------



## Pedros

Just for reference...

GTX590 @ 700










GTX 580 @ 960










GTX 580 @ default


----------



## RagingCain

Quote:


> Originally Posted by *Pedros;12899965*
> Just for reference...
> images


Looks good so far, get the 590 GTX upto 772 w/ 2000 MHz memory and you should see very close to SLI GTX 580 (at stock).
Quote:


> Originally Posted by *rush2049;12899957*
> If the CPU/NB voltage increase makes the pci-e interfaces more stable then I will try that after a few more test runs with the new driver.


Glad the new drivers are holding you down.

It can increase stability on a highly stressed CPU/IMC but usually occurs at really 3 or 4 GPUs, or 2 GPUs, and some extra controllers like a sound card or raid controller in effect.

However, as far as performance goes, you are definitely holding your system back even if you had one GPU with a stock NB frequency.


----------



## Pedros

That's a no go Raging ... OCP will kick in, if you give more voltage than what we were using ( 0,963 or 0,975, can't remember )

I still think the GTX580 scores amazingly well for a single gpu solution!!! :x


----------



## MrTOOSHORT

I was kind of excited about the 6990 and GTX 590, but I think I'll just wait for the eVGA GTX 580 3gb version to come out.


----------



## sl00tje

Please update topic: GeForce Driver v267.91


----------



## Pedros

Unless you really need the 3Gb ... it won't performe any better imo ... 
But then again, you have a 2560x1600 screen









More memory = less oc potential = less frequency


----------



## RagingCain

Quote:



Originally Posted by *Pedros*


That's a no go Raging ... OCP will kick in, if you give more voltage than what we were using ( 0,963 or 0,975, can't remember )

I still think the GTX580 scores amazingly well for a single gpu solution!!! :x


Are you on stock air cooling?

Can you to show me this, mine do not arrive here (hopefully) till Wednesday, otherwise I would have already been on HWBot clocking and testing every setting.

Do a comparison of 700 vs 772 and show screen shots of everything including GPU usage.

Quote:



Originally Posted by *Pedros*


Unless you really need the 3Gb ... it won't performe any better imo ... 
But then again, you have a 2560x1600 screen









More memory = less oc potential = less frequency










The Phantoms OCed just as well as the GTX 580 reference designs. I wish I had that article but I am sure you can find it GTX 580 Phantom. Secondly, 3GB is necessary for the GTX 580s + 3D Surround/2D Surround @ 1920x1080 and higher. A GPU memory bottleneck appears keeping it uncompetitive to a 6970 card.


----------



## Pedros

I had to release the card to another reviewer .. it was a test sample from asus

but here's the 750Mhz run







35 vs 38







You can really "feel" OCT kicking in


















As for the 580, yes, they overclock as good as reference, but memory won't







BUt hey... we're talking about peanuts


----------



## L D4WG

Hi Guys, I'm really!!! looking forward to getting one of these beasts, if you could choose which of the following 2 would you choose?

http://altech.com.au/displayproduct.aspx?search=gtx+590

I can get both of them for about $200-$300 cheaper than that (reseller account)

Just have to wait for them to come into stock


----------



## RagingCain

Quote:


> Originally Posted by *L D4WG;12900185*
> Hi Guys, I'm really!!! looking forward to getting one of these beasts, if you could choose which of the following 2 would you choose?
> 
> http://altech.com.au/displayproduct.aspx?search=gtx+590
> 
> I can get both of them for about $200-$300 cheaper than that (reseller account)
> 
> Just have to wait for them to come into stock


I personally would go Gigabyte, I have had good results with them. If they are all references, the cards are all the same so it shouldn't matter. So naturally it would be the company with the best warranty, and I guess then goodies?


----------



## Alatar

they should all be reference designs for now so just take the one you like more.

And oh goody goody, post office time brb


----------



## Pedros

Soooo ... any watercooled results ?!?!
This card's performance exploration is going so slowwwww








Everyone is afraid to fry the damn thing?









Sissy's ....







ehehehe


----------



## CSHawkeye

Quote:



Originally Posted by *Pedros*


Soooo ... any watercooled results ?!?! 
This card's performance exploration is going so slowwwww








Everyone is afraid to fry the damn thing?









Sissy's ....







ehehehe


Hey man I spent $700 on this, I am happy with 700/1800 at stock volts. I feel no need to go any higher.


----------



## Alatar

excuse the bad camera, the other one that I normally use is currently in southern europe...


----------



## Pedros

hawkeye... did you bench games at 700Mhz vs Stock? that would be great to check


----------



## soilentblue

i wonder how long before someone on water plays with the card at 1.00v and OCP off?


----------



## sl00tje

Quote:



Originally Posted by *Alatar*












Congratulations!


----------



## CSHawkeye

Quote:



Originally Posted by *Pedros*


hawkeye... did you bench games at 700Mhz vs Stock? that would be great to check










No, I can test if you want. To be honest I am happy as a clam now since I have this and my GTX 465 to power my 2 secondary monitors and physX. Before I had a GTX 580 SLI with an 8400gs powering my secondary screens. Temps were way hotter then what I have now since I had to sandwich the GTX 580's.


----------



## Alatar

just a quick question: Do you have to have the normal latest version of MSI afterburner for it to recognize the card properly? or do you need the latest beta version?


----------



## sl00tje

My EVGA GeForce GTX 590 Classified


----------



## Alatar

updated









been playing crysis 1 and BFBC2 so far so good.


----------



## Juzam

http://3dmark.com/3dm11/919611

Why is my score this low? A friend who has 480 AMP! SLI and an i7 975 scored 10056 points. My CPU shouldn't be this much of a bottleneck. I blame the experimental nature of the drivers...


----------



## Pedros

dude... you core clock is at 300Mhz?!?!?! ... :x

Some kind of bug or glitch in the matrix?









NVIDIA GeForce GTX 590
# of cards2
SLI / CrossFire On
Memory1536 MB
Core clock300 MHz
Memory clock135 MHz


----------



## Juzam

Quote:



Originally Posted by *Pedros*


dude... you core clock is at 300Mhz?!?!?! ... :x

Some kind of bug or glitch in the matrix?









NVIDIA GeForce GTX 590
# of cards2
SLI / CrossFire On
Memory1536 MB
Core clock300 MHz
Memory clock135 MHz


It's a 3DMark system detection bug I think...


----------



## Pedros

Ok ... but... really can't understand ... maybe OCP bottlenecking ?
That's the score i do with a single clocked 580 :x

You should be getting at least 1000 marks more ... :x


----------



## Juzam

Quote:



Originally Posted by *Pedros*


Ok ... but... really can't understand ... maybe OCP bottlenecking ?
That's the score i do with a single clocked 580 :x

You should be getting at least 1000 marks more ... :x


But I'm not overclocking the card :S


----------



## Pedros

And... OCP kicks in even without any overclocking ... and the latest drivers are worst performers than the drivers that came with the card, since the OCP is calibrated to kick in more often and earlier...

Now... what drivers are you using ? I think today, nvidia launched a new set of drivers for the 590 ...

Make a Vantage Run and tell me what you get...


----------



## Pedros

Let me just try to compare your graphics score with another GTX590 ... the total score can be handicapt by your CPU ... be back in a minute...

By checking futuremark ... everyone is getting low results with the 590

http://3dmark.com/search?resultTypeI...&chipsetId=667

This is really strange ... i hope nvidia didn't get too excited when entering the new variables for OCP ... :x

This is my run


----------



## Juzam

I can't do vantage since textures go all over the place at the first test and FPS is an amazing 35... I'm slowly starting NOT to like my 590...


----------



## Pedros

Hmmm ... remove your 280 and retest ...


----------



## grunion

Quote:


> Originally Posted by *Juzam;12903797*
> I can't do vantage since textures go all over the place at the first test and FPS is an amazing 35... I'm slowly starting NOT to like my 590...


That's an ugly sli/cfx glitch that rears it head now and again.

Just let the test finish and restart your system, should clear it up.


----------



## Abiosis

_It seems we got a company...







_
Quote:


> Conclusion:
> 
> After our initial testing of the HD 6990 we moved the graphics card over to a backup system that we were using to test new games for our benchmarking suite. We were able to complete testing with the HD 6990 in some of our new benchmarks including; H.A.W.X 2, Lost Planet 2 and DiRT 2, however, when we were testing the performance of Dragon Age II the HD 6990 died on us. At the time of it's demise the card was set at the stock 830MHz setting and the BIOS switch was in the default position. This fact that it died could have been that we tested the graphics card at both the 375W and 450W settings, but since the review we have left the settings at default level.
> Presently this leaves Neoseeker without a HD 6990 for future testing. AMD will not warranty the card so we are left with no choice but to reach out to their partners to see if we can get a sample. So, from this point on it will only be seen in the benchmarks that were used at the time of the review until we can get a replacement.


http://www.neoseeker.com/Articles/Hardware/Reviews/AMD_HD_6990_Antilles/18.html

_This reviewer BOSCO said he got 4 HD6990 die in a same way..._


----------



## OverK1LL

Has anybody seen temps for a 590 hydro copper around anywhere?

Mine is idling only 1C above ambient and 3C, 5C (gpu1, gpu2) above ambient on load... Is that even possible? I feel like those numbers are too low to be correct...


----------



## Alatar

Quote:


> Originally Posted by *Abiosis;12904174*
> _It seems we got a company...
> 
> 
> 
> 
> 
> 
> 
> _
> 
> http://www.neoseeker.com/Articles/Hardware/Reviews/AMD_HD_6990_Antilles/18.html
> 
> _This reviewer BOSCO said he got 4 HD6990 die in a same way..._


please don't go trolling with that. We don't need to do that. ( not thinking you would but anyways... )

I seem to be having some PSU issues and the old unit seems to be dying on me







I guess it's time to buy a new one then.


----------



## Abiosis

_Hmmm...you're right ~ we don't need that~

My bad







_


----------



## Juzam

Quote:


> Originally Posted by *grunion;12903989*
> That's an ugly sli/cfx glitch that rears it head now and again.
> 
> Just let the test finish and restart your system, should clear it up.


Ok did that and the result is this:








Quote:


> Originally Posted by *Abiosis;12904483*
> _Hmmm...you're right ~ we don't need that~
> 
> My bad
> 
> 
> 
> 
> 
> 
> 
> _


Did you check your inbox?


----------



## Abiosis

Quote:


> Originally Posted by *Juzam;12904654*
> 
> Did you check your inbox?


_I would say go for it...

since I'm nowhere near Audiophile level or anything in that natural......

but I bet you'll like the sound that you hear with this hardware~







_


----------



## 2010rig

do we have any water results yet?


----------



## OverK1LL

Quote:


> Originally Posted by *2010rig;12905211*
> do we have any water results yet?


Only mine, but I need someone to confirm them because I have never seen temps that low, even on idle within a single loop.


----------



## Pedros

Overkill... they will be really low! Down forget you're running 2 GF110 underclocked, undervolted









So, you will get ... like 40ºC at load or less ...


----------



## saulin

Quote:


> Originally Posted by *OverK1LL;12905307*
> Only mine, but I need someone to confirm them because I have never seen temps that low, even on idle within a single loop.


Can you overclock your card to say 800Mhz and do a 3DMark 2011 run in performance and xtreme?


----------



## 2010rig

Quote:


> Originally Posted by *saulin;12905368*
> Can you overclock your card to say 800Mhz and do a 3DMark 2011 run in performance and xtreme?


Or just 772 run some benches to see if the results match or come near 580 SLI?


----------



## saulin

Quote:


> Originally Posted by *2010rig;12905452*
> Or just 772 run some benches to see if the results match or come near 580 SLI?


Yeah man, we want benchmarks. I can't believe someone has the card on water and no benchies up hehe


----------



## 2010rig

Quote:


> Originally Posted by *saulin;12905479*
> Yeah man, we want benchmarks. I can't believe someone has the card on water and no benchies up hehe


I just want to know if the card DOES get throttled beyond 700 under water.


----------



## Juzam

I was playing Empire: Total War a moment ago and then... suddenly... BAM! PC shutdown. Reboot: no POST. Turned off power switch behind the PSU and just as I was reaching for it, the mesh behind the PSU burnt my fingers. It was scorching hot and a faint smell of burnt capacitor slowly creeped out of it. Waited for 10 minutes and then I held my breath and turned on power again. Everything is working now but the PSU's fan is at 100% and there's really hot air coming from the back of it. I knew that Tagan's BZ series were subpar quality compared to Corsair and Enermax but I didn't quite expect it to fail after all this time. Now I can see why I'm getting lower than usual performance. 5 years worth of capacitor aging +three GPUs and one W gobbler of an extreme edition CPU has finally worn out my Tagan. Time to get a Corsair AX...


----------



## soilentblue

Quote:


> Originally Posted by *Juzam;12906090*
> I was playing Empire: Total War a moment ago and then... suddenly... BAM! PC shutdown. Reboot: no POST. Turned off power switch behind the PSU and just as I was reaching for it, the mesh behind the PSU burnt my fingers. It was scorching hot and a faint smell of burnt capacitor slowly creeped out of it. Waited for 10 minutes and then I held my breath and turned on power again. Everything is working now but the PSU's fan is at 100% and there's really hot air coming from the back of it. I knew that Tagan's BZ series were subpar quality compared to Corsair and Enermax but I didn't quite expect it to fail after all this time. Now I can see why I'm getting lower than usual performance. 5 years worth of capacitor aging +three GPUs and one W gobbler of an extreme edition CPU has finally worn out my Tagan. Time to get a Corsair AX...


juzam this isn't unusual with the fermi line. if you go on the nvidia forums you will see that 400 and 500 prefer clean volts. the gtx 570 is a major one though.


----------



## Pedros

Quote:


> Originally Posted by *2010rig;12905636*
> I just want to know if the card DOES get throttled beyond 700 under water.


ahahah you are just like ... trying to understand if the card rocks under water or not









Quote:


> Originally Posted by *Juzam;12906090*
> I was playing Empire: Total War a moment ago and then... suddenly... BAM! PC shutdown. Reboot: no POST. Turned off power switch behind the PSU and just as I was reaching for it, the mesh behind the PSU burnt my fingers. It was scorching hot and a faint smell of burnt capacitor slowly creeped out of it. Waited for 10 minutes and then I held my breath and turned on power again. Everything is working now but the PSU's fan is at 100% and there's really hot air coming from the back of it. I knew that Tagan's BZ series were subpar quality compared to Corsair and Enermax but I didn't quite expect it to fail after all this time. Now I can see why I'm getting lower than usual performance. 5 years worth of capacitor aging +three GPUs and one W gobbler of an extreme edition CPU has finally worn out my Tagan. Time to get a Corsair AX...


Jeez







Damn luck ... maybe that's problem there ... But hey, it was a good PSU while it lasted







GTX 590 lets no prisoners







eheheh


----------



## srsparky32

Quote:


> Originally Posted by *Arizonian;12898165*
> Why don't you go back to your 'I'm a huge ATI fanboy' club. So inappropriate on this thread Reflex99. Thanks for letting everyone know how little you know and how immature. Go back to your 'I have no money and can't afford a real GPU' club', so I have to invade others clubs thread and make derogatory statements to make myself feel better. Get a life man.
> 
> Your not even worth a real response. Your reps must be from people just like yourself and hold very little value.


a 6870 is a bad card? huh i didnt know that.

oh and, i'm guessing you have a flawless rep record, right?

just another martyr on ocn. what a surprise.

oh and just because you have more money than he does dosent mean you are better than him.


----------



## Pedros

looooooooooooooooooooooooool


----------



## remer

So I'm dying to really push this thing. Do you think a corsair AX850 would be enough to handle any gpu overclocking with my i7 > 4GHz?


----------



## grunion

Quote:


> Originally Posted by *remer;12906972*
> So I'm dying to really push this thing. Do you think a corsair AX850 would be enough to handle any gpu overclocking with my i7 > 4GHz?


Yes

And don't post anything like that again.


----------



## Twilex

Okay so the ek blocks on frozencpu sold out instantly. Where the hell are you people with water results?!?!!


----------



## OverK1LL

Quote:


> Originally Posted by *Twilex;12907166*
> Okay so the ek blocks on frozencpu sold out instantly. Where the hell are you people with water results?!?!!


I live 10 minutes away from FrozenCPU, so if you need me to swing by and bash some heads, lemme know. ;-)

Anyways,

I'd love to oc this thing, but I don't know how hard I can push it; Never OCd a gpu before so I am a bit hesitant to do it right off the bat on an $800 card. I bought the card with the intent to oc it but haven't yet. I guess now is the best time to learn.

If any one would like to help me out, I'd be more than willing to post some results...


----------



## Twilex

Quote:


> Originally Posted by *OverK1LL;12907304*
> I live 10 minutes away from FrozenCPU, so if you need me to swing by and bash some heads, lemme know. ;-)
> 
> Anyways,
> 
> I'd love to oc this thing, but I don't know how hard I can push it; Never OCd a gpu before so I am a bit hesitant to do it right off the bat on an $800 card. I bought the card with the intent to oc it but haven't yet. I guess now is the best time to learn.
> 
> If any one would like to help me out, I'd be more than willing to post some results...


Right off the bat get MSI Afterburner. Make sure its the newest version. Don't touch the voltage just yet. Throw the core at 700mhz and bump the memory 100mhz. Download vantage and do a run stock first then do a run with the settings i just told you. Then report back


----------



## Twilex

Quote:


> Originally Posted by *OverK1LL;12907304*
> I live 10 minutes away from FrozenCPU, so if you need me to swing by and bash some heads, lemme know. ;-)
> 
> Anyways,
> 
> I'd love to oc this thing, but I don't know how hard I can push it; Never OCd a gpu before so I am a bit hesitant to do it right off the bat on an $800 card. I bought the card with the intent to oc it but haven't yet. I guess now is the best time to learn.
> 
> If any one would like to help me out, I'd be more than willing to post some results...


P.S. I hate you for living that close to FrozenCpu >< I shop there so much =/


----------



## delavan

Sign me up!
I couldn't resist but open the box...crossing fingers...dunno if unigine score is good or not...








[/URL] Uploaded with ImageShack.us

Unigine
Heaven Benchmark v2.1
FPS:
99.4
Scores:
2504
Min FPS:
6.2
Max FPS:
207.5
Hardware
Binary:
Windows 32bit Visual C++ 1500 Release May 21 2010
Operating system:
Windows 7 (build 7601, Service Pack 1) 64bit
CPU model:
Intel(R) Core(TM) i5-2500K CPU @ 3.30GHz
CPU flags:
3300MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT
GPU model:
NVIDIA GeForce GTX 590 8.17.12.6791 1536Mb
Settings
Render:
direct3d11
Mode:
1680x1050 fullscreen
Shaders:
high
Textures:
high
Filter:
trilinear
Anisotropy:
4x
Occlusion:
enabled
Refraction:
enabled
Volumetric:
enabled
Replication:disabled
Tessellation:
normal


----------



## L D4WG

Quote:


> Originally Posted by *RagingCain;12900301*
> I personally would go Gigabyte, I have had good results with them. If they are all references, the cards are all the same so it shouldn't matter. So naturally it would be the company with the best warranty, and I guess then goodies?


Cool thanks, I wish they sold the EVGA cards!!! they look so much better!! Oh well, Ill be getting them at a heavily reduced price, so I shouldn't complain







, plus 3 years rtb warranty!!


----------



## L D4WG

Far out, the EVGA cards are only $730 in the USA!

http://www.evga.com/products/moreinfo.asp?pn=03G-P3-1598-AR

Does anyone know of a way I could buy it and get it shipped to Australia? I would happily pay $50 in postage to get one at that price!


----------



## soilentblue

Quote:


> Originally Posted by *delavan;12908101*
> Sign me up!
> I couldn't resist but open the box...crossing fingers...dunno if unigine score is good or not...
> 
> 
> 
> 
> 
> 
> 
> 
> [/URL] Uploaded with ImageShack.us
> 
> Unigine
> Heaven Benchmark v2.1
> FPS:
> 99.4
> Scores:
> 2504
> Min FPS:
> 6.2
> Max FPS:
> 207.5
> Hardware
> Binary:
> Windows 32bit Visual C++ 1500 Release May 21 2010
> Operating system:
> Windows 7 (build 7601, Service Pack 1) 64bit
> CPU model:
> Intel(R) Core(TM) i5-2500K CPU @ 3.30GHz
> CPU flags:
> 3300MHz MMX SSE SSE2 SSE3 SSSE3 SSE41 SSE42 HTT
> GPU model:
> NVIDIA GeForce GTX 590 8.17.12.6791 1536Mb
> Settings
> Render:
> direct3d11
> Mode:
> 1680x1050 fullscreen
> Shaders:
> high
> Textures:
> high
> Filter:
> trilinear
> Anisotropy:
> 4x
> Occlusion:
> enabled
> Refraction:
> enabled
> Volumetric:
> enabled
> Replication:disabled
> Tessellation:
> normal


your minimum should definitely not be 6fps. something is up with these batch of drivers with this card.


----------



## Twilex

Quote:


> Originally Posted by *L D4WG;12908237*
> Far out, the EVGA cards are only $730 in the USA!
> 
> http://www.evga.com/products/moreinfo.asp?pn=03G-P3-1598-AR
> 
> Does anyone know of a way I could buy it and get it shipped to Australia? I would happily pay $50 in postage to get one at that price!


Only way really would be to find someone you can trust in the states who you can ship it to, then they ship it to you.


----------



## soilentblue

i'm feeling like i'm not giving this card a chance but i may return mine(when it comes in tomorrow) for a watercooled 580 or something.


----------



## Twilex

I just hope i can get this water block on without any hassles and make sure i dont leave any cosmetic damage on the stock heatsink, just in case i need to send it back


----------



## soilentblue

was it just that one guy that was having problems with voltage and the latest drivers?


----------



## L D4WG

Quote:


> Originally Posted by *Twilex;12908376*
> Only way really would be to find someone you can trust in the states who you can ship it to, then they ship it to you.


Yeah but I don't know anyone in the states







, I wonder why they are $1000 on ebay.com, the person selling them cant be making many sales if they are a few hundred dollars more expensive than EVGA.com :S

I just got a quote from a website that ships products from the USA to other countries, they slapped a god $200 ontop of the cost







, so that's not an option.

sigh


----------



## Twilex

Quote:


> Originally Posted by *soilentblue;12908827*
> was it just that one guy that was having problems with voltage and the latest drivers?


Not sure which guy you're talking about.

Also wish masked would show us some benchies on this supposed 840mhz overlock.
Quote:


> Originally Posted by *L D4WG;12908860*
> Yeah but I don't know anyone in the states
> 
> 
> 
> 
> 
> 
> 
> , I wonder why they are $1000 on ebay.com, the person selling them cant be making many sales if they are a few hundred dollars more expensive than EVGA.com :S
> 
> I just got a quote from a website that ships products from the USA to other countries, they slapped a god $200 ontop of the cost
> 
> 
> 
> 
> 
> 
> 
> , so that's not an option.
> 
> sigh


Yea and EVGA wont ship to those types of companies anyway =/ I would prolly try to find someone on here with a buttload of rep and trust them to send it to you. Otherwise youre prolly gunna be waiting awhile.


----------



## delavan

Quote:


> your minimum should definitely not be 6fps. something is up with these batch of drivers with this card.


KK, but how is the average score...making sense? I haven't noticed a stutter while playing Black ops, same for Unigine..no visible stutter...

But I'm always looking at the FPS counter, thinking...is it the scenery that's more demanding, of the darn thing throttles...lol conspiracy theory...


----------



## L D4WG

Quote:


> Originally Posted by *Twilex;12908912*
> 
> Yea and EVGA wont ship to those types of companies anyway =/ I would prolly try to find someone on here with a buttload of rep and trust them to send it to you. Otherwise youre prolly gunna be waiting awhile.


That is a pretty good idea, and yes by waiting a whie, it would be waiting for ever basically because the prices from evga cards in aus never go down







, now to find someone







....


----------



## delavan

first 3Dmarks06 run just for kicks








[/URL] Uploaded with ImageShack.us

It's nice, I was getting 20 000 3Dmarks with my Q9550 @ 3.8 and HD5870...
Now 32 000...


----------



## Toilet-Duck

Noticied on another forum that the newer drivers are now throttling down more in normal games and more often with the newer driver because its put the limits have been lowered due to the card blowouts.

Is this true?

I'm really considering this graphics card, want to avoid AMD due to their crappy drivers.


----------



## Twilex

Quote:


> Originally Posted by *delavan;12909419*
> first 3Dmarks06 run just for kicks
> 
> 
> 
> 
> 
> 
> 
> 
> [/URL] Uploaded with ImageShack.us
> 
> It's nice, I was getting 20 000 3Dmarks with my Q9550 @ 3.8 and HD5870...
> Now 32 000...


Very nice! Your 2500k prolly added at least 5k to your score over your old cpu. Awesome score though!


----------



## Twilex

Quote:


> Originally Posted by *Toilet-Duck;12909461*
> Noticied on another forum that the newer drivers are now throttling down more in normal games and more often with the newer driver because its put the limits have been lowered due to the card blowouts.
> 
> Is this true?
> 
> I'm really considering this graphics card, want to avoid AMD due to their crappy drivers.


Yes the new drivers implement OCP software which throttle the card back when it gets too hot


----------



## Toilet-Duck

Quote:


> Originally Posted by *Twilex;12909514*
> Yes the new drivers implement OCP software which throttle the card back when it gets too hot


Any idea what tempratures we are talking about here high 80s/90s?


----------



## delavan

The highest I got was at 79C on one core and 84 on the other..no idea how to make sure there is throttling involved...any ideas?
The full backplate on the EVGA card is nice, dunno if it actually helps...probably a little bit.


----------



## Twilex

Not sure what temp is kicks in tbh. There is very little info as to what makes it kick in but speculation is temps. People are also thinking core clock could be a factor as well.


----------



## Behemoth777

Quote:


> Originally Posted by *Arizonian;12898165*
> Why don't you go back to your 'I'm a huge ATI fanboy' club. So inappropriate on this thread Reflex99. Thanks for letting everyone know how little you know and how immature. Go back to your 'I have no money and can't afford a real GPU' club', so I have to invade others clubs thread and make derogatory statements to make myself feel better. Get a life man.
> 
> Your not even worth a real response. Your reps must be from people just like yourself and hold very little value.


Oh I forgot, everyone who has a differing opinion is a fanboy.









Your response to his is pretty pathetic.


----------



## grunion

Quote:


> Originally Posted by *delavan;12909655*
> The highest I got was at 79C on one core and 84 on the other..no idea how to make sure there is throttling involved...any ideas?
> The full backplate on the EVGA card is nice, dunno if it actually helps...probably a little bit.


Just increase the speed 25mhz at a time, run a bench, note the numbers.

Where the performance increase stops, or goes negative, back it down 15mhz or so and retest.
From there if you want, give it an ever so slight voltage increase.
But I wouldn't until more testing has been done on the card.


----------



## delavan

Oh Grunion,

I'm not looking for an overclock on the card, I only want to figure out if it throttles down...

what would tell me...GPU-Z graphs?, EVGA precision? I need graphs for temp and clocks + recording...


----------



## OverK1LL

Quote:


> Originally Posted by *Twilex;12908651*
> I just hope i can get this water block on without any hassles and make sure i dont leave any cosmetic damage on the stock heatsink, just in case i need to send it back


Well the top part is easy! lol, but the bottom part is a different story. I think it uses a security hex. If you don't have one, you might want to add this to your tool box:

96 Piece Security Bit Set


----------



## rush2049

Allright everyone, to sum up a few things with what I have seen:

-Yes the latest drivers (.91) throttle at stock... (the low scores I have been getting are a testament to that)

-and the throttling is either not based upon temperatures or not solely based upon temperatures, I kept mine under 60 C and I saw lower scores than with the earlier drivers.

-I have a feeling that the volt/amp draw is what is being measured, if you remember Power equals volts times amps. So probably when a heavy load is being run the power required increases, and a higher volt is needed to meet demands. When the card raises the volt to meet the need the OCP kicks in. I am not seeing a core clock decrease though, perhaps some other type of throttle is happening, like limiting that volt increase to meet demands (which would also explain why I am getting bluescreens with a long(time) heavy load).

---I am hesitant to overclock until the default clocks stop throttling, I will wait a week or so....


----------



## Twilex

Yea nvidia and their infinite wisdom thought it would be cool to make those screws torx screws. Don't worry im prepared =) Do you have the EK block? or the HC?


----------



## Twilex

Quote:


> Originally Posted by *rush2049;12909903*
> Allright everyone, to sum up a few things with what I have seen:
> 
> -Yes the latest drivers (.91) throttle at stock... (the low scores I have been getting are a testament to that)
> 
> -and the throttling is either not based upon temperatures or not solely based upon temperatures, I kept mine under 60 C and I saw lower scores than with the earlier drivers.
> 
> -I have a feeling that the volt/amp draw is what is being measured, if you remember Power equals volts times amps. So probably when a heavy load is being run the power required increases, and a higher volt is needed to meet demands. When the card raises the volt to meet the need the OCP kicks in. I am not seeing a core clock decrease though, perhaps some other type of throttle is happening, like limiting that volt increase to meet demands (which would also explain why I am getting bluescreens with a long(time) heavy load).
> 
> ---I am hesitant to overclock until the default clocks stop throttling, I will wait a week or so....


Christ could they have been any more lame with this card =/


----------



## Special_K

Can anyone verify the 3 holes on the back of the card(that vents inside the case) to be threaded and possibly for an aftermarket fan to mount to?


----------



## grunion

Quote:


> Originally Posted by *delavan;12909851*
> Oh Grunion,
> 
> I'm not looking for an overclock on the card, I only want to figure out if it throttles down...
> 
> what would tell me...GPU-Z graphs?, EVGA precision? I need graphs for temp and clocks + recording...


Well NV states the max temp as 97°c, so somewhere around there thermal throttling happens.

OCP is the tricky one, it for sure shouldn't do it during gaming.


----------



## OverK1LL

P33367Stock Clocks around the board...
Max temp during the bench, according to HWMon was 30C/32C on GPU1/GPU2.

Now to try and monkey with the clocks...


----------



## Twilex

Quote:


> Originally Posted by *OverK1LL;12910271*
> P33367
> Stock Clocks around the board...
> Max temp during the bench, according to HWMon was 30C/32C on GPU1/GPU2.
> 
> Now to try and monkey with the clocks...


Jesus. Is that the EK block or the HC?


----------



## OverK1LL

Quote:


> Originally Posted by *twilex;12910294*
> jesus. Is that the ek block or the hc?


hc.


----------



## Twilex

What type of loop are you using? Just cpu, gpu? Those are some awesome temps


----------



## Alatar

Quote:


> Originally Posted by *Juzam;12906090*
> I was playing Empire: Total War a moment ago and then... suddenly... BAM! PC shutdown. Reboot: no POST. Turned off power switch behind the PSU and just as I was reaching for it, the mesh behind the PSU burnt my fingers. It was scorching hot and a faint smell of burnt capacitor slowly creeped out of it. Waited for 10 minutes and then I held my breath and turned on power again. Everything is working now but the PSU's fan is at 100% and there's really hot air coming from the back of it. I knew that Tagan's BZ series were subpar quality compared to Corsair and Enermax but I didn't quite expect it to fail after all this time. Now I can see why I'm getting lower than usual performance. 5 years worth of capacitor aging +three GPUs and one W gobbler of an extreme edition CPU has finally worn out my Tagan. Time to get a Corsair AX...


I got clean shutdowns with no BSODs or anything with BFBC2 SP and 3dmark Vantage.

New PSU for me I guess... I was kind of hoping my current one would be up to the task, but I guess not.

Updating atm, gonna take a min.


----------



## Twilex

If my kingwin takes a **** because of this card ill be furious


----------



## OverK1LL

Quote:


> Originally Posted by *Twilex;12910667*
> What type of loop are you using? Just cpu, gpu? Those are some awesome temps


Right now, just GPU. It is for my June 1st Build.

Black Ice SR1 Low Flow 420 and an Ehiem HPPS+.

It is overkill for just one 590X but it is going to have 2 soon, so I figured I could use the cooling power.


----------



## Twilex

Nothing is overkill here! Thats some good stuff right there though. Kinda wish i went with the HC. Would have been cheaper in the long run. Guess we'll see how this bad boy runs in a loop with a 470, full chipset and memory block.


----------



## rush2049

Took a few pictures of where my card is installed for whoever it was that asked. They will be in the 'uber resolution' as someone called it... as I am too lazy to resize and my flickr account doesn't care about the size.

I will update this post when they are done uploading.


----------



## soilentblue

someone asked this question in another thread and i think it's a good one....
Quote:


> could nVidia be setting the voltage limit in order to iron out the bugs on OCV/OCP


if these cards are able to reach 1.05v on reference cooling i really don't see why the ocp is kicking in at stock. hell let alone at 1.05v. it's almost like the card doesn't know how to consistantly ask for the proper amounts of volts/amps


----------



## soilentblue

when you buy a graphics card from the egg how long do you have before you can't return it for a refund?


----------



## Arizonian

Quote:


> Originally Posted by *soilentblue;12911410*
> when you buy a graphics card from the egg how long do you have before you can't return it for a refund?


Well it depends on which card you buy and the policy. Your ASUS GTX 590 is VGA Replacement Only Return Policy.

Return for refund within: *non-refundable*
Return for replacement within: *30 days*

Products that state "This item is covered by Newegg.com's *Replacement Only 30-Day Return Policy*", or items labeled as "*Non-refundable*" (or similar labeling) must be returned to Newegg within 30 days of the invoice date for this policy to apply. Products covered by this return policy may only be returned for a replacement of the same or equivalent item. "Return" constitutes receipt of the product by Newegg, and not the mere issuance of an RMA.

The following conditions are not acceptable for return, and will result in the merchandise you have returned to Newegg being returned to you:

Cards exhibiting physical damage
Cards that are missing the manufacturer label containing model number, part number or serial number
Cards that are missing the manufacturer warranty label
Cards returned without all included accessories, bonus games, and documentation.

Sorry about that, the lower price video cards usually have the refund within 30 days offer. Just depends. I'm assuming that you meant your ASUS GTX 590 card btw?


----------



## soilentblue

+rep man. appreciate the info. looks like i'm stuck with the card and going to be riding these drivers out. i can always sell it since it's serial number warranty based but i'll see what it looks like in a month or so


----------



## Juzam

Ok... crossed my fingers. I'll post new benchmark scores as soon as I install the PSU...


----------



## delavan

My first VANTAGE run ...590 Classified (630MHz stock clocks)

Please tell if it's good, not really familiar with 3Dmarks Vantage.








[/URL] Uploaded with ImageShack.us


----------



## Juzam

Gratz. You've just scored 7000 points higher than me


----------



## Pedros

delavan, are you using the latest drivers or the ones before ?


----------



## Alatar

it says .91 in the GPU-Z window


----------



## Pedros

Yeah.... the latest ...
Btw, 36500 it's nice ... i score 31k with a single 580, clocked.


----------



## delavan

Quote:


> it says .91 in the GPU-Z window


True.

ALATAR???
For the score, it's with Physx enabled...dunno if it makes a difference, I think It does...when you try to send a "physx-enabled" score to FUTUREMARK, it's rejected...
so we have to come up with a standard I think...


----------



## Pedros

Mehhhh... another one bites the dust here


----------



## delavan

First 3Dmarks 11 run...is it a little low?

TOMShardware tested 8934 pts in P-mode...seems close enough, what is your score guys/gals?
I was getting something like 4200 pts with my HD5870/Q9550 @ 3.6GHz...








[/URL] Uploaded with ImageShack.us

haven't benched in years, I'm missing references...


----------



## Pedros

i've already posted my 580 scores a while back, but here's again for comparison ...


----------



## Twilex

3Dmark11 has shown that it isnt really fermi friendly so im not gunna use it as a benchmark for the 590


----------



## Pedros

... one thing is the Fermi being a "so so" performer at DX11 ... another thing is saying that it isn't Fermi friendly









If you check benchmarks of games, you'll see that DX11 games don't perform that well


----------



## OverK1LL

So, are any 590 owners worried about the issues with the cards "biting the dust"?

I'm starting to wonder if this was a good investment. Someone talk me down, please.


----------



## Alatar

Quote:


> Originally Posted by *OverK1LL;12916074*
> So, are any 590 owners worried about the issues with the cards "biting the dust"?
> 
> I'm starting to wonder if this was a good investment. Someone talk me down, please.


I think there's a saying: for every loud whiner on the net there are 100 happy people who are quiet. (or something like that







)

I wouldn't worry. I'm a happy camper since my card seems to be fine. My PSU is not though. The 12v rail isn't liking my upgrade... Gonna have to order a new one today.


----------



## Juzam

Quote:


> Originally Posted by *OverK1LL;12916074*
> So, are any 590 owners worried about the issues with the cards "biting the dust"?
> 
> I'm starting to wonder if this was a good investment. Someone talk me down, please.


Not at all. *Every single card that blew up was whether or not @ default voltage, OVERCLOCKED*. I don't overclock my graphics cards. So I'm not worried. Just to be on the safe side I'm replacing my PSU and taking the 280 off (at least until proper drivers that support both). So that gives me even more reason not to be worried.


----------



## OverK1LL

Quote:


> Originally Posted by *Juzam;12916835*
> Not at all. *Every single card that blew up was whether or not @ default voltage, OVERCLOCKED*. I don't overclock my graphics cards. So I'm not worried. Just to be on the safe side I'm replacing my PSU and taking the 280 off (at least until proper drivers that support both). So that gives me even more reason not to be worried.


Overclocked? what?


----------



## Juzam

Quote:


> Originally Posted by *OverK1LL;12917001*
> Overclocked? what?


Are you serious?


----------



## RagingCain

Quote:


> Originally Posted by *OverK1LL;12917001*
> Overclocked? what?


Listen Over, in my amateur overclocking ability, and noobie-ness in general, these review sites did very amateur testing trying to see how many volts and at what overclock, some of them just on their second attempt at OCing it maxed out the voltage. Had they done it patiently and slower, obvious things like artifacting and crashing would have occured long before they got 1.15~1.20v. nVidia recommends staying at or under 0.988v with water, and stock voltage with air cooling.

Its a weak overclocker no doubt, but its reliability has not come into question at standard voltages and standard clocks. I also don't see how an overclock can fry a card if voltage is left alone, from my experience overclocks on the GPU just add instability if anything, memory on the other hand can-fry.

If you do get it, just don't OC till real forum pros have a go at it.

Kingpin has already got is LN2 pots made and has begun testing for World records.


----------



## Juzam

Quote:


> Originally Posted by *RagingCain;12917079*
> List Over, in my amateur overclocking ability, and noobie-ness in general, these review sites did very amateur testing


Excuse my words, but I simply cannot stand ignorance. Forum whiners did even more ridiculous stuff, like running 10 demanding benchmarks in a row at their card's overclocked state. One of them (the idiot on Pedros' link, namely "Greg Brown") even mentions that he had to use towels to get the card out of the system. How ******ed do you have to be to let the card get so outrageously hot? Another one has opened threads about his failing motherboards and CPUs in the past. His PSU sucks or obviously he doesn't have a UPS to stabilize his crappy mains output.

On top of these idiots, ATi zealots further degrade the issue to the point that most people who don't have a clue get scared off and buy a 6990 instead. Just click on Pedros' link and follow it to those other forums as well. Those posts reek of ignorance and fanboyism.


----------



## OverK1LL

Quote:


> Originally Posted by *Juzam;12917026*
> Are you serious?


of course i'm not serious. are you?


----------



## saulin

Quote:


> Originally Posted by *Juzam;12917382*
> Excuse my words, but I simply cannot stand ignorance. Forum whiners did even more ridiculous stuff, like running 10 demanding benchmarks in a row at their card's overclocked state. One of them (the idiot on Pedros' link, namely "Greg Brown") even mentions that he had to use towels to get the card out of the system. How ******ed do you have to be to let the card get so outrageously hot? Another one has opened threads about his failing motherboards and CPUs in the past. His PSU sucks or obviously he doesn't have a UPS to stabilize his crappy mains output.
> 
> On top of these idiots, ATi zealots further degrade the issue to the point that most people who don't have a clue get scared off and buy a 6990 instead. Just click on Pedros' link and follow it to those other forums as well. Those posts reek of ignorance and fanboyism.


I totally agree with this


----------



## Imglidinhere

Quote:


> Originally Posted by *Juggalo23451;12882057*
> I have 2 of them


Oh my god... You have money...


----------



## CSHawkeye

I will post mine shortly but I am getting with the 720p testing around 9300


----------



## Horsemama1956

Quote:


> Originally Posted by *Juzam;12917382*
> Excuse my words, but I simply cannot stand ignorance. Forum whiners did even more ridiculous stuff, like running 10 demanding benchmarks in a row at their card's overclocked state. One of them (the idiot on Pedros' link, namely "Greg Brown") even mentions that he had to use towels to get the card out of the system. How ******ed do you have to be to let the card get so outrageously hot? Another one has opened threads about his failing motherboards and CPUs in the past. His PSU sucks or obviously he doesn't have a UPS to stabilize his crappy mains output.
> 
> On top of these idiots, ATi zealots further degrade the issue to the point that most people who don't have a clue get scared off and buy a 6990 instead. Just click on Pedros' link and follow it to those other forums as well. Those posts reek of ignorance and fanboyism.


Considering the new drivers apparently clock it down to 550(or below 600) when OCP kicks in, it's safe to say nVidia realizes they messed up.

Just because someone says the card is flawed(which it is) doesn't mean they are "ati zealots". It's not the buyer of the 590 that should need to defend themselves because nVidia screwed up. This is like calling people PS3 fanboys when someone mentions the 360 had RROD issues.

We get you need to justify your purchase and don't want to feel burned about your decision. The denial sounds just as bad. Just because your card might not die, doesn't mean everyone else that ran into the issue at hand screwed up. That's just ridiculous.


----------



## kometata

Hi all,
I will post my proof pictures after minutes too.

I have a question and will be very thankful if someone could help. I have two 590 and want to test them (or just one) now under Linux. There is no official driver till now. If it was "normal" case ok, but after so many posts about damaged cards and bangs even on stock clocks I started to worry.

What do you think? Is it good idea, to test the card with current drivers or to wait? Does anybody have some experience under Linux?


----------



## Pedros

... errr ... you bought 2 590's to use under Linux ? ...

Just to try to understand ... you are joking right?


----------



## kometata

Quote:


> Originally Posted by *Pedros;12918300*
> ... errr ... you bought 2 590's to use under Linux ? ...
> 
> Just to try to understand ... you are joking right?


Not only Linux
I bought them for CUDA applications and calculations


----------



## Alatar

Quote:


> Originally Posted by *kometata;12918452*
> Not only Linux
> I bought them for CUDA applications and calculations


I'd say it doesn't hurt to try... only a few people on these forums really have linux experience and I doubt no one here has experience of running a system with 4 brand new GPUs under linux.

E: you shouldn't be frying cards just booting and stuff, if that happens it would probably have fried itself anyways.


----------



## Juzam

Quote:


> Originally Posted by *Horsemama1956;12918116*
> Considering the new drivers apparently clock it down to 550(or below 600) when OCP kicks in, it's safe to say nVidia realizes they messed up.
> 
> Just because someone says the card is flawed(which it is) doesn't mean they are "ati zealots". It's not the buyer of the 590 that should need to defend themselves because nVidia screwed up. This is like calling people PS3 fanboys when someone mentions the 360 had RROD issues.
> 
> We get you need to justify your purchase and don't want to feel burned about your decision. The denial sounds just as bad. Just because your card might not die, doesn't mean everyone else that ran into the issue at hand screwed up. That's just ridiculous.


I wasn't talking about you and since you've jumped right on the topic like a carp (it's a Turkish expression), you're on your way to becoming (if you're not already) a fanboy. Gratz.

Yeah nvidia messed up really bad. What is your degree in electronics, sir? I really want to read your academic reports on semiconductors.

I'm playing a whole variety of games and I've never seen the GPU clock go lower than stock. So that argument is unproven.

The card is perfectly operational at stock voltages and speeds so it is not flawed. You on the other hand are, since no one except me has given any attention to you (guess what your flaw is). I can't see anyone defending themselves so I don't understand your point (if there is any). If by mentioning you mean: "LAWLS TEH CARD BURNZ GET ATI PWNZOR!1!!11" yes, they are blind zealots. If anyone says "TEH XBOXZ RRODS SUX GET PS3 NOW!1!1!!11" yes, he is a PS3 fanboy, oh or didn't you know?

I get that you need to justify YOUR purchase and try to prove that you're "The Uberskilled PC God" by attacking threads and don't feel like the underachieved ignorant you are, but please, get a life. Everyone that has ran into the issue screwed up at some point. That's why I mentioned what they exactly did before and during the screw-up. You're just ridiculous.


----------



## kometata

Quote:


> Originally Posted by *Alatar;12918569*
> I'd say it doesn't hurt to try... only a few people on these forums really have linux experience and I doubt no one here has experience of running a system with 4 brand new GPUs under linux.
> 
> E: you shouldn't be frying cards just booting and stuff, if that happens it would probably have fried itself anyways.


Tanks Alatar, I am going to try both OC!


----------



## armartins

Quote:


> Originally Posted by *Juzam;12918727*
> I wasn't talking about you and since you've jumped right on the topic like a carp (it's a Turkish expression), you're on your way to becoming (if you're not already) a fanboy. Gratz.
> 
> Yeah nvidia messed up really bad. *(1) What is your degree in electronics, sir*? I really want to read your academic reports on semiconductors.
> 
> *(2) I'm playing a whole variety of games and I've never seen the GPU clock go lower than stock*. So that argument is unproven.
> 
> The card *(3) is perfectly operational at stock voltages and speeds* so it is not flawed. You on the other hand are, since no one except me has given any attention to you (guess what your flaw is). I can't see anyone defending themselves so I don't understand your point (if there is any). If by mentioning you mean: "LAWLS TEH CARD BURNZ GET ATI PWNZOR!1!!11" yes, they are blind zealots. If anyone says "TEH XBOXZ RRODS SUX GET PS3 NOW!1!1!!11" yes, he is a PS3 fanboy, oh or didn't you know?
> 
> I get that you need to justify YOUR purchase and try to prove that you're "The Uberskilled PC God" by attacking threads and don't feel like the underachieved ignorant you are, but please, get a life. *(4) Everyone that has ran into the issue screwed up at some point*. That's why I mentioned what they exactly did before and during the screw-up. You're just ridiculous.


(1) What's yours?
(2) Nvidia already confirmed that the drivers are NOT letting softwares detect when the card underclocks (look for your self if you don't believe)
(3) You said all "at *stock voltages AND speeds*"
(4) Wait how is the name of this forum? We are at stockclock.com? Man open your eyes. The GTX590 is a great card for someone that will just "plug'n'play" for that "normal" user (stock cooling/clocks) it'll be much better than 6990 (IMHO) due to acustics, but there is no tweaking room. That's what all ATI fanboys are bragging, and believe it or not, that is TRUE. Why?







you will say. Well I'll not waste my time looking for sources, but so far the reality from what I've read is: The VRMs at 590's PCB are the GTX570 VRMs x2. People complain about the room for overclocking in the GTX570 these VRMs are great for a 570 at it's default speeds or for a 590 (1024 CUDA cores) at stock, but there is little room for tweaking. And *if the card is SILENT and there is LITTLE room, or better little GAIN from OC why go water?* And look around you'll find many guys showing that even when it is stable at high clocks the results don't go up according (underclocking backfiring, stealth from user knowledge). And that is typical from Nvidia when they screw up. In the other hand, 6990 has a robust design for two chips that consumes less power, the OC results are scaling very well, and when the waterblocks came around we'll se their real potential (what scares me is the possible coil wine, that would be a no go for me). Nvidia has very good products, I can even defend 580SLI vs 6990 + 6970 (trifire for $1000) in some scenarios. The 580 still Nvidia's best call IMHO. I'll not even touch the point where people get in framebuffer walls with 580s or the 590. Call me whatever you want.









Edit: typos


----------



## rush2049

I can confirm myself that when the card throttles it does not lower the clock speeds.....

though on the other hand, the power management that does lower clock speed is rather aggressive as well (I really want nvidia to give me a way to turn that off)


----------



## soilentblue

i haven't seen the card for anybody go lower than stock clocks. not sure where that rumor came from but no doubt from someone who doesn't own the card.

the ocp is flawed currently. that's the current issue. when nvidia fixes it, the card should begin to work alot better. i've seen over a handful of cards run on 1.0v and not blow up. the ocp is just downclocking them at points that is shouldn't be. wait for driver maturity and/or a possible bios flash from nvidia to fix the issue.


----------



## Juzam

Quote:


> Originally Posted by *armartins;12919331*
> (1) What's yours?
> (2) Nvidia already confirmed that the drivers are NOT letting softwares detect when the card underclocks (look for your self if you don't believe)
> (3) You said all "at *stock voltages AND speeds*"
> (4) Wait how is the name of this forum? We are at stockclock.com? Man open your eyes. The GTX590 is a great card for someone that will just "plug'n'play" for that "normal" user (stock cooling/clocks) it'll be much better than 6990 (IMHO) due to acustics, but there is no tweaking room. That's what all ATI fanboys are bragging, and believe it or not, that is TRUE. Why?
> 
> 
> 
> 
> 
> 
> 
> you will say. Well I'll not waste my time looking for sources, but so far the reality from what I've read is: The VRMs at 590's PCB are the GTX570 VRMs x2. People complain about the room for overclocking in the GTX570 these VRMs are great for a 570 at it's default speeds or for a 590 (1024 CUDA cores) at stock, but there is little room for tweaking. And *if the card is SILENT and there is LITTLE room, or better little GAIN from OC why go water?* And look around you'll find many guys showing that even when it is stable at high clocks the results don't go up according (underclocking backfiring, stealth from user knowledge). And that is typical from Nvidia when they screw up. In the other hand, 6990 has a robust design for two chips that consumes less power, the OC results are scaling very well, and when the waterblocks came around we'll se their real potential (what scares me is the possible coil wine, that would be a no go for me). Nvidia has very good products, I can even defend 580SLI vs 6990 + 6970 (trifire for $1000) in some scenarios. The 580 still Nvidia's best call IMHO. I'll not even touch the point where people get in framebuffer walls with 580s or the 590. Call me whatever you want.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: typos


1- Lol. Did you see me trashing any card ati or nvidia? Please don't waste your precious calories on useless queries.

2- Link please.

3- Huh?

4- I think the name of this forum is doing fine right now. How are YOU?







Oh and please point me in the direction of the rule that says: "Thou shall not use this forum for discussing components at stock speeds" Oh... now into corporate espionage I see... Bless you sir! You truly know every strategy eeeeeevil nvidia employs upon poor unsuspecting customers







Now you're saying nvidia has good products? What kind of a conflict is this? Are you high? I call thee: Lost & Confused...

I hope to god that this thread does not receive any more garbage...


----------



## Twilex

I think the drama needs to **** and go to a diff thread. Lets keep the discussion to useful info


----------



## RagingCain

Quote:


> Originally Posted by *Twilex;12919866*
> I think the drama needs to **** and go to a diff thread. Lets keep the discussion to useful info


Agreed. You don't see me talking smack about me vs. 6970 cfx, 6990, 590, 570 SLi (hell quite a few 580 SLi scores too) benchmarks with my super duper VRMs.

Get out of the thread if you have nothing to add either factual, ownership wise, benchmarks, or a clue <- actually if you don't have a clue and want to ask questions pertaining to the 590 thats fine with me. The rest of you are just filling the club with useless posts.

Yes yes, our card is fail. Now leave. Trash talk somewhere else before I break my report button cherry.

P.S. Juggalo I hate you. I decided to add a RX360 rad, some Noiseblockers to my EK140, and switch out some compression fittings for 45degree ones for a cleaner install. Was supposed to be just 2x GPUs out, and 2x GPUs in.... damn sonar techs.


----------



## armartins

Quote:


> Originally Posted by *Juzam;12919785*
> 1- Lol. Did you see me trashing any card ati or nvidia? Please don't waste your precious calories on useless queries.
> 
> 2- Link please.
> 
> 3- Huh?
> 
> 4- I think the name of this forum is doing fine right now. How are YOU?
> 
> 
> 
> 
> 
> 
> 
> Oh and please point me in the direction of the rule that says: "Thou shall not use this forum for discussing components at stock speeds" Oh... now into corporate espionage I see... Bless you sir! You truly know every strategy eeeeeevil nvidia employs upon poor unsuspecting customers
> 
> 
> 
> 
> 
> 
> 
> Now you're saying nvidia has good products? What kind of a conflict is this? Are you high? I call thee: Lost & Confused...
> 
> I hope to god that this thread does not receive any more garbage...


Juzam if you would like to proceed this discussion in this tone I'll just vanish with my "garbage". I'm far from lost & confused there is no conflict at all I just can SEE the conjecture UNBIASED. The exemple I told is simple, trifire will have better performance over 580sli in most cases but when drivers don't work for a game/app 580SLI will destroy the buggy trifire and if that happens exactly with the game/app you most play/use? 580SLI is consistent, rock solid while trifire has issues.

And

1 - This definitely isn't my personal war, but there is no need to be an engeneer to understend what comon sense can adress. They didn't put a 6+2 VRM per GPU because that would allow for insane high consumption (remember 2 580s = 2 6xPIN + 2 8xPIN), it would cost more, and it would create undesired competition for 580SLI (market segmentation my friend). The other side of the coin is that that wasn't enough for beating the 6990 in performance, even with NVIDIA claiming the king over the hill crown. And it was that what I mean when I said that it is typical from Nvidia, you're an user that have your personal thoughts, your opinion you aren't mislead. But imagine someone who wants the fastest video card (out of the box I believe 590 has a better solution, due to acoustics) and see that clain. Marketing Nvidia was always good at it and this time is misleading. Some sites considering an "avg of x% faster/slower than the competition" avg with mixed resolutions doesn't say nothing (another exemple).

I'll quote one post from another thread (already closed) because of this endless stupid fanboys war. This post express precisely my feeling.
Quote:


> rx7racer
> IMO It's not only the drivers fault but also the hardware.
> They made a cheap solution to their VRM design taking up too much space on the pcb and Nvidia wanted it as small as possible.
> 
> We know what voltages the 110 can handle with the 580 out in the wild so long. We also see the effects of less phases with the GTX 570.
> 
> The GF110 can suck massive amounts of power therefore NV has to limit it through software which is where the drivers come in due to the hardware. Hardware wise the 590's vrm's will try to pump what the GF110 is asking for but they just can't handle it.
> 
> IMO, if the engineers would have made the pcb a tad bit longer and kept a 6+2 phase design for each core and memory then a 1 phase for the NV200 bridge chip the GTX 590 could have been the champ all around.
> 
> Can't wait to see some custom pcb designs that incorporate a more powerful vrm setup.
> 
> Anyone that says it's completely a driver issue is a moron. The only reason the driver is a issue is because the hardware is the issue. But to stay in specs they had to hinder the hardware in some way shape or form.
> 
> I really do wanna buy one just to play with, but it doesn't make much sense due to them being so limted imo, even for us users with GTX 480's we can SLI and hit/beat the performance of a GTX 590.


----------



## Pedros

Arma... put your facts straight man ... they didn't put more phases because ... there wasn't any space left!!!!

They had a plan and they sticked to the plan, that was to do a card with the pcb size of a single gpu ...

As for tri-fire and sli ... you just keep on comparing 3 gpu's against 2 ... no matter how cheaper they are ... 3 physical GPU's have higher computing power ... it's all about bandwidth! Just by using 3 pcie lanes, allows you to have more bandwidth ... so each gpu won't be 100% stressed ...

And... i think that AMD driver problems are like, 30% problems, 70% Hype ...

Nvidia has driver problems too ... but everyone just forgets about it ...
I know that in the past AMD did mess up big time on the drivers... but afaik, these last releases are good... and if you do a clean install they will work well.


----------



## delavan

Quote:


> he VRMs at 590's PCB are the GTX570 VRMs x2.


no.

The 570 has 4 VRMs
The 580 has 6
The 590 has 5 per GPU...

We're talking GPU VRMs here, because memory has some also...

And for the DRAMA...it blows...stop!!!!!!!!!
The HD6990 are apparently squeeling like piglets and we don't hear about it...
This thread should be fanboi-free and be educative/investigative/informative!!!


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;12919366*
> I can confirm myself that when the card throttles it does not lower the clock speeds.....
> 
> though on the other hand, the power management that does lower clock speed is rather aggressive as well (I really want nvidia to give me a way to turn that off)


What is it throttling, if not clock speeds?

SLI GPU Usage?


----------



## Juzam

Quote:


> Originally Posted by *armartins;12920320*
> Juzam if you would like to proceed this discussion in this tone I'll just vanish with my "garbage".


Yes. Please, go.


----------



## OverK1LL

so is this thread neutral again?


----------



## soilentblue

for the watercooling group I just got an email from koolance on any difference between their 6990 block and GTX 590 block....
Quote:


> Our GTX 590 block is due on Monday. Unlike on the AMD 6990 koolance block we will have water pass over the VRM, but the internal structure is not yet known. Thanks for your inquiry.


interesting. I wonder which block cools the VRMs better, ek or koolance?


----------



## soilentblue

Quote:


> Originally Posted by *RagingCain;12920752*
> What is it throttling, if not clock speeds?
> 
> SLI GPU Usage?


if he worded that correctly then it must be clocking volts back to stock. if he worded it incorrectly then he must mean that it's not going lower than stock clock speeds.


----------



## delavan

Folks!

Interesting article from the TEchreport...really worth reading...
http://techreport.com/discussions.x/20677

Debunks a few rumors about the beast!

Quote from the techreport:
Quote:


> All of which leads us back to exactly where we started, with no evidence of basic problems in the GTX 590's operation beyond, you know, the initial exploding drivers. Heh. We do have some evidence of additional, sloppily made insinuations of problems, which I suppose shouldn't be too surprising.


----------



## saulin

Quote:


> Originally Posted by *delavan;12921370*
> Folks!
> 
> Interesting article from the TEchreport...really worth reading...
> http://techreport.com/discussions.x/20677
> 
> Debunks a few rumors about the beast!
> 
> Quote from the techreport:


Good article. Nvidia/GTX 590 haters should read it


----------



## soilentblue

very good article.


----------



## Pedros

Sorry but ... i mean ... this is hilarious ... i'm sorry but i just CAN'T understand how you people get so ... i don't know ... maybe comprehensive ... when we are talking about a 700 bucks card...

From the TechReport we, 590 users, are basically dependent of a driver ...
No one just stops for a second and analyse this situation in a perspective that ... the freaking card is not build as it should... it can't handle the power ... so we use a driver that will limit that power to hide the faulty design ...

We are not talking about a 570, middle market card...
We are talking about a card that was to be sold way more expensive, but due to competition a last minute price change made it cheaper ... but still expensive









Seriously ... this is something that i can't understand how people get so " i'm ok with that " ...


----------



## delavan

Quote:


> Seriously ... this is something that i can't understand how people get so " i'm ok with that " ...


I know what you mean..are you planning to send it back? I don't thing a newer driver will solve everything, to be honest...I don't thing the card will really get better than it is...I'm giving it a few more days...

I was testing in-game with EVGA Precision overlay, to monitor temps, GPU utilization and GPU clocks...

Interesting...I used maxfps_200 in COD Blackops...

With Vsync OFF, I'm getting clock speeds of 630 mhz, sometimes one core throttles to 554mhz (at around 74 celsius)...

With Vsync ON, at 60 FPS, I get lower temps (around 60 degrees) and I see GPU core clocks at 405mhz and 554mhz...sometimes max clock 630mhz...

Strange...why is it clocking down, when temps (and demand) are lower...to get lower temps?


----------



## Pedros

Delavan... yeah, i'm sending it back tomorrow ...
And i'm going for some sort of dual card ... cf or sli ... still didn't make a decision ... bur or 6970 or 570 ...

I won't get dual gpu solutions from this generation ... i had the asus for review, had this pov to try out ... but ... thank you but no thanks ... :x

I'm tired of all this stuff
Again ... if they said this would be a dual 570 chips i was ok with it ... so the price should be lower ... but with all this stuff ... i'm really tired, frustrated and meh ...

I mean, today i was playing Bad Company 2 and my computer just turned off ... and i just started testing the card today ( this pov ) ...

Instead of people saying that Nvidia did a bad job when designing the circuitry of the board ... no ... they are saying that the Drivers are amazing and if it was not the drivers the card would blow ... common ... that's not a good way of enjoying a card or having fun with it

But, this is just my own opinion and my own frustration ...


----------



## delavan

Quote:


> Instead of people saying that Nvidia did a bad job when designing the circuitry of the board ... no ... they are saying that the Drivers are amazing and if it was not the drivers the card would blow ... common ... that's not a good way of enjoying a card or having fun with it
> 
> But, this is just my own opinion and my own frustration ...


My thought exactly...it's ahrd to enjoy, when you spend so much, for such a SORE...hard to be proud owner of something like that...it's kinda false advertisement, because you pay for max steady clocks...imagine with higher ambient temps in summer time...the cards will come to a crawl...


----------



## rush2049

For whoever asked about how the card can throttle when the clock speeds aren't lowering:

There are a number of ways, first the voltage can be limited. Under normal operation of a computer component the voltage needs to raise to meet a higher demand of power. One way a card can be throttled is to not raise the voltage to meet the demand and force the card to operate at that same clock speed with less energy. This is very hard to measure as I know of no way to measure the watt usage of a video card alone and correlate it with the voltage and clock speed at the same time.

The second way to throttle is to lower clock speed, but it isn't doing that....

The third is to shut off cores temporarily (I do not think they are doing this).

So to sum stuff up: the voltage regulation circuitry isn't as robust as the 580's. When overclocked to a 580 the performance is about equal. If you clock it past that the throttling occurs (clock speeds aren't lowered), and you get a low score in a benchmark.

On A completely different note: the power regulation (lowering clock speeds when using only 2D or siting on your desktop) is somewhat aggressive and you will see the clocks spike or dip at loading screens and sometimes in the middle of a game when you are at a less demanding section of a map. This is potentially problematic and I would like a way to turn this off....

Hope that was a good summary for people. I am hoping that upcoming drivers find a good balance in volt/power draw and performance. I will be holding onto my 590 waiting for that day, which shouldn't take too long.


----------



## delavan

Quote:


> On A completely different note: the power regulation (lowering clock speeds when using only 2D or siting on your desktop) is somewhat aggressive and you will see the clocks spike or dip at loading screens and sometimes in the middle of a game when you are at a less demanding section of a map. This is potentially problematic and I would like a way to turn this off....


As a side note (on the 3D part), I posted on EVGA's forum. They told me to go in the driver's 3D options (manage 3D settings)...

Look for POWER MANAGEMENT MODE: set it to "prefer maximum performance" instead of "adaptative mode"..it helps my 3D clocks throttling...also, raising the eye candy (read AA) helps...

hope it helps


----------



## rush2049

I have that set to maximum performance..... doesn't seem to completely remove the adaptive behavior.


----------



## Abiosis

Quote:


> Originally Posted by *rush2049;12924038*
> I have that set to maximum performance..... doesn't seem to completely remove the adaptive behavior.


_Try "force alternate frame rendering 2" ... it's worky for me ~







_


----------



## soilentblue

has anyone actually been able to come close or equal to 580 sli in gaming?

also does the new msi afterburner 2.2.0 beta 2 that was released today change anything with the gtx 590?


----------



## CSHawkeye

A *Shameful Display* this card has been, mine just died on me!! Was playing Shogun 2 and all of a sudden the screen went blank and heard a loud pop. Going to test all my other hardware out but so far I am back up with a GTX 465 just fine.


----------



## soilentblue

Quote:


> Originally Posted by *CSHawkeye;12925002*
> A *Shameful Display* this card has been, mine just died on me!! Was playing Shogun 2 and all of a sudden the screen went blank and heard a loud pop. Going to test all my other hardware out but so far I am back up with a GTX 465 just fine.


can you give us some info hawkeye? brand, oc'd?,latest drivers, etc.

your psu seems more than capable.


----------



## CSHawkeye

It totally is, using the latest drivers it was an EVGA GTX 590.


----------



## soilentblue

hhmm i wonder what caused it to die. fan died maybe? stock volts isn't enough to get this card killed.

at least it's under warranty. still a hassle returning the card however. sorry to hear man.


----------



## soilentblue

surprised it died with the latest drivers. ocp seems overly aggressive with .91


----------



## JCPUser

Quote:


> Originally Posted by *CSHawkeye;12925104*
> It totally is, using the latest drivers it was an EVGA GTX 590.


May I ask whether both the volts and clocks were at stock when it blew?


----------



## reflex99

This card...

How could nvidia not test overclocking these things. Were they under the assumption that people don't overclock them or something?


----------



## cowie

Quote:


> Originally Posted by *CSHawkeye;12925002*
> A *Shameful Display* this card has been, mine just died on me!! Was playing Shogun 2 and all of a sudden the screen went blank and heard a loud pop. Going to test all my other hardware out but so far I am back up with a GTX 465 just fine.


sry to hear about your loss
details on clocks and volts and temps?


----------



## rush2049

CSHawkeye, was your card at stock everything?

Overclocking is officially not guaranteed on any video card.... yes nvidia realizes people overclock, but they aren't going to sell a video card advertising 'it overclocks to XXX speeds' If they did they would just sell it at that speed in the first place....

Myself, I have not got to the performance of a 580 in sli yet. I am going to see how high I can get without raising my voltage as I do not want to stress the regulators until the drivers improve.


----------



## jcde7ago

So, all of these reports about 590s blowing up/dying are a bit unnerving, especially since I managed to place an order for a Classified 590 on EVGA's site today for all of the 8 minutes that their stock lasted.

I am even making sure i am staying home to be there (with my dog, lol) to sign for this, but now i am extremely nervous and may even think about returning this.

I sold both my 480s here and shelled out a *decent* amount to get a 590, with the aim of having a little less power consumption, a little less heat in my case and a LOT less noise (I ran my 480 fans pretty high frequently as I did not really want to let them go past 80 celsius when gaming).

I am not really planning on overclocking as I have no real need for it right now while i am only running a single 1920x1200 monitor, so running it stock doesn't bother me.

But if these cards are running into issues when run at STOCK, then there is a serious, serious problem here.

I might leave my 590 unopened through the weekend just to see if there are anymore developments on this - i am assuming running stock is perfectly fine and shouldn't cause problems, but if there are underlying issues I may return and wait for a revision or something of the sort.

DAMMIT NVIDIA GET YOUR ACT TOGETHER PLEASE.


----------



## H3XUS

Lol, I'm not paying $700+ for this overpriced junk. I'm sticking with my 560ti for now, and when Q4 2011 comes around, nvidia is releasing their GTX 600 series with 28nm Keplar chip to replace Fermi. Keplar runs 3-4 times faster than Fermi, and is much much MUCH more cooler, and power efficient, while maintaining a high transistor count.

Have fun with your overpriced card that doesn't work properly.

Heed this advice. Return your cards if still possible, stick with what you have/buy another of the same for a cheaper price, and sli.. make it last another 4 months.

It won't be long until it's Keplar time.


----------



## reflex99

Quote:


> Originally Posted by *H3XUS;12926409*
> Lol, I'm not paying $700+ for this overpriced junk. I'm sticking with my 560ti for now, and when Q4 2011 comes around, nvidia is releasing their GTX 600 series with 28nm Keplar chip to replace Fermi. Keplar runs 3-4 times faster than Fermi, and is much much MUCH more cooler, and power efficient, while maintaining a high transistor count.
> 
> Have fun with your overpriced card that doesn't work properly.
> 
> Heed this advice. Return your cards if still possible, stick with what you have/buy another of the same for a cheaper price, and sli.. make it last another 4 months.
> 
> It won't be long until it's Keplar time.


by that logic, when keplar comes out, you should wait for whatever is next


----------



## jellis142

Quote:


> Originally Posted by *H3XUS;12926409*
> Lol, I'm not paying $700+ for this overpriced junk. I'm sticking with my 560ti for now, and when Q4 2011 comes around, nvidia is releasing their GTX 600 series with 28nm Keplar chip to replace Fermi. Keplar runs *3-4 times faster than Fermi*, and is much much MUCH more cooler, and power efficient, while maintaining a high transistor count.
> 
> Have fun with your overpriced card that doesn't work properly.
> 
> Heed this advice. Return your cards if still possible, stick with what you have/buy another of the same for a cheaper price, and sli.. make it last another 4 months.
> 
> It won't be long until it's Keplar time.


Kepler won't literally be 3-4 times faster, it will be 3-4 times faster PER WATT. I suspect it will be like Ivy Bridge, it will be a die shrink, with small performance increases. But I think they're going for lower power needs and cooler temps after Fermi, it was hot.


----------



## H3XUS

Quote:


> Originally Posted by *reflex99;12926428*
> by that logic, when keplar comes out, you should wait for whatever is next


No, because Maxwell won't be released until at least 2013.

*Ivy bridge is quite literally a smaller sandy bridge. Later on when 2012 socket is released, the processor will be utilized more efficiently however.. but then we're getting into another socket, another chipset, etc etc.


----------



## reflex99

do you work for NV?

Are you a time traveler?

Some sort of Oracle?

Maybe a Wizard?

How do you know all of this for certain?


----------



## soilentblue

lol he doesn't. he's guessing like everyone else. even if nvidia said it would be 50% faster, you have to take that with a grain of salt. they always over reach on their projections for both camps.


----------



## H3XUS

It's a little something called going to Nvidia, AMD, and intel confrences, as well as doing your damn research.

http://www.zdnet.com/blog/computers/nvidias-life-after-fermi-kepler-in-2011-maxwell-in-2013/3952

http://www.tcmagazine.com/tcm/news/hardware/30538/nvidia-fermi-successor-called-kepler-coming-next-year

ITS NOT HARD GUYS. C'MON. Use your heads. This is even info from last year that explains it.

It's a waste of money buying the 590, or the 6990, when keplar will have 2 years until maxwell.


----------



## reflex99

so it will release 100% on time, giving exactly the performance they say it will, just like Fermi did









Also:
Quote:


> While Nvidia showed a new openness by revealing the road map, some critics are dubious about the performance estimates, especially given many of the problems that Fermi faced as it was readied for launch. In particular, noted Nvidia skeptic Charlie Demerjian isn't as hopeful about the numbers, believing that most of the promised gains are the result of Moore's law (the amount of transistors that can fit on a integrated circuit doubles every two years - or 18 months, depending on whom you ask), rather than radical new technology.




If we go purely on roadmaps, HD6XXX series is based on a 28nm process, and Fermi released in November of 2009


----------



## H3XUS

Quote:


> Originally Posted by *reflex99;12926602*
> so it will release 100% on time, giving exactly the performance they say it will, just like Fermi did


This can't be promised, but Nvidia has received the worst wrap in a long time with Fermi. They know that if they make the same mistake, it's downhill from here. Rest assure they'll be doing more testing on Keplar, than the "testing" with Fermi.

AMD's HD 7000 series (southern islands) will also be using a 28nm gpu. They never say much about details though. Once again, they talk very little about southern islands, but then quickly lead in to talk about bulldozer









I'm curious to see how it will all unfold.. but until then, I'm not dropping $700+ on either nvidia or amd's new card.


----------



## soilentblue

back on topic.

http://www.frozencpu.com/products/12733/ex-blc-884/EK_GeForce_590_GTX_VGA_Liquid_Cooling_Block_-_Nickel_EK-FC590_GTX_-_Nickel_Pre-Order_Item.html

looks like ek blocks for the 590 are back in stock


----------



## Juzam

Quote:


> Originally Posted by *jcde7ago;12926077*
> So, all of these reports about 590s blowing up/dying are a bit unnerving, especially since I managed to place an order for a Classified 590 on EVGA's site today for all of the 8 minutes that their stock lasted.
> 
> I am even making sure i am staying home to be there (with my dog, lol) to sign for this, but now i am extremely nervous and may even think about returning this.
> 
> I sold both my 480s here and shelled out a *decent* amount to get a 590, with the aim of having a little less power consumption, a little less heat in my case and a LOT less noise (I ran my 480 fans pretty high frequently as I did not really want to let them go past 80 celsius when gaming).
> 
> I am not really planning on overclocking as I have no real need for it right now while i am only running a single 1920x1200 monitor, so running it stock doesn't bother me.
> 
> But if these cards are running into issues when run at STOCK, then there is a serious, serious problem here.
> 
> I might leave my 590 unopened through the weekend just to see if there are anymore developments on this - i am assuming running stock is perfectly fine and shouldn't cause problems, but if there are underlying issues I may return and wait for a revision or something of the sort.
> 
> DAMMIT NVIDIA GET YOUR ACT TOGETHER PLEASE.


A friend of mine also did the same thing you're doing (selling his 480s and getting a 590). Well, he gets 1800 points less than he got with 480s (which were factory overclocked by the way) on 3dmark11 but he said he didn't notice any framerate drop in games. Overall he's satifsied and 100% sure he made a good investment. He does not overclock his card so it's running perfectly fine like mine. So like I said before, *nothing will happen to your card* if you run it at stock clocks and voltages, use latest drivers, have a decent PSU and your overall system stability is good.
Quote:


> Originally Posted by *H3XUS;12926409*
> Have fun with your overpriced card that doesn't work properly.


Yes. Like the GTX295 before it, I will never overclock my "overpriced junk" and continue to enjoy it for another 2 years while you stick your head underground.


----------



## RagingCain

Quote:


> Originally Posted by *H3XUS;12926689*
> This can't be promised, but Nvidia has received the worst wrap in a long time with Fermi. *They know that if they make the same mistake, it's downhill from here. Rest assure they'll be doing more testing on Keplar, than the "testing" with Fermi*.


^ That's what we call opinion.
Quote:


> Originally Posted by *H3XUS;12926562*
> It's a little something called going to Nvidia, AMD, and intel confrences, as well as doing your damn research.
> 
> urlz
> 
> ITS NOT HARD GUYS. C'MON. Use your heads. This is even info from last year that explains it.


^ Not really anything factual anywhere in the post or links.
Quote:


> AMD's HD 7000 series (southern islands) will also be using a 28nm gpu. They never say much about details though. Once again, they talk very little about southern islands, but then quickly lead in to talk about bulldozer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I'm curious to see how it will all unfold.. but until then, I'm not dropping $700+ on either nvidia or amd's new card.*


^ But it sounds like you already know more than us, more than nVidia, more than anybody for that matter. What is there left for you to be curious about?
Quote:


> *It's a waste of money buying the 590, or the 6990, when keplar will have 2 years until maxwell.*


Quote:


> Originally Posted by *H3XUS;12926409*
> *Lol, I'm not paying $700+ for this overpriced junk.* I'm sticking with my 560ti for now, and when Q4 2011 comes around, nvidia is releasing their GTX 600 series with 28nm Keplar chip to replace Fermi. Keplar runs 3-4 times faster than Fermi, and is much much MUCH more cooler, and power efficient, while maintaining a high transistor count.
> 
> Have fun with your overpriced card that doesn't work properly.
> 
> Heed this advice. Return your cards if still possible, stick with what you have/buy another of the same for a cheaper price, and sli.. make it last another 4 months.
> 
> It won't be long until it's Keplar time.


So what exactly are you doing here? I am sure the lower ended 560Ti has a fan club around here somewhere you can go to.
Quote:


> Originally Posted by *jcde7ago;12926077*
> So, all of these reports about 590s blowing up/dying are a bit unnerving, especially since I managed to place an order for a Classified 590 on EVGA's site today for all of the 8 minutes that their stock lasted.
> 
> I am even making sure i am staying home to be there (with my dog, lol) to sign for this, but now i am extremely nervous and may even think about returning this.
> 
> I sold both my 480s here and shelled out a *decent* amount to get a 590, with the aim of having a little less power consumption, a little less heat in my case and a LOT less noise (I ran my 480 fans pretty high frequently as I did not really want to let them go past 80 celsius when gaming).
> 
> I am not really planning on overclocking as I have no real need for it right now while i am only running a single 1920x1200 monitor, so running it stock doesn't bother me.
> 
> But if these cards are running into issues when run at STOCK, then there is a serious, serious problem here.
> 
> *I might leave my 590 unopened through the weekend just to see if there are anymore developments on this - i am assuming running stock is perfectly fine and shouldn't cause problems, but if there are underlying issues I may return and wait for a revision or something of the sort.*
> 
> DAMMIT NVIDIA GET YOUR ACT TOGETHER PLEASE.


Life-time warranty from EVGA? Whats to fear? RMAs? EVGA is very good about keeping their high-end customers happy, even if it means replacing the model of the card to something equitable in the even there is a hardware design SNAFU/catastrophe.

I am in a very similar boat, selling my GTX 580s, and I am not very worried, but at the same time, on one GTX 590, I knew not to assume GTX 580 SLi performance, so one card would technically be a downgrade, but two cards would be quite the boost in performance.


----------



## Juzam

Quote:


> Originally Posted by *RagingCain;12927199*
> So what exactly are you doing here?


He's trying to get into my signature....and.... well it looks like he made it! YAY!


----------



## RagingCain

TiN over at KingpinCooling already has Quads under LN2.

http://kingpincooling.com/forum/showthread.php?t=1163
Quote:


> Originally Posted by *rush2049;12923669*
> For whoever asked about how the card can throttle when the clock speeds aren't lowering:
> 
> There are a number of ways, first the voltage can be limited. Under normal operation of a computer component the voltage needs to raise to meet a higher demand of power. One way a card can be throttled is to not raise the voltage to meet the demand and force the card to operate at that same clock speed with less energy. This is very hard to measure as I know of no way to measure the watt usage of a video card alone and correlate it with the voltage and clock speed at the same time.


I am not entirely sure I understand. The only time I have ever seen lower scores or minimal gains with a GPU overclock have been from CPU/Memory/GPU OC instability (temp/voltages/etc.), Power Supplies, or benchmark compatibility issues. If you increase the clocks, and something reduces the voltages (which can be monitored in GPU-z), you would expect artifacts, crashing BSOD, much like overclocking a CPU, not just lower benchmark scores. Its just not possible to sustain the clock speeds and drop voltages (from invisible OCP) and keep stability.

In other words, something has to be holding back the performance of the card. Thats where throttling terminology originates from. Voltage drops without clock drops would cause instability, not poor benchmark performance. The only other thing that could throttle the cards performance wise would be GPU usages. Forcing 75% max GPU usage as opposed to 99%.

I am very keen to investigate this as it looks more and more like a driver issue, and not an OCP issue.


----------



## rush2049

What I was getting at is the equation looks like this for electricty:

Power (watts) = Resistance (volts) * Current (amps)

When more watts is needed to run a heavier load one of those two variables on the right need to increase. Either the volts or the amps. From what I am seeing on my card the volts remain mostly constant, which is desirable. That means the amp's are increasing to meet demand. Well I am guessing that the amps supplied or the volt increase to meet demand is being limited.

It is possible to keep the same clock speed without meeting the watts demand. And the result is artifacts and BSOD's, but if the parts are built to be used with low volts (underclocking) then they might be able to do fine....

This is somewhat speculation as I do not have access to the coding for nvidias proprietary hardware/software. I am just going to wait it out and see how stable they can make it with newer drivers then I will push my luck in the overclocking front.

edit: I am curious to see how that LN2 turns out.... I will be keeping tabs, thanks for posting.


----------



## JCPUser

Quote:


> Originally Posted by *RagingCain;12927303*
> 
> *snip*
> 
> I am very keen to investigate this as it looks more and more like a driver issue, and not an OCP issue.


As I have never heard of a driver than worsens as you increase the clock speed... the culprit can by found by doing a simple test.

1) Run a bench underclocked (550Mhz) -- monitor GPU usage, clocks, and score
2) Run a bench at stock (607Mhz) -- monitor GPU usage, clocks, and score
3) Run a bench OCed (625Mhz) -- monitor GPU usage, clocks, and score
4) Keep increasing OC and volts (up to near 1V) -- monitor GPU usage, clocks, and score

If at step 1 the clocks and the gpu usage are bouncing around... then it is a driver issue. If the the usage is 99% all the time in step 1) but starts bouncing around or the score drops in step 2), 3), or 4) then it is throttling.

Done. Culprit identified (driver or throttling) with a short scientific study.


----------



## rush2049

Allright, let me run mine at 550 for you....

edit:
While thats running....

Does anyone know how fancy the OCP on the card are? I am assuming the driver can control it.


----------



## rush2049

Ok here is 553 mhz vs 612 mhz.

http://3dmark.com/compare/3dmv/3033474/3dmv/3027371

gpu utilization was constant >95% for both.
clock speed stayed constant at the setting for the duration of the tests.


----------



## JCPUser

Quote:


> Originally Posted by *rush2049;12927747*
> Ok here is 553 mhz vs 612 mhz.
> 
> http://3dmark.com/compare/3dmv/3033474/3dmv/3027371
> 
> gpu utilization was constant >95% for both.
> clock speed stayed constant at the setting for the duration of the tests.


Cool. Then I would say it is likely not the drivers.







Furthermore, OCP is doing nothing [significant] @ stock.

Now... I don't want to tell you how to use your rig (especially since OCing this card can be risky) but if you starting OCing it some and then you start to see weird GPU usage or clock drops/score drops then you can determine approximatly at what clock settings the card's OCP kicks in and throttles down.


----------



## rush2049

Ok people, this is what I was telling you all... but you didn't believe me....

I clocked mine up to 650 mhz, but stock volts for a single run... then quickly put it back to stock....

Here are the scores of all three, from lowest on the left to highest clock on the right.










And here is the graphs for testing in Vantage while it is clocked at 650 mhz.










Notice the two first tests, all the dipping and what have you... thats throttling.... The drops to 0 are loading screens or cpu tests.... the bunch at the end are quick features tests, so ignore them. The first two blocks is what to look at.


----------



## rush2049

Oh and here are the links to those three results:

@Under (553 mhz): http://3dmark.com/3dmv/3033474
@Stock (613 mhz): http://3dmark.com/3dmv/3027462
@Over (650 mhz): http://3dmark.com/3dmv/3033502


----------



## JCPUser

Quote:


> Originally Posted by *rush2049;12927860*
> Oh and here are the links to those three results:
> 
> @Under (553 mhz): http://3dmark.com/3dmv/3033474
> @Stock (613 mhz): http://3dmark.com/3dmv/3027462
> @Over (650 mhz): http://3dmark.com/3dmv/3033502


Good Work. +Rep

It is not the drivers and the card throttles at 650Mhz due to OCP. Of course, for different benchmarks or other games the clock where it throttles will be different, but for all intents and purposes you have identified the culprit...

OCP.


----------



## rush2049

Thanks, hope this is proof enough for everyone.

I am playing the waiting game for new drivers, till then I will enjoy L4D2 with 32x AA and 120+ fps...... I ain't complaining....

Oh and Alatar change my clock in the first post for me, I ran at 650 for the one test and it was stable, but I don't feel comfortable running there all the time just yet....


----------



## CSHawkeye

Quote:


> Originally Posted by *JCPUser;12925720*
> May I ask whether both the volts and clocks were at stock when it blew?


They were, very unhappy right now so I guess you can make me a former member as EVGA is happily going to replace my video card, but then its going on ebay. I am going to throw back in my dual 6970's.


----------



## Alatar

Quote:


> Originally Posted by *rush2049;12927976*
> Oh and Alatar change my clock in the first post for me, I ran at 650 for the one test and it was stable, but I don't feel comfortable running there all the time just yet....


done.

E: wait... what clocks do you want in the OP?







you're saying that you are running stock for now right?


----------



## Twilex

@ rush: I see your x6 is only at 3.5 ghz...could this be a bottleneck were looking at here? We are essentially running sli on this thing and its trying to get info from your cpu as fast as possible. The higher you clock your cards the faster your processor needs to be. Just wondering if that could be the culprit before we blame the ocp. Might wanna try to OC your cpu close to 4ghz and try to tests over again.


----------



## JCPUser

From a "life long" AMD user, I have never 0% GPU in any situation outside of driver issues. But given the drivers work fine at 550MHz AND stock we have to rule those out...

I am not trying to start anything here, but it really seems like OCP.


----------



## rush2049

JCPUser, did you read what I wrote under that graph. The 0% utilization was the loading screens between tests..... its the dips to 60-40% that is the throttling.

Twilex, I can try pushing my x6 further... but I seriously doubt that is holding me back. The reason I am at 3.5 ghz is because thats how far I can run stable at stock voltage..... fairly impressive if I do say so myself.


----------



## Twilex

Yea im thinking its ocp but it can still be a possibly that needs to be ruled out. I havent run vantage in a while so im not sure how cpu intensive it is.


----------



## rush2049

well before I go rebooting and do all the prime95 testing again.... let me run vantage and see what my cpu utilization is like


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;12929412*
> JCPUser, did you read what I wrote under that graph. The 0% utilization was the loading screens between tests..... its the dips to 60-40% that is the throttling.
> 
> Twilex, I can try pushing my x6 further... but I seriously doubt that is holding me back. The reason I am at 3.5 ghz is because thats how far I can run stable at stock voltage..... fairly impressive if I do say so myself.


I would say it was a good start, and I didn't mean to come off sounding like I didn't believe you. I just didn't see what you are saying.

CPU Utilization won't drop it down 40%, maybe 80~85% from 3.5GHz to 99% at 4.0 GHz at the most. I see the same type of thing in various video games on my current cards due to drivers.

This is what I was referring to as what is throttling, the GPU usage. Although there are plenty of low Fermi GPU usage causes, OCP being one of them (but less prominent as it exists on cards without OCP.) If at stock there is 99% usage over all tests, then I would say either the OC doesn't have enough volts, driver problem, then a benchmark compatibility problem, then I would think OCP if all others have been ruled out. I have done various gaming tests with OCP on and off, and the GPU usage often gets throttled about 50% of my games, and every new release Crysis 2 included, has the issue till drivers come out to fix it.

Now thats a two fold problem identifying what the problem is, as the drivers might be actually fixing stuff/profiles or there is heavy OCP coding tweaks going on.

World of Warcraft and Homefront didn't ever go above 30%. It sucks because I want 120 FPS minimum at all times (monitor.) Thats entirely driver related to me. When the 580s first came I had low GPU usage in everything except Unigine Heaven and 3DMark 11, which are not the most fun games to play









@All
New Super Betas:
http://www.overclock.net/nvidia/978135-new-nvidia-drivers-270-51-beta.html

Direct Link to 270.51 Betas (x86):
http://www.nvidia.com/object/win7-winvista-32bit-270.51-beta-driver.html

Modded INF (x86) <- Use these only if you have card undetected, 590 should be supported.

Direct Link to 270.51 Betas (x64):
http://www.nvidia.com/object/win7-winvista-64bit-270.51-beta-driver.html

Modded INF (x64) <- Use these only if you have card undetected, 590 should be supported.


----------



## JCPUser

Quote:


> Originally Posted by *rush2049;12929412*
> JCPUser, did you read what I wrote under that graph. The 0% utilization was the loading screens between tests..... its the dips to 60-40% that is the throttling.
> 
> Twilex, I can try pushing my x6 further... but I seriously doubt that is holding me back. The reason I am at 3.5 ghz is because thats how far I can run stable at stock voltage..... fairly impressive if I do say so myself.


Sorry I missed that









I guess I could be the Phenom then -- that is a reasonable point. Really dunno about drivers or benchmark compat issues as the lower clocks are OK. Could need more voltage... I have seen OCs where at the same clocks increasing the voltage increased the score.

Anyway, I tried to help identify the problem with a simple test and added my 2 cents. But -- as I am just a measly 570 owner, I will leave you guys to your thread.









Cheers.


----------



## soilentblue

Quote:


> Originally Posted by *rush2049;12927584*
> What I was getting at is the equation looks like this for electricty:
> 
> Power (watts) = Resistance (volts) * Current (amps)
> 
> When more watts is needed to run a heavier load one of those two variables on the right need to increase. Either the volts or the amps. From what I am seeing on my card the volts remain mostly constant, which is desirable. That means the amp's are increasing to meet demand. Well I am guessing that the amps supplied or the volt increase to meet demand is being limited.
> 
> It is possible to keep the same clock speed without meeting the watts demand. And the result is artifacts and BSOD's, but if the parts are built to be used with low volts (underclocking) then they might be able to do fine....
> 
> This is somewhat speculation as I do not have access to the coding for nvidias proprietary hardware/software. I am just going to wait it out and see how stable they can make it with newer drivers then I will push my luck in the overclocking front.
> 
> edit: I am curious to see how that LN2 turns out.... I will be keeping tabs, thanks for posting.


ocp controls the current(over current protection) and not the volts. seems the card is being heavily aggressive on current draw.


----------



## rush2049

Here is two graphs made from the sensor readings during a vantage run. These were done while the core clock was at 650 mhz.










So basically, no the gpu isn't maxing my processor out.... so increasing my cpu clock will not improve the score more than just the increased cpu clock would do.

BUT something interesting happened! I monitored the gpu for this test and got the following graph:










That should look very familiar, but the interesting part is this:

http://3dmark.com/3dmv/3034243

So compare that with my last run at 650 mhz: http://3dmark.com/compare/3dmv/3033502/3dmv/3034243

...very interesting, so basically the OCP is messing with my card in a non-uniform fashion.... and my score varies very widely....


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;12929910*
> Here is two graphs made from the sensor readings during a vantage run. These were done while the core clock was at 650 mhz.
> 
> So basically, no the gpu isn't maxing my processor out.... so increasing my cpu clock will not improve the score more than just the increased cpu clock would do.
> 
> BUT something interesting happened! I monitored the gpu for this test and got the following graph:
> 
> That should look very familiar, but the interesting part is this:
> 
> So compare that with my last run at 650 mhz:
> 
> ...very interesting, so basically the OCP is messing with my card in a non-uniform fashion.... and my score varies very widely....


Thx for being the guinea pig









If you look at your CPU core (not sure which one its following) you are peaking at 100% (not sure why the graphs go up to 120% lol), this can lead to some throttling of the GPU, as it has to feed the data to both cards. Perhaps a temporary bump to 3.8~4.0 and retest.

Regarding the GPU usage though, this looks/sounds more and more like voltage instability. I think we should investigate voltages & currents running a test using the GPU-z sensor tab. Maybe even a combination of bottleneck, voltage regulation, and drivers. These scores are too low.


----------



## soilentblue

rush your using the .91 drivers and not the new beta 270.51 drivers correct? debating on whether or not to give the beta drivers a go tonight when my card arrives. seems like there's alot of sli improvement and they say they support the gtx 590.


----------



## saulin

Quote:


> Originally Posted by *RagingCain;12929968*
> Thx for being the guinea pig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you look at your CPU core (not sure which one its following) you are peaking at 100%, this can lead to some throttling of the GPU, as it has to feed the data to both cards. Perhaps a temporary bump to 3.8~4.0 and retest.
> 
> Regarding the GPU usage though, this looks/sounds more and more like voltage instability. I think we should investigate voltages & currents running a test using the GPU-z sensor tab. Maybe even a combination of bottleneck, voltage regulation, and drivers. These scores are too low.


At 650Mhz no extra voltage should be needed.

I don't get it though. Why are people so scare to overclock the card?

Use the latest drivers, set the core at 700Mhz, no voltage needed most likely. Set the fan at 100% of you want to be extra careful.

If it throttles you will know. From there go in 20Mhz increments, if it's not stable, try adding a little voltage. I think 1.0v should be perfectly fine.Then watch if it throttles

You will know the card throttles if your score does not increase or is lower. When this happens back up 10Mhz or so. Until you find your max overclock. I also suggest you try agresssive settings to make sure your CPU is not bottlenecking the card. If you are using Vantage then run the extreme preset.


----------



## rush2049

I am happier to do testing without overclocking my cpu further more than spending the time to clock it higher and test it to where it is stable and I am satisfied that it won't be causing problems itself.

I have already monitored the +12 and +5 volt rails from my mobo and I am at:
11.70 with a +/- of .05 volts
5.16 volts constant

This was measured from idle to 30 min with furmark..... so yeah, I am confident my PSU is capable of holding stable voltages. There is a very heavy draw of power in my case, (gtx 590, phenom II x6, 5x mechanical hard drives, xonar d2). And my PSU is beefy to support it.

Voltage on the gpu is a constant .9130 volts no matter how hard it is being taxed.

The current is what the OCP is limiting causing the %utilization to drop.... when nvidia makes the drivers more stable that will improve, possibly when they let the card draw more power.


----------



## rush2049

Quote:


> Originally Posted by *saulin;12930116*
> At 650Mhz no extra voltage should be needed.
> 
> I don't get it though. *Why are people so scare to overclock the card?*
> 
> Use the latest drivers, *set the core at 700Mhz*, no voltage needed most likely. *Set the fan at 100%* of you want to be extra careful.
> 
> If it throttles you will know. From there go in 20Mhz increments, if it's not stable, try adding a little voltage. I think 1.0v should be perfectly fine.Then watch if it throttles
> 
> *You will know the card throttles if your score does not increase or is lower.* When this happens back up 10Mhz or so. Until you find your max overclock. I also suggest you try agresssive settings to make sure your CPU is not bottlenecking the card. If you are using Vantage then run the extreme preset.


*Why are people so scare to overclock the card?* : Have you been reading the reports?

*set the core at 700Mhz* : I tried it without increasing the voltage, no go..... it just spits you back to the desktop and recovers the driver back to default settings

*Set the fan at 100%* : One of the first things I did was setup a manual profile so that the fan runs much more often..... and that only gets me to 30's at idle, which I wish was lower.

*You will know the card throttles if your score does not increase or is lower.* : Please read the whole thread before posting


----------



## rush2049

Quote:


> Originally Posted by *soilentblue;12930075*
> rush your using the .91 drivers and not the new beta 270.51 drivers correct? debating on whether or not to give the beta drivers a go tonight when my card arrives. seems like there's alot of sli improvement and they say they support the gtx 590.


Yes I am using the .91 drivers.... my slow internet connection takes me a while to download 200 mb drivers once or twice a day. And right after I am done typing this I am going to install the 270.51 and test at stock again....

I'll let ya'll know.


----------



## saulin

Quote:


> Originally Posted by *rush2049;12930230*
> *Why are people so scare to overclock the card?* : Have you been reading the reports?
> 
> *set the core at 700Mhz* : I tried it without increasing the voltage, no go..... it just spits you back to the desktop and recovers the driver back to default settings
> 
> *Set the fan at 100%* : One of the first things I did was setup a manual profile so that the fan runs much more often..... and that only gets me to 30's at idle, which I wish was lower.
> 
> *You will know the card throttles if your score does not increase or is lower.* : Please read the whole thread before posting


So I guess your card is bad overclocker then, it needs more voltage. I would try 1.0v max if you can't get 700Mhz then I'll say is a bad overclocker dude. Will it do 690Mhz?

Most reviews left it between 675-690 without voltage added


----------



## rush2049

I refuse to put my card over 750 until the bugs are ironed out at that speed.... I do not have the resources to fry a 700$ card and wait for an rma. I use the cuda cores for programming and cannot afford the down time.


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;12930170*
> I am happier to do testing without overclocking my cpu further more than spending the time to clock it higher and test it to where it is stable and I am satisfied that it won't be causing problems itself.
> 
> I have already monitored the +12 and +5 volt rails from my mobo and I am at:
> 11.70 with a +/- of .05 volts
> 5.16 volts constant
> 
> This was measured from idle to 30 min with furmark..... so yeah, I am confident my PSU is capable of holding stable voltages. There is a very heavy draw of power in my case, (gtx 590, phenom II x6, 5x mechanical hard drives, xonar d2). And my PSU is beefy to support it.
> 
> Voltage on the gpu is a constant .9130 volts no matter how hard it is being taxed.
> 
> The current is what the OCP is limiting causing the %utilization to drop.... when nvidia makes the drivers more stable that will improve, possibly when they let the card draw more power.


Argh, 590s get here Thursday, Radiator gets here Friday, pieces parts get here between Thursday - Saturday. Gentle Typhoons/Noiseblockers should be here today sometime... If I don't get all this crap benchmarked by Sunday, I will literally go blind from anticipation.

Are you able to disable SLi on this card, will it let you run on one GPU?

Meh, Quad-SLI is only 7k higher Vantage than me..
http://3dmark.com/3dmv/3032848

3DMark 11 Extreme only has 300 points above me here
http://3dmark.com/3dm11/925109

Now I am thinking I should have just Tri-SLied.


----------



## rush2049

RagingCain, I would say if waiting for parts forces you till sunday to try them then you have that whole extra week of better drivers to play with.....


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


RagingCain, I would say if waiting for parts forces you till sunday to try them then you have that whole extra week of better drivers to play with.....


Patience is not one of my skills.










I don't know why, its not like the GTX 580s sold and I am destitute for pretty graphics.


----------



## saulin

Quote:


> Originally Posted by *RagingCain;12930302*
> Argh, 590s get here Thursday, Radiator gets here Friday, pieces parts get here between Thursday - Saturday. Gentle Typhoons/Noiseblockers should be here today sometime... If I don't get all this crap benchmarked by Sunday, I will literally go blind from anticipation.
> 
> Are you able to disable SLi on this card, will it let you run on one GPU?
> 
> Meh, Quad-SLI is only 7k higher Vantage than me..
> http://3dmark.com/3dmv/3032848
> 
> *3DMark 11 Extreme only has 300 points above me here
> http://3dmark.com/3dm11/925109*
> 
> Now I am thinking I should have just Tri-SLied.


That's low. This guy gets

http://3dmark.com/3dm11/901336

X5644 @ 630Mhz


----------



## soilentblue

Quote:



Originally Posted by *saulin*


That's low. This guy gets

http://3dmark.com/3dm11/901336

X5644 @ 630Mhz


it's mainly low because currently nvidia's drivers are not the best at dx11. amd definitely has better drivers for dx11.


----------



## RagingCain

Quote:


> Originally Posted by *saulin;12930411*
> That's low. This guy gets
> 
> http://3dmark.com/3dm11/901336
> 
> X5644 @ 630Mhz


Thats with 2x 590s. So 4 gpus...

Not good.


----------



## soilentblue

http://www.kitguru.net/components/gr...ir-reputation/

so there's a possible bios flash in the future. i wonder what they have changed and how it will affect the card


----------



## rush2049

Theres some handy discussion about the voltage and current regulators going on over here:

http://semiaccurate.com/forums/showt...?t=4473&page=7

starting about the page I linked to. Seems that the power circuitry is rated for what it ships at, but is really running over spec if any overclocking occurs. And electric components like those aren't like a gpu/cpu; they are rated like fuses... over spec and pop....


----------



## Pedros

Excellent reading


----------



## rush2049

Quote:



Greg Brown:

Yeah, maybe we should change the title of the thread to;

"*The GTX590 Survivors Club*"


I liked it so much I cross forum quoted....


----------



## Alatar

Quote:



Originally Posted by *rush2049*


I liked it so much I cross forum quoted....


----------



## Pedros

Like already said ... if Nvidia opted for the Volterra's VRM components that 6990 uses ... this card would be so bad ass ...

It's one of those things ... it had everything to be perfect ... but the human being eager for profit and they make choices with that profit in mind ... so they just turn "amazing pieces of hardware " into " amazing pieces of hot ware " if you know what i mean ...


----------



## RagingCain

I am going to be pissed if I find out modded referenced or even non-referenced cards released in a few weeks/months that specifically beefs up voltage regulations and phases on this card or a 6 GB card... GTX 595~599 for the absolute pants off win?

Wait, I will just bench my cards till they die... and demand newer revision replacements... EVGA for the win?

Official GTX 590 Walking Dead Club sounds catchy to me.

Sent from my DROID2


----------



## rush2049

There is a lot of talk going around about a recall...... supposedly these new drivers by default clock down to 550 mhz, that is if you don't manually set it like I did....

I wouldn't be opposed to a recall if I got a v2 board with better VRM parts... I wouldn't even mind if it was an inch longer or whatever.... it already takes 2 of my hard drive bays, might as well expand to the very end of my case.


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


I wouldn't even mind if it was an inch longer or whatever.... it already takes 2 of my hard drive bays, might as well expand to the very end of my case.


That's what she said









Sent from my DROID2


----------



## xsf

How are the results under water? Any see if the card is still having the throttling/down-clock problems? I haven't seen much since it's a smaller percent of people owning both the 590+wc'ing







.


----------



## Pedros

ahahahah ...
But the v2 can't be done "that fast" ...







i mean ... it's a lot of work! When they finish fabrication, the Kepler's are almost around the corner


----------



## armartins

Nice reading about 590's VRM here http://lab501.ro/placi-video/nvidia-...verclocking/12 this page and the next. Just use google translator. No time to post all my "new" findings about 590s issues... I'll post later if no one else do it before me.

P.S. Sorry Juzam, I'm back.


----------



## 2010rig

@rush2049 - just out of curiosity, I'm not sure if you mentioned it, what is your CPU clocked at?


----------



## RagingCain

Quote:



Originally Posted by *xsf*


How are the results under water? Any see if the card is still having the throttling/down-clock problems? I haven't seen much since it's a smaller percent of people owning both the 590+wc'ing







.


I or Juggalo will let you know Friday/Saturday or Sunday


----------



## saulin

Quote:



Originally Posted by *xsf*


How are the results under water? Any see if the card is still having the throttling/down-clock problems? I haven't seen much since it's a smaller percent of people owning both the 590+wc'ing







.


There are reviews of the EVGA GTX 590 Hydro Copper and they were scared to overclock it lol

Actually here is a review. He overclcoked it to 750Mhz

http://www.anandtech.com/show/4239/n...le-card-king/4


----------



## Juggalo23451

Quote:



Originally Posted by *RagingCain*


I or Juggalo will let you know Friday/Saturday or Sunday










I am waiting for koolance to release there block for the 590. It looks like it is going to be awhile


----------



## soilentblue

Quote:



Originally Posted by *rush2049*


There is a lot of talk going around about a recall...... supposedly these new drivers by default clock down to 550 mhz, that is if you don't manually set it like I did....

I wouldn't be opposed to a recall if I got a v2 board with better VRM parts... I wouldn't even mind if it was an inch longer or whatever.... it already takes 2 of my hard drive bays, might as well expand to the very end of my case.



Quote:



Originally Posted by *Juggalo23451*


I am waiting for koolance to release there block for the 590. It looks like it is going to be awhile










coming out next monday. they said that their waterblock has water running over the vrms to help cool it unlike on their amd 6990 waterblocks. no knowledge of their internal design as of yesterday though.


----------



## soilentblue

Quote:



Originally Posted by *saulin*


There are reviews of the EVGA GTX 590 Hydro Copper and they were scared to overclock it lol

Actually here is a review. He overclcoked it to 750Mhz

http://www.anandtech.com/show/4239/n...le-card-king/4


unless i'm reading it wrong they did not use the 590 hc. i don't see any reviews on any site for the 590 hc.


----------



## saulin

Quote:



Originally Posted by *soilentblue*


unless i'm reading it wrong they did not use the 590 hc. i don't see any reviews on any site for the 590 hc.


My bad, that seems to be the EVGA GTX 590 Classified.

Got it from here

http://www.evga.com/forums/tm.aspx?m=923582

I'll see if I can find a review for the HC version


----------



## Juggalo23451

Not sure if this has been seen yet
http://www.overclock.net/hardware-ne...ushed-out.html


----------



## OverK1LL

Quote:



Originally Posted by *saulin*


My bad, that seems to be the EVGA GTX 590 Classified.

Got it from here

http://www.evga.com/forums/tm.aspx?m=923582

I'll see if I can find a review for the HC version


Does anybody else here besides me have an HC?


----------



## soilentblue

Quote:



Originally Posted by *OverK1LL*


Does anybody else here besides me have an HC?


masked does but i haven't seen him post since the weekend.

are you seeing ocp affecting you just like rush overkill?


----------



## RagingCain

Quote:


> Originally Posted by *OverK1LL;12933229*
> Does anybody else here besides me have an HC?


Mines en route.


----------



## Draygonn

Quote:


> Originally Posted by *soilentblue;12933341*
> masked does but i haven't seen him post since the weekend.


Still waiting to see the results of his benches. Hopefully WC can keep the VRMs cooler or something.


----------



## saulin

This guy over at the EVGA forums says this about his card

http://www.evga.com/forums/tm.aspx?m=934396
Quote:


> I am using the new MSI Afterburner Beta which seems to be the only OC utility that allows voltage control on the 590 right now. Here are my results.
> 
> Core Voltage (.925): 1.000 mV
> Core Clock (607 ref/630 classy): 775
> Shader Clock: (1200 ref/1260 classy): 1550 MHz
> Memory Clock: (3414 ref/3456 classy): 4000 MHz
> Temps: 34c idle and 42c under full load.


Now can you guys confirm if these cards can do this without throttling under water? he has the GTX 590 HC


----------



## soilentblue

i saw that earlier but i think he's using the old drivers and not the new .91 drivers


----------



## Ghoxt

I'm so bummed about this card. I was going to Join but now I know I cannot "Clock" it as advertised by ASUS...Brand I bought. Thank god I have all the packaging and original box etc.

For $700.00 I'm returning it to the seller in this case New Egg, false advertising etc. Lemon Law


----------



## Abiosis

_Anyone tried the latest nVidia drivers 270.51 BETA yet?

is there any different?

I'm in hurry to my class... will test it out later tonight~_


----------



## AuDioFreaK39

Quote:


> Originally Posted by *Abiosis;12936201*
> _Anyone tried the latest nVidia drivers 270.51 BETA yet?
> 
> is there any different?
> 
> I'm in hurry to my class... will test it out later tonight~_


Ever since I got back to school this week, I haven't even had a chance to setup my rig or test out my new EVGA GTX 590 Quad-SLI setup.







Hopefully I get some free time this weekend.


----------



## rush2049

Quote:


> Originally Posted by *2010rig;12932087*
> @rush2049 - just out of curiosity, I'm not sure if you mentioned it, what is your CPU clocked at?


You could have checked my sig or the vantage runs I linked, but a conservative 3.5 ghz with stock voltage.


----------



## delavan

I just noticed the new driver also...it mentions the 590, but nothing in particular...
On my side of things, I'm waiting for a call-back from my Etailer...good thing I haven't touched the bundled T-shirt and mouse pad...gotta send this thing back...too much money paid for a failed power design...woder what happened over at nVIDIA...they forgot how to make good cards?

What's up with the Wimpy power phases on the 570 and 590...are they for real?


----------



## soilentblue

Quote:


> Originally Posted by *Ghoxt;12935896*
> 
> 
> 
> 
> 
> 
> 
> I'm so bummed about this card. I was going to Join but now I know I cannot "Clock" it as advertised by ASUS...Brand I bought. Thank god I have all the packaging and original box etc.
> 
> For $700.00 I'm returning it to the seller in this case New Egg, false advertising etc. Lemon Law


i don't think you can return it to the egg if your serious. i'm in the same boat. i bought asus to be able to overvolt and now it seems like I currently can't.


----------



## soilentblue

my card should be here within the hour. i'll be installing the 270 drivers and checking for benching performance


----------



## 222Panther222

nvm


----------



## soilentblue

just have to get rid of all these damn amd drivers first though. can't wait for the pain to begin :smh:


----------



## rush2049

Sorry everyone for not doing a run with the newest drivers right away, I had been awake for 48 hours at that point fiddling with it. I passed out for 5 hours and am taking it easy now. I will have a run with the latest drivers later tonight...


----------



## Ghoxt

Quote:


> Originally Posted by *soilentblue;12936370*
> i don't think you can return it to the egg if your serious. i'm in the same boat. i bought asus to be able to overvolt and now it seems like I currently can't.


The nice thing about my bank is that they will confirm a product returned and then Credit my account. They really move to put money back into their hosted account.

Look at the cover of the Box. 50% Faster by voltage tweak advertised, FALSE.


----------



## hubwub

I just got a notifier that the EVGA GTX 590 are back on stock on the EVGA website. I was wondering if my power supply would be enough at the moment. It does say on the page that the minimum requirements is a 700W PSU.

I just want to make sure that this fine for now.


----------



## soilentblue

finally the ups shows up







time to start the removal of drivers. be back a little later


----------



## soilentblue

Quote:


> Originally Posted by *hubwub;12937568*
> I just got a notifier that the EVGA GTX 590 are back on stock on the EVGA website. I was wondering if my power supply would be enough at the moment. It does say on the page that the minimum requirements is a 700W PSU.
> 
> I just want to make sure that this fine for now.


seems fine to me. don't oc nothing though.


----------



## Masked

Quote:


> Originally Posted by *Ghoxt;12936824*
> The nice thing about my bank is that they will confirm a product returned and then Credit my account. They really move to put money back into their hosted account.
> 
> Look at the cover of the Box. 50% Faster by voltage tweak advertised, FALSE.


...And we wonder why the system is so broken









In any event, I had food poisoning + the flu since monday, my first time back near a PC and I'm at work...

Work PC is going strong w/the hydro at 652//1304//1728 ~ That's stock voltages...

Home PC is going strong @ 812/1622 ~


----------



## rush2049

Ok everyone I said I would give some more tests. Here is stock clocks comparing driver .91 with 270.51.

http://3dmark.com/compare/3dmv/3027462/3dmv/3036190

A small gain..... but if you look at the attached graph:










You'll see that the performance is now dropping down at stock clocks, but with the performance gains in the driver it has evened out..... maybe a slight score increase, but that can be due to normal variances.


----------



## soilentblue

just got mine up and running. so far the card seems to be doing good *fingers crossed* gonna install the latest afterburner and make a fan profile.


----------



## soilentblue

latest afterburner/drivers has the voltages completely locked for me. can't even use the bar at all.


----------



## Juggalo23451

Quote:



Originally Posted by *Ghoxt*


The nice thing about my bank is that they will confirm a product returned and then Credit my account. They really move to put money back into their hosted account.

Look at the cover of the Box. 50% Faster by voltage tweak advertised, FALSE.


With proper cooling like LN2


----------



## 2010rig

Quote:



Originally Posted by *Juggalo23451*


With proper cooling like LN2


Doesn't say you need LN2 on the box though, just sayin'.

It just says, Voltage Tweak, shift into overdrive up to 50% faster.


----------



## soilentblue

mafia 2 keeps crashing on me. not sure why. played the first level and it just doesn't seem to like the second chapter.


----------



## Twilex

ughhhhh living in germany can really blow at times. Evga wouldnt send to apo so i had to have it sent to a friend and then to me. Prolly wont be here till monday =/ Thankfully with Evga i can just keep blowing these up until i get a good one, or they come out with a new revision.


----------



## rush2049

Got a question for all you people who have done water cooling before.

This card is making me look seriously at water cooling for it alone, with the possibility of cpu in the future.

What waterblock would you suggest for my asus 590. Or does the fact that it is an asus not matter?

My case is rather cramped (5 hard drives will do that to you), so I would be getting a new case as well, I am eying up the 800D, but can be swayed to anything if there is something better.....

comments, suggestions are welcome.

The reason I am doing this is I like to keep my components under 40 C at all times, and the current 590 hits 60's under load.....


----------



## Twilex

Quote:



Originally Posted by *rush2049*


Got a question for all you people who have done water cooling before.

This card is making me look seriously at water cooling for it alone, with the possibility of cpu in the future.

What waterblock would you suggest for my asus 590. Or does the fact that it is an asus not matter?

My case is rather cramped (5 hard drives will do that to you), so I would be getting a new case as well, I am eying up the 800D, but can be swayed to anything if there is something better.....

comments, suggestions are welcome.

The reason I am doing this is I like to keep my components under 40 C at all times, and the current 590 hits 60's under load.....


Depending on the price of the koolance block, i would say either that or the EK block. Theres also the DD block coming out but im not sure on performance figures. Doesn't really matter that its an asus though. I've got an EK block otw for my classified, so i can let you know performance figures when im done with it. Will be running through a chipset, memory block and a gtx 470 though, so my temps might be a bit diff if you only watercool the 590 and your cpu. I would highly recommend it. Keeps things super cool and makes them last longer. I would also highly recommend the 800D. Such a sexy case with tons of potential. You can easily stick a GTX 360 rad in the top with no mods or a 480 with just a single extra 120mm cut which you could easily run your x6 and 590 through. There's soooo many people on here with one that have gorgeous looking water builds that you can look at for reference.


----------



## rush2049

Let me know how easy it is to remove the heat sink assembly on the 590.....


----------



## Juggalo23451

Quote:



Originally Posted by *rush2049*


Got a question for all you people who have done water cooling before.

This card is making me look seriously at water cooling for it alone, with the possibility of cpu in the future.

What waterblock would you suggest for my asus 590. Or does the fact that it is an asus not matter?

My case is rather cramped (5 hard drives will do that to you), so I would be getting a new case as well, I am eying up the 800D, but can be swayed to anything if there is something better.....

comments, suggestions are welcome.

The reason I am doing this is I like to keep my components under 40 C at all times, and the current 590 hits 60's under load.....


Koolance has been typically the best block for NVIDIA graphics cards. As for as taking the card apart and installing a water block I will let you know. I will be making videos on my YouTube page about it.


----------



## Juggalo23451

Quote:



Originally Posted by *2010rig*


Doesn't say you need LN2 on the box though, just sayin'.

It just says, Voltage Tweak, shift into overdrive up to 50% faster.


That is common sense to me. Your not going to do 4ghz an a processor using stock cooling. You are going to get a after market cooler to do so. So you will have better temps


----------



## coolhandluke41

Quote:



Originally Posted by *rush2049*


Let me know how easy it is to remove the heat sink assembly on the 590.....


here is EK 590 GTX block 
http://www.ekwaterblocks.com/uploads...unted-1200.jpg
http://www.ekwaterblocks.com/index.p...t01returnid=17


----------



## soilentblue

koolance blocks should be arriving monday I was told. ek blocks are in stock at frozencpu. i'm waiting to see how well the design of the koolance block is. they already said that they will definitely be flowing water over the vrms to help cool them.

dangerden's block is on dangerdens website and so far looks to have the best flow. swiftech makes the evga hc blocks.


----------



## soilentblue

Quote:



Originally Posted by *Juggalo23451*


Koolance has been typically the best block for NVIDIA graphics cards. As for as taking the card apart and installing a water block I will let you know. I will be making videos on my YouTube page about it.


dangerden has a video on youtube on how to take the fan assembly off and another one on how to apply their block. get a torx 6 driver and phillips and your all set.


----------



## L D4WG

Hey guys I have a question, hope someone can help, Im organizing for someone for post me an EVGA 590 from the USA to Australia, will it still be covered under warranty if it dies? Im not exactly sure how it all works? would I have to send it back to the USA? or could I send it to EVGA.com.au?

Thanks guys


----------



## Hawk777th

Evga only warrants to the original purchaser with a receipt. So if it bought in the US it wont have your name on the receipt therefore no warranty.

So if someone buys it here for you they would have to be willing for you to send it back to them, then they send it to EVGA.


----------



## Twilex

Sooo something gorgeous came in the mail today...now this gets to sit here and tease me till i get my card =/ I love their new boxes.


















































I'm praying to god my card is here before sat but knowing my luck it prolly wont be till mon or tues unfortunately. Soo if no one beats me too it ill have numbers up asap.


----------



## kcuestag

That looks sexy!!!!!!!!!!!!!!!!!

Congratz, let's hope you get ur card this weekend


----------



## Twilex

Quote:


> Originally Posted by *kcuestag;12944262*
> That looks sexy!!!!!!!!!!!!!!!!!
> 
> Congratz, let's hope you get ur card this weekend


I know!!!


----------



## Masked

I don't really understand what issues you guys are running into w/these cards.

After having a VERY long discussion with someone @XFX I gotta say I disagree with the theory of "throttling" on the 590.

Benchmarks aren't going to do much in terms of scaling b/c SLI hasn't been perfected yet with these cards...it's @ maybe 40% atm which, most of you see for yourselves. I'm seeing it right now @work with CS5, some high-end flash apps are making this driver "issue" very poignant but, throttling? No.

It's going to take some time for the drivers to mature but, that doesn't mean these cards are throttling in any way/shape/form UNTIL the drivers mature and we see that issue, repeatedly...That's been true of any release, even the 6990's.

That being said, gonna start benching my work PC when I get it tweaked again and the home PC will shortly follow.

Food Poisoning sucks balls.


----------



## Pedros

What, you still didn't bench your system Masked? tsk tsk

So... will we see some WC results?







I'm really looking forward to those results


----------



## grunion

Quote:


> Originally Posted by *Masked;12944507*
> I don't really understand what issues you guys are running into w/these cards.
> 
> After having a VERY long discussion with someone @XFX I gotta say I disagree with the theory of "throttling" on the 590.
> 
> Benchmarks aren't going to do much in terms of scaling b/c SLI hasn't been perfected yet with these cards...it's @ maybe 40% atm which, most of you see for yourselves. I'm seeing it right now @work with CS5, some high-end flash apps are making this driver "issue" very poignant but, throttling? No.
> 
> It's going to take some time for the drivers to mature but, that doesn't mean these cards are throttling in any way/shape/form UNTIL the drivers mature and we see that issue, repeatedly...That's been true of any release, even the 6990's.
> 
> That being said, gonna start benching my work PC when I get it tweaked again and the home PC will shortly follow.
> 
> Food Poisoning sucks balls.


Since when does cs5 support sli?


----------



## Hawk777th

Wow dude nice water block. Really looks like nice machining on it!


----------



## JCPUser

Quote:


> Originally Posted by *Masked;12944507*
> I don't really understand what issues you guys are running into w/these cards.
> 
> After having a VERY long discussion with someone @XFX I gotta say I disagree with the theory of "throttling" on the 590.
> 
> Benchmarks aren't going to do much in terms of scaling b/c SLI hasn't been perfected yet with these cards...it's @ maybe 40% atm which, most of you see for yourselves. I'm seeing it right now @work with CS5, some high-end flash apps are making this driver "issue" very poignant but, throttling? No.
> 
> It's going to take some time for the drivers to mature but, that doesn't mean these cards are throttling in any way/shape/form UNTIL the drivers mature and we see that issue, repeatedly...That's been true of any release, even the 6990's.
> 
> That being said, gonna start benching my work PC when I get it tweaked again and the home PC will shortly follow.
> 
> Food Poisoning sucks balls.


I don't think we will EVER have concrete proof of throttling unless Nvidia comes out and says it themselves. Thus there will always be room for people like yourself to blame SLI, drivers, benchmark compat, etc, etc.

The best we are going to get are reports from users and posted testing results -- and from that people are going to draw their own conclusions. There are already some benchmarking results in this thread that imply throttling.

People are going to be convinced with different levels of evidence so while you may not be believe the card is throttling -- just because SOME people do... does not necessarily make them incorrect.


----------



## Pedros

Common... drivers? When i had the Asus on my bench i had a really good performance with the cd drivers ... then the second set of drivers came and i saw the throttle down of the card ...

Testing my POV TGT Ultra Charged ( what a name for such weak performance ), with the latest i got erratic performance ... i mean ...

The problem are the drivers indeed ... that are taking 30% or more of your cards performance


----------



## Masked

Quote:


> Originally Posted by *grunion;12944653*
> Since when does cs5 support sli?


CS5 doesn't support SLI per-say but, when I have 3 instances open on 3 different screens, it's blatantly obvious the work isn't being split.

1st screen I'm doing Alienware's new front page graphic, 2nd screen I'm doing the new rigs and the 3rd screen I'm doing some personal tweeks.

All are CLEARLY relying on GPU1 with it's workload @60% and GPU2 @1% constant.

I have thousands of instances of hard SLI cards, I.E. dual 580's or 6970's splitting the workload but, this card is clearly relying on 1 GPU.
Quote:


> Originally Posted by *JCPUser;12944710*
> I don't think we will EVER have concrete proof of throttling unless Nvidia comes out and says it themselves. Thus there will always be room for people like yourself to blame SLI, drivers, benchmark compat, etc, etc.
> 
> The best we are going to get are reports from users and posted testing results -- and from that people are going to draw their own conclusions. There are already some benchmarking results in this thread that imply throttling.
> 
> People are going to be convinced with different levels of evidence so while you may not be believe the card is throttling -- just because SOME people do... does not necessarily make them incorrect.


I'm simply saying, coming from an office with practically every flavor of card...and having tested as many as we have...

I do not, what-so-ever see ANY evidence of throttling...I do see very poor driver performance...Just as I did with the initial release of the 6990's.

My home PC is at @850mhz and I see clear evidence of very poor scaling and poor drivers...On my attempt to "fix" my own drivers, I feel as if I pushed the card too much so, I decided not to play with them until I had some solid information.

Poor drivers is absolutely the cause of the issue in my opinion and it's something that is going to take time...Until then, we will see performance degrade because the card simply can't equate performance =/= speed, etc...Same exact thing with the 6990 on release...and the 295's.

@Pedro ~ Was in the hospital with food poisoning so bad my muscles stopped allowing me to throw up







~ Still recovering...Will have some form of benches up by Monday!


----------



## saulin

Quote:


> Originally Posted by *rush2049;12927838*
> Ok people, this is what I was telling you all... but you didn't believe me....
> 
> I clocked mine up to 650 mhz, but stock volts for a single run... then quickly put it back to stock....
> 
> Here are the scores of all three, from lowest on the left to highest clock on the right.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here is the graphs for testing in Vantage while it is clocked at 650 mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Notice the two first tests, all the dipping and what have you... thats throttling.... The drops to 0 are loading screens or cpu tests.... the bunch at the end are quick features tests, so ignore them. The first two blocks is what to look at.


I think that's with the extreme preset right?

If not can you try the same tests with the extreme preset to rule out a possible CPU bottleneck


----------



## RagingCain

Cards arrive in 3 hours. Cards are here. Afraid to open them...

Extra radiator, some water cooling stuff, and fittings all arrive tomorrow. I should be benching by tomorrow night, or Saturday. All depending on how much cable management/loop redesigning I do.

Really think I should have added a second D5 pump.

I strangely don't want to go through with all of this work...


----------



## soilentblue

i tested out mafia 2 benchmark and with physx i was barely able to edge out 35fps. tested out the metro 2033 demo and noticed minimum fps as low as 25-27fps with the benchmark. DOF was off. the card isn't as powerful as I thought it would be.

honestly it feels like it's only running on one gpu. i downloaded the latest sli patch but it said that I was already up to date.


----------



## Masked

Quote:


> Originally Posted by *soilentblue;12947720*
> i tested out mafia 2 benchmark and with physx i was barely able to edge out 35fps. tested out the metro 2033 demo and noticed minimum fps as low as 25-27fps with the benchmark. DOF was off. the card isn't as powerful as I thought it would be.
> 
> honestly it feels like it's only running on one gpu. i downloaded the latest sli patch but it said that I was already up to date.


Again, it took the 6990, 2.5 months to be where it's currently at...Yet, we expect Nvidia to be there over-night.

There are obviously some driver issues that still need to be worked out...Not something that's just going to happen like Dorothy clicking her damned heels...


----------



## soilentblue

you do know it really hasn't been 2.5 months correct? i'm not saying there aren't driver issues but i was more curious if i was not doing something correctly by checking an sli box or something.


----------



## saulin

Quote:


> Originally Posted by *soilentblue;12947720*
> i tested out mafia 2 benchmark and with physx i was barely able to edge out 35fps. tested out the metro 2033 demo and noticed minimum fps as low as 25-27fps with the benchmark. DOF was off. the card isn't as powerful as I thought it would be.
> 
> honestly it feels like it's only running on one gpu. i downloaded the latest sli patch but it said that I was already up to date.


Something wrong there bro

It should handle Mafia 2 no problem

http://www.hardwareheaven.com/reviews/1138/pg12/asus-nvidia-geforce-gtx-590-graphics-card-launch-review-mafia-2-physx.html


----------



## RagingCain

Quote:


> Originally Posted by *soilentblue;12947720*
> i tested out mafia 2 benchmark and with physx i was barely able to edge out 35fps. tested out the metro 2033 demo and noticed minimum fps as low as 25-27fps with the benchmark. DOF was off. the card isn't as powerful as I thought it would be.
> 
> honestly it feels like it's only running on one gpu. i downloaded the latest sli patch but it said that I was already up to date.


Quote:


> Originally Posted by *saulin;12948131*
> Something wrong there bro
> 
> It should handle Mafia 2 no problem
> 
> http://www.hardwareheaven.com/reviews/1138/pg12/asus-nvidia-geforce-gtx-590-graphics-card-launch-review-mafia-2-physx.html


If I am not mistaken didn't they use the 267.85 drivers?

Could give them a try. I have noticed about 10~15% performance lost on these new 270 drivers.

I know everyone has jumped on the GTX 590 is fail bandwagon, but watch this regardless. Its still soo funny.

Hitler Disappointed with GTX 590




http://www.youtube.com/watch?v=QFqxn804Vmk[/ame[/URL]]


----------



## cowie

Quote:


> Originally Posted by *CSHawkeye;12928772*
> They were, very unhappy right now so I guess you can make me a former member as EVGA is happily going to replace my video card, but then its going on ebay. I am going to throw back in my dual 6970's.


ok i see you have the 6970's as back up thats cool
you get you rma# yet?
id buy it form you and can avoid ebay fees if you want? i pay paypal.
It would be new in in the wrapper right? never opened?
or if your trying to get a good buck can you atleast post me your ebay link?or in pm?
thanks


----------



## saulin

Hey guys the 267.91 seem to bring some nice improvements

http://www.fudzilla.com/reviews/item/22266-gainward-gtx-590-previewed.


----------



## Draygonn

Quote:


> Originally Posted by *RagingCain;12948193*
> Hitler Disappointed with GTX 590


"Don't worry the cyanide will makes us forget all this"


----------



## soilentblue

Quote:


> Originally Posted by *saulin;12948363*
> Hey guys the 267.91 seem to bring some nice improvements
> 
> http://www.fudzilla.com/reviews/item/22266-gainward-gtx-590-previewed.


not sure about the rest of the test but i know this one is rigged. the 6990 does better than that in metro 2033. fud must have been using old drivers or something.










when I get home today if I have time i'll try out .91 drivers and see if there's any improvement. i got my fingers very crossed that the rumored driver advancements that nvidia is suppose to come out with the first week of april is just what this card needs to shock it back to life.


----------



## rush2049

The .91 drivers have NO DIFFERENCE over the previous drivers (excluding 267.51 which I never tested). The 270.51 have lots of performance gains, but the throttling is more apparent at even lower clocks.

I also somewhat agree with masked. It is a driver issue. I am confident that the driver is telling the card to be over-protective and throttle to save the card until they are able to work out the bugs. I will wait until the drivers are more mature to really push the card, and/or if they have to change the bios to change something the driver can't, then I will wait for that as well.....

Someone asked for me to run the 550, 610, 650 mhz tests at extreme.... sure I can do that, I will post later with the results.


----------



## soilentblue

rush what is the actual number your volts are currently set at?


----------



## RagingCain

After debating the better half of the day I have decided I will indeed be keeping the twins.


----------



## soilentblue

one thing is for sure, this card can definitely use a waterblock.


----------



## sublimejhn

Well, I guess I may as well jump on this card while all the controversy is going on!! Can't really return it now, I'll just wait it out at stock clocks

http://www.techpowerup.com/gpuz/cbrhy/


----------



## RagingCain

I just noticed that every single melted GPU has been Asus.

Anybody remember this backplate, design/serial number sticker? I believe it was posted as an EVGA card yes?

http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/3.html

Or specifically:









Looks familiar, only doesn't have some silicon melted around the center?

I am sick of this crap, misinformation pisses me off.

I am going to bench these cards to hell and back to get to the end of this.


----------



## OverK1LL

Quote:


> Originally Posted by *RagingCain;12951228*
> I just noticed that every single melted GPU has been Asus.
> 
> ...
> 
> I am going to bench these cards to hell and back to get to the end of this.


Twins!! Congrats!! I'm jealous.

Anyways, I am surprised that the only cards that have melted were ASUS. I wonder what is going on. It's almost hard to believe.

You are also very brave to OC the twins just to prove a point. YOU ROCK. lol.


----------



## Pedros

It's a self made man!!!! ehehehhe WE WANT SOLUTIONS, nor Problems ... so ... bench it ... who knows, water cooling it is the best solution


----------



## Toilet-Duck

Nah its not just Asus , Zotac have melted as well at stock settings:










http://forums.overclockers.co.uk/showpost.php?p=18781867&postcount=139


----------



## RagingCain

Quote:


> Originally Posted by *Toilet-Duck;12951946*
> Nah its not just Asus , Zotac have melted as well at stock settings:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showpost.php?p=18781867&postcount=139


Thanks Duck,

Looks like that back plate style is common.


----------



## rush2049

For whoever asked: the default voltage on my card is: .9130 volts and it stays constant like it should no matter what I throw at it.


----------



## Baasha

Well, just completed my rig this week with the EVGA GTX-590 Classified Quad-SLI!









Here are some pics:


























































My idle temps are around 40C. Sometimes lower (~37C) but at load, the temps go up to 85-86C! Is that too high? Are those temperatures dangerous? I have left a gap between the cards (as you can tell from the pictures) so I'm not sure why it's getting so hot under load. I have 17 fans in my case and have the GTX-590 fans turned up to 75% manually during gaming.

However, add me to the list of owners of the GTX-590!









Some benchmarks with the GTX-590 Quad-SLI:


----------



## delavan

I contacted my Etailer and got an RMA number for my card.
I cannot, for the life of me, tolerate such throttling on a new card, that's worth that much money.

I noticed throttling easily. Just play games with EVGA precision in-game overlay..pick GPU clock monitoring, temps, and FPS...you'll see throttle-dips...sometimes one GPU throttling to 554MHz (evga class. here)...
I tried "adaptative" power management and also, maximum performance...

Now I'll figure out what to buy instead...I'd like 2 X 570s in SLI...but the 570 have 4vrms... they are build "weak in the knees"...common nVidia, take your head out of your....


----------



## RagingCain

For the price tag, you could buy approximately 3 x reference 6950s, unlock the bios's and have very high end performance.

Or purchase 2x 6970s as they have quality VRMs similar to the 6990s.


----------



## Masked

Quote:


> Originally Posted by *delavan;12952689*
> I contacted my Etailer and got an RMA number for my card.
> I cannot, for the life of me, tolerate such throttling on a new card, that's worth that much money.
> 
> I noticed throttling easily. Just play games with EVGA precision in-game overlay..pick GPU clock monitoring, temps, and FPS...you'll see throttle-dips...sometimes one GPU throttling to 554MHz (evga class. here)...
> I tried "adaptative" power management and also, maximum performance...
> 
> Now I'll figure out what to buy instead...I'd like 2 X 570s in SLI...but the 570 have 4vrms... they are build "weak in the knees"...common nVidia, take your head out of your....


Omg...

Are you all too young to remember the 295? Honestly?

It's not throttling...It's going to take 3-4 months for this card to work and scale 100% the way it was "intended to".

I'm sick of this "OMG MY CARD IS THROTTLING"...No, it's not.

I have a 590 at home at 850mhz, an evga classy HC and it's absolutely NOT throttling...The drivers suck, absolutely but, that =/= throttling.


----------



## RagingCain

Quote:


> Originally Posted by *Masked;12953341*
> Omg...
> 
> Are you all too young to remember the 295? Honestly?
> 
> It's not throttling...It's going to take 3-4 months for this card to work and scale 100% the way it was "intended to".
> 
> I'm sick of this "OMG MY CARD IS THROTTLING"...No, it's not.
> 
> I have a 590 at home at 850mhz, an evga classy HC and it's absolutely NOT throttling...The drivers suck, absolutely but, that =/= throttling.


Many people don't know about Fermi's issues such as low GPU usage issues broken SLi performance

If thats what he wants to do he is entitled to do it. There is so much negativity around these cards I thought about returning it.

I have more to lose than most, not just my cash, but I am coming from twin GTX 580s.


----------



## reflex99

I want one of these, just so i can pester eVGA for the rest of my life RMAing the thing.


----------



## rush2049

Ok everybody, here is vantage at the extreme preset.

Under: http://3dmark.com/3dmv/3039358
Stock: http://3dmark.com/3dmv/3039374
Over: http://3dmark.com/3dmv/3039425

Here are the three graphs from during the tests....

Under:









Stock:









Over:









And here are the results compared:









So in summary, there are issues like I said. I am waiting for driver improvements.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Many people don't know about Fermi's issues such as low GPU usage issues broken SLi performance

If thats what he wants to do he is entitled to do it. There is so much negativity around these cards I thought about returning it.

I have more to lose than most, not just my cash, but I am coming from twin GTX 580s.


And I'm coming from a 6990...

I realize I live in an office of enthusiasts, I get that however, this card has yet to mature...

Nvidia ****ed up, period...Bad bios, bad drivers...Cards over vaulted; I get that.

Yet, there's so much mis-information being passed around, like throttling, that even those of you IN THIS THREAD, begin to believe it and it's pure bull****.

I just updated my NGINX server yesterday, finally ditching a lifetime of actual throttling on an apache server...So, believe me, I KNOW what throttling is.

The drivers aren't mature enough to call it anything beyond choking and I'd actually give you choking because the drivers suck huge ass but, we're FAR from throttling.

In 2-3 months if we're seeing the same results; it's throttling...For now, it's not.

Same thing goes for these cards "blowing up"...The bios sucks ass, looks like a 6th grader wrote it, I agree 100% they dropped the ball...Just don't be stupid and your card won't burn out.

I mean common sense guys...come on.

Same EXACT thing happened with the 295's...Same EXACT thing happened with the 59's and the 69's...How quickly we are to forget...


----------



## delavan

Quote:



I'm sick of this "OMG MY CARD IS THROTTLING"...No, it's not.


KK, how do you call a vidcard that downclocks??? I call that throttling myself...as well as the periodical dips in FPS that are coming at the same time...

BTW, I remember the 295, but never looked into it myself...feed me on how well it matured over time...enlighten me...


----------



## reflex99

The 6990 has OCP (or power tune/whatever they call it), but it isn't nearly as aggressive as the 590.

We don't forget the 590, because it will explode if you try to run it without extreme OCP.

To me, the VRMs are a bigger fail than the throttling. I can deal with throttling, it is still a fast card, but if it explodes, a 9400GT is faster.


----------



## Masked

Quote:


> Originally Posted by *delavan;12953600*
> KK, how do you call a vidcard that downclocks??? I call that throttling myself...as well as the periodical dips in FPS that are coming at the same time...
> 
> BTW, I remember the 295, but never looked into it myself...feed me on how well it matured over time...enlighten me...


Quote:


> Originally Posted by *reflex99;12953621*
> The 6990 has OCP (or power tune/whatever they call it), but it isn't nearly as aggressive as the 590.
> 
> We don't forget the 590, because it will explode if you try to run it without extreme OCP.
> 
> To me, the VRMs are a bigger fail than the throttling. I can deal with throttling, it is still a fast card, but if it explodes, a 9400GT is faster.


The concept of throttling can only exist when a card is actually working and the 590 is NOT ACTUALLY WORKING!

Surprise!

When the drivers mature, you can call it throttling...

The 295 release was the same, the 2nd core didn't work for a month...The 5990, holy ****, I've got stories...This is typical.

It's going to take 2-3 months for the drivers to mature and then we'll see a, hopefully, 100% functioning card...

If it's holding back performance when that happens, we'll call it throttling but, for now...You can't really call it anything because the drivers suck so much, we don't even know how it's supposed to perform.


----------



## Strider_2001

Currently have both cores folding at 700MHZ...Still wont touch the volts cause I am scarred...









Anyway here is a screenshot of my GPU-Z.

I Will upload pics of it tomorrow...


----------



## JCPUser

Quote:


> Originally Posted by *Masked;12953732*
> The concept of throttling can only exist when a card is actually working and the 590 is NOT ACTUALLY WORKING!
> 
> Surprise!
> 
> When the drivers mature, you can call it throttling...
> 
> The 295 release was the same, the 2nd core didn't work for a month...The 5990, holy ****, I've got stories...This is typical.
> 
> It's going to take 2-3 months for the drivers to mature and then we'll see a, hopefully, 100% functioning card...
> 
> If it's holding back performance when that happens, we'll call it throttling but, for now...You can't really call it anything because the drivers suck so much, we don't even know how it's supposed to perform.


You can call it "NOT WORKING" or "throttling" or say it is "being gimped by fairies" -- who *bleeping* cares -- the point is the card is not functioning as advertised so let the man return his card in peace.

Talk about forcing your opinion on others...


----------



## BigCactus

Alright, which one of you bought the 590 for $1000 on ebay and had the card fry on you? I wonder can you get a warranty if you bought it on ebay? Man that would just suck. And I don't see how there are all these cards on ebay and none in retail stores.


----------



## Fullmetalaj0

I have a question, if I were to pair this with a AMD CPU, would tht work just fine without needing any kind of "hack"?


----------



## JCPUser

Quote:


> Originally Posted by *Fullmetalaj0;12954966*
> I have a question, if I were to pair this with a AMD CPU, would tht work just fine without needing any kind of "hack"?


Yes, you would be fine. The "hack" is only needed if you have 2 separate cards.

Basically, in the case of the 590 the connection between the GPUs is all handled internally (on the card) whereas for normal SLI, let's say 570 SLI, the board/chipset is also involved which is why you need a "SLI compatible" motherboard.


----------



## Arizonian

Quote:


> Originally Posted by *reflex99;12953621*
> The 6990 has OCP (or power tune/whatever they call it), but it isn't nearly as aggressive as the 590.
> 
> We don't forget the 590, because it will explode if you try to run it without extreme OCP.
> 
> To me, the VRMs are a bigger fail than the throttling. I can deal with throttling, it is still a fast card, but if it explodes, a 9400GT is faster.


Quote:


> Originally Posted by *reflex99;12953481*
> I want one of these, just so i can pester eVGA for the rest of my life RMAing the thing.


Hey Reflex this is the GTX 590 Club, I think you want to post your argument on the AMD 6990 vs GTX 590 thread. We are well aware of the issue as high end enthusiast. The GTX 590 owners don't buy your propaganda.


----------



## Pedros

One thing ... people keep talking about the GTX295 also having problems like the GTX590...

Please... don't even say that ... The problems were far from being the same...

I.E. I had a GTX295 during 2 years, where the last year was runing at 1.20v at 805Mhz ...

Never had a single freaking problem... and i know at least 10 more users with 295's without problems ...

It's been 8 days from the 590 launch and i already know 4 guys that saw the 590 "blow" ...

So... don't try to "escuse" 590 problems with the problems that other cards may have had or something ...

People try to find escuses or to try some kind of "self acceptance" for this situation by saying that other cards had problems too ... meh ...

Oh ... and don't forget that the 295 was cheaper than this one ... i paid 499Euros for mine ...

But... i'm still looking forward to water cooled benchmarks and some oc action ... maybe what this cards really wants is a cold "body"


----------



## Alatar

updated.


----------



## sl00tje

Quote:


> Originally Posted by *Alatar;12957108*
> updated.


Update the driver links


----------



## Alatar

Quote:


> Originally Posted by *sl00tje;12957133*
> Update the driver links










I completely forgot those..

I think there's no need update the links for every single new driver released. The link now leads to the general driver download page on nvidia.com


----------



## Masked

Quote:


> Originally Posted by *Pedros;12956969*
> One thing ... people keep talking about the GTX295 also having problems like the GTX590...
> 
> Please... don't even say that ... The problems were far from being the same...
> 
> I.E. I had a GTX295 during 2 years, where the last year was runing at 1.20v at 805Mhz ...
> 
> Never had a single freaking problem... and i know at least 10 more users with 295's without problems ...
> 
> It's been 8 days from the 590 launch and i already know 4 guys that saw the 590 "blow" ...
> 
> So... don't try to "escuse" 590 problems with the problems that other cards may have had or something ...
> 
> People try to find escuses or to try some kind of "self acceptance" for this situation by saying that other cards had problems too ... meh ...
> 
> Oh ... and don't forget that the 295 was cheaper than this one ... i paid 499Euros for mine ...
> 
> But... i'm still looking forward to water cooled benchmarks and some oc action ... maybe what this cards really wants is a cold "body"


Incorrect.

The 295 release was rittled with errors. The card did NOT launch well.

It DID, recover well because Nvidia was ontop of it HOWEVER, the damage had already been done.

I'm not trying to jump down people's throats and force my opinion, I apologize if that seems to be the case...I'm frustrated with the amount of fanboys jumping ship and plaguing this release to be far worse than it is.

The bios is shoddy...It's over-volting MOST cards at stock which is why they're frying...It's in NO WAY the card itself, it's a horribly executed bios. A fix is in the works.

Same goes for the drivers...The vast majority of cards frying ARE a combination of the bios and drivers being garbage; I 100% stand behind that and completely agree.

That doesn't mean we're giving NVIDIA a pass at being as driver responsible as ATI but, it IS the situation.

A card can really only throttle when you know the potential...It is NOT currently software controlled, it is nothing like DFS; simply, at this point, I would call it choking.

I have absolutely NONE of the issues you all complain about. I have some meh results in a few games but, the scaling on these cards is still terrible so, I'm going to wait for a couple drivers to pass my opinion...If you guys want to sell, sell...

Just don't call it throttling when there's absolutely no evidence of such.

There is evidence it can't perform but, there's no evidence of the card scaling back performance in any form/function...The 4th gear it's supposed to be hitting, just isn't there atm; when it's actually there and we're NOT hitting it because the card scales, then it's throttling...That's all I'm saying.


----------



## Arizonian

Quote:


> Originally Posted by *Masked;12957323*
> Incorrect.
> 
> The 295 release was rittled with errors. The card did NOT launch well.
> 
> It DID, recover well because Nvidia was ontop of it HOWEVER, the damage had already been done.
> 
> I'm not trying to jump down people's throats and force my opinion, I apologize if that seems to be the case...I'm frustrated with the amount of fanboys jumping ship and plaguing this release to be far worse than it is.
> 
> The bios is shoddy...It's over-volting MOST cards at stock which is why they're frying...It's in NO WAY the card itself, it's a horribly executed bios. A fix is in the works.
> 
> Same goes for the drivers...The vast majority of cards frying ARE a combination of the bios and drivers being garbage; I 100% stand behind that and completely agree.
> 
> That doesn't mean we're giving NVIDIA a pass at being as driver responsible as ATI but, it IS the situation.
> 
> A card can really only throttle when you know the potential...It is NOT currently software controlled, it is nothing like DFS; simply, at this point, I would call it choking.
> 
> I have absolutely NONE of the issues you all complain about. I have some meh results in a few games but, the scaling on these cards is still terrible so, I'm going to wait for a couple drivers to pass my opinion...If you guys want to sell, sell...
> 
> Just don't call it throttling when there's absolutely no evidence of such.
> 
> There is evidence it can't perform but, there's no evidence of the card scaling back performance in any form/function...The 4th gear it's supposed to be hitting, just isn't there atm; when it's actually there and we're NOT hitting it because the card scales, then it's throttling...That's all I'm saying.


x2 +1 rep


----------



## cowie

Quote:


> Originally Posted by *Masked;12957323*
> Incorrect.
> 
> The 295 release was rittled with errors. The card did NOT launch well.
> 
> It DID, recover well because Nvidia was ontop of it HOWEVER, the damage had already been done.
> 
> .


no way did this card have issues that were not about drivers only.
it had newer drivers put out that fixed alot of the sli scaling issues,what x2 cored from either brand does not at launch?
It was built like a tank and mine from day one to this day was overclocked and overvolted(using rivatuner) the rest i agree with.


----------



## RagingCain

A 3Dfx Voodoo GTX 590 !

http://www.geforce.com/#/News/articles/voodoo-revived
HAPPY APRIL FOOLS DAY!


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;12957750*
> A 3Dfx Voodoo GTX 590 !
> 
> http://www.geforce.com/#/News/articles/voodoo-revived
> HAPPY APRIL FOOLS DAY!


omg lol







that's just awesome, Just imagine them really releasing a 590 3dfx version! (assuming april fools ofc )


----------



## Pedros

Quote:


> Originally Posted by *Masked;12957323*
> Incorrect.
> 
> The 295 release was rittled with errors. The card did NOT launch well.
> 
> It DID, recover well because Nvidia was ontop of it HOWEVER, the damage had already been done.
> 
> I'm not trying to jump down people's throats and force my opinion, I apologize if that seems to be the case...I'm frustrated with the amount of fanboys jumping ship and plaguing this release to be far worse than it is.
> 
> The bios is shoddy...It's over-volting MOST cards at stock which is why they're frying...It's in NO WAY the card itself, it's a horribly executed bios. A fix is in the works.
> 
> Same goes for the drivers...The vast majority of cards frying ARE a combination of the bios and drivers being garbage; I 100% stand behind that and completely agree.
> 
> That doesn't mean we're giving NVIDIA a pass at being as driver responsible as ATI but, it IS the situation.
> 
> A card can really only throttle when you know the potential...It is NOT currently software controlled, it is nothing like DFS; simply, at this point, I would call it choking.
> 
> I have absolutely NONE of the issues you all complain about. I have some meh results in a few games but, the scaling on these cards is still terrible so, I'm going to wait for a couple drivers to pass my opinion...If you guys want to sell, sell...
> 
> Just don't call it throttling when there's absolutely no evidence of such.
> 
> There is evidence it can't perform but, there's no evidence of the card scaling back performance in any form/function...The 4th gear it's supposed to be hitting, just isn't there atm; when it's actually there and we're NOT hitting it because the card scales, then it's throttling...That's all I'm saying.


Masked, can you please clarify which where the problems with the GTX295 launch? Yes, it had drivers problems, as all SLi 2 years ago ... i just saw my card supporting many games in 2010 ... but ... as far as physical/design problems... can you please describe which where?

I'm not being sarcastic ... i'm really trying to find out something i wasn't aware ... i saw drivers problems and heat problems ( that mainly occurred at lower fan speeds ) ... at least in the single pcb ... the dual pcb didn't had heat problems ...

Sorry to ask this in this thread, i know it's about 590 but ... it's pure curiosity and with informative title so we can compare each cards launch


----------



## Masked

Quote:


> Originally Posted by *Pedros;12958102*
> Masked, can you please clarify which where the problems with the GTX295 launch? Yes, it had drivers problems, as all SLi 2 years ago ... i just saw my card supporting many games in 2010 ... but ... as far as physical/design problems... can you please describe which where?
> 
> I'm not being sarcastic ... i'm really trying to find out something i wasn't aware ... i saw drivers problems and heat problems ( that mainly occurred at lower fan speeds ) ... at least in the single pcb ... the dual pcb didn't had heat problems ...
> 
> Sorry to ask this in this thread, i know it's about 590 but ... it's pure curiosity and with informative title so we can compare each cards launch


Verrrry busy work day.

When the 295 was originally released...To put it bluntly, same exact issues EXCEPT, the PCB was obviously much more "robust".

First week of January 2009; I believe, don't feel like looking it up...was the release date.

Anyway, we have 2 samples in the office that fried out due to the card improperly reporting temps...Those were release samples or beta samples and we were sent 2 more... I think they're in a dead box in the attic...Which I have 0 motivation to go get but, the difference is/was NVIDIA acted IMMEDIATELY on the issues people had.

It wasn't a week between drivers...It was days and sometimes HOURS...Go back and look through the beta release drivers in Dec/Jan of 2008/2009...There were MANY and a new Bios. Even more on Guru 3d because people were making their own so the card would properly scale...

I might still have the emails in the Apache backup but, back then (2 years haha) Nvidia would actually email//talk to you...Wasn't hard to get a beta driver like *snap*.

Same went for the 7900s and the 7950s...These dual cards having problems is always true...This release, there are just as many as there always have been they're just not ontop of it as much as they have been in the past.

It's very evident some people wish to throw away history and blame the card but, when there's a history of these issues occuring, it's better to just be patient.


----------



## DarthShader

Quote:


> Originally Posted by *RagingCain;12951228*
> Anybody remember this backplate, design/serial number sticker? I believe it was posted as an EVGA card yes?
> 
> Looks familiar, only doesn't have some silicon melted around the center


Nope, this one ends with a "75", while the burned EVGA card, which is an OEM part and hence has a stock template, has "53".
Quote:


> Just don't call it throttling when there's absolutely no evidence of such.


But you have evidence in this very thread, where a higher clocked card get lower scores. Driver? Since when did drivers react to higher clocks like that? CPU bottlenck? Then the score would be equal, not lower. So how is this possible?


----------



## saulin

Quote:


> Originally Posted by *rush2049;12953493*
> Ok everybody, here is vantage at the extreme preset.
> 
> Under: http://3dmark.com/3dmv/3039358
> Stock: http://3dmark.com/3dmv/3039374
> Over: http://3dmark.com/3dmv/3039425
> 
> Here are the three graphs from during the tests....
> 
> Under:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Over:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here are the results compared:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So in summary, there are issues like I said. I am waiting for driver improvements.


That's not throttling though. Driver issues sure. And before perhaps you were being bottlenecked by your CPU. Read up of Fermi low GPU ussage. They like i7s too.
Quote:


> Originally Posted by *Masked;12953341*
> Omg...
> 
> Are you all too young to remember the 295? Honestly?
> 
> It's not throttling...It's going to take 3-4 months for this card to work and scale 100% the way it was "intended to".
> 
> I'm sick of this "OMG MY CARD IS THROTTLING"...No, it's not.
> 
> *I have a 590 at home at 850mhz, an evga classy HC and it's absolutely NOT throttling...The drivers suck, absolutely but, that =/= throttling.[/*QUOTE]
> 
> That's what I wanted to hear. Anyone that has a GTX 590 and is scare to even overclock at stock voltage really has no need to overclock or really is way too scare. I mean come on the card has warranty even if it fails overclocked as long as you keep the voltages within limits.
> 
> Personally if I had one of this. Just to put an end to rumours or to find out the truth.
> 
> I would overclock it untill I know for a fact that it throttles. I would go as high as 1.0v too. That's just me though.
> 
> But make sure you are using extreme settings to make sure you are not bottlenecking the card too.


----------



## Pedros

Masked... so you clocked your card to 850? What voltages?


----------



## soilentblue

Quote:


> Originally Posted by *DarthShader;12958997*
> Nope, this one ends with a "75", while the burned EVGA card, which is an OEM part and hence has a stock template, has "53".
> 
> But you have evidence in this very thread, where a higher clocked card get lower scores. Driver? Since when did drivers react to higher clocks like that? CPU bottlenck? Then the score would be equal, not lower. So how is this possible?


i'm seeing scores with games and benchmarks equaling to single gpu scores instead of dual gpu scores. these drivers are not good at all currently.
Quote:


> Originally Posted by *Masked;12958955*
> Verrrry busy work day.
> 
> When the 295 was originally released...To put it bluntly, same exact issues EXCEPT, the PCB was obviously much more "robust".
> 
> First week of January 2009; I believe, don't feel like looking it up...was the release date.
> 
> Anyway, we have 2 samples in the office that fried out due to the card improperly reporting temps...Those were release samples or beta samples and we were sent 2 more... I think they're in a dead box in the attic...Which I have 0 motivation to go get but, the difference is/was NVIDIA acted IMMEDIATELY on the issues people had.
> 
> It wasn't a week between drivers...It was days and sometimes HOURS...Go back and look through the beta release drivers in Dec/Jan of 2008/2009...There were MANY and a new Bios. Even more on Guru 3d because people were making their own so the card would properly scale...
> 
> I might still have the emails in the Apache backup but, back then (2 years haha) Nvidia would actually email//talk to you...Wasn't hard to get a beta driver like *snap*.
> 
> Same went for the 7900s and the 7950s...These dual cards having problems is always true...This release, there are just as many as there always have been they're just not ontop of it as much as they have been in the past.
> 
> It's very evident some people wish to throw away history and blame the card but, when there's a history of these issues occuring, it's better to just be patient.


go on the evga forums. those cards had ALOT of problems with them. the gtx 590 has like 8 cards maybe that have popped. the 295 has seen more cards than that die in less than a week.


----------



## Farih

If its not throttling but you see the card core frequency fall back to lower then set rates how do you call it then ? isnt that throttling.

Same if you set power limit in CCC lower you see amd cards throttle back in core frequency.

If this isnt throttling what is it then ?
you could say choking, but doesnt choking it throttle the card to lower core frequencys ?
still throttling in my eyes.

i am not trying to be smart, i just want to understand the whole meaning of throttling now since its getting confusing


----------



## soilentblue

Quote:


> Originally Posted by *Pedros;12959242*
> Masked... so you clocked your card to 850? What voltages?


and on what drivers? i'm not capable of raising volts above 900mv


----------



## Masked

Quote:


> Originally Posted by *Farih;12959337*
> If its not throttling but you see the card core frequency fall back to lower then set rates how do you call it then ? isnt that throttling.
> 
> Same if you set power limit in CCC lower you see amd cards throttle back in core frequency.
> 
> If this isnt throttling what is it then ?
> you could say choking, but doesnt choking it throttle the card to lower core frequencys ?
> still throttling in my eyes.
> 
> i am not trying to be smart, i just want to understand the whole meaning of throttling now since its getting confusing


I personally believe that in 2-3 months when this card ACTUALLY has a 4th gear...If it scales back, it's throttling.

Right now, the performance isn't there because the drivers aren't...There's no argument about that.

There's also a NEW bios being released.

I have no issues if you all choose to label it was "throttling"; I just find it to be an unfair conclusion considering the card hasn't even matured to it's potential.

It would be like seeing a 10 year old running and saying "Oh, that's the fastest he'll ever run, he's always going to have short legs." ~ It's not true nor remotely accurate to the situation.


----------



## soilentblue

nvidia said there isn't a new bios being released.

also if it's not throttling then a readout of the stock volts should be as flat as the core clock. i may see if i can get a readout of the volts at stock when i get home today.


----------



## Farih

Quote:


> Originally Posted by *Masked;12959475*
> I personally believe that in 2-3 months when this card ACTUALLY has a 4th gear...If it scales back, it's throttling.
> 
> Right now, the performance isn't there because the drivers aren't...There's no argument about that.
> 
> There's also a NEW bios being released.
> 
> I have no issues if you all choose to label it was "throttling"; I just find it to be an unfair conclusion considering the card hasn't even matured to it's potential.
> 
> It would be like seeing a 10 year old running and saying "Oh, that's the fastest he'll ever run, he's always going to have short legs." ~ It's not true nor remotely accurate to the situation.


Am i right in saying this ?

"What you see is indeed throttling but it could be issue's givin with the bad drivers they have now. When drivers are near perfect and it still happens you can call that throttling. Now because drivers are bad it might throttle more/less then needed, you have no clue if its right or not due to the bad drivers."

Dit i understand your words proper now ?

I hope Nvidia gets it all worked out soon though for all the people that allready have bought the card.


----------



## Masked

Quote:


> Originally Posted by *Farih;12959654*
> Am i right in saying this ?
> 
> "What you see is indeed throttling but it could be issue's givin with the bad drivers they have now. When drivers are near perfect and it still happens you can call that throttling. Now because drivers are bad it might throttle more/less then needed, you have no clue if its right or not due to the bad drivers."
> 
> Dit i understand your words proper now ?
> 
> I hope Nvidia gets it all worked out soon though for all the people that allready have bought the card.


I absolutely disagree that it's even throttling currently.

1.03v (1.05 real), 850mhz 1732 ~ I see absolutely NO evidence of throttling.

I see evidence of terrible performance due to driver issues...I see evidence of non-compatability with benches again, due to driver issues. I see a new driver on the horizon courtesy of Nvidia's recent "leaks".

I do not see ANY evidence what-so-ever of this card throttling, yet.

If the drivers are fixed and the card is still underperforming in 2-3 months MAYBE we'll call that throttling...Until benches actually WORK with the drivers and until the card ACTUALLY SCALES, it's not fair to claim it's throttling.

Remember the kid with small legs? You can't be on the lacrosse team because your legs are too small...That's what you're saying.


----------



## Pedros

... Masked ... you must be an unique user ...

I've benched a Asus GTX590 for 2 days before launch and this week i've been bench testing the POV TGT 590GTX Ultra Charged ...

OCP does throttle the core speed down ... It's not about the drivers being bad ( just ) ... It's about Nvidia OCP activity being over-protective since the last 2 driver sets.

I tested with the CD ones and with the update that Nvidia launched. It was clear that the OCP was more agressive and was holding the card back. For record, with the first drivers i was able to get 38k in 3dmark ... with the update i went to 36k ... and overclocked i went to 35k ...

Testing at 770 gave me lower scores than 670 ...

So it's pretty clear that OCP is doing it's job by not allowing the card to leave that "confort zone" of power usage ... and the more overclock you give, the more agressive the throttle will be.

So, re-test your benchmarks and "re-think" what you mean with throttling ...


----------



## Masked

Quote:


> Originally Posted by *Pedros;12960178*
> ... Masked ... you must be an unique user ...
> 
> I've benched a Asus GTX590 for 2 days before launch and this week i've been bench testing the POV TGT 590GTX Ultra Charged ...
> 
> OCP does throttle the core speed down ... It's not about the drivers being bad ( just ) ... It's about Nvidia OCP activity being over-protective since the last 2 driver sets.
> 
> I tested with the CD ones and with the update that Nvidia launched. It was clear that the OCP was more agressive and was holding the card back. For record, with the first drivers i was able to get 38k in 3dmark ... with the update i went to 36k ... and overclocked i went to 35k ...
> 
> Testing at 770 gave me lower scores than 670 ...
> 
> So it's pretty clear that OCP is doing it's job by not allowing the card to leave that "confort zone" of power usage ... and the more overclock you give, the more agressive the throttle will be.
> 
> So, re-test your benchmarks and "re-think" what you mean with throttling ...


Why would I test a product...That 1) Doesn't have working drivers, 2) Isn't scaling and 3) Isn't even properly voltaging...On a benching program that requires 1, 2 and 3 to actually test the performance of a card?

Fact is simple: As an enthusiast I would not and as a professional, I would not.

As I said, I see horrific driver performance, a bad bios and other culprits...I do NOT see throttling.

I'm re-thinking nothing because we haven't even hit the stage where the card is properly scaling.

Again, refer back to history...7900, 7950 AND the 295...It will be a while before we can properly bench or even see promised results...I suggest patience


----------



## saulin

Quote:


> Originally Posted by *Pedros;12960178*
> ... Masked ... you must be an unique user ...
> 
> I've benched a Asus GTX590 for 2 days before launch and this week i've been bench testing the POV TGT 590GTX Ultra Charged ...
> 
> OCP does throttle the core speed down ... It's not about the drivers being bad ( just ) ... It's about Nvidia OCP activity being over-protective since the last 2 driver sets.
> 
> I tested with the CD ones and with the update that Nvidia launched. It was clear that the OCP was more agressive and was holding the card back. For record, with the first drivers i was able to get 38k in 3dmark ... with the update i went to 36k ... and overclocked i went to 35k ...
> 
> Testing at 770 gave me lower scores than 670 ...
> 
> So it's pretty clear that OCP is doing it's job by not allowing the card to leave that "confort zone" of power usage ... and the more overclock you give, the more agressive the throttle will be.
> 
> So, re-test your benchmarks and "re-think" what you mean with throttling ...


The POV TGT 590GTX Ultra Charged comes at 690Mhz, how far does it overclock?

I also think rush2049's card can't be throttling, his CPU is holding him back. His scores are low too. Plus his overclock was only 630Mhz


----------



## Pedros

Ok guys ... now you are saying that Nvidia doesn't know how to do drivers ... Taking what you say masked... if they had problems in drivers in the past, for sure, that they have learned something! And... Nvidia is well known for capable drivers and a good dev team ... if they didn't get the drivers right for the 590, then i must take my hat to the amd driver development team that made the 6990 work without driver problems ...

It's just a matter of taking 5 min to think about things and leave the "color preferences" aside! I mean... So... cards have blown ... the drivers are bad ...

Was there anything good with the 590 ?

But ok ... it's the drivers ... if you say it's the drivers, then it's the drivers. We'll see in a couple of months if, in fact, you could blame the drivers.

Saulin ... it's just more of the same ... Ultra charged are just cards that were able to handle 690Mhz without any voltage tweaks ... And they warrant the card at those speeds... that's why it costs more money


----------



## cowie

The 7900x2(not7950x2) was the only card since the 590 that had any issues with reviewers...
And the 7900x2 was not blowing up just crappy drivers with hardly any scaling 
I'v owned all three x2 cards built by nvidia since day of release so yes i'm old enough to know.
sry for off topic


----------



## Masked

Quote:



Originally Posted by *cowie*


The 7900x2(not7950x2) was the only card since the 590 that had any issues with reviewers...
And the 7900x2 was not blowing up just crappy drivers with hardly any scaling 
I'v owned all three x2 cards built by nvidia since day of release so yes i'm old enough to know.
sry for off topic


This is not accurate.

The 295's had EXTREME issues with reviewers but, those reviewers simply didn't review those cards...Some were given new cards for reviewing so fast it would've made your head spin...Look back and see who reviewed the card and who was around but, didn't...You'll start to see something interesting...























As several of us have said, the 295's had MORE blowouts and fried cards than this run did HOWEVER, Nvidia was at our side the entire time for the entire release.

As of now, for the 590 release, they're not.

@Pedros 
I'm saying look at history...This is typical of ANY dual GPU card...from Nvidia...It takes a couple months to work itself out AND I just got news of a new bios for the second time today so, I'm leaning on the fact that within a "safe" time period, we'll see a "safer and newly discovered" bios, soon.


----------



## delavan

Masked,

good to see you sticking to your guns...but anyway, MY card was being DOWNCLOCKED at times...call it whatever you like...

BTW, I sent my card back for full refund...I don't like playing guinea pig with nVIDIA...You can take my name off the fan club list...

Now what i'm looking for is GTX570s with better VRMs..the only ones are the ASUS DirectCuII, but they're 3-slots cards...HARD to SLI...doesn't fit on my mobo...all other 570 have 4 power phases only...


----------



## grunion

I don't ever recall gtx295s blowing out. 
I don't recall any negative feedback during reviews, other than hot, loud, etc..

I had 2 of them, dual pcb, are you perhaps referring to the single pcb 295?

I can remember one game and one synthetic that didn't do well, Bioshock and 06.
Other than that I had no issues with my 295s.
Even here in our 295 members thread, there was more praise than anything regarding the 295.


----------



## rush2049

hey masked could you post a screenshot of gpuz at your overclock for me? Also... run a single run of 3dmark and post the link..... I am calling your bluff on that overclock, unless I see some proof that it can stably complete a benchmark.
-----------------

And yes I realize my cpu isn't amazing awesome..... I do not think it is holding back the graphics card though. I will work on overclocking it today... see if I can break the 4 ghz barrier with some volt increases.....


----------



## delavan

even with my 2500K at 4.6, the dips were there...

MASK, play with your card, use EVGA precision or another monitoring software that has IN-GAME overlay...and tell me your clocks never change whatsoever...


----------



## Masked

Quote:



Originally Posted by *grunion*


I don't ever recall gtx295s blowing out. 
I don't recall any negative feedback during reviews, other than hot, loud, etc..

I had 2 of them, dual pcb, are you perhaps referring to the single pcb 295?

I can remember one game and one synthetic that didn't do well, Bioshock and 06.
Other than that I had no issues with my 295s.
Even here in our 295 members thread, there was more praise than anything regarding the 295.


Perhaps not here but, EVGA's forums, back when Asus had forums...They were littered with people whom had their 295's blown out within the first 2 weeks.

Ours blew out and then the replacements blew out...Thus, this office never actually went with the 295's; we stuck with 280's and later the 285's.

There's definitely info out there on it...It was a big deal for about 2 weeks then forgotten about...This 590 issue, isn't going to be a quick fix, unfortunately.

Quote:



Originally Posted by *rush2049*


hey masked could you post a screenshot of gpuz at your overclock for me? Also... run a single run of 3dmark and post the link..... I am calling your bluff on that overclock, unless I see some proof that it can stably complete a benchmark.
-----------------

And yes I realize my cpu isn't amazing awesome..... I do not think it is holding back the graphics card though. I will work on overclocking it today... see if I can break the 4 ghz barrier with some volt increases.....


The issue is...You're not listening.

Even if you bench this card, you won't get real results...Drivers haven't matured enough to get a fair assessment so, why would I waste my time?

Since I'm still at work, here's my GPU-Z of the office PC. I step it up daily to see what I can do under stock voltages until the new drivers come out but, quite frankly, without the new drivers...I haven't even installed vantage at home yet and it won't do anyone, any good because SLI isn't scaling properly.

When working drivers come out, that actually scale, that don't choke the card...that function WELL...I'll be among the first people to give the vast majority the finger...Until then, it's sheathed.

Work PC btw:


It's 24/7 ~ I haven't started tweaking the MC yet b/c I hear that's where people have the most issues but, I have none.


----------



## soilentblue

http://www.overclock.net/nvidia/750870-gtx-295-dead.html

here's a dead one on our own site around 6 months ago

http://www.evga.com/forums/tm.aspx?m=913181

here's a 295 with the same issue as the 590


----------



## Abiosis

Quote:


> Originally Posted by *Masked;12963491*
> Perhaps not here but, EVGA's forums, back when Asus had forums...They were littered with people whom had their 295's blown out within the first 2 weeks.
> 
> Ours blew out and then the replacements blew out...Thus, this office never actually went with the 295's; we stuck with 280's and later the 285's.
> 
> There's definitely info out there on it...It was a big deal for about 2 weeks then forgotten about...This 590 issue, isn't going to be a quick fix, unfortunately.


_I got both cards...

I had my GTX295 (duel pcb) for over two years and it has been overclocked to 686c\1496\1213 since day one...

it plays every games smoothly (except you know)

I'm @stock(630 clock) with my GTX590 right now...

it's fast enough for my need...and I won't lost sleep if I didn't overclock/overvolting it...and I already tested almost every game that I have...it seems nothing wrong with my card...

but for enthusiast folks here...spending US$700-800+ or even over thousand may expecting more which totally entitled to...

but nonetheless, Masked,I'm praising your positive energy that you bring to this GTX590 issues and to the Club as we may need it at the moment ~

Peace
_


----------



## cowie

Quote:


> Originally Posted by *soilentblue;12963752*
> http://www.overclock.net/nvidia/750870-gtx-295-dead.html
> 
> here's a dead one on our own site around 6 months ago
> 
> http://www.evga.com/forums/tm.aspx?m=913181
> 
> here's a 295 with the same issue as the 590


if thats what you call on lease week then ok









when it comes to talk of the 295's the only thing blowing was out of somes you know what....


----------



## soilentblue

here's my proof that I own one by the way.


----------



## Pedros

... You folks remind me of something funny ...





http://www.youtube.com/watch?v=CXl1GkWWGmA[/ame[/URL]]

This guy was amazing... the tanks in Bagdad and he saying they weren't near ...
It's almost like the GTX590 ...









I've been a nvidia user for many years now ... but we need to clear our thoughts and see what's happening ...

The silence from Nvidia, not saying anything official ... either " The Card is working properly ... this was how we projected ... " or " There's a problem... " is, at least, uninspiring ...


----------



## sublimejhn

Quote:


> Originally Posted by *cowie;12963903*
> if thats what you call on lease week then ok
> 
> 
> 
> 
> 
> 
> 
> 
> 
> when it comes to talk of the 295's the only thing blowing was out of somes you know what....


Agreed. ANY card can die at some point. Finding random GTX295's that have died should be fairly easy considering how long they have been out at this point. I just swapped my BFG GTX295 for this GTX590, and I have had NO issues with my old dual PCB card. I have also NEVER heard of issues with them blowing up other than 1 or 2 cards dieing over time, as is expected with any electronics.

You keep saying the evidence of the same thing happening with the GTX295s is out there so we need to look it up. YOU are the one making such a bold claim that nobody here seems to remember. If the GTX295 had these issues, YOU prove it and show some links, since YOU are making the claims. That's how using facts to backup your statements works. I can tell you right now these issues did not exist with the GTX295. There were some driver issues related to SLI, that's about it.

While I am hoping that these issues are primarily due to heat, limiting problems on my watercooled card, there's no doubt the GTX590s have some serious concerns that are NOT common


----------



## Special_K

So I tigerdirect called me to verify my order. And my furnace has been shipped out.









Check it out.


----------



## Pedros

Sublime, are you also experiencing the "throttle down" of the core speed?


----------



## Pedros

1500USD in graphic cards ...
jeeezzzzz... You have courage i say ...


----------



## OverK1LL

Is anyone else getting terrible FPS in Chess Titans?


----------



## Pedros

... wow... i'm impressed ... that's a really demanding game







eheheheh


----------



## grunion

That happens, with many different cards.
I mean you could find a dozen instances of other card dying at any forum.

Quote:


> Originally Posted by *soilentblue;12963752*
> http://www.overclock.net/nvidia/750870-gtx-295-dead.html
> 
> here's a dead one on our own site around 6 months ago
> 
> http://www.evga.com/forums/tm.aspx?m=913181
> 
> here's a 295 with the same issue as the 590


Did you notice which 295 version those are, both were co ops.
The co op was a stripped down money saving version, kind of like the 590 is.


----------



## Capwn

Quote:


> Originally Posted by *Pedros;12964686*
> 1500USD in graphic cards ...
> jeeezzzzz... *You have courage i say ...*


On a OCZ 1000 watt.. I would say so.


----------



## soilentblue

Quote:


> Originally Posted by *grunion;12964845*
> That happens, with many different cards.
> I mean you could find a dozen instances of other card dying at any forum.
> 
> Did you notice which 295 version those are, both were co ops.
> The co op was a stripped down money saving version, kind of like the 590 is.


does make sense. didn't notice which version it was. could be that there were more problems with that version than the other. i'm not an expert on it. i just remember there being problems. every card has problems at launch when it comes to drivers though.


----------



## iQuit

Last night I was google images trying to find wallpaper for us GTX 590 owners, oh wait I am also a photographer... here is my wallpaper I hope you like it

Wallpaper here I cant upload it the photos are too big
http://nik.bot.nu/view.fu?a=717043


----------



## Farih

Quote:


> Originally Posted by *Masked;12963491*
> Perhaps not here but, EVGA's forums, back when Asus had forums...They were littered with people whom had their 295's blown out within the first 2 weeks.
> 
> Ours blew out and then the replacements blew out...Thus, this office never actually went with the 295's; we stuck with 280's and later the 285's.
> 
> There's definitely info out there on it...It was a big deal for about 2 weeks then forgotten about...This 590 issue, isn't going to be a quick fix, unfortunately.
> 
> The issue is...You're not listening.
> 
> Even if you bench this card, you won't get real results...Drivers haven't matured enough to get a fair assessment so, why would I waste my time?
> 
> Since I'm still at work, here's my GPU-Z of the office PC. I step it up daily to see what I can do under stock voltages until the new drivers come out but, quite frankly, without the new drivers...I haven't even installed vantage at home yet and it won't do anyone, any good because SLI isn't scaling properly.
> 
> When working drivers come out, that actually scale, that don't choke the card...that function WELL...I'll be among the first people to give the vast majority the finger...Until then, it's sheathed.
> 
> Work PC btw:
> 
> 
> It's 24/7 ~ I haven't started tweaking the MC yet b/c I hear that's where people have the most issues but, I have none.


You try overclocking it but dont bench it ?
How do you know if its stable at those clocks ? How you know if it choke's ?
Or do you do it with gaming or so ?
If its just all done in 2D mode then you cant really call it an overclock.

Same as calling a cpu-z screeny a valid overclock.... its not.

I really like to know all inside outs of this card, oneday [prolly soon] a custumor is going to ask about it or wants me to build one in there new computer they buy from me. i want to give them good advice and dont look like a fool.
Now i would tell them to wait it off or get a 6990 [mostly try to talk into them to go with single cards in sli or cf though]


----------



## ReignsOfPower

After moving from a 4870x2 to a 5970 I eventually caved into ****ty drivers and fans that did my head in. Bought a EVGA GTX590 Classified card. Leaving it on stock clocks. So comfortable! I love it so far. The hot air it dumps into the case isn't bad at all if you have good case airflow. Also changed a bunch of my fans to Noctuas, got sick and tired of my Scythe crap buzzing at me after about a year. They look like **** with their cream and morrone colour scheme, but can't deny their excellence. So far so good!

Let's move on from the arguments everyone. I'm sure if there's a hardware recall concerning the VRM problems or whatever, we will all benefit from the updated hardware. So stress less and enjoy your equipment! I know I am!

Here is my proof of ownership! Also If anyone needs some benchies on a game....let me know. I have quite a few!


----------



## grunion

Nice

Crysis @ 2560x1600 VH no AA/x4 AA
M2033 @ 1080p VH AAA/tess/no physx/no dof
AVP @ 1080 maxed IQ

Could you do them at default, 680 and 730 if possible?


----------



## ReignsOfPower

Quote:


> Originally Posted by *grunion;12966142*
> Nice
> 
> Crysis @ 2560x1600 VH no AA/x4 AA
> M2033 @ 1080p VH AAA/tess/no physx/no dof
> AVP @ 1080 maxed IQ
> 
> Could you do them at default, 680 and 730 if possible?


Sure I can get to them all soon. I ran a few on stock for the moment as I've got to jet real soon!

3D Mark 11 - P9330
http://3dmark.com/3dm11/948216

3D Mark Vantage - P41053
http://3dmark.com/3dmv/3042854

3D Mark 06 - 28136 (Just for giggles....kinda low actually!)
http://3dmark.com/3dm06/15588547

Metro 2033 - VH / AAA / AF16x / DX 11 / No PhysX / No DoF @ 1920x1080









Metro 2033 - VH - MSAA 4X / AF16x / DX 11 / No PhysX / No DoF @ 2560x1600









[EDIT] Added more...
Unigine 'Heaven' Benchmark 2.5 - 2560x1600 / AF 16x / No AA / Extreme Tessellation / Shaders High / Textures High


----------



## Pedros

Nice







Hope to hear that you're problem free!







36k on the gpu ... that's around the average numbers i saw in both my benched 590's


----------



## ReignsOfPower

Quote:



Originally Posted by *Pedros*


Nice







Hope to hear that you're problem free!







36k on the gpu ... that's around the average numbers i saw in both my benched 590's










Thanks! How come you got rid of your 590's? Didn't like the fact you couldn't overclock them to 580 speeds? That's understandable I suppose. Especially for the price. However, I doubt pushing two 580 TDP's in one cooler would work very well in the first place. Unless you watercool, forget it! I pretty much upgrade to the newest Dual GPU every generation. Buying a waterblock every generation would cost much more. Because I sell old, buy new, aftermarket installs of stuff would make it harder to sell. What I do is not the most cost effective solution, or the best bang for buck...but we're enthusiasts. We don't really care about that. Hence why we have hundreds of posts in a forum such as overclock.net. A 'waste' of money for what we use our computers for (heavy gaming and entertainment) are things like 990x's and stuff just for that higher bench score.

Bring on Kepler dual GPU! I'm staying Green until AMD fix their vaccume cleaners and polish their drivers. 3870x2 -> 4870x2 -> 5970 - gave them a good three year chance. Over it.


----------



## Pedros

Reigns ... i had some problems with my card ... and i really didn't like what i saw after inspecting the vrm circuit ... i always try to get cards that have a strong vrm. When testing the Asus, before launch, the drivers were ok ... i was getting almost the same benchmark than 580 SLi's ... it was between the 570SLi and 580Sli ... after the release of the first set of upgrade drivers those numbers went down to 36k ...

Meanwhile i had to return the sample to other reviewers.
Last week i got the change to bench test a Point of View Ultra Charged ... and my experience wasn't that good. New drivers don't give you the possibility to change voltages ... and i got several lock ups and freezes... and erratic performance.

So... i decided to not going dual gpu in the generation. My GTX295 was a performer... it was a fun card to have... this one i didn't get that much fun with it ... And knowing how the vrm is built, i know that the card it's already at it's limit...

Having 2 fully unlocked GF110s and not having the possibility to use it and have fun with it... it's not in my "notebook" as a "must have" card









So ... now ... i'm working with a Radeon 5850 that i had laying around here ... and still deciding on what to get ... eheheh...

If a 590 version with upgraded VRM "comes to shore" ... then i'm all into it ... that's why i didn't buy anything right away ... i'm still w8 for a official position from Nvidia about this ... either "The card is working properly ... and what you see is what you get" or ... " The card is having some problems and we're working on a solution ... " or " Please wait for our partners to launch custom design cards ..."

Of course, i know that, knowing Nvidia, this is a long shot ... and the day were they will do a official statement may never come


----------



## delavan

I got rid of mine for the same reasons PEDRO did....

ERRATIC performance...intermittent downclocks...i realise new products have teething problems, but nvidia reproduced the "weak" VRM mistakes they did with the 570s...they have no excuses, they had lots of indications they cheaped out on the 570s, they should've acklowledge, do something about it and don't reproduce the same mistakes...

I'm trying to go back to the green team, but they won't let me!!!
I can't afford SLI 580...so I look towards SLI570s, but weak vrms...I look towards SLI560 TI SC, but then, it's not that much of a enthusiast's setup anymore...2 lesser cards...single 580 for $500??? I might as well re-install my HD5870 and wait for something next-gen...

Or look for a crossfire setup (which I'm afraid of...driver wize)


----------



## Masked

One day I'm going to go office space on our Apache server, I swear to god...

Anyway, I'm still holding to the fact that, nothing is working properly for this card...

It is erratic, terrible performance, absolutely agree without question...But, there is a history.

My stance is simple and hasn't changed from day 1...Give this card some time, they will hopefully "fix" it's issues.

If you could see Alienware's database of 295 RMA's, I could prove my stance without question but, unfortunately, I can't...I can tell you, there are over 600 RMA's within the 1st week of OUR releasing it...I wasn't doing our inventory back then but, we ordered MAYBE 1500 cards. 900/1500 is NOT reliable...And is certainly more problem-prone then this release...SAME of the 7900...

The same goes for this release...The drivers aren't there, in my opinion, the bios isn't there...Any fix is going to take time.

You're either willing to give it time or, not.

P.S. I overclock and I bench, however, I don't needlessly stress my parts without knowing it's for a cause and is actually going to perform...With the drivers and bios the way they are now, why risk my card blowing out? It's not YOUR risk...My work PC is functioning fantastically OC'd at 702, CS5 and flash just kicked it out to 100% and it's still going. ~ 24/7


----------



## kcuestag

Did Nvidia announce if they're making a new Revision of the GTX590?

My other friend's GTX590 just died with no OC.

I was going to buy a GTX590 for my 2nd RIG mainly for [email protected] 24/7, but I think I may wait for a new revision, you think Nvidia will release a better revision with more quality vrm's/phases ?

I am quite scared tbh, I was about to buy one yesterday


----------



## Arizonian

Quote:



Originally Posted by *kcuestag*


Did Nvidia announce if they're making a new Revision of the GTX590?

My other friend's GTX590 just died with no OC.

I was going to buy a GTX590 for my 2nd RIG mainly for [email protected] 24/7, but I think I may wait for a new revision, you think Nvidia will release a better revision with more quality vrm's/phases ?

I am quite scared tbh, I was about to buy one yesterday










Baloney, what did your friends card with link have to do with you buying a GTX 590 only if they come out with a second revision? You were making a back handed point about the Vram issue. I know this because I read your posts on ATI threads and you've put down Nvidia and you have no intention on buying one. That was an under handed troll.


----------



## kcuestag

Quote:



Originally Posted by *Arizonian*


Baloney, what did your friends card have to do with you buying a GTX 590 only if they come out with a second revision? You were making a back handed point about the Vram issue. I know this because I read your posts on ATI threads and you've put down Nvidia and you have no intention on buying one. That was an under handed troll.


No, you're completely wrong about my point.

I'm asking wether Nvidia is going to make a new revision or not.

I was actually about to click the BUY button last night for a POV 590, but before I do so, I would like to know if there's a permanent fix for it?

Trust me, I'm not a fanboy of any mark, I've also got a GTX570 and GTX285 in my other rigs, soon to be x2 570 in my 2nd rig as well (Although I don't know how good it will do on a 1055T, might need to grab an i5 2500k hehe).

Anyways, I'm sorry if I bothered you, it was not by any means my intention.

I definitely want a GTX590 as soon as I am guaranteed I will be safe with it, trust me, I'm in love with that card on a single pcb.


----------



## Arizonian

Quote:



Originally Posted by *kcuestag*


No, you're completely wrong about my point.

I'm asking wether Nvidia is going to make a new revision or not.

I was actually about to click the BUY button last night for a POV 590, but before I do so, I would like to know if there's a permanent fix for it?

Trust me, I'm not a fanboy of any mark, I've also got a GTX570 and GTX285 in my other rigs, soon to be x2 570 in my 2nd rig as well (Although I don't know how good it will do on a 1055T, might need to grab an i5 2500k hehe).

Anyways, I'm sorry if I bothered you, it was not by any means my intention.

I definitely want a GTX590 as soon as I am guaranteed I will be safe with it, trust me, I'm in love with that card on a single pcb.


Ok, time will tell. Sorry to have called you out if your intentions were true.


----------



## kcuestag

Quote:



Originally Posted by *Arizonian*


Ok, time will tell. Sorry to have called you out if your intentions were true.










No problem, I may have entered the wrong words, English is not my main language, so many times I can tell X when I am trying to explain Y.

If you had to choose a 590 brand for me, which would it be?

I heard EVGA has a great Warranty support service, and a program called step up.

Also heard PoV is good too, which one should I pick? I honestly don't mind the stickers at all, I'm probably removing them as soon as I buy it, like i did on my 6970's.

Again, I'm sorry for the confusion, wasn't my intention.


----------



## Masked

Quote:



Originally Posted by *kcuestag*


No problem, I may have entered the wrong words, English is not my main language, so many times I can tell X when I am trying to explain Y.

If you had to choose a 590 brand for me, which would it be?

I heard EVGA has a great Warranty support service, and a program called step up.

Also heard PoV is good too, which one should I pick? I honestly don't mind the stickers at all, I'm probably removing them as soon as I buy it, like i did on my 6970's.

Again, I'm sorry for the confusion, wasn't my intention.


You will not likely be able to step up from the 590's in any form/fashion...You're also not allowed to step TO this card in any form/fashion HOWEVER, Evga has a very good reputation AND a lifetime warranty...They assist at every turn and actually have extremely good service. I would strongly suggest EVGA.

And on the 2nd revision, I've heard no...This was a limited release; they seem to be standing by it.


----------



## soilentblue

anyone running 2560x1440/1600 able to run crysis 1 fullscreen without a cursor bug?

could just be the two games i've tried but experience so far has been dreadful.


----------



## kcuestag

Quote:



Originally Posted by *Masked*


You will not likely be able to step up from the 590's in any form/fashion...You're also not allowed to step TO this card in any form/fashion HOWEVER, Evga has a very good reputation AND a lifetime warranty...They assist at every turn and actually have extremely good service. I would strongly suggest EVGA.

And on the 2nd revision, I've heard no...This was a limited release; they seem to be standing by it.


Thanks, I think EVGA will be my choice then


----------



## DarthShader

Quote:


> Originally Posted by *Masked;12963491*
> Even if you bench this card, you won't get real results...Drivers haven't matured enough to get a fair assessment so, why would I waste my time?


Maybe to document the driver/bios improvments over time? Or to point out and convince others it's indeed a driver issue?

Wonder if any reviewers had these driver issues, didn't see anybody complain.
Quote:


> Originally Posted by *ReignsOfPower;12970034*
> Bring on Kepler dual GPU! I'm staying Green until AMD fix their vaccume cleaners and polish their drivers. 3870x2 -> 4870x2 -> 5970 - gave them a good three year chance. Over it.


That's a bit ironic you changed to a GTX590, plagued with bad drivers in Masked's opinion. More interesting is, you don't seem to have/notice any issues with it, do you?


----------



## kcuestag

Quote:


> Originally Posted by *DarthShader;12973722*
> 
> That's a bit ironic you changed to a GTX590, plagued with bad drivers in Masked's opinion. More interesting is, you don't seem to have/notice any issues with it, do you?


I also find it a bit ironic considering how good the HD5970 was, and still is.

I had it for a good 9-12 months before I moved to x2 HD6970, and I never had any driver problems, neither noise problems even with stock cooler









Although I must agree it would be nice if they started using the same type of fans as the Nvidia cards, which are a bit quieter (1 of my Sapphire 6970 is non-reference and has one of those fans like the Nvidia cards, and I must agree they're a lot quieter ^^).

Anyways, I am thinking about ordering the EVGA GTX590 on Monday for my 2nd rig


----------



## Masked

Quote:


> Originally Posted by *DarthShader;12973722*
> Maybe to document the driver/bios improvments over time? Or to point out and convince others it's indeed a driver issue?
> 
> Wonder if any reviewers had these driver issues, didn't see anybody complain.
> 
> That's a bit ironic you changed to a GTX590, plagued with bad drivers in Masked's opinion. More interesting is, you don't seem to have/notice any issues with it, do you?


So, "you'll" sit here claiming your card is throttling, even though it's not performing or scaling correctly and...That's not a driver issue at all, obviously, right?


----------



## kcuestag

Yeah, my friend's 590 didn't throttle either, there must be something going on in *DarthShader*'s GPU.


----------



## RagingCain

Sent from my DROID2


----------



## RagingCain

There could be a million reasons for issues, I am not siding with Masked specifically but he is probably correct at the scaling performance from drivers are the reason for benchmark scores.

My loop is rebuilt, cards installed... just trouble shooting flow... hoping juggalo will give a fellow ex-sailor some help.

I will bench the hell out of the cards when I am fully up running.

Here are four main culprits for invisible (not registered by any sensor) performance drops:

GDDR5 Error Correction < caused by low or high voltages to GPU or memory. Primary symptom: causes stutter and/or low scores with higher clocks. Overclock too high even with max voltage and your scores will dip. Proven a gillion times, especially by Hwbot benchers. I have first hand knowledge and experience as of last week (last time I checked) I was 20th in the world for certified 3DMark11 Extreme scores, but it was actually my 3rd highest overclock that produced the best numbers, not highest or second highest.

Power draw issues, power supply is having voltage ripples enough to trigger microstutter and lower anticipated voltages, but not enough to crash or register on a voltmeter (happens too quick.)

No extra PCI-express juice via molex or PCI-express boosters. Usually an issue for 3 or 4 cards, but this puppy draws PCI-Express max power outage at the single slot limit so this maybe an issue.

Of course, drivers. They can cause erratic performance issues, artifacts, microstutter, instability, and SLi issues. The dual GPUs of all kinds/manufacturers are known to be better performers after drivers mature.

Those are logical explanations of everything you boys might be seeing.

Sent from my DROID2


----------



## DarthShader

Quote:


> Originally Posted by *Masked;12973813*
> So, "you'll" sit here claiming your card is throttling, even though it's not performing or scaling correctly and...That's not a driver issue at all, obviously, right?


No, throttling is not a driver issue, it's how it's supposed to work! A driver issue would be uniform ie. regardless of the clocks you run, it would be still bad and getting worse with lower clock. A CPU bottleneck would keep the results constant regardless of GPU clocks. And here it is actually becoming better as you back off with the overclocking, as reported by Pedros and proven by rush2049. Combined with the fact, that there is such mechanism built into the card, to throttle when things get dicey, I don't understand why so much denial about it?

It's actualy the best thing the card can do now. But if you want to continue to claim they didn't get the drivers and bios right, after working on the card for over two years and it being little more than SLI on one pcb, then you make nV driver team look like clueless baboons. Which they are not and neither that's your intention, is it? The fault lies in the hardware and the software team can only do so much with it.

However, if you can show me such driver induced behavior on a card without any OCP ciruits, I'll be happy to learn something new and look into it.








Quote:


> Originally Posted by *kcuestag;12973854*
> Yeah, my friend's 590 didn't throttle either, there must be something going on in *DarthShader*'s GPU.


waitwaitwait, I don't have the card.









RagingCain,

good point about the GDDR5, but I don't think people clocked it very high, did they? It's set very low for stock too. Good luck with your tests, be careful though, proving someone wrong on teh internets isn't worth the stress sometimes...


----------



## kcuestag

Oh, I thought you had a 590 which was throttling.

My bad then, sorry!


----------



## Masked

Quote:


> Originally Posted by *DarthShader;12975006*
> No, throttling is not a driver issue, it's how it's supposed to work! A driver issue would be uniform ie. regardless of the clocks you run, it would be still bad and getting worse with lower clock. A CPU bottleneck would keep the results constant regardless of GPU clocks. And here it is actually becoming better as you back off with the overclocking, as reported by Pedros and proven by rush2049. Combined with the fact, that there is such mechanism built into the card, to throttle when things get dicey, I don't understand why so much denial about it?
> 
> It's actualy the best thing the card can do now. But if you want to continue to claim they didn't get the drivers and bios right, after working on the card for over two years and it being little more than SLI on one pcb, then you make nV driver team look like clueless baboons. Which they are not and neither that's your intention, is it? The fault lies in the hardware and the software team can only do so much with it.
> 
> However, if you can show me such driver induced behavior on a card without any OCP ciruits, I'll be happy to learn something new and look into it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> waitwaitwait, I don't have the card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> RagingCain,
> 
> good point about the GDDR5, but I don't think people clocked it very high, did they? It's set very low for stock too. Good luck with your tests, be careful though, proving someone wrong on teh internets isn't worth the stress sometimes...


After writing my own drivers for several years...I know it's a driver issue...I've been in contact with Nvidia about these issues.

In Adobe I'm only using a single core...In comparison to hard SLI where I utilize both...

That alone proves to me, absolutely, it's not scaling in SLI properly, it's not actually even performing properly...

I have 1 at home at 850mhz, I have 1 at work at 725mhz (right now, stable) and all I see is horrible performance at higher clocks because the card isn't scaling properly.

What I don't understand is why everyone claims it's throttling when it's only been proving to be utilizing 40%...That means it's not scaling.

Until it actually scales and performs properly; the entire argument is null.

I've promoted patience through this entire conversation and I will continue to promote that.


----------



## rush2049

Well, sorry I havent been active on here the last two days..... I got a new case the Corsair 800D, and I wired up my psu differently that I did in the past..... and my system isn't posting anymore..... probably put too much amps on a single rail..... I can't figure out how to get it to post.....

Dang card draws too much power.... lol..... hope I didn't wreck my psu for good, and if I can get the rails figured out it fixes it.


----------



## RagingCain

@D
The GDDR5 error checking can be triggered either GPU or Memory Speeds. Similar to an error in P95 Blend test which can cause an error by either CPU, Memory, or IMC voltages too low/high. Which is why you test CPU first with Small FFTs, then Blend test.

In this case, no offense intended, but all of you guys seem like you aren't testing any form of stability via GPU testing in OCCT or ATI OC artifact scanner. Just increasing GPU clocks and maybe a nudge on voltage and just running vantage, heaven, or 3DMark 11.

I don't believe what any of you are seeing is throttling, especially with no drops in voltage, no drop in clock speeds, nor anything else measurable decreasing. The only thing we see is low GPU usage, and I have seen this on my 580s which have primitive OCP and I have seen it while OCP is disabled. It made no difference. Let's take Dragon Age 2 for example, had about 50/60% GPU, drivers two weeks later 90% on both. That being said, performance could purposely be held back as I suspect it is sometimes for all Fermis.

My God, cards are cursed, fan died on the PSU now I have 0 water flow.

Spliced a brand new noiseblocker in there and omg is it quiet. An inch from my ear. I can't hear it blowing. So impressed.

The new typhoons are phenomenal too! I can feel them from 3 ft away and barely above a whisper. My pump is now the loudest thing on my computer.

Be back in a bit. Tearing water loop apart.... again. Sigh....

Sent from my DROID2


----------



## ReignsOfPower

Quote:


> Originally Posted by *DarthShader;12973722*
> That's a bit ironic you changed to a GTX590, plagued with bad drivers in Masked's opinion. More interesting is, you don't seem to have/notice any issues with it, do you?


I hate to break this to you, but who in the world is 'masked'? Secondly, If I don't have problems, I don't have problems. Don't know how else I can say it.
Quote:


> Originally Posted by *kcuestag*
> I also find it a bit ironic considering how good the HD5970 was, and still is.


Sure was, sure is. Didn't like the audible cooling levels. Didn't like the bugs I experienced with the drivers. Sometimes you would find a driver release was great, sometimes they were buggy. My personal experiences ranged from Cat 10.1 - > 11.4p with everything in between (and including Von Modded drivers).

This is a GTX590 owners club. That means talking about ways we can tweak/improve performance and post our performance results and findings. If you're all going to sit there endlessly trying to win a internet battle over throttling issues of some and the vrm 'fireworks' of others, make another thread. "GTX590 Debate(s)" sounds good to me. It's been discussed, the problem exists for some, and doesn't for others.

I'm going to post some benchies in Crysis and AvP later this morning.




http://www.youtube.com/watch?v=Kq40MqiPkJQ&feature=autofb[/ame[/URL]]

[EDIT] - Ah Masked is a user here. Didn't realise. No offence!


----------



## DIABLOS

I for one in the twelve years i've been interested in hardware have never heard of a poorly optimised driver making cards perform worse when overclocked...


----------



## Arizonian

Quote:


> Originally Posted by *ReignsOfPower;12975995*
> I hate to break this to you, but who in the world is 'masked'? Secondly, If I don't have problems, I don't have problems. Don't know how else I can say it.
> 
> Sure was, sure is. Didn't like the audible cooling levels. Didn't like the bugs I experienced with the drivers. Sometimes you would find a driver release was great, sometimes they were buggy. My personal experiences ranged from Cat 10.1 - > 11.4p with everything in between (and including Von Modded drivers).
> 
> This is a GTX590 owners club. That means talking about ways we can tweak/improve performance and post our performance results and findings. If you're all going to sit there endlessly trying to win a internet battle over throttling issues of some and the vrm 'fireworks' of others, make another thread. "GTX590 Debate(s)" sounds good to me. It's been discussed, the problem exists for some, and doesn't for others.
> 
> I'm going to post some benchies in Crysis and AvP later this morning.
> 
> 
> 
> 
> 
> [EDIT] - Ah Masked is a user here. Didn't realise. No offence!


This x2 - funny how the ATI owners came over to invade the GTX 590 club thread to post propaganda. Like those of us who spent this much on one video card don't know about computers or video cards.


----------



## Pedros

Arizonian... some don't know at all


----------



## RagingCain

Alright! One card is dead.

I am never water cooling another computer for the rest of my life...

Sent from my DROID2


----------



## Pedros

You are kidding right Raging? :x


----------



## RagingCain

Quote:


> Originally Posted by *Pedros;12976473*
> You are kidding right Raging? :x


Sigh... sadly no.

90% sure it will RMA fine, I didn't do anything technically wrong.

9% sure it will heal with time + evaporation if that is the reason.

2% sure a blow dryer may help.

1% sure it was just DoA on arrival.

0.00001% its power supply.


----------



## Pedros

Jeez :x good luck with that mate :x That's why i use those connectors that you can just unplug but the water is retained ... i'm always changing parts to test under water ... so it's the best ... you just have to start the pumps for the system to be filled again and you're ready to go


----------



## Masked

Quote:


> Originally Posted by *DIABLOS;12976144*
> I for one in the twelve years i've been interested in hardware have never heard of a poorly optimised driver making cards perform worse when overclocked...


...That's not true at all.

The 7900's ran SO BADLY in benches and had so many problems the 7950 was made to remedy them...

I'd love an explanation, though, as to why my card is running 90% on 1 core and 0% on the other...Perhaps it's throttling back the 2nd core?!?!?!?


----------



## RagingCain

Quote:


> Originally Posted by *Pedros;12977239*
> Jeez :x good luck with that mate :x That's why i use those connectors that you can just unplug but the water is retained ... i'm always changing parts to test under water ... so it's the best ... you just have to start the pumps for the system to be filled again and you're ready to go


Got off the phone with a good ole Southern Boy named Jim who works for EVGA.

That alone scared me, until he started pulling up the tech specs on my specific power supply. Pleasantly surprised by knowledgeable tech support guy. I am definitely testing the second card by itself, but it looks like its the power supply.

Cards might be pulling more amperage than my rails can provide.

EVGA steady blinking light = power supply error/issue.
^ Like a safe mode, preventing full start up of the GPU.

Wet/dead/damaged GPU = no blinking light period.


----------



## remer

There is no way 1200W isn't enough. How many rails does that thing have? Maybe switch it to another rail?


----------



## Masked

Quote:


> Originally Posted by *remer;12977677*
> There is no way 1200W isn't enough. How many rails does that thing have? Maybe switch it to another rail?


Ultra's are known for having frequent issues or being amazing...It's obvious he got the issues


----------



## remer

Edit: I would think one 85 amp 12v rail should be plenty but I'm no psu expert.


----------



## RagingCain

Well I am going through the tech specs of my ULTRA x4... rather than reinstalling the loop as its easier, judging by everything I have read, its looking with in spec for 2x GTX 590s.

Here is real world data:
http://www.plugloadsolutions.com/psu_reports/ULTRA%20PRODUCTS_X4%201200W_ECOS%201613_1200W_Report.pdf

Anybody see something I am missing?

I have had Trifire 5870x and 2x 580s on this system. No issues so far? Save for the fan dying earlier today. Just gave it a hefty upgrade with the Noiseblocker.


----------



## Masked

Ultra's occasionally just die...Some rails stop working...Like I said, either you get an amazing one or a dud.

Bought 12 for our office PC's b/c I swore by them...Only 2 are left 1.5 years later...

If it doesn't seem to be working or amping properly, chances are, it's not...Yours could have just randomly died.


----------



## remer

Maybe its an on board problem with the power, like the vrms or mosfets or something.


----------



## remer

Quote:


> Originally Posted by *Masked;12977781*
> Ultra's occasionally just die...Some rails stop working...Like I said, either you get an amazing one or a dud.
> 
> Bought 12 for our office PC's b/c I swore by them...Only 2 are left 1.5 years later...
> 
> If it doesn't seem to be working or amping properly, chances are, it's not...Yours could have just randomly died.


Wow, that sucks. Maybe a Corsair AX1200 will fix the problem.


----------



## RagingCain

Despite not trusting Ultra products in general this thing has been a quiet, non-whining beast putting up with my crap for over 6 moths. I wouldn't have noticed if the fan had died its so quiet.

I will verify, the card is good by itself before I look into another PSU.

I think apart of me is hoping its not the card. That and I just voided my PSU warranty by repairing/replacing the fan, 2 or 3 hours ago.


----------



## soilentblue

Quote:


> Originally Posted by *RagingCain;12977619*
> Got off the phone with a good ole Southern Boy named Jim who works for EVGA.
> 
> That alone scared me, until he started pulling up the tech specs on my specific power supply. Pleasantly surprised by knowledgeable tech support guy. I am definitely testing the second card by itself, but it looks like its the power supply.
> 
> Cards might be pulling more amperage than my rails can provide.
> 
> EVGA steady blinking light = power supply error/issue.
> ^ Like a safe mode, preventing full start up of the GPU.
> 
> Wet/dead/damaged GPU = no blinking light period.


southern people can't be knowledgeable in tech?


----------



## grunion

Quote:


> Originally Posted by *RagingCain;12977842*
> Despite not trusting Ultra products in general this thing has been a quiet, non-whining beast putting up with my crap for over 6 moths. I wouldn't have noticed if the fan had died its so quiet.
> 
> I will verify, the card is good by itself before I look into another PSU.
> 
> I think apart of me is hoping its not the card. That and I just voided my PSU warranty by repairing/replacing the fan, 2 or 3 hours ago.


Ultra is superb at warranty replacement, 9 times out of ten they don't even want the old unit back.

Just tell them its not playing nicely with your Fermi cards, it'll likely be no questions asked.


----------



## soilentblue

Quote:


> Originally Posted by *Masked;12977335*
> ...That's not true at all.
> 
> The 7900's ran SO BADLY in benches and had so many problems the 7950 was made to remedy them...
> 
> I'd love an explanation, though, as to why my card is running 90% on 1 core and 0% on the other...Perhaps it's throttling back the 2nd core?!?!?!?


i'm not seeing 0 but it's definitely not scaling correctly at all. 95% and 28% in crysis for me


----------



## grunion

^^Will sli work in windowed mode?


----------



## remer

I think crysis 2 is having sli issues. Try setting the "Multi-GPU render mode" to "force alternate frame rendering 1" in the nv control panel if you haven't already.

Edit: I see you are talking about the original crysis. It still might not automatically do sli in windowed mode or flicker like crazy.


----------



## soilentblue

Quote:


> Originally Posted by *grunion;12978354*
> ^^Will sli work in windowed mode?


it is working now but it's just not scaling well.


----------



## RagingCain

Alrighty, looks like card 2 was a DOA

No fear, I swear to God, I actually felt better getting an RMA approved from EVGA then I did opening the cards.

I have no prior experience having to use EVGA tech support, which is also a positive note, but after talking to them... I felt better as a person. Best customer service support ever.

I had a small complaint (a chipped gold pin on the Express pins) and the guy said to me, "well sir, you paid enough money to have a quality product without any issues" then proceeded to finish RMA #1, and told me send some pictures and he will confirm the RMA of card 2, despite it being able to post. Thats as far as I got with the cards today.

I am now an EVGA customer for life.
Quote:


> Originally Posted by *soilentblue;12977933*
> southern people can't be knowledgeable in tech?


Doesn't everyone know this? j/k

No I am not a big fan of tech support usually. I generally have to get to tier 3 before I stop confusing them with computer words.


----------



## Special_K

So my cards will be here on Monday which will give me 4 days to work out my custom intake exhaust for the cards so they don't bring air into the cards from the case and the cards don't ventilate into the case before a LAN party. I might post the pics of what I did before I leave. It will be a lot of plexiglass work though.


----------



## soilentblue

i will say this though. if I play crysis on the nvidia autofan setting then I can play for like 30 minuutes and then it blue screen crashes. With a custom profile for the fan it hasn't crashed yet and i've been playing for over 2 hours stressing the card.

after the computer would restart i've seen the card between 68*-79*. it can get hot quick and i don't think it likes it up there even though nvidia said the thermal threshold was up to mid 90s or somsething around there.


----------



## soilentblue

i wonder how it performs


----------



## Juggalo23451

Quote:



Originally Posted by *soilentblue*




















i wonder how it performs


Won't be available till April 11 I think.
I wanna see the flow design of the card


----------



## rush2049

so yeah, unlike ragincain..... my psu is not up to the challenge of running this beast. I had it on three rails in my old case. But I just purchased an 800D and moved all my stuff over...

well to make a long story short I did not wire it correctly, and when I pushed the on button it destroyed my psu..... pulling 50A for the card on one rail ain't good when its rated for 25A single rail.

So I have a corsair ax1200 coming in the mail, should be here tuesday.... till then I am confined to this laptop and staring at the beautiful case and the 590 sitting there laughing at me.


----------



## soilentblue

Quote:



Originally Posted by *Juggalo23451*


Won't be available till April 11 I think.
I wanna see the flow design of the card


as do i. I'm curious to see if the fins for the gpu are set into the block or raised. also whether there's designs made for better vrm cooling than the manufactures who had cards on launch day. i would assume those who didn't get a card till late are going to be the ones to implement some type of vrm cooling specially for this card but that could also be wishful thinking.

Quote:



Originally Posted by *rush2049*


so yeah, unlike ragincain..... my psu is not up to the challenge of running this beast. I had it on three rails in my old case. But I just purchased an 800D and moved all my stuff over...

well to make a long story short I did not wire it correctly, and when I pushed the on button it destroyed my psu..... pulling 50A for the card on one rail ain't good when its rated for 25A single rail.

So I have a corsair ax1200 coming in the mail, should be here tuesday.... till then I am confined to this laptop and staring at the beautiful case and the 590 sitting there laughing at me.


sorry to hear that rush. at least it's a hell of a psu you are getting though.


----------



## Juggalo23451

Quote:


> Originally Posted by *rush2049;12980370*
> so yeah, unlike ragincain..... my psu is not up to the challenge of running this beast. I had it on three rails in my old case. But I just purchased an 800D and moved all my stuff over...
> 
> well to make a long story short I did not wire it correctly, and when I pushed the on button it destroyed my psu..... pulling 50A for the card on one rail ain't good when its rated for 25A single rail.
> 
> So I have a corsair ax1200 coming in the mail, should be here tuesday.... till then I am confined to this laptop and staring at the beautiful case and the 590 sitting there laughing at me.


That bites bro, I ordered the SilverStone 1500 watt psu. Just to be safe. Overkill yes, bit this is ocn there is no such thing lol.
http://www.newegg.com/Product/Product.aspx?Item=N82E16817256054


----------



## Alatar

Quote:


> Originally Posted by *rush2049;12980370*
> So I have a corsair ax1200 coming in the mail, should be here tuesday.... till then I am confined to this laptop and staring at the beautiful case and the 590 sitting there laughing at me.


Heh I have an AX1200 coming too...

I was getting clean shut downs in the most intensive gameplay moments in BC2, 3dmark etc. with both cores enabled.

Decided to do some testing, and since I'm not an electrical engineer or anything asked my granddad to bring some equipment, we tested the wattage my rig pulls, etc. and came to the conclusion that the first 12v rail of my PSU that's handling the mobo, processor, and the GFX card (well, the stuff it pulls from the pci-e slot) isn't up to the task. it can take the card with a stock processor but when I overclock to 3,8ghz+ problems start coming.


----------



## sl00tje

I have had contact with member ruderthanyou and our stock voltages are not the same. What are your stock voltages? Mine are (EVGA GeForce GTX 590 Classified):

MSI Afterburner 2.2.0 Beta 2 (download):

GPU1: 912
GPU2: 913

GPU-Z v0.5.2 (download):

GPU1: 0.9120 V
GPU2: 0.9130 V

EVGA OC Scanner 1.6.1 (download):

GPU1: 0.912V
GPU2: 0.913V


----------



## Alatar

according to GPU-Z v0.5.2 it's 0.9250V (and lowers to 0.8888 if there is no load).

E: afterburner beta also shows a 925mV reading


----------



## sublimejhn

Mine- .912 under load
.875 idle

I can't believe how many people are having PSU issues. Now you all have me worried. I still have the same old coolermaster PSU that I have had for a good 5 years now, and it seems to be doing just fine. It wasn't even a great PSU to begin with!! The thing has 6 rails which is a bit odd


----------



## Alatar

I seem to have a bit higher voltage :[


----------



## sl00tje

Quote:


> Originally Posted by *Alatar;12981172*
> according to GPU-Z v0.5.2 it's 0.9250V


Quote:


> Originally Posted by *sublimejhn;12981195*
> Mine- .912 under load


For both GPU1 and GPU2 the same?


----------



## sublimejhn

Quote:



Originally Posted by *sl00tje*


For both GPU1 and GPU2 the same?


Sorry, not the exact same, but close. GPU2- .913 under load. Same at idle


----------



## Alatar

Quote:



Originally Posted by *sl00tje*


For both GPU1 and GPU2 the same?


yes. both GPUs.

And correction to the idle voltage, it's 0.875


----------



## whosjohnny

Here is the proof for my EVGA GTX 590 on EVGA's STOCK VOLTAGES! GPU1:0.938V, GPU2:0.925V =D
EVGA GTX 590 [email protected], 2000Memory
EVGA GTX 590 [email protected], 2000Memory
PC Power&Cooling 750w Continuous, 830w Peak rated @82% efficiency.
I have NO HARDDRIVE inside the Antec 900 Case to maximize Air Flow and minimize ambient temp.
Reverse the TOP FRONT FAN to suck the HEAT directly from the GPU1 exhaust (located at the end of the card toward the front of the case) out. I have 120mm FAN that draws cool air from outside the case directly on top of the GTX 590. Only thing inside is EVGA i790 ULTRA SLI Mobo, USB3 card, Q9450 @3.2Ghz, 4G DDR3 1600mhz 7-6-6-24-2T, 1 DVD-ROM.

http://johnten.com/images/EvgaGTX590..._AirCooled.jpg
http://johnten.com/images/OC775_2000...oArtifacts.jpg
http://johnten.com/images/OC775_2000...oArtifacts.jpg
http://johnten.com/images/OC800_2000_ZeroArtifacts.jpg


----------



## sl00tje

Ok, here's the list (so far):










Tips and ideas are welcome!


----------



## sl00tje

Quote:



Originally Posted by *whosjohnny*


Here is the proof for my EVGA GTX 590 on EVGA's STOCK VOLTAGES! (0.938V)


I see different voltages in your pictures. What are your stock voltages for GPU1 and 2?


----------



## whosjohnny

GPU1 Stock Voltage = 0.938V
GPU2 Stock Voltage = 0.925V

If you study the card, GPU2 has less ventilation than GPU1 (located toward the front of the case). GPU2 is constricted by DVI connector toward the back of the case.

Just my guess why EVGA did that.


----------



## sublimejhn

Some of these OCs seem crazy. I tried running mine at 680 core the other day and it crashed during a 3dmark 11 run :/ I guess I have plenty of room to increase voltage, but I would like to avoid that for now until a bit more is known about the issues the cards are having


----------



## sl00tje

List updated:


----------



## Alatar

Quote:



Originally Posted by *sl00tje*


List updated:











hmm I guess I could do that google docs spreadsheet so you wouldn't have to post pics every time heh


----------



## whosjohnny

Hey Sublime... Just my guess, I'm currently running Core800Mhz Stable on Stock 0.938V/0.925V, air cooled... What is your Power Supply and Ambient temperature in the room your case is located in?

I have 2 theories.
1. Power supply. As an electrical engineer, I understand electronics will last longer with a more continuous, clean flow of electric current under load and spikes. That requires both a good power supply and a good motherboard that is not skimpy on capacitors. My $350 EVGA mobo is designed for that and is powered by PC Power&Cooling one of the best power supply companies out there. I even receive a technical printout of the actual test from the factory when I got my power supply.

2. Ambient temperature. Although this is just speculation, a 70 degree room vs. a 85 degree room makes a difference I believe in how much you can overclock stable due to all the transfer of heat to the actual GPU themselve. Higher lock, higher temperature.


----------



## sublimejhn

Quote:



Originally Posted by *whosjohnny*


Hey Sublime... Just my guess, I'm currently running Core800Mhz Stable on Stock 0.938V/0.925V, air cooled... What is your Power Supply and Ambient temperature in the room your case is located in?

I have 2 theories.
1. Power supply. As an electrical engineer, I understand electronics will last longer with a more continuous, clean flow of electric current under load and spikes. That requires both a good power supply and a good motherboard that is not skimpy on capacitors. My $350 EVGA mobo is designed for that and is powered by PC Power&Cooling one of the best power supply companies out there. I even receive a technical printout of the actual test from the factory when I got my power supply.

2. Ambient temperature. Although this is just speculation, a 70 degree room vs. a 85 degree room makes a difference I believe in how much you can overclock stable due to all the transfer of heat to the actual GPU themselve. Higher lock, higher temperature.


Well my GPU is definitely outdated, I've had it for around 5 years. It's a Coolermaster Real Power 850w. I don't think it's the problem though as I have no instability issues what so ever with my system, this specific card just doesn't OC well at stock voltage.

My ambient is warm for sure, I live in AZ and the last couple days have been hot. Thermometer is reading 78 degrees, but that's in the hallway and it's quite a bit warmer in my computer room. Either way though, I'm on water and temps never even get above 50 degrees.

I will probably just need to give my card some voltage if I want to do any real overclocking

edit: Room temp in Fahrenheit, card temps in Celcius. Us crazy americans!!


----------



## Alatar

we now have a spreadsheet and I'd like that someone try editing it, to see if it's working properly.

Thanks guys.

I can also add stuff later and as always all suggestions are welcome.


----------



## ReignsOfPower

Hey guys,
Had some problems running the Crysis Benchmark tool this morning, hence why I didn't post any results.

Here are my voltages for reference








GPU 1 
IDLE 0.8750
LOAD 0.9130

GPU2
IDLE 0.8750
LOAD 0.9250

EVGA GTX590 Classified GPU 630 Shader 1260 Memory 1728


----------



## sl00tje

Quote:



Originally Posted by *Alatar*


..so you wouldn't have to post pics every time heh


I like to post pics









Quote:



Originally Posted by *ReignsOfPower*


Here are my voltages for reference










Thanks ReignsOfPower. List update:


----------



## Alatar

Quote:



Originally Posted by *sl00tje*


I like to post pics










Yeah I can understand that









But I made the spreadsheet and it can now be found in the OP. I'd appreciate it if someone other than me tried to edit it.


----------



## sublimejhn

Quote:



Originally Posted by *Alatar*


Yeah I can understand that









But I made the spreadsheet and it can now be found in the OP. I'd appreciate it if someone other than me tried to edit it.


Well I haven't messed with google docs much. But I sent a request to edit it, so when you accept it I will throw my info in there


----------



## crimsonberkut

i'm about to buy GTX 590 , do you think Gigabyte Odin GT 800w PSU is enough ?

thanks


----------



## Alatar

Quote:



Originally Posted by *sublimejhn*


Well I haven't messed with google docs much. But I sent a request to edit it, so when you accept it I will throw my info in there


it shouldn't need any permissions. I set it up so anyone with the link could edit it.


----------



## sublimejhn

Quote:



Originally Posted by *Alatar*


it shouldn't need any permissions. I set it up so anyone with the link could edit it.


Well it is not allowing me to edit it, I had to request permission which as of yet does not appear to have been accepted. So I'm not sure what the problem is


----------



## Pedros

For those who clocked the cards a 800Mhz ... can you show us some benches?







i got really curious on checking those









If at 800Mhz under water it can have a good performance increase, that would be great ... although i really think you won't see any real gains at 800Mhz due to the ocp but i'm really interested to check that.


----------



## Masked

Quote:



Originally Posted by *Pedros*


For those who clocked the cards a 800Mhz ... can you show us some benches?







i got really curious on checking those









If at 800Mhz under water it can have a good performance increase, that would be great ... although i really think you won't see any real gains at 800Mhz due to the ocp but i'm really interested to check that.


I ran a bench @ work last night just to see something and the 2nd core isn't scaling properly.

Mid-Vantage I was showing 90% and 30%.

So it's NOT because of the OCP...It's due to scaling that these cards are seeing POOR performance...

They're not even scaling properly in multi-window apps...Much less Crysis...Not going to miraculously start working in benches any differently unless God comes down and blesses the damn thing...









BTW; I would like to quote you from another thread...

Quote:



Originally Posted by *Pedros*


But you know ... having a GTX 590 gives you some kind of "status" ... people just don't care how bad it sucks ... they just want to show that they have the 590 ...

and most of 590 users try to show how much they really understand about graphic cards ... but in the end ... if you show him a mosfet & a memory chip ... for him ... they are the same ...


This is absolutely NOT true and considering your apparent stance over the past few weeks, I'd tread carefully and bench your own 590's...


----------



## Pedros

Masked... just show us some benchmarks ... not just talk ...









and the last quote ... my friend... it's true ... there are several users that just wanted the 590 to be a solid and a card that would deliver enough power ... that's really ok ...

But then, there are others that just try to make it a card that really isn't ... and they don't understand a damn thing about how to analyze a freaking pcb, but even though, they try to have technical discussions and try to defend some ideas that are really foolish ...

Here... 
I think that this thread is full of "ideas" but lacking actual experiments ... you included masked...

750Mhz










700Mhz










Oh w8... it scales well at 700Mhz but doesn't scale well at 750 ... hmmm yes ... must be the drivers not optimizing the scalling at 750 ...

Tsk tsk ...


----------



## kcuestag

@RagingCain I'm sorry to hear about your loss









Good luck with the RMA and let's keep our fingers crossed so you don't have to wait too long to get a replacement and start rocking with those beasts!!!!









Again, sorry to hear about ur loss









PS: Pedro come to Steam


----------



## kazukun

Zotac GTX590 Quad SLI

















Core700/Mem1900/Shader1400

3DMARK11
P15297
Graphics　18347
Physics　11040
Combined　9172








3DMARK VANTAGE
P59274
GPU　54322
CPU　81586









disintegrated


----------



## Alatar

bravo, love the WC loop as well. Do you have any plans of putting the cards under water?

Updating in a sec


----------



## Masked

Quote:



Originally Posted by *Pedros*


Masked... just show us some benchmarks ... not just talk ...









and the last quote ... my friend... it's true ... there are several users that just wanted the 590 to be a solid and a card that would deliver enough power ... that's really ok ...

But then, there are others that just try to make it a card that really isn't ... and they don't understand a damn thing about how to analyze a freaking pcb, but even though, they try to have technical discussions and try to defend some ideas that are really foolish ...

Oh w8... it scales well at 700Mhz but doesn't scale well at 750 ... hmmm yes ... must be the drivers not optimizing the scalling at 750 ...

Tsk tsk ...


Did you bother checking usage core vs core? No...

You're taking a single assessment @ what YOU believe to be 100% and, it's not.

90/30 in vantage says worlds more than any bench you can ever show me...

If a bench is only using 30% of it's second core...How is that at all throttling? It's not...It's absolutely ****ty scaling.

EVGAjacob just made a comment on the EVGA forums that they're aware of the issues and working on them...Meaning, IT IS AN ISSUE.

You can see it for yourself with Precision, with Aida...Check it on your own time...Apparently I just bought this card to own it so, you can absolutely help yourself.


----------



## grunion

Quote:



Originally Posted by *whosjohnny*


Here is the proof for my EVGA GTX 590 on EVGA's STOCK VOLTAGES! GPU1:0.938V, GPU2:0.925V =D
EVGA GTX 590 [email protected], 2000Memory
EVGA GTX 590 [email protected], 2000Memory
PC Power&Cooling 750w Continuous, 830w Peak rated @82% efficiency.
I have NO HARDDRIVE inside the Antec 900 Case to maximize Air Flow and minimize ambient temp.
Reverse the TOP FRONT FAN to suck the HEAT directly from the GPU1 exhaust (located at the end of the card toward the front of the case) out. I have 120mm FAN that draws cool air from outside the case directly on top of the GTX 590. Only thing inside is EVGA i790 ULTRA SLI Mobo, USB3 card, Q9450 @3.2Ghz, 4G DDR3 1600mhz 7-6-6-24-2T, 1 DVD-ROM.

http://johnten.com/images/EvgaGTX590..._AirCooled.jpg
http://johnten.com/images/OC775_2000...oArtifacts.jpg
http://johnten.com/images/OC775_2000...oArtifacts.jpg
http://johnten.com/images/OC800_2000_ZeroArtifacts.jpg


What does the gpu usage do when run at a standard fullscreen res, 1080, 1200?

Quote:



Originally Posted by *kazukun*


Zotac GTX590 Quad SLI










Core700/Mem1900/Shader1400

3DMARK11
P15297
Graphics　18347
Physics　11040
Combined　9172

3DMARK VANTAGE
P59274
GPU　54322
CPU　81586


Very nice, could you do some runs with AB displaying the gpu usage?


----------



## kcuestag

Whoah! *kazukun* your setup looks BADASS!

I just fell in love with ur rig









Congratz, now it's time to enjoy it


----------



## soilentblue

Quote:


> Originally Posted by *Juggalo23451;12980345*
> Won't be available till April 11 I think.
> I wanna see the flow design of the card


i'm going to assume they used the same internal design to work for the 590 as they did for the 6990.


----------



## soilentblue

i could be definitely wrong but it seems like the gpu to the left would not get cooled as effectively as the one on the right from water not running through the fins.


----------



## Pedros

great stuff kazukun!!!


----------



## Huckleberry

So, I have definitely been considering getting one of the hydro copper 590s from evga over the hydro copper 580s due to the $180 price difference. While the performance is not going to equal 2 580s, it still costs a decent bit less. For those that own them, and especially the water cooled versions, are you satisfied thus far with the purchase? I realize overclocking will not be as substantial as with a 580, but that is not what I would be looking to do anyways.


----------



## EvilClocker

I just Installed EK Block on my EVGA GTX 590.. Wow What a Temp Difference.. but sadly with the 270.51 no way to Voltage Tweak ? Anyone got a work around ?


----------



## grunion

Quote:


> Originally Posted by *EvilClocker;12987290*
> I just Installed EK Block on my EVGA GTX 590.. Wow What a Temp Difference.. but sadly with the 270.51 no way to Voltage Tweak ? Anyone got a work around ?


None as of yet, roll back to a version without the lock.
But be very very careful.


----------



## soilentblue

Quote:


> Originally Posted by *EvilClocker;12987290*
> I just Installed EK Block on my EVGA GTX 590.. Wow What a Temp Difference.. but sadly with the 270.51 no way to Voltage Tweak ? Anyone got a work around ?


evil what are your temps at idle and load? the ek block revision is very nice. makes me want one for sure.

i believe with 270.51 you are limited to 900mv

have you overclocked it on stock volts also?


----------



## soilentblue

Quote:


> Originally Posted by *grunion;12987434*
> None as of yet, roll back to a version without the lock.
> But be very very careful.


isn't it throwing out more volts than what you assign to it? i wouldn't overvolt.


----------



## grunion

Quote:


> Originally Posted by *soilentblue;12987922*
> isn't it throwing out more volts than what you assign to it? i wouldn't overvolt.


IDK

Only heard that from one person.

I'm sure someone with a good volt meter and knowledge of the measuring points someone could tell us.

I definitely agree with you, no overvolting until NV says so.


----------



## soilentblue

i'm as curious as everyone else but there's no point in taking a chance without being on water and i don't even feel it's worth it right now due to the ineffeciency of these drivers.

rumor is the big nvidia drivers are coming the first week in april so we will see. possibly coming with nvidia's new SSAA(i think that's the letters) that is equal to MLAA in quality and fps


----------



## Juggalo23451

Quote:


> Originally Posted by *soilentblue;12988118*
> i'm as curious as everyone else but there's no point in taking a chance without being on water and i don't even feel it's worth it right now due to the ineffeciency of these drivers.
> 
> rumor is the big nvidia drivers are coming the first week in april so we will see. possibly coming with nvidia's new SSAA(i think that's the letters) that is equal to MLAA in quality and fps


That would be a great early Bday present


----------



## Masked

Quote:


> Originally Posted by *grunion;12988053*
> IDK
> 
> Only heard that from one person.
> 
> I'm sure someone with a good volt meter and knowledge of the measuring points someone could tell us.
> 
> I definitely agree with you, no overvolting until NV says so.


I was running 1.05 for a while until I found out b/c of the VRM's it was in actuality +/- .03...

As we all know over 1.05 is the magical number for "puff puff crackle" so I turned it down to 1.02 AND was able to actually maintain the same OC of 850mhz so, I believe it's +.03.

Was using a Northern Industrial when I came upon the discovery of 1.08 and swiftly changed that btw...


----------



## EvilClocker

I am running a Swiftech MCR320-Drive Rad With 3/8 tubes, with the 590 only thing in loop, Getting 42C running Furmark 1.9. playing games it stays around 36-38C . Funny thing is I only got (2) 120 fans on my Triple RAd at the moment LOL so i figure once I put the 3rd fan on the RAD Temp will drop little more

I feel alot better now with the Block on there. I got the Heat Under Control just hoping i can tweak the V to about 1.00V so i can overlock it to about 720


----------



## Masked

Quote:


> Originally Posted by *EvilClocker;12988413*
> I am running a Swiftech MCR320-Drive Rad With 3/8 tubes, with the 590 only thing in loop, Getting 42C running Furmark 1.9. playing games it stays around 36-38C . Funny thing is I only got (2) 120 fans on my Triple RAd at the moment LOL so i figure once I put the 3rd fan on the RAD Temp will drop little more
> 
> I feel alot better now with the Block on there. I got the Heat Under Control just hoping i can tweak the V to about 1.00V so i can overlock it to about 720


I get 750 stock...800+ will be 1.00v


----------



## soilentblue

Quote:


> Originally Posted by *EvilClocker;12988413*
> I am running a Swiftech MCR320-Drive Rad With 3/8 tubes, with the 590 only thing in loop, Getting 42C running Furmark 1.9. playing games it stays around 36-38C . Funny thing is I only got (2) 120 fans on my Triple RAd at the moment LOL so i figure once I put the 3rd fan on the RAD Temp will drop little more
> 
> I feel alot better now with the Block on there. I got the Heat Under Control just hoping i can tweak the V to about 1.00V so i can overlock it to about 720


very nice. i'm hoping to get temps around that. this card has heat problems that the reference fan and autofan combo just aren't good enough to deal with imo.


----------



## EvilClocker

675 is highest my card will go at stock volts.. I using Heaven benchmark to verify my overclocks.. it needs little more volts


----------



## RagingCain

You guys should be using EVGA OC Scanner or OCCT GPU stress test to verify stability.

@Juggalo
Its always the CPU block.

Everything is up and running back on the 580s.

590s go back to EVGA tomorrow and replacements be here in a week or so thats cool. At least the mouse pad is nice








Quote:


> Originally Posted by *Masked;12988455*
> I get 750 stock...800+ will be 1.00v


I have 3 reports of varying stock voltages.

Some are saying theirs are 0.880, 0.918, and 0.963.
Quote:


> Originally Posted by *grunion;12988053*
> IDK
> 
> Only heard that from one person.
> 
> I'm sure someone with a good volt meter and knowledge of the measuring points someone could tell us.
> 
> I definitely agree with you, no overvolting until NV says so.


I heard this as well, its over shooting up to 0.025v, according to some EVGA users (ones with more than 4 posts) on the forum.

The drivers have locked the voltage so its all users too, not just EVGA buyers.
Quote:


> Originally Posted by *kazukun;12982618*
> Zotac GTX590 Quad SLI


Beautiful rig Kaz, well done, awesome first post. Welcome to OCN.


----------



## Juggalo23451

Quote:



Originally Posted by *RagingCain*


You guys should be using EVGA OC Scanner or OCCT GPU stress test to verify stability.

@Juggalo
Its always the CPU block.

Everything is up and running back on the 580s.

590s go back to EVGA tomorrow and replacements be here in a week or so thats cool. At least the mouse pad is nice









I have 3 reports of varying stock voltages.

Some are saying theirs are 0.880, 0.918, and 0.963.


Umm what about the cpu block I'm lost


----------



## RagingCain

Quote:


> Originally Posted by *Juggalo23451;12991092*
> Umm what about the cpu block I'm lost


It had gunk in the CPU block causing the flow restriction. Perhaps draining it carried more gunk too the block that may have been sitting in the tubes... not sure, but its very high flow rate now.

Finally decided to tear it apart, luckily though, its a little older now, and the o-rings are retaining their shape. Last time I was so frustrated I almost gave myself a seizure.

I knew it was it, in my freaking gut... I just... just didn't want to tear it apart. New rule, when water loop gets taken apart... I take out the CPU block too for a cleaning. Its the only spot for clogging in my loop.

Edit:
We have 600 posts. With only like 10 members? How much arguing, bickering and trolling is going on during that time I wonder.


----------



## EvilClocker

Hopefully Nvidia Next Driver's will let Us Tweak A little on the voltage.. going back to older driver not and option since the 270 driver are alot faster with surround for me


----------



## jcde7ago

Quote:



Originally Posted by *EvilClocker*


Hopefully Nvidia Next Driver's will let Us Tweak A little on the voltage.. going back to older driver not and option since the 270 driver are alot faster with surround for me


I honestly don't recommend overvolting at all at the moment, period.

Just find your max overclock with stock voltages and stick with it for the time being, until a BIOS is maybe released that gives us some leeway with the voltages (though I can't imagine it's anything greater than a 5% or so overvolt).

It's always nice to overclock or have the ability to should we choose to, but the 590's internal design just doesn't allow us that freedom unfortunately. Unless you absolutely need a high OC, the card itself on stock clocks or at an OC of around 650 on stock voltages should be far more than sufficient for any game out there right now, unless you're running 3D Vision/Surround.


----------



## sl00tje

Quote:



Originally Posted by *Alatar*


it shouldn't need any permissions. I set it up so anyone with the link could edit it.


I've just tried to edit the sheet but you need permission.

Quote:



Originally Posted by *RagingCain*


You guys should be using EVGA OC Scanner or OCCT GPU stress test to verify stability.


Found this in the EVGA forum:

Quote:



Originally Posted by *EVGA_JacobF*

Well OC Scanner does a great job on single GPU, but not the best on dual GPU today (we are working on this







) So it is definitely recommended to still try a full screen 3d app like heaven benchmark!


I think we have to wait for a software update


----------



## Masked

Quote:


> Originally Posted by *jcde7ago;12991803*
> I honestly don't recommend overvolting at all at the moment, period.
> 
> Just find your max overclock with stock voltages and stick with it for the time being, until a BIOS is maybe released that gives us some leeway with the voltages (though I can't imagine it's anything greater than a 5% or so overvolt).


I was very unhappy to find that when my card was @1.05v it was actually 1.08v ~ That also explains as to why a lot of cards were/have been blowing up.

@1.02 I'm solid and my volt meter reads 1.05...Still slugging along @850mhz without a single issue.

At work, I find the same issues...On site right now but, the meter turned up 1.03 when I clearly had the card @1.00...

Mine are serial babies so, back to back...Perhaps I just got the 2 that over-vault by 0.03?

In any event, I would NOT go over 1.00v...Not at all.
Quote:


> Originally Posted by *sl00tje;12993622*
> Found this in the EVGA forum:
> 
> I think we have to wait for a software update


I said that like...yesterday omgz









Jacob also mentioned there were scaling issues in MOST benches because the drivers weren't yet optomized for SLI much less, benching.


----------



## soilentblue

http://www.hardocp.com/article/2011/04/03/asus_geforce_gtx_590_overclocking_followup

new article on hardocp overclocking the gtx 590. just came out on 4/4. it's a very good article and shows the OVP kicking in.


----------



## saulin

Quote:


> Originally Posted by *soilentblue;12996726*
> http://www.hardocp.com/article/2011/04/03/asus_geforce_gtx_590_overclocking_followup
> 
> new article on hardocp overclocking the gtx 590. just came out on 4/4. it's a very good article and shows the OVP kicking in.


The best article so far

So upping the voltage will throttle the card and upping the voltage and increasing the clocks will throttle the card even more.

It seems like Nvidia really is blocking GTX 590 users from increasing the voltage on the newer drivers. I bet this would have never happened if people would not have made such a huge deal out of the couple of cards that blew up because they put too much voltage.

So I guess now if people really want to find out what the max overclock really is, they have to use old drivers and probably bump the voltage to 1v max. because the new drivers will throttle the card.

This also means that pretty much you guys are limited to what the card can do on stock voltage for now till someone finds a way to remove OCP.

There is something the article doesn't mention though that might be worth exploring.

What causes OCP. The Power draw or is it also heat related? Will cooling help? Do Watercooled cards overclock more on stock voltage?


----------



## Masked

Quote:


> Originally Posted by *saulin;12996916*
> The best article so far
> 
> So upping the voltage will throttle the card and upping the voltage and increasing the clocks will throttle the card even more.
> 
> It seems like Nvidia really is blocking GTX 590 users from increasing the voltage on the newer drivers. I bet this would have never happened if people would not have made such a huge deal out of the couple of cards that blew up because they put too much voltage.
> 
> So I guess now if people really want to find out what the max overclock really is, they have to use old drivers and probably bump the voltage to 1v max. because the new drivers will throttle the card.
> 
> This also means that pretty much you guys are limited to what the card can do on stock voltage for now till someone finds a way to remove OCP.
> 
> There is something the article doesn't mention though that might be worth exploring.
> 
> What causes OCP. The Power draw or is it also heat related? Will cooling help? Do Watercooled cards overclock more on stock voltage?


I still don't agree that the card "throttles" due to voltage.

[email protected] (1.05 real) and shotty performance says to me there are some other issues here.

In running real world programs, I'm seeing a 90% core 1 vs 30% core 2 ~ Same thing in benches.

I'm not going to call it throttling because I'm firmly against the fact that this card apparently, throttles; but, I think there are very serious scaling issues...EVGA has mentioned them several times, especially in benches...I think that once those issues are resolved with new drivers...A lot of this "throttling" talk won't be relevant...

To anyone with Aida or precision, anyone who actually watched their cores during most benches, each program is giving a different #...all are monitoring based...So, straight up, the card isn't performing.

Unfortunately hard did the testing as a base card and we were unable to see core 1 vs core 2 but, I'd bet you $$$$ they weren't even throughout their testing.


----------



## saulin

Quote:


> Originally Posted by *Masked;12997077*
> I still don't agree that the card "throttles" due to voltage.
> 
> [email protected] (1.05 real) and shotty performance says to me there are some other issues here.
> 
> In running real world programs, I'm seeing a 90% core 1 vs 30% core 2 ~ Same thing in benches.
> 
> I'm not going to call it throttling because I'm firmly against the fact that this card apparently, throttles; but, I think there are very serious scaling issues...EVGA has mentioned them several times, especially in benches...I think that once those issues are resolved with new drivers...A lot of this "throttling" talk won't be relevant...
> 
> To anyone with Aida or precision, anyone who actually watched their cores during most benches, each program is giving a different #...all are monitoring based...So, straight up, the card isn't performing.
> 
> Unfortunately hard did the testing as a base card and we were unable to see core 1 vs core 2 but, I'd bet you $$$$ they weren't even throughout their testing.


Your card is water cooled right?

Are you able to do 750Mhz on it without touching the voltage?

Or how far can you take it before having to up the voltage? I'm sure the cards are capable of good overclocks but somehow they are being crippled.


----------



## grunion

Quote:


> Originally Posted by *saulin;12996916*
> The best article so far
> 
> So upping the voltage will throttle the card and upping the voltage and increasing the clocks will throttle the card even more.
> 
> It seems like Nvidia really is blocking GTX 590 users from increasing the voltage on the newer drivers. *I bet this would have never happened if people would not have made such a huge deal out of the couple of cards that blew up because they put too much voltage.*
> 
> So I guess now if people really want to find out what the max overclock really is, they have to use old drivers and probably bump the voltage to 1v max. because the new drivers will throttle the card.
> 
> This also means that pretty much you guys are limited to what the card can do on stock voltage for now till someone finds a way to remove OCP.
> 
> There is something the article doesn't mention though that might be worth exploring.
> 
> What causes OCP. The Power draw or is it also heat related? Will cooling help? Do Watercooled cards overclock more on stock voltage?


Your twisted views are unbelievable.
More cards would have blown up if reviewers never bothered to oc it.
Believe it or not, it's a good thing it happened or every tom dick and harry would be popping cards.

Quote:


> This also means that pretty much you guys are limited to what the card can do on stock voltage for now till someone finds a way to remove OCP.


Are you serious?

Look what happened when the OCP was disabled, or not present.
Cards blew up.


----------



## saulin

Quote:


> Originally Posted by *grunion;12997127*
> Your twisted views are unbelievable.
> More cards would have blown up if reviewers never bothered to oc it.
> Believe it or not, it's a good thing it happened or ever tom dick and harry would be popping cards.
> 
> Are you serious?
> 
> Look what happened when the OCP was disabled, or not present.
> Cards blew up.


But we know 1v should be safe now. So as long as people can at least add some voltage the overclock should increase. However at this point overclock is very limited if you can only do stock voltage.


----------



## Masked

Quote:


> Originally Posted by *saulin;12997126*
> Your card is water cooled right?
> 
> Are you able to do 750Mhz on it without touching the voltage?
> 
> Or how far can you take it before having to up the voltage? I'm sure the cards are capable of good overclocks but somehow they are being crippled.


Stock with the hydro-copper, I'm pushing 750mhz with absolutely no voltage tweaks, card is stable 24/7 and runs at @27-35C...

At home I'm pushing, 1.02v @850mhz and scaling is absolutely horrible as I've said several times.

Scaling is terrible at work on the stock as well...

I wouldn't move to call it throttling yet until in the very least, the scaling "issues" are resolved...


----------



## EvilClocker

I can;t Figure out how u are running 750mhz with no voltage tweak. I must have gotten a crappy card. mine will not do anything above 680mhz will stock vcore.

and I have a 1200 watt Power supply with 100amp single rail


----------



## soilentblue

Quote:


> Originally Posted by *Masked;12997238*
> Stock with the hydro-copper, I'm pushing 750mhz with absolutely no voltage tweaks, card is stable 24/7 and runs at @27-35C...
> 
> At home I'm pushing, 1.02v @850mhz and scaling is absolutely horrible as I've said several times.
> 
> Scaling is terrible at work on the stock as well...
> 
> I wouldn't move to call it throttling yet until in the very least, the scaling "issues" are resolved...


scaling has nothing to do with that article. it wasn't about fps but what the card does when the volts are raised. it's definitely throttling. whether or not nvidia wants to change the throttling algorithms is up to them.


----------



## saulin

Quote:


> Originally Posted by *EvilClocker;12997270*
> I can;t Figure out how u are running 750mhz with no voltage tweak. I must have gotten a crappy card. mine will not do anything above 680mhz will stock vcore.


Is your card water cooled too? I guess you did get a crappy card cause HOCP pushed their air cooled card to 723Mhz and it was using 0.963v, I guess it might have a tiny bump on the voltage


----------



## soilentblue

Quote:


> Originally Posted by *EvilClocker;12997270*
> I can;t Figure out how u are running 750mhz with no voltage tweak. I must have gotten a crappy card. mine will not do anything above 680mhz will stock vcore.


i was also about to say the same thing. either you have the best card shipped to date or your numbers are definitely incorrect. what driver set are you using masked?


----------



## grunion

Quote:


> Originally Posted by *saulin;12997149*
> But we know 1v should be safe now. So as long as people can at least add some voltage the overclock should increase. However at this point overclock is very limited if you can only do stock voltage.


You can add all the voltage you want, but the OCP will say uh uh you're not going to draw 50amps(arbitrary number) through me.
Once that happens, throttling occurs.

Now tell me why you think people would want to disable the OCP?
Is that a smart thing to do?


----------



## RagingCain

Quote:


> Originally Posted by *saulin;12996916*
> The best article so far
> 
> So upping the voltage will throttle the card and upping the voltage and increasing the clocks will throttle the card even more.
> 
> It seems like Nvidia really is blocking GTX 590 users from increasing the voltage on the newer drivers. I bet this would have never happened if people would not have made such a huge deal out of the couple of cards that blew up because they put too much voltage.
> 
> So I guess now if people really want to find out what the max overclock really is, they have to use old drivers and probably bump the voltage to 1v max. because the new drivers will throttle the card.
> 
> This also means that pretty much you guys are limited to what the card can do on stock voltage for now till someone finds a way to remove OCP.
> 
> There is something the article doesn't mention though that might be worth exploring.
> 
> What causes OCP. The Power draw or is it also heat related? Will cooling help? Do Watercooled cards overclock more on stock voltage?


Unfortunately, this would be a power draw/power usage related OCP, not triggered by heat. Heat would be a separate type of throttling, one of clocks and voltages.

Water cooling would/should have no effect on the cards OCP, thus not allowing any better clocks at all.


----------



## soilentblue

Quote:


> Originally Posted by *grunion;12997346*
> You can add all the voltage you want, but the OCP will say uh uh you're not going to draw 50amps(arbitrary number) through me.
> Once that happens, throttling occurs.
> 
> Now tell me why you think people would want to disable the OCP?
> Is that a smart thing to do?


^^^this is exactly why people are not removing the OCP.


----------



## Masked

Quote:


> Originally Posted by *soilentblue;12997280*
> scaling has nothing to do with that article. it wasn't about fps but what the card does when the volts are raised. it's definitely throttling. whether or not nvidia wants to change the throttling algorithms is up to them.


Again, disagree...Hard's test WAS an FPS based test and with the voltages raised, the card spiked downward.

However, due to it being an FPS based test, there was NO CORE MONITORING...So, Hard, I believe experienced the same exact thing I am: 90%//30% ~ That =/= throttling.

It seems to be coorelative that when the voltage goes up, scaling goes to **** so, if those are their findings, I absolutely agree w/them but, that's NOT a throttling issue...

Now if OCP is kicking in and limiting power to the 2nd core, that I can understand but, there's no evidence of that atm, just terrible scaling.
Quote:


> Originally Posted by *saulin;12997283*
> Is your card water cooled too? I guess you did get a crappy card cause HOCP pushed their air cooled card to 723Mhz and it was using 0.963v, I guess it might have a tiny bump on the voltage


Yes it's WC'd...There's a screenshot 2/3 pages back where I showed it was @700mhz, I kept going up until I BSOD'd ~ it's 24/7 stable at stock.
Quote:


> Originally Posted by *soilentblue;12997288*
> i was also about to say the same thing. either you have the best card shipped to date or your numbers are definitely incorrect. what driver set are you using masked?


I have the beta's at home and .91 @ work.


----------



## Masked

Quote:


> Originally Posted by *Masked;12963491*
> Work PC btw:
> 
> 
> It's 24/7 ~ I haven't started tweaking the MC yet b/c I hear that's where people have the most issues but, I have none.


Found it! Page IDK, way back -- 40-somthing.

This was before I pushed it to [email protected]


----------



## RagingCain

Quote:


> Originally Posted by *Masked;12997371*
> Again, disagree...Hard's test WAS an FPS based test and with the voltages raised, the card spiked downward.
> 
> However, due to it being an FPS based test, there was NO CORE MONITORING...So, Hard, I believe experienced the same exact thing I am: 90%//30% ~ That =/= throttling.


There is a flaw in your theory that this is just driver issues, or typical low GPU usage

Stock clocks and voltages have higher performance than overclocked ones. Performance decreases compared to stock. I think what we are seeing is GPU usage throttling to reduce power usage/voltages. Although your case is unique in one GPU has low usage at 30% and the other is not, there is also likely chance of poor scaling as well. It just doesn't sit with the fact that a card at stock performs better than an overclocked one. I would anticipate that the GPU usage would correspond inversely to the speed of your clocks, 100% at stock clocks, 80% at 700, etc etc. Now incur low GPU usage or lack of SLI optimizations and the OCP issue is exacerbated.

Which is what I suspected back when Rush was claiming the cards were throttling yet the speeds and voltages did not waiver.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;12997737*
> There is a flaw in your theory that this is just driver issues, or typical low GPU usage
> 
> Stock clocks and voltages have higher performance than overclocked ones. Performance decreases compared to stock. I think what we are seeing is GPU usage throttling to reduce power usage/voltages. Although your case is unique in one GPU has low usage at 30% and the other is not, there is also likely of poor scaling as well. It just doesn't sit with the fact that a card at stock performs better than an overclocked one. I would anticipate that the GPU usage would correspond inversely to the speed of your clocks, 100% at stock clocks, 80% at 700, etc etc. Now incur low GPU usage or lack of SLI optimizations and the OCP issue is exacerbated.
> 
> Which is what I suspected back when Rush was claiming the cards were throttling yet the speeds and voltages did not waiver.


My biggest issue with the throttling theory is, if the card was throttling, GPU usage would be at 50/50 or 60/60 or 70/70 etc...Instead, for most users I've spoken to, it's actually either 90/30, 70/20 etc...

That says to me it's either limiting power by core OR it's not scaling properly...I've been taking the scaling route because I personally don't think NVIDIA would be that stupid but, if I'm wrong, so-be-it.


----------



## grunion

Quote:


> Originally Posted by *Masked;12997803*
> My biggest issue with the throttling theory is, if the card was throttling, GPU usage would be at 50/50 or 60/60 or 70/70 etc...Instead, for most users I've spoken to, it's actually either 90/30, 70/20 etc...
> 
> That says to me it's either limiting power by core OR it's not scaling properly...I've been taking the scaling route because I personally don't think NVIDIA would be that stupid but, if I'm wrong, so-be-it.


What if only 1 core is tripping its dedicated ocp, each core has its own.
Maybe why people see 90/30, 80/20, etc.

Scaling in my experience affects both cores equally, if a game has poor scaling then both cores are at 20, 30, 40, 50%, not one at 50 and the other at 80%


----------



## Masked

Quote:


> Originally Posted by *grunion;12997863*
> What if only 1 core is tripping its dedicated ocp, each core has its own.
> Maybe why people see 90/30, 80/20, etc.
> 
> Scaling in my experience affects both cores equally, if a game has poor scaling then both cores are at 20, 30, 40, 50%, not one at 50 and the other at 80%


At this point, I have to agree.

Scaling however, especially on the 7900's and the 295's was never even, thus my immediate "scaling issues"...

EVGA has acknowledged there are "scaling issues" and are working toward resolving them but, I happen to agree with the OCP possibly limiting the other core's performance in an effort to curb the voltage.

That seems logical all-be-it absolutely ******ed to limit 1 core but, it would explain the many issues people are having, as well.

So I'll go with that, I tip my hat to you, sir.


----------



## soilentblue

Quote:


> Originally Posted by *Masked;12997803*
> My biggest issue with the throttling theory is, if the card was throttling, GPU usage would be at 50/50 or 60/60 or 70/70 etc...Instead, for most users I've spoken to, it's actually either 90/30, 70/20 etc...
> 
> That says to me it's either limiting power by core OR it's not scaling properly...I've been taking the scaling route because I personally don't think NVIDIA would be that stupid but, if I'm wrong, so-be-it.


that's not necessarily true. it could be just as easily throttling one core instead of both cores making throttling present but input into the gpu software incorrectly. ocp could very well be be throttling one gpu instead of both and scaling really isn't as off as we think. who knows really but there's definitely throttling otherwise there would be no voltage "sweet spot".

i think nvidia will be fine tuning ocp on the card in the next 590 exclusive driver set more than adjusting the scaling and i really think they will never admit to it.


----------



## soilentblue

dah grunion beat me to it lol


----------



## soilentblue

Quote:


> Originally Posted by *Masked;12997927*
> At this point, I have to agree.
> 
> Scaling however, especially on the 7900's and the 295's was never even, thus my immediate "scaling issues"...
> 
> EVGA has acknowledged there are "scaling issues" and are working toward resolving them but, I happen to agree with the OCP possibly limiting the other core's performance in an effort to curb the voltage.
> 
> That seems logical all-be-it absolutely ******ed to limit 1 core but, it would explain the many issues people are having, as well.
> 
> So I'll go with that, I tip my hat to you, sir.


just a theory but if it was nvidia's quick fix for the cd drivers to use the ocp from the 570 due to both cards having the same phase layout then that would explain the reason it only sees one gpu. we'll never know the truth but it's always fun to make a guess.


----------



## Pedros

Doesn't work that way soil ... basically what ocp reads it's the power requirement from both gpu's, but the "calculations" have and order that is basically

total consumption = gpu 1 + gpu 2 + memory & other circuitry.
What is important is the gpu 1 + gpu 2 since the memor & other circuitry, doesn't have so much fluctuation in terms of power.

So what ocp does is if total consumption > ocp limit , the gpu that is requiring more power will be target of higher ocp throttling.

one gpu is not equal to the other, there are voltage fluctuations, one gpu may need a little more voltage than the other, so the core that is requiring more power is the first to be handicapt. The ocp algorithm makes the calculations that for x amount of processing, each gpu must be between an interval of y and z of power requirement. If one of the gpu's get's out of scale, it's throttled down immediately.

Although they must be synced, the power needs differ ( even if the difference is minimal ... that was the main idea behind binned gpu's ... to have the lowest and closest vid possible )

Even the core load isn't always 50/50...


----------



## DarthShader

Quote:


> Originally Posted by *Masked;12997803*
> My biggest issue with the throttling theory is, if the card was throttling, GPU usage would be at 50/50 or 60/60 or 70/70 etc...Instead, for most users I've spoken to, it's actually either 90/30, 70/20 etc...


Maybe it's indeed a driver issue and the throttling is not even?







Got two questions:

1) Do you experience microstuttering when playing a game that shows you uneven GPU usage afterwards?
2) Does your lwoer clocked work card behave like that too?


----------



## Masked

Quote:


> Originally Posted by *DarthShader;12998256*
> Maybe it's indeed a driver issue and the throttling is not even?
> 
> 
> 
> 
> 
> 
> 
> Got two questions:
> 
> 1) Do you experience microstuttering when playing a game that shows you uneven GPU usage afterwards?
> 2) Does your lwoer clocked work card behave like that too?


1) No, never on either...

I haven't had much time to game at home...It so happens that I'm moving Apache servers to other hosts this week so my play time has been absolutely abysmal.

In Crysis//Crysis 2 I've seen absolutely no stuttering...It's been 100% (This is the 850mhz card)

2) I run different apps on my work PC that are just AS graphic intensive BUT, are of other media.

So for example, I have 2x or 3x monitors at work, depending on what I need done and typically I'm too "tired/lazy" to switch it back to 2 so, I'm on 3 regularly.

Anyway, say my boss wants a new background for our ALX custom customers...I port open 3 windows of CS5 just because it makes my life easier, a beta-flash program I can't really speak about and typically ACDSee...

When I have them ALL open, 1 core spikes to 80% let's say, the 2nd core chills at 30%...Keep in mind, I'm USING all 3 monitors extensively and it doesn't change.

If I watch a BRD movie, say Avatar on the 3rd screen...The 1st core goes up to 90% and the second core sneaks up to 40-45% but, it's always a drastic difference.

With EVGA claiming it was/is scaling issues (Same thing for benches) I'm still under the assumption it's MOSTLY a scaling/driver issue BUT, I can//do see how OCP limiting COULD be a major factor as well...Which, as was previously stated, Nvidia will never admit to, regardless.

So the real question in my eyes atm is...Where does scaling start/end in regards to the OCP limiting the cores?


----------



## armartins

Masked I have no experience with SLI but I do know it can use both cores in windowed gaming (and by inference desktop tasks/apps) while Crossfire won't do it. But honestly I don't see both cores having the load really evenly split while windowed/desktop tasks. The numbers from the benchmarks in the reviews suggest that the card is indeed scaling better. And I also can't understand why would the low results on benchmarks while overclocked be caused by drivers or poor scaling. It would affect both scenarios and the benchmarks should scale accordingly. I know that sometimes there are invisible bottlenecks and that may be the case. The OCP argument is strong and more logically plausible IMHO.


----------



## Masked

Quote:


> Originally Posted by *armartins;12998621*
> Masked I have no experience with SLI but I do know it can use both cores in windowed gaming (and by inference desktop tasks/apps) while Crossfire won't do it. But honestly I don't see both cores having the load really evenly split while windowed/desktop tasks. The numbers from the benchmarks in the reviews suggest that the card is indeed scaling better. And I also can't understand why would the low results on benchmarks while overclocked be caused by drivers or poor scaling. It would affect both scenarios and the benchmarks should scale accordingly. I know that sometimes there are invisible bottlenecks and that may be the case. The OCP argument is strong and more logically plausible IMHO.


SLI scaling in real-time apps spreads just as it would in HARD sli...50/50 or something close...this is the same w/the 580...I could show a hard example with nearly every card in my office.

However, with this card, the demand is so high on the core, it's not scaling and I'm beginning to side with the OCP argument because quite frankly, there's really nothing else that makes sense.

90/30 is a bit extreme, regardless of the benches/apps or not.


----------



## soilentblue

koolance released picts of their cooler


----------



## Special_K

Installed and running nicely, now to add my custom ventilation so they run nice and warm. (better than hot and cool won't happen)


----------



## Farih

Very sexy. Grats with your cards !

now we want to see benchies together with that beastly 980X


----------



## maur0

This is un efin belivable. 60+ pages and ppl still have no idea why/when OCP kicks in.
One guy wants to remove OCP, other wants to OC by setting 1V and then raise the core,

and this third after 60+ pages keeps claiming there is no throttling, although HWbot is full of bad scores with high GPU from day 1, although its visible from [H] that higher MHz=lower fps and higher OV=lower fps, yet he keeps repeating bad scaling, oh and he bases this on Adobe and movies playback. Few [email protected] semi interesting benchmarks and the rest is blahblahblah blahblahblah.,.


----------



## Open1Your1Eyes0

*Any owners of an EVGA GTX590, I'm looking to buy the mousepad that came with your package so feel free to PM me anytime.

See thread: Link*


----------



## remer

I just got my new psu installed. I don't have time tonight to fiddle much but I got a 3dmark11 score of P9218 with 680/1360/1850, stock voltage and 267.91 drivers. I think I'm going to leave it at that until I have a chance to spend some quality time getting things tuned in.


----------



## jcde7ago

Got mine setup tonight, running at 670/1340/1770 @ stock volts, stable via 3DMark11, Heaven and Vantage. Will probably push a wee-bit more later tonight.

Also played a couple hours of Crysis 1/2, seems good so far.

Coming from the heat, power consumption and noise of 480's in SLI, this card is a welcome change.

While it may not have the OC potential that pretty much every enthusiast card has, it's still a beast - sure, 580s in SLI will beat it by clocking 170-200 mhz above this on air, but I don't think that's worth an extra $300, in addition to the heat, noise, power consumption etc. - but that's just me, and to each his own. I am very happy with my card so far... 670/1340/1770 is a respectable OC, and I could pretty much hit 720ish with a slight overvolt but i am fine with my current OC without having to overvolt, so I may leave it here for the time being.

Pics (bear with me and my crappy iPhone 4 pics!):

The GTX 590 and the GTX 480s it replaced - it was indeed hard to ship those off today, they were such awesome cards.


----------



## RagingCain

Soooo shexy, nice Special_K and JC









I hope you have fun with them.


----------



## rush2049

Quote:



Originally Posted by *Masked*


** snip **

To anyone with Aida or precision, anyone who actually watched their cores during most benches, each program is giving a different #...all are monitoring based...So, straight up, the card isn't performing.

** snip **


When testing my card I made sure to show the graphs of gpu usage.... of both cores. You can obviously see the cores dropping and then raising again during the same sections of the same benchmark. So whatever function the cards were doing at the middle sections of the vantage bechmarks make the cards drop down. If it is the cards pulling too much voltage/amperage to accomplish shader X or Y number of polygons, I do not know.... but it is repeatable and can be seen repeatedly on any of the graphs I posted.

To those that think it was my cpu limiting the gpu, if that were the case the cards would not jump to 100% near the end/begining of the tests and would stay low always.... and the one time I monitored both the gpu usage and the cpu usage, the cpu should have been maxed when the gpu was dipping, but that wasn't the case.....


----------



## Levesque




----------



## kazukun

Difference in two pieces of prints
















ZotacGTX590
One serial number difference


----------



## Alatar

updated.

removed delavan from the list since he got rid of his card. If I missed something shoot me a pm or post it here thanks


----------



## Masked

Quote:


> Originally Posted by *rush2049;13005059*
> When testing my card I made sure to show the graphs of gpu usage.... of both cores. You can obviously see the cores dropping and then raising again during the same sections of the same benchmark. So whatever function the cards were doing at the middle sections of the vantage bechmarks make the cards drop down. If it is the cards pulling too much voltage/amperage to accomplish shader X or Y number of polygons, I do not know.... but it is repeatable and can be seen repeatedly on any of the graphs I posted.
> 
> To those that think it was my cpu limiting the gpu, if that were the case the cards would not jump to 100% near the end/begining of the tests and would stay low always.... and the one time I monitored both the gpu usage and the cpu usage, the cpu should have been maxed when the gpu was dipping, but that wasn't the case.....


The major issue I have is...It's not all being reported as the same usage.

Aida will show 90/30 ~ Keep in mind, this is an award winning monitoring/server monitoring program that's far beyond reproach.

Precision will show 60/40 ~ Sometimes innacurate but, good program.

MSI will show 70/35 ~ Again, same as precision.

Benchmarks = 60/60 ~ 50/50 ~ AND an Evga tech has stated SEVERAL times on their forums, there are ISSUES with scaling and GPU usage reports.

At this point, I'm siding with Aida...If it says 90/30, while Jacob is saying benches are having issues...That's what it is.

If it's a cause of the OCP, I'm by far no expert but, I do have to side with these guys that it IS 90/30 regardless of what the benches are reporting...

While I don't approve of Pedro's "desktop paperweight" persona, I do agree that the higher core is the one limited and if it's the 2nd core in our case, this makes a lot of sense.

So at this point, it's clearly, at least IMO, a situation of hopefully drivers will fix the "scaling" issues a lot of us are seeing.


----------



## kazukun

Quote:


> Originally Posted by *Alatar;13007148*
> updated.
> 
> removed delavan from the list since he got rid of his card. If I missed something shoot me a pm or post it here thanks


Hi
[email protected]


----------



## Alatar

Sorry about that kazukun, fixed now.

Ughh... my psu still hasn't shipped ,I'd like to do some testing and heavy gaming but without it I'm at a loss :/


----------



## kazukun

Alatar　Thank you for a change.
I attach a water block late this week.
I report it if I attach it.


----------



## grunion

Quote:


> Originally Posted by *kazukun;13006426*
> Difference in two pieces of prints
> 
> ZotacGTX590
> One serial number difference


What are the manufacturing weeks of those cores?


----------



## kazukun

Quote:


> Originally Posted by *grunion;13008118*
> What are the manufacturing weeks of those cores?


45weeks & 49weeks ??

This is a batch number.
45N3XC 1045A1 & 49N3FN 1049A1


----------



## grunion

Quote:


> Originally Posted by *kazukun;13008630*
> 45weeks & 49weeks ??
> 
> This is a batch number.
> 45N3XC 1045A1 & 49N3FN 1049A1


Wow those cores are from last year, they've really been working on the 590 for a good while.


----------



## Masked

Quote:


> Originally Posted by *grunion;13008652*
> Wow those cores are from last year, they've really been working on the 590 for a good while.


My understanding is they started being picked when the 580's were intially in production...

Then when the second/third runs started, they kept picking, obviously.

Would explain their "cherry picking" explanation to an extent.


----------



## soilentblue

they should have taken care towards the rest of the card and software as much as they did the gpus.


----------



## jcde7ago

Quote:


> Originally Posted by *soilentblue;13008993*
> they should have taken care towards the rest of the card and software as much as they did the gpus.


The thing is, I don't think Nvidia sought to achieve the performance levels of 580's in SLI with the 590 - and I think that's a huge point that people are missing.

This is a single-card, dual-GPU solution for people like me who want a performance increase past a single GTX 580 but with reduced heat, power consumption, and noise.

The above is exactly the reason why I switched from 2x480s to a single GTX 590. And let me tell you, it was very much worth it.

Would it be awesome if I could OC to the performance levels of 2 stock 580s?

Absolutely.

But I also paid $300 less for my card, and I still get the benefit of 1,024 CUDA cores, etc. I just have to be happy with an overclock on stock volts at about 700 mhz which, for $300 less and coming with less noise, heat and power consumption, i am willing to live with.

Yes, it has failed to compete with the 6990 in the way that people expected it to - but realistically, people were expecting too much from a card that wasn't really supposed to match the power of 2 stand-alone 580s, period.


----------



## soilentblue

Quote:



Originally Posted by *jcde7ago*


The thing is, I don't think Nvidia sought to achieve the performance levels of 580's in SLI with the 590 - and I think that's a huge point that people are missing.

This is a single-card, dual-GPU solution for people like me who want a performance increase past a single GTX 580 but with reduced heat, power consumption, and noise.

The above is exactly the reason why I switched from 2x480s to a single GTX 590. And let me tell you, it was very much worth it.

Would it be awesome if I could OC to the performance levels of 2 stock 580s?

Absolutely.

edit: if you are going to run less vram than your competitor then you better be sure you can keep up with them another way. running this single card like 580 sli is the way. It's the best way. There is no grey area in business.

But I also paid $300 less for my card, and I still get the benefit of 1,024 CUDA cores, etc. I just have to be happy with an overclock on stock volts at about 700 mhz which, for $300 less and coming with less noise, heat and power consumption, i am willing to live with.

Yes, it has failed to compete with the 6990 in the way that people expected it to - but realistically, people were expecting too much from a card that wasn't really supposed to match the power of 2 stand-alone 580s, period.


they built the card to compete first and foremost. if it doesn't compete, then it's not a success. every card will have an owner that likes it but that was not their goal while they were R&D the card. If it works well for you then that's good, but from nvidia's view(they are a business first) it's not a success.

currently I wouldn't even recommend this card to anyone above 1920x1200. maybe in a month or two with better drivers but i'll never tell someone to buy a product off of "maybes". I would have rather spent an extra 100 bucks and had a card that could clock to 580 sli levels.

for me the best thing about this card currently is the CUDA cores........period

Quote:



Yes, it has failed to compete with the 6990 in the way that people expected it to - but realistically, people were expecting too much from a card that wasn't really supposed to match the power of 2 stand-alone 580s, period.


how else were they going to win against the 6990 unless they ran it as 580 sli. they knew they didn't have the vram on the card to compete. they knew what they were about to push out.


----------



## Masked

Quote:



Originally Posted by *soilentblue*


*they built the card to compete first and foremost. if it doesn't compete, then it's not a success.* every card will have an owner that likes it but that was not their goal while they were R&D the card. If it works well for you then that's good, but from nvidia's view(they are a business first) it's not a success.

currently I wouldn't even recommend this card to anyone above 1920x1200. maybe in a month or two with better drivers but i'll never tell someone to buy a product off of "maybes". I would have rather spent an extra 100 bucks and had a card that could clock to 580 sli levels.

for me the best thing about this card currently is the CUDA cores........period


This is false and I absolutely disagree.

This card was never built to compete, NVIDIA stated that from day 1 in their press releases...November, December, January...Never ever said ANYTHING about competition.

They DID say quieter, less power hungry, EXTREMELY LIMITED, CHERRY PICKED solution...Not ONCE did any press release EVER put forth or push competition as any other previous card release has.

The sampling that was given to us came with a note, it did not say ANYTHING about competition...It said a quieter, dual GPU solution...Even on the survey the released to reviewers it was blatantly obvious by the line of questioning, THIS CARD WAS NOT MADE TO COMPETE.

As I've said from day 1, HOW in god's name is a LIMITED EDITION anything going to compete with something STILL IN PRODUCTION?

Answer is simple: It can't because it doesn't have the numbers.

This was made as a more sensible dual GPU solution for it's niche users and it very much sold out instantly and succeeded in that regard...

Meant to compete? Even a high school student could tell you it would be rediculous.


----------



## soilentblue

Quote:



Originally Posted by *Masked*


This is false and I absolutely disagree.

This card was never built to compete, NVIDIA stated that from day 1 in their press releases...November, December, January...Never ever said ANYTHING about competition.

They DID say quieter, less power hungry, EXTREMELY LIMITED, CHERRY PICKED...Not ONCE did any press release EVER put forth or push competition as any other previous card release has.

The sampling that was given to us came with a note, it did not say ANYTHING about competition...It said a quieter, dual GPU solution...Even on the survey the released to reviewers it was blatantly obvious by the line of questioning, THIS CARD WAS NOT MADE TO COMPETE.

As I've said from day 1, HOW in god's name is a LIMITED EDITION anything going to compete with something STILL IN PRODUCTION?

Answer is simple: It can't because it doesn't have the numbers.

This was made as a more sensible dual GPU solution for it's niche users and it very much sold out instantly and succeeded in that regard...

Meant to compete? Even a high school student could tell you it would be rediculous.


masked get real man. they aren't in the business of petting unicorns and saving the planet. they are a business and what they say is no different from other companies. they are there to make money and if amd wasn't there then they would have higher prices and make more money.

also it was said from nvidia themselves that they had the fastest card. they said that to give the impression that their card was #1. that's competitive talk.

limited edition does not mean it can't compete.

not sure if the high school student was a jab at me or not but even you've posted when the gtx 590 came out that you wanted to wait for reviews to see which one was the winner. obviously you thought it had a chance of competing and it's obvious if the card can take the volts at 1.0v without ocp gimpoing it then it will definitely compete so i'm not sure why the remark.


----------



## Masked

Quote:


> Originally Posted by *soilentblue;13010175*
> masked get real man. they aren't in the business of petting unicorns and saving the planet. they are a business and what they say is no different from other companies. they are there to make money and if amd wasn't there then they would have higher prices and make more money.
> 
> also it was said from nvidia themselves that they had the fastest card. they said that to give the impression that their card was #1. that's competitive talk.
> 
> limited edition does not mean it can't compete.


Absolutely incorrect.

Business 101 my friend.

If you have 1000 units and your competition has 10000 units, what do you do?

You appeal to the niche market with a limited, "better niche" solution.

They absolutely did.

This is a card that's meant to be a quieter, more effecient, less problematic solution and again, you'd be a fool to say it didn't succeed in that regard.

It was never made to compete clock for clock, the PCB is direct proof of that.

They made something for a niche market that was absolutely a hit...It absolutely sold out and is significantly more popular card for card than the 6990...But, again, that gets into semantics.

Facts are simple: Nvidia did NOT make enough to compete...They did make enough to control AND satisfy a niche market and did so absolutely; this card was never made to compete CLOCK FOR CLOCK...Again, it was made for a niche market and it without a shadow of a doubt, succeeded.

A high school student could tell you the difference, it's free market capitalism...There's no way something limited can EVER compete to current production UNLESS it's appealing to a certain market...They did and they hit a home run.


----------



## soilentblue

it is not less problematic. as long as you don't touch the clocks it's quieter and cooler than it's competitior but you already said it's not meant to compete so why bring that up?

i never said anything about competeing clock for clock. you should never judge amd vs nvidia on clock for clock. why do you keep saying that?

you are talking about everything but the way it performs power wise. you wouldn't even be saying this if the card was more powerful. i'm done as it's getting nowhere but sorry I have a card that isn't watercooled and believe me once clocked up even on stock volts it's not quiet(due to needing more than autofan), it's not cooler, and it's not power effecient. All of this I can deal with as it's an enthusiast card and that's what you deal with when you put two gpus on one card. it does however, currently, not perform against it's competition.

nvidia dressed their card up like the milk man and people are mad that everyone is asking for milk


----------



## Masked

Quote:



Originally Posted by *soilentblue*


nvidia dressed their card up like the milk man and people are mad that everyone is asking for milk


That made me lol...









I'm just saying that, the card sold out...In less time than the 6990 did...With more numbers than the 6990 had on release...

And as to it's further success, we'll have to see but, saying it wasn't "good" for NVIDIA just isn't true.

They sold out of an ENTIRE RUN, within 8 hours...How is that not successful?


----------



## saulin

Quote:



Originally Posted by *soilentblue*


it is not less problematic. as long as you don't touch the clocks it's quieter and cooler than it's competitior but you already said it's not meant to compete so why bring that up?

i never said anything about competeing clock for clock. you should never judge amd vs nvidia on clock for clock. why do you keep saying that?

you are talking about everything but the way it performs power wise. you wouldn't even be saying this if the card was more powerful. i'm done as it's getting nowhere but sorry I have a card that isn't watercooled and believe me once clocked up even on stock volts it's not quiet(due to needing more than autofan), it's not cooler, and it's not power effecient. All of this I can deal with as it's an enthusiast card and that's what you deal with when you put two gpus on one card. it does however, currently, not perform against it's competition.

nvidia dressed their card up like the milk man and people are mad that everyone is asking for milk



Judging by the reviews. Stock vs stock, the GTX 590 wins because it actually is cooler and less noisy and they perform about the same.

Once overclocked. I'm not sure but I can tell you that in reviews the 6990 did not overclock that great either. I wouldn't think the 6990 pulls ahead that much when overclocked.

Plus Fud did have some benchmarks with newer Nvidia drivers that showed an improvement for the GTX 590 on benchmarks where it got beaten by the 6990. If anything once drivers mature for both cards the GTX 590 might end up being the faster card overal.


----------



## soilentblue

Quote:



Originally Posted by *Masked*


That made me lol...









I'm just saying that, the card sold out...In less time than the 6990 did...With more numbers than the 6990 had on release...

And as to it's further success, we'll have to see but, saying it wasn't "good" for NVIDIA just isn't true.

They sold out of an ENTIRE RUN, within 8 hours...How is that not successful?


I don't go by that because of the limited amount that nvidia had. i also don't know how many were supplied to every vender. amd sold out in a bunch of places as well. I also know that nvidia wanted to have the fastest card in the dual gpu market and they failed at that. money determines success first in a business and i don't know their complete stats but selling out doesn't always mean a success. the ps3 wasn't a success until year 2. the n64 sold over 20 million and still wasn't a success. there's definitely many ways to sell out of stock and still not be successful. the card obviously was constrained by budget, size, and power or nvidia would have just used the best of the best. the PR has definitely not helped the success of the card. bad PR in some situations is still very much bad PR and that matters when you are marketing a specific "niche".

Quote:



Originally Posted by *saulin*


Judging by the reviews. Stock vs stock, the GTX 590 wins because it actually is cooler and less noisy and they perform about the same.

Once overclocked. I'm not sure but I can tell you that in reviews the 6990 did not overclock that great either. I wouldn't think the 6990 pulls ahead that much when overclocked.

Plus Fud did have some benchmarks with newer Nvidia drivers that showed an improvement for the GTX 590 on benchmarks where it got beaten by the 6990. If anything once drivers mature for both cards the GTX 590 might end up being the faster card overal.


the gtx 590 does better with dx10 games but is beaten by the 6990 at dx11. alot of people are hoping that the april drivers bring this dx11 lackluster performance to an end. even some dx10 games are beat by the 6990 at not just average fps but min fps as well depending on the screen res.


----------



## Masked

Quote:



Originally Posted by *soilentblue*


I don't go by that because of the limited amount that nvidia had. i also don't know how many were supplied to every vender. amd sold out in a bunch of places as well. I also know that nvidia wanted to have the fastest card in the dual gpu market and they failed at that. money determines success first in a business and i don't know their complete stats but selling out doesn't always mean a success. the ps3 wasn't a success until year 2. the n64 sold over 20 million and still wasn't a success. there's definitely many ways to sell out of stock and still not be successful. the card obviously was constrained by budget, size, and power or nvidia would have just used the best of the best. the PR has definitely not helped the success of the card. bad PR in some situations is still very much bad PR and that matters when you are marketing a specific "niche".


I would like evidence of that because in every "leaked" release since November, I have yet to see 1 release that displays those intentions.

Nvidia sold out on Tigerdirect within 30 minutes. Sold out on Newegg within 15 minutes...Sold out on EVGA within 1hr...

6990 was present for days.

I hold to my statement, never once was this card meant to be the fastest, never once did they claim it was.

The ONLY evidence of any competition was a press release from AMD claiming to be the best and it was NEVER refuted by Nvidia.


----------



## soilentblue

Quote:


> Originally Posted by *Masked;13010761*
> I would like evidence of that because in every "leaked" release since November, I have yet to see 1 release that displays those intentions.
> 
> Nvidia sold out on Tigerdirect within 30 minutes. Sold out on Newegg within 15 minutes...Sold out on EVGA within 1hr...
> 
> 6990 was present for days.
> 
> I hold to my statement, never once was this card meant to be the fastest, never once did they claim it was.
> 
> The ONLY evidence of any competition was a press release from AMD claiming to be the best and it was NEVER refuted by Nvidia.


http://news.yahoo.com/s/zd/20110327/tc_zd/262388

it was there but I can't seem to find the actual image that nvidia used. they may have taken it down now.

also i thought the gtx 590 was on tigerdirect for two hours before it sold out. it went live at midnight where i live.

also there's alot more AIB for the 6990 than for nvidia so that's not a fair comparison. there's more stock with the 6990 so selling out will not be as fast. that doesn't mean the 6990 isn't getting more sales or making more money per card than the 590.


----------



## soilentblue

http://pressroom.nvidia.com/easyir/c...ue&prid=736275

here it is. i knew i saw it somewhere.

Quote:



Innovative Dual GPU Design Is Also Dramatically Quieter Than Competitive Products


tell me they aren't competiting.


----------



## Masked

Quote:



Originally Posted by *soilentblue*


http://news.yahoo.com/s/zd/20110327/tc_zd/262388

it was there but I can't seem to find the actual image that nvidia used. they may have taken it down now.

also i thought the gtx 590 was on tigerdirect for two hours before it sold out. it went live at midnight where i live.

also there's alot more AIB for the 6990 than for nvidia so that's not a fair comparison. there's more stock with the 6990 so selling out will not be as fast. that doesn't mean the 6990 isn't getting more sales or making more money per card than the 590.


I'm actually wrong and I fold...Nvidia's own front page is ripe with bull****









I apologize.

Official Press Release

Insert foot in mouth...

That's really interesting because UNTIL that release on the 24th, there's absolutely NOTHING they put out claiming it was the fastest...It was a quieter solution meant for a niche market.

**** you Nvidia, I'm losing more and more of these arguments because of your TERRIBLE PR.

I am going to add that the card, in terms of inventory, sold out so fast, it was absolutely a hit/success.


----------



## jcde7ago

Quote:


> Originally Posted by *soilentblue;13010820*
> http://news.yahoo.com/s/zd/20110327/tc_zd/262388
> 
> it was there but I can't seem to find the actual image that nvidia used. they may have taken it down now.
> 
> also i thought the gtx 590 was on tigerdirect for two hours before it sold out. it went live at midnight where i live.
> 
> also there's alot more AIB for the 6990 than for nvidia so that's not a fair comparison. there's more stock with the 6990 so selling out will not be as fast. that doesn't mean the 6990 isn't getting more sales or making more money per card than the 590.


You're delusional if you think the 590 was meant to compete with highly-clocked 580's in SLI.

That's like Nvidia killing its own fastest-single-gpu card market - what's not to like about the GTX 590 if it can do anything a pair of 580s can, consume less power, output less heat and noise, AND cost $300 less?

They stuck with this design to limit thermals, power consumption and noise, knowing that they did so at the expense of added performance.

People are trying to milk this card for much more than its capable and not wondering why it costs $300 less than a pair of cards it's "supposed to" compete with while having none of the disadvantages.

This is basic economics here.


----------



## Masked

Quote:



Originally Posted by *jcde7ago*


You're delusional if you think the 590 was meant to compete with highly-clocked 580's in SLI.

That's like Nvidia killing its own fastest-single-gpu card market - what's not to like about the GTX 590 if it can do anything a pair of 580s can, consume less power, output less heat and noise, AND cost $300 less?

They stuck with this design to limit thermals, power consumption and noise, knowing that they did so at the expense of added performance.

People are trying to milk this card for much more than its capable and not wondering why it costs $300 less than a pair of cards it's "supposed to" compete with while having none of the disadvantages.

This is basic economics here.


I DO agree with this, from a business standpoint, minus their claims...This release was a hit.


----------



## saulin

Quote:



Originally Posted by *jcde7ago*


You're delusional if you think the 590 was meant to compete with highly-clocked 580's in SLI.

*That's like Nvidia killing its own fastest-single-gpu card market* - what's not to like about the GTX 590 if it can do anything a pair of 580s can, consume less power, output less heat and noise, AND cost $300 less?

They stuck with this design to limit thermals, power consumption and noise, knowing that they did so at the expense of added performance.

*People are trying to milk this card for much more than its capable and not wondering why it costs $300 less than a pair of cards it's "supposed to" compete with while having none of the disadvantages.*

This is basic economics here.


Very well said

Also I noticed their claim on their site hahaha. But if you were to test more games than what are being tested in reviews believe me. The GTX 590 would come on top since most games are TWIMTBP titles.

On the other hand in most reviews the GTX 590 wins on half of the games and the 6990 wins on the other half and there is no way the 6990 could claim the title as the world's fastest card in this case.


----------



## jcde7ago

Quote:



Originally Posted by *Masked*


I DO agree with this, from a business standpoint, minus their claims...This was absolutely a hit.


Yes, but I wouldn't pay attention to those claims - we all know Nvidia's engineering department is just sneering at their marketing department for the way the card has been marketed - yes, it's been an overwhelming success in terms of sales and popularity, but they need to not boast about it being "the fastest gpu in the world," cause it clearly was not engineered to be that way.

It's meant to fill a niche market, which is where I agree with you about it being a "Limited Edition."


----------



## soilentblue

Quote:


> Originally Posted by *jcde7ago;13010918*
> You're delusional if you think the 590 was meant to compete with highly-clocked 580's in SLI.
> 
> That's like Nvidia killing its own fastest-single-gpu card market - what's not to like about the GTX 590 if it can do anything a pair of 580s can, consume less power, output less heat and noise, AND cost $300 less?
> 
> They stuck with this design to limit thermals, power consumption and noise, knowing that they did so at the expense of added performance.
> 
> People are trying to milk this card for much more than its capable and not wondering why it costs $300 less than a pair of cards it's "supposed to" compete with while having none of the disadvantages.
> 
> This is basic economics here.


if they wanted to beat the 6990 then they didn't have a choice. it's already been established that they wanted to beat the 6990 and the 590 sits at 570 sli power. the 6990 sits at 580 sli power. for those that don't like overclocking the 580 sli is still there. for those that like taking cards and overclocking the 590 could have been there but it currently isn't. you can not say that they wanted to beat the 6990, but didn't want to have 580 sli speed. it's an oxymoron.


----------



## soilentblue

Quote:



Originally Posted by *jcde7ago*


Yes, but I wouldn't pay attention to those claims - we all know Nvidia's engineering department is just sneering at their marketing department for the way the card has been marketed - yes, it's been an overwhelming success in terms of sales and popularity, but they need to not boast about it being "the fastest gpu in the world," cause it clearly was not engineered to be that way.

It's meant to fill a niche market, which is where I agree with you about it being a "Limited Edition."


that absolutely doesn't change the fact that it's marketed for a specific purpose and fails to deliver. if the marketing wasn't different then my claims wouldn't hold water.

same thing i told mask, you also don't know how much it cost to make the card and it being a limited item it's probably pretty close to cost. they may not even be making money off of it. it could be simply to gain the crown so to speak but i'm sure some profit is made off of it.

selling out does not always equal success. perception of success? sure, but not business $ucce$$.


----------



## jcde7ago

Quote:



Originally Posted by *soilentblue*


if they wanted to beat the 6990 then they didn't have a choice. it's already been established that they wanted to beat the 6990 and the 590 sits at 570 sli power. the 6990 sits at 580 sli power. for those that don't like overclocking the 580 sli is still there. for those that like taking cards and overclocking the 590 could have been there but it currently isn't. you can not say that they wanted to beat the 6990, but didn't want to have 580 sli speed. it's an oxymoron.


I for one would never try to speak about what it is Nvidia's intentions are with the GTX 590 in regards to competing with the 6990. If they came out and said that they wanted to compete with the 6990, that's different than their claim of being "the fastest gpu in the world" - we all know that's just bs propaganda. It's not necessarily an official stance of, "we're definitely at war with the 6990 here."

I know better than to spew crap I know nothing of, so whatever they said about wanting to beat/crush/slice/dice/cripple/maim the 6990 is on them.

Which again brings me to the point that the engineering and marketing aspects of this card do not agree with each other. It was built one way and marketed another, that's for sure.


----------



## soilentblue

Quote:


> Originally Posted by *Masked;13010886*
> I'm actually wrong and I fold...Nvidia's own front page is ripe with bull****
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I apologize.
> 
> Official Press Release
> 
> Insert foot in mouth...
> 
> That's really interesting because UNTIL that release on the 24th, there's absolutely NOTHING they put out claiming it was the fastest...It was a quieter solution meant for a niche market.
> 
> **** you Nvidia, I'm losing more and more of these arguments because of your TERRIBLE PR.
> 
> I am going to add that the card, in terms of inventory, sold out so fast, it was absolutely a hit/success.


i'm not saying your lying mask. just that the website is up since launch claiming it, then amd calls them out on it, and they give no response but the website stays up. it's a lie unless they are leaving it up cause they know something about the next drivers that I don't. still a very far fetched assumption though.


----------



## grunion

2 years a go Jen sun Huang said NV would not do a dual gpu card if it was not the the fastest solution.
Hmm

We really shouldn't be crapping up this thread, we have like 5 actual owners, but the others that are posting, me included and kind of making a mess of this thread.

Anyway, compared to the bullet/bomb/nuke proof rev 1 GTX295 this 590 does not pass on the crown as being the best.
That's what is disappointing to me.


----------



## Pedros

If you think that there are only 1000 units of a 590 ... meh ... think again ...

btw...
http://techreport.com/discussions.x/20711


----------



## RagingCain

Quote:



Originally Posted by *grunion*


2 years a go Jen sun Huang said NV would not do a dual gpu card if it was not the the fastest solution.
Hmm

We really shouldn't be crapping up this thread, we have like 5 actual owners, but the others that are posting, me included and kind of making a mess of this thread.

Anyway, compared to the bullet/bomb/nuke proof rev 1 GTX295 this 590 does not pass on the crown as being the best.
That's what is disappointing to me.


That sentiment completely makes sense to me. Myself, on the other hand, wanted a little bit more performance than 2x GTX 580s, yet have only ever wanted two cards at the most.

In regards to efficiency, base clock performance, the compactness and power requirements, I was indeed impressed. I of course am looking at it from a avid video gamer and SLI GTX 580 user. I am sorry but what they accomplished to me is still very impressive. It is a shame about the overclocking headroom, but again its 2x GF110s on this puppy. I just couldn't imagine it being done before it was released.

In that regard, I think its an absolute success, after all is there a more powerful nVidia card? On the other side, it failed to live up to nVidia's own pursuit of raw performance, robustness, and of course unofficially, its overclockability.

In regards to marketing, I have ignored most of it.


----------



## Masked

Quote:



Originally Posted by *Pedros*


If you think that there are only 1000 units of a 590 ... meh ... think again ...

btw...
http://techreport.com/discussions.x/20711


Nothing new...


----------



## CSHawkeye

you move to a GTX 590??


----------



## RagingCain

Updated the member's list, but Alatar you need to resize so it fits in the post better.

I recommend this at the end:
&w=93&h=500&gid=0&single=true

Width will be 93, height will be 500 (although it can be more)


----------



## Arizonian

Quote:


> Originally Posted by *grunion;13011258*
> 2 years a go Jen sun Huang said NV would not do a dual gpu card if it was not the the fastest solution.
> Hmm
> 
> We really shouldn't be crapping up this thread, we have like 5 actual owners, but the others that are posting, me included and kind of making a mess of this thread.
> 
> Anyway, compared to the bullet/bomb/nuke proof rev 1 GTX295 this 590 does not pass on the crown as being the best.
> That's what is disappointing to me.


It shows true colors when Nvidia has been on top for sooo long and Nvidia fans don't go to ATI Club threads and troll.

Yet ATI fans seem compelled to come over to our threads and troll to rub things in, and put us down says a lot about them.

If your not an Nvidia GTX 590 owner, or have a true interest in Nvidia, we prefer ATI fans please refrain. Start your own ' I hate Nvidia ' thread and leave this one alone.

We use this thread to communicate to each other on OCN regading our cards. Sort of find it disturbing to have to deal with you guys. Just look back on these pages and you'll see how many ATI fans felt they had to post here. After a long day of work I like to read this post to see how the new card series is going, good or bad. We don't sit around knocking ATI, because we don't care. These guys spent a lot of money and this is where we get info back and forth between us.

end rant/


----------



## RagingCain

Quote:


> Originally Posted by *Arizonian;13017182*
> It shows true colors when Nvidia has been on top for sooo long and Nvidia fans don't go to ATI Club threads and troll.
> 
> Yet ATI fans seem compelled to come over to our threads and troll to rub things in, and put us down says a lot about them.
> 
> If your not an Nvidia GTX 590 owner, or have a true interest in Nvidia, we prefer ATI fans please refrain. Start your own ' I hate Nvidia ' thread and leave this one alone.


BUT WHY WHEN 6990+6970 IS SO MUCH BETTER?!?!?!?!?!!?!?!?!?!

This one guy keeps saying that in every thread he goes in. Of course all of which seem to be related to GTX 590 in some way.

Just ignore it, or report them for trolling.

Thats one thing I do get annoyed here on Overclock.net, for every respectable member with an opinion worth listening to, there are 500 idiots. Yet there is no filter. Maybe like a minimum rep filter/forum would alleviate some of that... I don't know.

I wouldn't even mind deleting the first 68 pages of post. Just keep proof posts and the like, and useful information.


----------



## Stefy

Quote:


> Originally Posted by *RagingCain;13018252*
> BUT WHY WHEN 6990+6970 IS SO MUCH BETTER?!?!?!?!?!!?!?!?!?!
> 
> This one guy keeps saying that in every thread he goes in. Of course all of which seem to be related to GTX 590 in some way.
> 
> Just ignore it, or report them for trolling.
> 
> Thats one thing I do get annoyed here on Overclock.net, for every respectable member with an opinion worth listening to, there are 500 idiots. Yet there is no filter. Maybe like a minimum rep filter/forum would alleviate some of that... I don't know.
> 
> I wouldn't even mind deleting the first 68 pages of post. Just keep proof posts and the like, and useful information.


I really don't think an opinion is worth anything in the 590 vs 6990 debate. The 6990 is a better card. There's nothing more to it. Personally I like to make fun of Nvidia, not because I hate them, but because they really failed with the 570 and 590. Instead of getting mad people should just live on. Everyone fails, the more you laugh about it the more you'll get hit if Nvidia makes a comeback.

In such debates there will always be trolling. Just deal with it or move on.


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;13012868*
> Updated the member's list, but Alatar you need to resize so it fits in the post better.
> 
> I recommend this at the end:
> &w=93&h=500&gid=0&single=true
> 
> Width will be 93, height will be 500 (although it can be more)


I'll resize and tweak it when I get home from my chemistry exam.

and to the poster above me. Yes there is room for opinions because there are pros and cons for the the 590 and the 6990, but we do not want that discussion here.


----------



## Special_K

Just to show these cards are not gimped...by any means.


----------



## jcde7ago

Quote:


> Originally Posted by *Special_K;13018683*
> Just to show these cards are not gimped...by any means.


Wow, those are some pretty nice clocks for voltages so low.

Curious though, were you not able to clock up the memory much? Or did you just choose not to?


----------



## Arizonian

Quote:


> Originally Posted by *Stefy;13018353*
> I really don't think an opinion is worth anything in the 590 vs 6990 debate. The 6990 is a better card. There's nothing more to it. Personally I like to make fun of Nvidia, not because I hate them, but because they really failed with the 570 and 590. Instead of getting mad people should just live on. Everyone fails, the more you laugh about it the more you'll get hit if Nvidia makes a comeback.
> 
> In such debates there will always be trolling. Just deal with it or move on.


The point is we don't go over the the 6990 CLUB thread and post about the 6990's that just died or the 6950's that artifact due to bios flash on those clubs forums. There's a proper place to play fan boy, club threads aren't one of them.

So now that you've made your comment and have nothing to contribute to this 590 CLUB, you can move on. Point taken. Go back to the Red Tide ATI fan boy club and waste their time with your pointless comments.

Club forums should have the ability to have their own moderator to delete posts which are not on topic to the club members points of interest. That's all I'm going to comment as I'm not going to get into this on this thread any longer.

**Arizonian goes back to reading valid posts that matter to the 590 club and Nvidia enthusiasts quietly.**


----------



## zerounleashednl

@ Special_K: Whats the impact of the CPU clock when running 2x 590's? I see you've got an i7 980. Would you run a 3Dmark11 bench at default clock 3.3GHz and OC 4.3GHz with the 2x 590's for me?


----------



## Draygonn

Quote:


> Originally Posted by *Arizonian;13018960*
> There's a proper place to play fan boy, club threads aren't one of them.


I always felt that club threads should have different rules than normal threads. Such as no fanboy bashing and more off topic leeway for owners. If you want to bash 590s there are plenty of other threads. Not as many as Crysis, but still...


----------



## ReignsOfPower

Im still not part of the memberlist







Posted a while back lol


----------



## Alatar

sorry 'bout that, added.


----------



## ReignsOfPower

Quote:


> Originally Posted by *Alatar;13021011*
> sorry 'bout that, added.


Thanks babes! haha
Sorry about the lack of input in terms of benchies on my 30". Been busy with work







I also just purchased another Dell 3011 so I have two now







I heard you can chain more than 1 screen through the DP port in these cards. Is this true? Would improve the cable clutter behind these beasties if that were the case!


----------



## Masked

Quote:



Originally Posted by *ReignsOfPower*


Im still not part of the mamberlist







Posted a while back lol


I think that's the first time I've ever seen "mamberlist" spelt with an a...


----------



## ReignsOfPower

Quote:



Originally Posted by *Masked*


I think that's the first time I've ever seen "mamberlist" spelt with an a...


Whoops


----------



## Masked

Quote:



Originally Posted by *ReignsOfPower*


Whoops










I haven't had my coffee yet this morning so, it actually made me laugh sounding it out...Then it was like


----------



## CSHawkeye

well got my new card from EVGA from RMA, was pretty fast and they even gave me a letter apologizing for the inconvenience. New card is running with the 270.51 drivers nice and smooth.


----------



## Alatar

Quote:



Originally Posted by *RagingCain*


Updated the member's list, but Alatar you need to resize so it fits in the post better.

I recommend this at the end:
&w=93&h=500&gid=0&single=true

Width will be 93, height will be 500 (although it can be more)


again thanks for that









resized now. But I still think that 100 is a better width.


----------



## Masked

Quote:



Originally Posted by *Alatar*


again thanks for that









resized now. But I still think that 100 is a better width.


Hey Mr. Alatar,

Requested permission to edit that document but, meh maybe someone else would like to change it for me.

2x 590 GTX Hydro-Coppers ~

I updated the new drivers last night at home and am no longer at 850mhz, 750 on both ~ I'll get you a screenshot and verification when I get home...


----------



## Alatar

Quote:



Originally Posted by *Masked*


Hey Mr. Alatar,

Requested permission to edit that document but, meh maybe someone else would like to change it for me.

2x 590 GTX Hydro-Coppers ~

I updated the new drivers last night at home and am no longer at 850mhz, 750 on both ~ I'll get you a screenshot and verification when I get home...


I'm giving permissions to edit to whoever wants to do that, as soon as I see the notification mail in my gmail. You probably already have permission.


----------



## Masked

Last night I came to the conclusion 850mhz wasn't worth it.

The core was trying to get more power while the other core was choking so horribly, I actually recieved a score lower than my former 480's in SLI; so that train quickly came to a grinding halt...

After getting new drivers and upping it as much as I could on stock, I am able to get 750/1500...700~750mhz seems to be @the limit atm before it's "choked"...

Think I'm going to chill it at 750 and start benching a little.


----------



## soilentblue

for those that want an ek waterblock and backplate frozencpu said they will be getting backplates for the gtx 590 waterblocks as soon as they become available but no eta on when. last they heard was two weeks.


----------



## Special_K

@ jcde7ago, I just was trying the core clock for the moment, I was going in 10mhz increments until I could find it unstable, after that I looked for ways to increase the slider bar. I can try memory tonight

@ zerounleashednl, I will try to run those settings tonight to see what I get. I already ran Vantage at 4ghz with stock 590s and got 40k. I haven't had a chance to run 3dmark11 yet, as I havent had a dx11 card.


----------



## RagingCain

Quote:



Originally Posted by *Special_K*


@ jcde7ago, I just was trying the core clock for the moment, I was going in 10mhz increments until I could find it unstable, after that I looked for ways to increase the slider bar. I can try memory tonight

@ zerounleashednl, I will try to run those settings tonight to see what I get. I already ran Vantage at 4ghz with stock 590s and got 40k. I haven't had a chance to run 3dmark11 yet, as I havent had a dx11 card.


Hey Special, just mentioning it, you might already know this, but you can get a free copy of 3DMark11 with your cards (save you from having to buy it.)

All EVGA GTX 5xx card guys can get this:
http://www.evga.com/forums/tm.aspx?m=672594

Might be worth mentioning it in the first post too.


----------



## Twilex

My baby finally came in. Putting the EK block on as we speak. Have to redo my entire loop tonight so i'll see about getting benchies up. Might have to wait till this weekend though.

Not 56k Friendly


----------



## Special_K

@ RagingCain, thank you for the message, I got my serial just today after registering my cards day one. It took them a while to send me the link, but I got it now.


----------



## rush2049

Got my system back up and running. Turns out when the video card requested too many amps on the rail it was on my power supply complied. The problem was the amps was over specifications for my powers supply and the voltages went crazy high/low. In the end it was my motherboard that blew first, making the system do nothing but power the fans when turned on. No post, no video, no beeps.

I am lucky it didn't blow my cpu or anything else.... teaches me to check the power requirements and make sure I wire stuff up to enough rails to support power hungry components.

So now I am rocking the Corsair AX1200 power supply and a Corsair Obsidian 800D case and an Asus Crosshair IV Formula motherboard.

My temps are more stable than they ever were, and I have a single rail so I do not have to worry about power issues.

....now begins me waiting for better drivers, till then I will be gaming up a storm.


----------



## jcde7ago

Quote:


> Originally Posted by *rush2049;13026454*
> Got my system back up and running. Turns out when the video card requested too many amps on the rail it was on my power supply complied. The problem was the amps was over specifications for my powers supply and the voltages went crazy high/low. In the end it was my motherboard that blew first, making the system do nothing but power the fans when turned on. No post, no video, no beeps.
> 
> I am lucky it didn't blow my cpu or anything else.... teaches me to check the power requirements and make sure I wire stuff up to enough rails to support power hungry components.
> 
> So now I am rocking the Corsair AX1200 power supply and a Corsair Obsidian 800D case and an Asus Crosshair IV Formula motherboard.
> 
> My temps are more stable than they ever were, and I have a single rail so I do not have to worry about power issues.
> 
> ....now begins me waiting for better drivers, till then I will be gaming up a storm.


Nice!

I also want to point out that if you have a single GTX 590 and are running another card for PhysX (like i am with my 9800 GT), you may get some flickering, most likely due to the drivers.

If you are getting this flickering, disabling your PhysX card/card not in the SLI "loop" will fix the issue, as will pausing the game, changing the graphics from Extreme > High, and then back to Extreme again.

Thought it was something wrong with my card, but it's been flawless with every other game/3DMark11/Heaven, etc., and I stumbled upon some 580/general SLI users who were having the same issue with Crysis 2 and this solved it for them.


----------



## RagingCain

ME2 is gone


----------



## jcde7ago

Successfully did some Heaven/3DMark11 runs at 700/1400/1800 clocks @ stock volts.

http://www.techpowerup.com/gpuz/chyhc/


----------



## RagingCain

Quote:


> Originally Posted by *jcde7ago;13029833*
> Successfully did some Heaven/3DMark11 runs at 700/1400/1800 clocks @ stock volts.
> 
> http://www.techpowerup.com/gpuz/chyhc/


Nice, I updated your stats on the main page


----------



## whiz882

Nvidia Zotac GeForce GTX 590 Quad SLI



















☆My system 3DMark vantage score

Clock(MHz)FSB:4499.3MHz 204.5×22

Vcore(V):1.343v(CPUZ)

QPI voltage:1.45v

Memory voltage:1.64v

DRAM Frequency:1022.6MHz

DRAM Timing:8-8-8-20-1T

VGA clock/mem/shader:750/1900/1500

score:64481


----------



## Masked




----------



## RagingCain

Quote:


> Originally Posted by *Masked;13030178*


I thought you said you had 2 HydroCoppers?


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13030330*
> I thought you said you had 2 HydroCoppers?


1's at work, 1's at home...

12 in the office, not including mine.

Dry cleaning on the chair + crappy wallpaper = home


----------



## ReignsOfPower

Since we are all posting rig pics of their 590 systems, here is mine


----------



## Alatar

the package that has my AX1200 in it has been posted









With some amazing luck I might even get it by tomorrow, if not then it'll be here on Monday.


----------



## JY

this thread makes me sad


----------



## saulin

Quote:


> Originally Posted by *whiz882;13029925*
> Nvidia Zotac GeForce GTX 590 Quad SLI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ☆My system 3DMark vantage score
> 
> Clock(MHz)FSB:4499.3MHz 204.5×22
> 
> Vcore(V):1.343v(CPUZ)
> 
> QPI voltage:1.45v
> 
> Memory voltage:1.64v
> 
> DRAM Frequency:1022.6MHz
> 
> DRAM Timing:8-8-8-20-1T
> 
> VGA clock/mem/shader:750/1900/1500
> 
> score:64481


That seems like a pretty sweet score? Can you do a Crysis benchmark run with those clocks?


----------



## RagingCain

Can someone explain to me whats wrong with this monitor?

I am not understanding the price.

http://www.newegg.com/Product/Product.aspx?Item=N82E16824160062&cm_re=AOC-_-24-160-062-_-Product

Its just soooo thin... I am so tempted to get three.


----------



## rush2049

that contrast ratio isn't very good. DCR stands for dynamic contrast ratio.... aka what is the brightest white it can display ever, compared to the darkest black ever.

A better contrast ratio measurement is SCR, or static contrast ratio..... aka what is the brightest white it can display while at the same time displaying the darkest black.

So I would say the contrast they are posting there is misleading... and because they posted that misleading statistic I question the response time as well......


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


that contrast ratio isn't very good. DCR stands for dynamic contrast ratio.... aka what is the brightest white it can display ever, compared to the darkest black ever.

A better contrast ratio measurement is SCR, or static contrast ratio..... aka what is the brightest white it can display while at the same time displaying the darkest black.

So I would say the contrast they are posting there is misleading... and because they posted that misleading statistic I question the response time as well......


My current monitor is DCR and 80,000:1 and does a fantastic job. This monitor does have a response rate of 5ms, but I wonder if its BTB and not GTG. A lot of companies claim 2ms/5ms but have to specify GTG as its easier than BTB to achieve. I am assuming it does have some money saving features on it, but in relation to the price I am sure they can be justified.

The price is very luring. I will get one, and let you all know how it goes, then consider an additional two.

The size/bezel is very decent for a tri-monitor setup. I was supposed to be waiting for a while on this... but now I am not sure.

5760x1080 has to be decently playable with 2x GTX 590s.

EDIT:
After playing with the settings, the monitor is a cherry, picked up 2 more for Surround. Now, if I only had 2x GTX 590s...


----------



## whiz882

Quote:



Originally Posted by *saulin*


That seems like a pretty sweet score? Can you do a Crysis benchmark run with those clocks?


Is this a pretty sweet score? thank you !









Now GTX590 driver's v270.51 can not changed voltage,

so , Crysis bench to run the after driver has changed.

will post here soon !

↓ Max voltage is default


----------



## zerounleashednl

Quote:



Now GTX590 driver's v270.51 can not changed voltage,

so , Crysis bench to run the after driver has changed.

will post here soon !

↓ Max voltage is default


Hmm... could Afterburner override this in a new (beta) version?


----------



## whiz882

Quote:



Originally Posted by *zerounleashednl*


Hmm... could Afterburner override this in a new (beta) version?


After burner(v2.2.0Beta2) and GeForce driver has not new version.


----------



## whiz882

Thanks ! Alatar

that's paste was complete:specool:


----------



## Alatar

Quote:


> Originally Posted by *whiz882;13046259*
> Thanks ! Alatar
> 
> that's paste was complete:specool:


np, glad I could help


----------



## soilentblue

man i was really hoping nvidia would release some drivers this week.


----------



## saulin

Quote:


> Originally Posted by *whiz882;13043110*
> Is this a pretty sweet score? thank you !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now GTX590 driver's v270.51 can not changed voltage,
> 
> so , Crysis bench to run the after driver has changed.
> 
> will post here soon !
> 
> ↓ Max voltage is default


That kind of sucks about not been able to up the voltage at all now.

How about flashing the cards to the Original Asus GTX 590 Bios and using the Asus Voltage Tweak Utility?

Or do the drivers still prevent the voltage from being touched at all?


----------



## soilentblue

i have the original asus bios and it's still limited due to drivers. there's a new bios but I was told from asus that it's only to make the fan less noisy


----------



## saulin

Quote:


> Originally Posted by *soilentblue;13048606*
> i have the original asus bios and it's still limited due to drivers. there's a new bios but I was told from asus that it's only to make the fan less noisy


Oh that sucks man. Someone needs to find a way to bypass the lock in the driver









I would probably try to get the Bios of that GTX 590 that comes factory overclcoked to 690Mhz that one may add a little extra voltage.

On My Zotac GTX 480 I found the best Bios for me to be the Bios from the EVGA GeForce GTX 480 Hydro Copper FTW.


----------



## Alatar

Yeah what soilentblue said. I'm on the original Asus bios as well and the voltage is capped at 925mV.

And I just got my AX1200 installed


----------



## Masked

Quote:


> Originally Posted by *Alatar;13048804*
> Yeah what soilentblue said. I'm on the original Asus bios as well and the voltage is capped at 925mV.
> 
> And I just got my AX1200 installed


AX1200 is an amazing PSU...Swear by them now!

I'm working on re-writing the driver actually but, I found quite a few people on the forum having issues w/1 of our mobile drivers so, I'm working on that atm.

Soon, though, soon!


----------



## PizzaMan

I wouldn't increase the voltage on these puppies if I were you. nVidia is capping them with the drivers because they blow up quicker then 570's. 4 phases per GPU just isn't enough. They are also downclocking the poo out of them when currant limit gets high. nVidia is pumping out the poorest PCBs I've ever seen. Now they're try'n to cover their butts with these new neutered drivers. It's just to sad and I'm a big NV fan.









Google "dead GTX 590" and you'll see what I mean.


----------



## saulin

Quote:


> Originally Posted by *PizzaMan;13048922*
> I wouldn't increase the voltage on these puppies if I were you. nVidia is capping them with the drivers because they blow up quicker then 570's. 4 phases per GPU just isn't enough. They are also downclocking the poo out of them when currant limit gets high. nVidia is pumping out the poorest PCBs I've ever seen. Now they're try'n to cover their butts with these new neutered drivers. It's just to sad and I'm a big NV fan.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Google "dead GTX 590" and you'll see what I mean.


Yeah but some people have got some descent overclock 725-750Mhz with just a minor bump and now they can't get that. HOCP only used 0.963v which isn't much at all.


----------



## soilentblue

Quote:


> Originally Posted by *saulin;13048977*
> Yeah but some people have got some descent overclock 725-750Mhz with just a minor bump and now they can't get that. HOCP only used 0.963v which isn't much at all.


.963v is more than i can use. lol


----------



## Masked

Quote:


> Originally Posted by *saulin;13048977*
> Yeah but some people have got some descent overclock 725-750Mhz with just a minor bump and now they can't get that. HOCP only used 0.963v which isn't much at all.


I'm sticking to the old drivers until I can get one edited pushed out for you guys.

I was doing 750-800mhz at .95; 850mhz at 1.00

My card didn't explode, I really don't see how .95 or even .925 is a crime to be quite honest.


----------



## soilentblue

Quote:


> Originally Posted by *Masked;13049103*
> I'm sticking to the old drivers until I can get one edited pushed out for you guys.
> 
> I was doing 750-800mhz at .95; 850mhz at 1.00
> 
> My card didn't explode, I really don't see how .95 or even .925 is a crime to be quite honest.


I think the consensus is that even though you can push 750-800, you are not getting the constant volts/amps needed to maintain that oc. Noone has successfully kept it overclocked at that level while still seeing a consistant performance increase.

for the record i definitely want the card to succeed. i'm just wary that it's too much. only thing giving me hope Masked is that nvidia said before all this "blowing up stuff" that you could volt up to 1.05 so apparently they have successfully got that high without problems. Not sure if they knew the ocp would kick in though.


----------



## Masked

Quote:


> Originally Posted by *soilentblue;13049186*
> I think the consensus is that even though you can push 750-800, you are not getting the constant volts/amps needed to maintain that oc. Noone has successfully kept it overclocked at that level while still seeing a consistant performance increase.
> 
> for the record i definitely want the card to succeed. i'm just wary that it's too much. only thing giving me hope Masked is that nvidia said before all this "blowing up stuff" that you could volt up to 1.05 so apparently they have successfully got that high without problems. Not sure if they knew the ocp would kick in though.


In [H]'s benches, it can be seen that 750mhz makes a big difference with only a small voltage increase...

If there truly is an OCP // "throttling" issue, it's clearly with how the card is handling the power management of the 2nd core.

We didn't see that in the first 2 drivers, only the 3rd/4th.

So I think there is some success to be had UNDER 1.00v; perhaps .95 is the magic number?


----------



## helifax

Hi guys.

Here is my OCed Asus GTX 590 card + EK waterblock:

Currently the card is using the latest drivers (270.51) and is OCed to:
723Mhz Core/ 1446Mhz Shaders/ 3700Mhz Memory (see attachments)


----------



## soilentblue

i wonder if .963 is only achieveable with beta 1? I have msi beta 2 and it's limited to .935 or something around that.


----------



## armartins

Quote:



Originally Posted by *jcde7ago*


The thing is, I don't think Nvidia sought to achieve the performance levels of 580's in SLI with the 590 - and I think that's a huge point that people are missing.

This is a single-card, dual-GPU solution for people like me who want a performance increase past a single GTX 580 but with reduced heat, power consumption, and noise.

The above is exactly the reason why I switched from 2x480s to a single GTX 590. And let me tell you, it was very much worth it.

Would it be awesome if I could OC to the performance levels of 2 stock 580s?

Absolutely.

But I also paid $300 less for my card, and I still get the benefit of 1,024 CUDA cores, etc. I just have to be happy with an overclock on stock volts at about 700 mhz which, for $300 less and coming with less noise, heat and power consumption, i am willing to live with.

Yes, it has failed to compete with the 6990 in the way that people expected it to - but realistically, people were expecting too much from a card that wasn't really supposed to match the power of 2 stand-alone 580s, period.


Sorry to quote this, I've been out of the thread for a few days. But this is really a worth to mention post. OC it with stock voltage and be happy this is A LOT of performance. And I see the worth trade you made from the 480s. Noise is absolutely a no go for me.


----------



## helifax

Quote:



Originally Posted by *soilentblue*


i wonder if .963 is only achieveable with beta 1? I have msi beta 2 and it's limited to .935 or something around that.


Unfortunately no..
nVidia "happened" to lock the core voltage to the minim value from your card's bios. In my case (Asus card) that was 0.913mV which was kinda undervolted...
So in order to achieve this...I "played" with the GPUs bios and increased the minimum voltage to 0.963 ( this is the upper margin the officially specs 0.913-0.963). 
Also I do not encourage anyone to this on air cooling ( especially if they don't know what they are doing).

Also, I don't encourage anyone to go above the 0.963 threshold at all. The card is working flawlessly at the mentioned freqs ( 723 Core /2x1850Mhz memory) even with 3D vision ( 3D vision "loads" your card even more) although some games might "bring down" the drivers. Now, until we see a final release of the 270 drivers this is nothing to worry.

Also, going further than these frequencies will make your card throttle as the "power saver" kicks in.

Best regards


----------



## saulin

Quote:



Originally Posted by *helifax*


Unfortunately no..
nVidia "happened" to lock the core voltage to the minim value from your card's bios. In my case (Asus card) that was 0.913mV which was kinda undervolted...
So in order to achieve this...I "played" with the GPUs bios and increased the minimum voltage to 0.963 ( this is the upper margin the officially specs 0.913-0.963). 
Also I do not encourage anyone to this on air cooling ( especially if they don't know what they are doing).

Also, I don't encourage anyone to go above the 0.963 threshold at all. The card is working flawlessly at the mentioned freqs ( 723 Core /2x1850Mhz memory) even with 3D vision ( 3D vision "loads" your card even more) although some games might "bring down" the drivers. Now, until we see a final release of the 270 drivers this is nothing to worry.

Also, going further than these frequencies will make your card throttle as the "power saver" kicks in.

Best regards


Nice, so at least in your case you were able to increase it a little by editing the BIOS.

I think the Bios for that POV card that comes clocked at 690Mhz might have a tiny voltage bump.


----------



## helifax

Quote:



Originally Posted by *saulin*


Nice, so at least in your case you were able to increase it a little by editing the BIOS.

I think the Bios for that POV card that comes clocked at 690Mhz might have a tiny voltage bump.


Only 1 way to find out: GPU-Z can tell you or MSI Afterburner. They both read the currently "software" value.


----------



## brackberry

Just got the EVGA GTX 590 classified edition









bought as soon as I got the availability notification


----------



## whiz882

Quote:



Originally Posted by *saulin*


That kind of sucks about not been able to up the voltage at all now.

How about flashing the cards to the Original Asus GTX 590 Bios and using the Asus Voltage Tweak Utility?

Or do the drivers still prevent the voltage from being touched at all?


Maybe change the VBIOS , I think able to up the voltage.

but ,I don't like my card change bios now. I wait for the nvidia driver update.


----------



## RagingCain

I wouldn't touch any of the BIOS, if, *not saying it will*, if the card does fry all warranties are voided when they check the BIOS out.

That last I heard, driver 267.84/85 do not have locked down Voltages, yet still have the recommended newer OCP. The voltage lockdown was introduced with driver set 267.91/270.51. Therefore a non warranty voiding way to overclock just revert to those drivers if you want to up the voltage.

I think it would would be down right moronic to edit BIOS at this point, *even on water*. Especially with the other drivers as an option.


----------



## whiz882

Quote:



Originally Posted by *RagingCain*


I wouldn't touch any of the BIOS, if, *not saying it will*, if the card does fry all warranties are voided when they check the BIOS out.

That last I heard, driver 267.84/85 do not have locked down Voltages, yet still have the recommended newer OCP. The voltage lockdown was introduced with driver set 267.91/270.51. Therefore a non warranty voiding way to overclock just revert to those drivers if you want to up the voltage.

I think it would would be down right moronic to edit BIOS at this point, *even on water*. Especially with the other drivers as an option.



I understand driver 267.84/85 do not have locked Voltages.

but , 267.91/270.51. I think latest good driver !

I hope early updates can boost voltage driver.


----------



## Alatar

heh, gotta go and break that 9K mark


----------



## helifax

Quote:


> Originally Posted by *RagingCain;13055398*
> I wouldn't touch any of the BIOS, if, *not saying it will*, if the card does fry all warranties are voided when they check the BIOS out.
> 
> That last I heard, driver 267.84/85 do not have locked down Voltages, yet still have the recommended newer OCP. The voltage lockdown was introduced with driver set 267.91/270.51. Therefore a non warranty voiding way to overclock just revert to those drivers if you want to up the voltage.
> 
> I think it would would be down right moronic to edit BIOS at this point, *even on water*. Especially with the other drivers as an option.


Well...the latest drivers ALLWAYS are better to be used since they offer additional layers of protection .. I remember the original 267.52 blowing UP the cards since the "prevent card from overloading/overheating (TDP)" wasn't working "as intended" . The latest driver 270.51 have multiple fixes in many areas and improvements for 3D vision (& souround) users like myself. Trust me, I played with the 267.85 drivers alot until I came to this conclusion. Secondly, "editing" the Bios is not for everybody...the card has enough juice as it is. Thirdly, it is common knowledge (or should be) that an electronic device the more heat it produces the more power it draws in order to "work" in the intended way, which inevitably leads to more heating... SO cooling is a ESSENTIAL part in any electronic device!!!!! The 267.85 drivers overheated the card like hell as the TPD wasn't kicking when it should have.

So,*ALWAYS KEEP YOUR DRIVERS UPDATED*, could save your cards life!!

This card even at stock with the air cooler is a bit "too hot" for my taste....85 Celsius degrees is a lot.. On water 55 Celsius degrees (when OCed and OverV) gives you about 30 Celsius degrees ( Imagine how you feel at 0 degrees and how you feel at 30 degrees). So in keeping the temperatures low you also prevent the chance of the VRMs to overheat and the voltage to start "jumping" which would eventually crash the VRM chips.
As for the warranty...I never saw a card to fail IF cooled properly! I remember a GT7600 (@ stock) I had whit a faulty memory because Leadtek choosed not to coll the memory modules (I still have that card around here somewhere and is still functional in 2D). That was the only card I ever see to "fry up". The morale is: "Not overclocking can fry your card, overclocking can fry your card....but best of all.....HEAT fries up your card." So, even if you have the card @ stock be sure to cool it... This way no warranty will ever be needed. It is "good practice" to always read the product datasheet (specs) to see if your card is working in the corresponding parameters in order to get the best from it.

I do believe anyone who spent the money to get 1,2 GTX 590 should go for an aftermarket cooling solution and a large case (water still needs to be cooled by air).

As for the bios edit, most people *shouldn't* touch it. I, myself, took a "leap of fate" in this direction. And if you happen to edit it *ALWAYS* stay in the official specifications (for the voltage). Also, *never, ever* increase the voltage or frequencies like hell (Ex: from 600mhz to 900mhz; 0.9V to 1.2V) as this is an incredibly SHOCK on the hardware and no kind of protection can save you...

Best regards


----------



## Masked

Quote:


> Originally Posted by *helifax;13059426*
> Well...the latest drivers ALLWAYS are better to be used since they offer additional layers of protection .. I remember the original 267.52 blowing UP the cards since the "prevent card from overloading/overheating (TDP)" wasn't working "as intended" . The latest driver 270.51 have multiple fixes in many areas and improvements for 3D vision (& souround) users like myself. Trust me, I played with the 267.85 drivers alot until I came to this conclusion. Secondly, "editing" the Bios is not for everybody...the card has enough juice as it is. Thirdly, it is common knowledge (or should be) that an electronic device the more heat it produces the more power it draws in order to "work" in the intended way, which inevitably leads to more heating... SO cooling is a ESSENTIAL part in any electronic device!!!!! The 267.85 drivers overheated the card like hell as the TPD wasn't kicking when it should have.
> 
> So,*ALWAYS KEEP YOUR DRIVERS UPDATED*, could save your cards life!!
> 
> This card even at stock with the air cooler is a bit "too hot" for my taste....85 Celsius degrees is a lot.. On water 55 Celsius degrees (when OCed and OverV) gives you about 30 Celsius degrees ( Imagine how you feel at 0 degrees and how you feel at 30 degrees). So in keeping the temperatures low you also prevent the chance of the VRMs to overheat and the voltage to start "jumping" which would eventually crash the VRM chips.
> As for the warranty...I never saw a card to fail IF cooled properly! I remember a GT7600 (@ stock) I had whit a faulty memory because Leadtek choosed not to coll the memory modules (I still have that card around here somewhere and is still functional in 2D). That was the only card I ever see to "fry up". The morale is: "Not overclocking can fry your card, overclocking can fry your card....but best of all.....HEAT fries up your card." So, even if you have the card @ stock be sure to cool it... This way no warranty will ever be needed. It is "good practice" to always read the product datasheet (specs) to see if your card is working in the corresponding parameters in order to get the best from it.
> 
> I do believe anyone who spent the money to get 1,2 GTX 590 should go for an aftermarket cooling solution and a large case (water still needs to be cooled by air).
> 
> As for the bios edit, most people *shouldn't* touch it. I, myself, took a "leap of fate" in this direction. And if you happen to edit it *ALWAYS* stay in the official specifications (for the voltage). Also, *never, ever* increase the voltage or frequencies like hell (Ex: from 600mhz to 900mhz; 0.9V to 1.2V) as this is an incredibly SHOCK on the hardware and no kind of protection can save you...
> 
> Best regards


...I read the first paragraph and skipped ahead...TLDR, you're wrong.

The latest drivers ARE NOT the best...For anything, that's why they have beta's.

Hell, I remember the final drivers of the 480 release to be TERRIBLE, so bad MOST users had to go backwards so, this sentiment is absolutely FALSE.

Max Voltage OC is 1.05, that was stated by nvidia...The issue at this point is the OCP controls where the voltage is going and 99.9% of the time, it's limiting the 2nd core...Thus the low scores.

The issue is the drivers controlling the OCP and being the limiting factor OF the card...

Nvidia scaled back the ability to modify voltage because even after their warning, morons were overvaulting and frying their cards.

Ugh, you people need to be informed and stop spreading mis-information like the black plague...For real.


----------



## RagingCain

Quote:


> Originally Posted by *helifax;13059426*
> Snippet


The new OCP was introduced in 267.84, hotfix 267.85.

There was a voltage lock down at 267.91 and subsequently the 270.51 branch as well.

Newer usually means more compatibility better/newer profiles, true, but the new drivers don't necessarily offer any better OCP. In our case, they have completely removed our voltage control with those drivers, arguably "more" OCP. That was all that was changed, unless you have proof otherwise. If you want to OC, just use the previous drivers. You never *HAVE* to change drivers if like what you have. Any physical hardware problems generated from this are still covered by warranties.

Editing the BIOS for an overvolt at this point is straight up dumb, unless you have no vested interest in the card (reviewer/affluent/manufacturer). There is nothing you can do with your card once its toast, so editing the BIOS on a sensitive card already isn't wise.

If you want stability and more compatibility, then use the 267.91/270.51 driver set, and leave the overvolting alone.

There are people way smarter than me and more tech savvy who are waiting to play around with this card, and it would behoove you to follow in their footsteps. I, and a few others, feel this OCP voltage lockdown is a temporary measure for various reasons such as more testing or until the PR hoopla dies down.


----------



## helifax

Quote:


> Originally Posted by *Masked;13060329*
> The latest drivers ARE NOT the best...For anything, that's why they have beta's.


Did I say they are the BEST?? WHere? I said have improvements and I stated why!
Quote:


> Max Voltage OC is 1.05, that was stated by nvidia...The issue at this point is the OCP controls where the voltage is going and 99.9% of the time, it's limiting the 2nd core...Thus the low scores.


The lower score is because of throttling which happens because of the fail safe mechanism.
Ever tried to go with your car to MAX SPEED constant? We,ll give it a try maybe you will explode in the process... That is why I STATED THE 0.963.

Quote:


> Nvidia scaled back the ability to modify voltage because even after their warning, morons were overvaulting and frying their cards.
> Ugh, you people need to be informed and stop spreading mis-information like the black plague...For real.


Yes bacause of MORONS in the first place WHO don't know anything about electronics (what is a Hertz for example, or how voltages affects frequency) the cards blew! I am not spreading anything... I just said what I did...I don't give a damn if you agree with me or not. IT IS UP TO YOU TO USE YOUR MIND and see if you want to do like me or not


----------



## PizzaMan

Moron's design a 580 die with only 4 phases.................


----------



## Masked

Quote:


> Originally Posted by *helifax;13062351*
> Did I say they are the BEST?? WHere? I said have improvements and I stated why!
> 
> The lower score is because of throttling which happens because of the fail safe mechanism.
> Ever tried to go with your car to MAX SPEED constant? We,ll give it a try maybe you will explode in the process... That is why I STATED THE 0.963.
> 
> Yes bacause of MORONS in the first place WHO don't know anything about electronics (what is a Hertz for example, or how voltages affects frequency) the cards blew! I am not spreading anything... I just said what I did...I don't give a damn if you agree with me or not. IT IS UP TO YOU TO USE YOUR MIND and see if you want to do like me or not


Wrong, no no no no no NO.

The lower score is NOT due to throttling.

My card was at 850mhz for 2 weeks...My other at 750mhz for 2 weeks...OCP kicking in LIMITED the 2nd core's potential.

In every single bench ANY of us ran, the 2nd core was being limited by the demand for voltage...That =/= throttling, PERIOD.

Performance wasn't limited via throttling, OCP limited the power draw of the 2nd core and it was unable to perform THUS the 90/30, 80/40 etc performances that "most of us" recorded.

Go back in this thread and educate yourself before spouting off MORE misinformation.

Now I have edited my own bios because I talk to Nvidia daily, I'm aware of what's safe and what's not safe.

I'm also in the process of writing my own drivers for Guru3d that WILL allow save over-vaulting of the card and NON OCP distribution of the voltage.

The safe limit is NOT .963 that's ABSOLUTELY incorrect.

Nvidia themselves, in 2 releases, 1 VIA them and 1 VIA Asus stated the wall as being 1.05v...Period.

.963 is also NOT necessary on most of these cores to hit 750-800mhz.

As shown by [H] in their recent review (I suggest you read this before posting again) it was CLEARLY determined that the fault of this card lies in the LIMITING of voltage to the cores...NOT a control of performance, a control of POWER.

The OCP is clearly to blame and it CAN according to NVIDIA be controlled via drivers...

The ONLY reason it's currently being driver controlled is because of the RMA #'s from "MORONS" continuing to disregard NVIDIA's release and over-vault to UNSAFE amounts.

JUST got off the phone with them 2 minutes ago to discuss raid-controllers, brought up this issue and was given the above answers.


----------



## helifax

Quote:


> Originally Posted by *Masked;13062555*
> Wrong, no no no no no NO.
> 
> The lower score is NOT due to throttling.
> 
> My card was at 850mhz for 2 weeks...My other at 750mhz for 2 weeks...OCP kicking in LIMITED the 2nd core's potential.
> 
> In every single bench ANY of us ran, the 2nd core was being limited by the demand for voltage...That =/= throttling, PERIOD.
> 
> Performance wasn't limited via throttling, OCP limited the power draw of the 2nd core and it was unable to perform THUS the 90/30, 80/40 etc performances that "most of us" recorded.
> 
> Go back in this thread and educate yourself before spouting off MORE misinformation.
> 
> Now I have edited my own bios because I talk to Nvidia daily, I'm aware of what's safe and what's not safe.
> 
> I'm also in the process of writing my own drivers for Guru3d that WILL allow save over-vaulting of the card and NON OCP distribution of the voltage.
> 
> The safe limit is NOT .963 that's ABSOLUTELY incorrect.
> 
> Nvidia themselves, in 2 releases, 1 VIA them and 1 VIA Asus stated the wall as being 1.05v...Period.
> 
> .963 is also NOT necessary on most of these cores to hit 750-800mhz.
> 
> As shown by [H] in their recent review (I suggest you read this before posting again) it was CLEARLY determined that the fault of this card lies in the LIMITING of voltage to the cores...NOT a control of performance, a control of POWER.
> 
> The OCP is clearly to blame and it CAN according to NVIDIA be controlled via drivers...
> 
> The ONLY reason it's currently being driver controlled is because of the RMA #'s from "MORONS" continuing to disregard NVIDIA's release and over-vault to UNSAFE amounts.
> 
> JUST got off the phone with them 2 minutes ago to discuss raid-controllers, brought up this issue and was given the above answers.


Yes man, I agree 100%. Perhaps I didn't make myself clear when I stated about the 0.963. I didn't said it was the max limit...it is actually in the middle of the accepted interval 0.913-1.05. I just said it would be safer to used 0.963 than 1.05. Also.. if you have 3D vision and you enable it via nvidia control panel (and in game play still plain 3D) this has an effect on the games. The card apparently needs more power ( happend for me on 0.938 in plain 3D to work flawless but when switching to stereo 3D the drivers restarted). Increasing to 0.963 solved the problem. (haven't tested on 0.950)

As for the reviews..trust me I read all of them..been reading for the past 2 weeks...

Ps:
Quote:


> [H] in their recent review


I don't really understand who [H] is...enlighten me... please


----------



## Masked

Quote:


> Originally Posted by *helifax;13063327*
> Yes man, I agree 100%. Perhaps I didn't make myself clear when I stated about the 0.963. I didn't said it was the max limit...it is actually in the middle of the accepted interval 0.913-1.05. I just said it would be safer to used 0.963 than 1.05. Also.. if you have 3D vision and you enable it via nvidia control panel (and in game play still plain 3D) this has an effect on the games. The card apparently needs more power ( happend for me on 0.938 in plain 3D to work flawless but when switching to stereo 3D the drivers restarted). Increasing to 0.963 solved the problem. (haven't tested on 0.950)
> 
> As for the reviews..trust me I read all of them..been reading for the past 2 weeks...
> 
> Ps:
> I don't really understand who [H] is...enlighten me... please


[H] = Hard forums...

Nvidia has stated a couple times the hard cap is 1.05v...That's when overvaulting hits and anything higher the card fries out.

Right now depending on brand you're locked at ~.913

This very second, core 1 is .938, core 2 is .975.

When you hit 1.05v, core 2 hits 1.2v ~ That's why THIS = MAX.

If you were to SAFELY control the voltage around .975 - 1.00 you're STILL not looking at the card frying out.

.963 would be a good median however, the throttling issue isn't the problem because the cards aren't even getting to a point of where performance is/can be throttled...The OCP is completely limiting performance atm; THAT'S the issue.


----------



## Twilex

Heres crysis so far with a dedicated 470 for physx. For some god awful reason im getting horrible crashes with the 267.91 drivers when trying to run 3dmark11 and vantage. Tried going with the 270.51's and they wont even finish installing. Guess im gunna wait till new drivers come out before i can reliably run more benchmarks. I got 3dmark11 to go all the way through a few times, and P9043 was my highest before i got fed up with the crashes. Can't seem to go past 700 mhz without recieving negative gains although its completely stable up to 750mhz (and maybe more) on stock voltage. Anyways, candy.


----------



## helifax

Quote:


> Originally Posted by *Masked;13064186*
> [H] = Hard forums...
> 
> Nvidia has stated a couple times the hard cap is 1.05v...That's when overvaulting hits and anything higher the card fries out.
> 
> Right now depending on brand you're locked at ~.913
> 
> This very second, core 1 is .938, core 2 is .975.
> 
> When you hit 1.05v, core 2 hits 1.2v ~ That's why THIS = MAX.
> 
> If you were to SAFELY control the voltage around .975 - 1.00 you're STILL not looking at the card frying out.
> 
> .963 would be a good median however, the throttling issue isn't the problem because the cards aren't even getting to a point of where performance is/can be throttled...The OCP is completely limiting performance atm; THAT'S the issue.


Thanks for clarifying this...so the OCP is the killer....well waiting for a fix..if one will ever come...


----------



## reflex99

So instead of getting trolled to hell over in my thread, i'll post here:

Do you guys think that eVGA 590 will still be available in 3ish weeks?

(I already know Masked's opinion, so no need to post that again)


----------



## Masked

Quote:



Originally Posted by *reflex99*


So instead of getting trolled to hell over in my thread, i'll post here:

Do you guys think that eVGA 590 will still be available in 3ish weeks?

(I already know Masked's opinion, so no need to post that again)


Lol, I think you'll be able to find a used one in a few weeks, yes.

New? Ehhh maybe.

They're staggering the release so, it's quite possible you might but, chances are kind of slim









I heard a rumor that Amazon is staggering theirs + I know Newegg is.

I'd put notify on each and chill until it shoots off...


----------



## jcde7ago

Quote:



Originally Posted by *reflex99*


So instead of getting trolled to hell over in my thread, i'll post here:

Do you guys think that eVGA 590 will still be available in 3ish weeks?

(I already know Masked's opinion, so no need to post that again)


I'd say you have until probably the end of the month, if EVGA/NewEgg continues to stagger their availability like they have been.

Past that, probably not, since this has already been confirmed to be a limited run from the beginning.


----------



## reflex99

I think I am going to sell all my stuff then and buy the 590 on Tuesday when I get back from vacation.


----------



## Masked

Quote:


> Originally Posted by *reflex99;13066097*
> I think I am going to sell all my stuff then and buy the 590 on Tuesday when I get back from vacation.


I believe some are up on Amazon live, right now...I don't know their situation, they normally call you but, my guess would be there may still be a couple next week









Btw. I didn't mean to crap in your thread like that...I guess he's a known troll and I gave in; apologies.


----------



## reflex99

You fell into the trap.

We filled eachothers inboxs because he kept saying that the p8p67 pro had like 3+1 phase vrm or something stupid like that.

A certain online retailer has evga 590 in stock.


----------



## Masked

Quote:


> Originally Posted by *reflex99;13066163*
> A certain online retailer has evga 590 in stock.


Oh I know who







~ Don't want to ruin your chances.

And aye, some of the PM's I got about him...Ugh, definitely should NOT have indulged that one...









Anyway, it's definitely the way I'd go for a single slot solution but, make sure you acknowledge the fact it will never own up to 2x580's etc...Just better to get that over with beforehand imo.


----------



## Arizonian

My understanding is that each company had set their own limits. I know EVGA said about 1000 + but never gave a solid number. I didn't think Nvidia would limit manufacturers. Wonder why if it's still poplular it would me limited though. A lot of people are doubling up on them, go figure.


----------



## Twilex

Dropped clocks a bit and she seems more stable


































pretty sure that vantage score is lame =/


----------



## Masked

Quote:


> Originally Posted by *Arizonian;13066453*
> My understanding is that each company had set their own limits. I know EVGA said about 1000 + but never gave a solid number. I didn't think Nvidia would limit manufacturers. Wonder why if it's still poplular it would me limited though. A lot of people are doubling up on them, go figure.


The issue, and why they're locking clocks...Hearsay, however, people were still over-volting to 1.2...Apparently the # of RMA's is pretty intense all considering so...This was done basically as a "STOP" precaution.

Hopefully they'll back it off and set a max limit...

If you actually monitor the voltage of BOTH your cores, you'll notice the 2nd core is a bit higher ~ It's also the first core to get limited if you push it hard...That's a tell-tale sign IMO.

Vantage is one of the benches NVIDIA/EVGA etc acknowledged there were issues with...


----------



## reflex99

now i need to convince my mom to let me use her paypal to buy this thing with. (my cash obviously)

ugh. I need to get my own paypal....


----------



## Masked

Quote:


> Originally Posted by *reflex99;13067574*
> now i need to convince my mom to let me use her paypal to buy this thing with. (my cash obviously)
> 
> ugh. I need to get my own paypal....


You know...I'm still kind of 50-50 about paypal...

As a business office, it's my best friend, it really is...Makes my life insanely easy.

As an individual, I'm 50/50...Been screwed a few times...Came out on-top a few times...I think it's mostly the OTHER person that makes the biggest difference IMO.


----------



## jcde7ago

We're going to need to give Nvidia some time to get a decent set of drivers out for the 590 - it they can follow the path they did with, say, the GTX 470/480, we'll see some noticeable improvements once they get some issues ironed out (for example, a lot of SLI owners are STILL complaining about Crysis 2 issues, while some are not having any problems).

FWIW, 270.51 drivers, while decent, isn't really anything exceptional. I'll wait until the whole blown-out-of-proportion VRM issues go by the wayside and Nvidia gives us some drivers that will let us overvolt safely to around 1.02mV and get us to at least an average OC of 750mhz without having to worry about OCP issues.

For now though, i am enjoying an absolutely butter-smooth experience with every single game I play, with barely-audible fan noise and very little heat in my case (it's where the Antec 1200 comes in handy - displacing any trace of heat inside itself)!

I was able to run 700/1400/1770 fairly stable, but i've dropped it down to 670/1340/1750 as there was really no need for the extra increase. I'll wait for a different BIOS/set of drivers before I overvolt and go for at least 720-730mhz.


----------



## reflex99

I hate paypal.

I only use it to buy stuff.

And i try to only use it to sell low price stuff.

Ideally, if i must sell something high value thru pp then empty the account ASAP.

but, it is fine for online retailers because they ain't gonna scam you.

I also use it to deposit $ from bitcoins to.


----------



## Masked

Quote:


> Originally Posted by *jcde7ago;13067663*
> We're going to need to give Nvidia some time to get a decent set of drivers out for the 590 - it they can follow the path they did with, say, the GTX 470/480, we'll see some noticeable improvements once they get some issues ironed out (for example, a lot of SLI owners are STILL complaining about Crysis 2 issues, while some are not having any problems).
> 
> FWIW, 270.51 drivers, while decent, isn't really anything exceptional. I'll wait until the whole blown-out-of-proportion VRM issues go by the wayside and Nvidia gives us some drivers that will let us overvolt safely to around 1.02mV and get us to at least an average OC of 750mhz without having to worry about OCP issues.
> 
> For now though, i am enjoying an absolutely butter-smooth experience with every single game I play, with barely-audible fan noise and very little heat in my case (it's where the Antec 1200 comes in handy - displacing any trace of heat inside itself)!
> 
> I was able to run 700/1400/1770 fairly stable, but i've dropped it down to 670/1340/1750 as there was really no need for the extra increase. I'll wait for a different BIOS/set of drivers before I overvolt and go for at least 720-730mhz.


Believe it or not...The cores are actually really good cores.

They overclock VERY well.

For example, as I was clocking down my card @home, I was able to do [email protected] ~ Obviously core 2 was higher but, these cards ironically, overclock very well.

If people used their brains and we were allowed to even do .995 or .975...The vast majority of us would see huge gains.

The biggest thing that effected us was the morons that chose to OC it regardless of Nvidia's warnings...Essentially that was the reason for the *****-slap lock and I can't quite say I completely blame them.

It may be a while though, before we see some real improvements


----------



## kazukun

water block came








I dig the hole








To be able to make LED shine








Completion









http://blue.ap.teacup.com/kazukun/100.html


----------



## reflex99

even if they didn't have the voltage lock, these cards wouldn't OC well.

They took the best 580 dies, and wasted them on a card with a VRM setup that cannot handle the load.

If these cards could take 1.2v+ then they would be monsters.


----------



## jcde7ago

Quote:


> Originally Posted by *Masked;13067702*
> Believe it or not...The cores are actually really good cores.
> 
> They overclock VERY well.
> 
> For example, as I was clocking down my card @home, I was able to do [email protected] ~ Obviously core 2 was higher but, these cards ironically, overclock very well.
> 
> If people used their brains and we were allowed to even do .995 or .975...The vast majority of us would see huge gains.
> 
> The biggest thing that effected us was the morons that chose to OC it regardless of Nvidia's warnings...Essentially that was the reason for the *****-slap lock and I can't quite say I completely blame them.
> 
> It may be a while though, before we see some real improvements


Well yeah, that's exactly what i am talking about - there is a huge, ridiculous difference between overvolting between 0.963mV to 1.02-1.05mV from STOCK 0.913mV and, say, overvolting updwards of 1.1mV WITHOUT OCP, which was the reason for the whole VRM-blowup issue.

It doesn't solve the fact that Nvidia cheapened out on the VRMs, no one can deny that - they could have gone a much better route, though the cost of the card would undoubtedly have gone up.

That doesn't mean that the card isn't already powerful enough as it is, considering that it costs $300 less than 580s in SLI, and the fact that it does have decent overclocking potential if overvolted to around 1.00mV, which would be perfectly safe and allow us to reach at least 725mhz and push it close to stock 580 SLI performance.

It'll take some time no doubt, but I hope those drivers/BIOS eventually come.


----------



## Masked

Quote:


> Originally Posted by *reflex99;13067755*
> even if they didn't have the voltage lock, these cards wouldn't OC well.
> 
> They took the best 580 dies, and wasted them on a card with a VRM setup that cannot handle the load.
> 
> If these cards could take 1.2v+ then they would be monsters.


The issue really isn't the voltage lock, it's the OCP.

Even if you unlocked the voltage, the OCP is still standing there shocking your balls with a cattleprod.

However, these dies are actually EXTREMELY efficient!
Quote:


> Originally Posted by *jcde7ago;13067767*
> Well yeah, that's exactly what i am talking about - there is a huge, ridiculous difference between overvolting between 0.963mV to 1.02-1.05mV from STOCK 0.913mV and, say, overvolting updwards of 1.1mV WITHOUT OCP, which was the reason for the whole VRM-blowup issue.
> 
> It doesn't solve the fact that Nvidia cheapened out on the VRMs, no one can deny that - they could have gone a much better route, though the cost of the card would undoubtedly have gone up.
> 
> That doesn't mean that the card isn't already powerful enough as it is, considering that it costs $300 less than 580s in SLI, and the fact that it does have decent overclocking potential if overvolted to around 1.00mV, which would be perfectly safe and allow us to reach at least 725mhz and push it close to stock 580 SLI performance.
> 
> It'll take some time no doubt, but I hope those drivers/BIOS eventually come.


Definitely...Completely concur.

I think we'll see custom drivers // Manufacturer bios FAR before we see Nvidia accept the "issues" at hand...

They're too high on their horse for this one...That press release alone man









Personally, I think this release will go underground like the 9800s had to...


----------



## reflex99

do we think that my PSU will be able to handle this card?

(see sig rig)


----------



## jcde7ago

Quote:


> Originally Posted by *Masked;13067816*
> The issue really isn't the voltage lock, it's the OCP.
> 
> Even if you unlocked the voltage, the OCP is still standing there shocking your balls with a cattleprod.
> 
> However, these dies are actually EXTREMELY efficient!
> 
> Definitely...Completely concur.
> 
> I think we'll see custom drivers // Manufacturer bios FAR before we see Nvidia accept the "issues" at hand...
> 
> They're too high on their horse for this one...That press release alone man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally, I think this release will go underground like the 9800s had to...


Agreed.

This card would have been much, much better received if Nvidia didn't market it as the "world's fastest GPU" and instead steered it toward a slightly more affordable alternative to having close to the power of 580s in SLI in a single-card profile with *respectable* (not amazing, not extreme) overclocking potential that only comes close to or about even with stock 580s in SLI, but again, costs $300 less, gives off less heat, noise and draws less power.

But since they marketed this as their new flagship "uber enthusiast" card, and it didn't deliver because of the VRMs (and it was being pushed too far past design specs without OCP) - you have critical failure and the card's reputation goes down the drain at launch.

All of that could have easily been avoided had this been marketed a little better, or if they had gone all out and bumped the price and _*actually went for*_ the "fastest GPU" crown by using better/more VRMs.

It's a good thing that we can see the whole picture though - a lot of misinformation has been spread due to the whole VRM fiasco, but honestly, these are wonderfully-performing cards at their price point; they are exactly where they should be at given their price and stock performance compared to stock 580s in SLI.


----------



## reflex99

my view is that a card shouldn't need OCP to not burn.

The 6990 is doin' it right. They only use OCP to keep it within PCIe spec, and there is a switch to disable it entirely. Add that to the fact that it can hold up to high voltages, and it is clearly a better card.

But the 590 is more fun, and that is why i plan to get one.


----------



## Levesque

Quote:


> Originally Posted by *reflex99;13067977*
> The 6990 is doin' it right. They only use OCP to keep it within PCIe spec, and there is a switch to disable it entirely. Add that to the fact that it can hold up to high voltages, and it is clearly a better card.


You got it right. AMD made a real card for the ''uber enthusiast''. But then:
Quote:


> Originally Posted by *reflex99;13067977*
> But the 590 is more fun, and that is why i plan to get one.


----------



## reflex99

i am crazy, remember?


----------



## Levesque

Quote:


> Originally Posted by *reflex99;13067977*
> The 6990 is doin' it right. They only use OCP to keep it within PCIe spec, and there is a switch to disable it entirely. Add that to the fact that it can hold up to high voltages, and it is clearly a better card.
> 
> But the 590 is more fun, and that is why i plan to get one.


Sorry guys, I will leave your tea-party.

But, at least, you have to agree!







That quote was EPIC! Put aside all your anger, forget the 6990 and the 590, green and red, and smile. It was really an epic ''LOL WUT'' moment just there!

Sorry. I'm leaving. But don't tell me you didn't laugh a little bit deep down inside. Be honest. Those true LOL WUT moments are a rarity!









You are all so serious and full of anger. ''Garbage'', troll''. Relax guys. THAT was funny!

Masked. So now I can't ''browse'' where I want to? Are you the new ''browser'' police?







Ok ok I'm leaving, and I will continue to browse in here! Sorry. I have to much fun reading things like that!


----------



## reflex99

if i was going for performance, i would get the 6990.

But i am in it for the lolz. (plus i wanna see some VRM's blow with my own eyes)

I didn't really see you as trolling btw.......


----------



## Alatar

To the guys with a single GTX 590

What do you get with 3dmark Vantage? (Perf. preset)


----------



## Twilex

Quote:



Originally Posted by *Alatar*


To the guys with a single GTX 590

What do you get with 3dmark Vantage? (Perf. preset)


32.7k


----------



## Alatar

Quote:



Originally Posted by *Twilex*


32.7k


OK than seems like my score was on par, was pushing near 33k too.


----------



## Twilex

Quote:


> Originally Posted by *kazukun;13067752*
> water block came
> 
> 
> 
> 
> 
> 
> 
> 
> I dig the hole
> 
> 
> 
> 
> 
> 
> 
> 
> To be able to make LED shine
> 
> 
> 
> 
> 
> 
> 
> 
> Completion
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://blue.ap.teacup.com/kazukun/100.html


I guess ill be the first to make a useful post in about 5 pages and tell you how nice those blocks are! I've got the black acetal one on my 590 and its gorgeous. Heres what your expecting for temps. This is in a loop with a BI 360 extreme rad, gtx 470(physx), full mobo block and memory block.


----------



## Twilex

Quote:


> Originally Posted by *Alatar;13070296*
> OK than seems like my score was on par, was pushing near 33k too.


Well that helps me out seems so i figured it was low =/ guess not though


----------



## Twilex

Cool i like how photobucket is resizing my images to half scale....In any case that says 37c on core 1 and 41c on core 2 at full load. Very cool with the block on.


----------



## helifax

Hi,

Well last night I found that the OCP is a "moron". I will go with the details to see what I mean:

- Boot up Windows
- Set 723Mhz Core / 1446Mhz Shaders/ 2x1850 Mhz Memory @ 0.963V(at @0.950 drivers crash when 3D vision enabled)
- Enter Metro 2033 (3D Vision on) and played about 1:20 hours.
- Temps are low, maximum was 45 degrees Celsius.
- Drivers crash and restart ( apparently OCP kicks in).
- Set again the voltages/freq to the above mentioned values.
- Enter Metro 2033 , drivers crash in like 10-20 sec. (no matter how many times I try, I get the same result).
- Reboot Windows, set the freq again.
- Enter Metro 2033...I can Play again for like 1-2h.
PS: the values from the overclock I got it from |[H]ard|OCP article.

What I find it strange is the OCP will kick in after a period of time... If anyone can tell me why is that...I would be grateful.

PS: I found out that 705Mhz/1410Mhz/2x1900Mhz @0.963V gives the same performance and greater stability in time.


----------



## kazukun

Start

























Room temperature 17.8 degrees
Water temperature 20.2 degrees
PC start two and a half hours


----------



## RagingCain

Quote:



Originally Posted by *reflex99*


So instead of getting trolled to hell over in my thread, i'll post here:

Do you guys think that eVGA 590 will still be available in 3ish weeks?

(I already know Masked's opinion, so no need to post that again)


I guess my opinion didn't count?

Sent from my DROID2


----------



## Alatar

what are the load temps kazukun?


----------



## reflex99

Quote:



Originally Posted by *RagingCain*


I guess my opinion didn't count?

Sent from my DROID2


more like:

your opinion was lost between the trolling/argueing/name calling.


----------



## ArcticZero

Hey guys. I have a friend who's thinking of picking up a GTX590.

What would be safe max voltages for him to try? He has a tendency to OC everything and skip stock speeds for anything (if he could OC his keyboard, he would!)

Also, I'm thinking of a brand to best recommend to him. I know everything's reference as of now, however I was thinking more of the warranty and bundle. I haven't been in the Nvidia scene as of late, but I am interested for his sake. I would assume ASUS and EVGA are safe bets?


----------



## Masked

Quote:



Originally Posted by *ArcticZero*


Hey guys. I have a friend who's thinking of picking up a GTX590.

What would be safe max voltages for him to try? He has a tendency to OC everything and skip stock speeds for anything (if he could OC his keyboard, he would!)

Also, I'm thinking of a brand to best recommend to him. I know everything's reference as of now, however I was thinking more of the warranty and bundle. I haven't been in the Nvidia scene as of late, but I am interested for his sake. I would assume ASUS and EVGA are safe bets?










#1, right now you can't alter your voltages so, there's nothing to OC.

#2, the card is EXTREMELY limited and on a controlled release so, I'd get the first one you could/can get, otherwise, you're not getting one.

There are a "few" vendors with some left but, very very few.


----------



## saulin

Quote:



Originally Posted by *ArcticZero*


Hey guys. I have a friend who's thinking of picking up a GTX590.

What would be safe max voltages for him to try? He has a tendency to OC everything and skip stock speeds for anything (if he could OC his keyboard, he would!)

Also, I'm thinking of a brand to best recommend to him. I know everything's reference as of now, however I was thinking more of the warranty and bundle. I haven't been in the Nvidia scene as of late, but I am interested for his sake. I would assume ASUS and EVGA are safe bets?










You can't change voltage right now unless you modify the voltage values in the BIOS.

Nvidia said up to 1.05v should be safe, however I don't think you need to go that high for the 20% or so overclock you might get before OCP kicks in. Also EVGA cards seem to have the best warranty.


----------



## capchaos

U can alter the voltages but only though the bios. The NVIDIA driver set ur voltage to the min voltage for 3d clock. I flashed my bios with a min voltage clock of .963 and all is good here.


----------



## remer

I've been tweaking in my overclock. I rolled back to 267.85 drivers so I could do some overvolting. In my case anything above 0.930v is causing some throttling, even with a custom fan profile. The highest 3dmark11 score I was able to get was P9551 with 690/1380/3600 and 0.925v.

It theses clocks I appear to be right on the edge of ocp. Anything higher caused a drastic decline in performance. I haven't played around with the memory too much. The stability for me goes away somewhere inbetween 1800 and 1850 MHz.

I think I'll stick with these clocks for a while. Hopefully the upcoming drivers will allow a little overvolting.




















1253 Heaven Benchmark, 1920x1080 8XAA


----------



## ArcticZero

Thank you guys for the quick responses. I was worried I would get kicked out for owning a 6990, but I guess that was a silly assumption.









Right, that seems to be the case here as well. There are hardly (if any) 590's available on the local market, and they are way overpriced at the moment. It's a shame you can't OC at the moment, but I'm sure that will be resolved eventually. But I'll be certain to let him know that before he picks one up.

Thanks again!


----------



## Masked

I have 6 PC's here with modified BIOS's and Solo 590s...The issue is, that even if you roll back the drivers and modify the bios, OCP is STILL kicking in and limiting the 2nd core...

So, I wouldn't exactly depend on modified bios...


----------



## 2010rig

you guys might get a kick out of this, a broken hearted 6990 owner switched to the 590 because he was having too many issues with his 6990.





http://www.youtube.com/watch?v=Q0FmH4RG3GM[/ame[/URL]]


----------



## saulin

Quote:


> Originally Posted by *Masked;13072266*
> I have 6 PC's here with modified BIOS's and Solo 590s...The issue is, that even if you roll back the drivers and modify the bios, OCP is STILL kicking in and limiting the 2nd core...
> 
> So, I wouldn't exactly depend on modified bios...


At what clocks is it kicking in? and at what voltages too?

What seems to be the average overclock?

Also what are you using to test the overclock? Something like Furmark might be too extreme since it will draw way too much power


----------



## Masked

Quote:


> Originally Posted by *saulin;13072340*
> At what clocks is it kicking in? and at what voltages too?
> 
> What seems to be the average overclock?
> 
> Also what are you using to test the overclock? Something like Furmark might be too extreme since it will draw way too much power


Everything is drawing too much power.

I can leave all 6 at 1.00v...The issue comes when core 2 draws more than .95 which is GOING to happen because even @stock the 2nd core is drawing too much power.

The MAJOR issue is, even at stock the OCP is regulating the 2nd core.

That's why there are issues with benching...Even when benching stock, your 2nd core is being limited by 20%ish...


----------



## capchaos

I can do 720 @ .963 no problem In games with a Good performance increase but in benchmarks it seams as the ocp is way more strick and will usually crash the driver. I think they need to iron out there driver a lot better. I'm sure in the coming months it will be great I hope


----------



## Masked

Quote:


> Originally Posted by *capchaos;13072393*
> I can do 720 @ .963 no problem I games with a Good performance increase but in benchmarks it seams as the icon is way more strick and will usually crash the driver. I think they need to iron out there driver a lot better. I'm sure in the coming months it will be great I hope


You can up this as high as you want...The issue is OCP is limiting you from day 1.

Your 2nd core draws more power than the first core and once it hits X it starts to scale down the voltage TO that core.

You're not 100% on both cores at any time...

For example, I'm 90/40 right now...that 40% is due to the voltage being over X...

I can try all of the drivers, same thing happens because OCP has a max regardless of what you OC to...Once you hit that wall, that's it.


----------



## Arizonian

Quote:


> Originally Posted by *ArcticZero;13072244*
> Thank you guys for the quick responses. I was worried I would get kicked out for owning a 6990, but I guess that was a silly assumption.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Right, that seems to be the case here as well. There are hardly (if any) 590's available on the local market, and they are way overpriced at the moment. It's a shame you can't OC at the moment, but I'm sure that will be resolved eventually. But I'll be certain to let him know that before he picks one up.
> 
> Thanks again!


The GTX 590's are the same price as the AMD 6990's, fyi. Glad you found the answer you were looking for however.


----------



## helifax

I found this review: Hardware Heaven GTX 590 Review and I can confirm what they are saying is true. Works very good both in plain 3D and stereo 3D (with 3D vision).

Now something I found out while using 3D Vision and OC-ing:

@ 710Mhz Core /1820-1850Mhz Memory/ @0.963V the drivers crashed.
@ 710Mhz Core /1900Mhz Memory/ @0.963V the drivers WORK flawless.

So it seems the memory freq needs to be increase more than 1850. Hope this helps those who want to overclock more then 710Mhz.

Regards

PS: on 3DMark2011 on Performance profile my card scores at 10260. (Too bad my AMD Phenom can't handle this card to well.....so overall I get around P8000. CPU Upgrade pending). This is about 1000 points higher than 9260 which I get @ stock speeds.


----------



## ezveedub

Quote:


> Originally Posted by *Arizonian;13072442*
> The GTX 590's are the same price as the AMD 6990's, fyi. Glad you found the answer you were looking for however.


I don't think he's in the US where the pricing is the same


----------



## RagingCain

So, I am still using the ACER for center monitor, just can't bring myself to open the third one. May even return it, I like the idea of having at least one monitor thats 120 Hz/3D.


















This is what it looks like so far.

Now I just need my GTX 590s back from RMA to power these puppies. These AOCs produce a beautiful display: great color, great brightness, and much less heat than I am used to. Also about 1/3 the thickness of my Acer.

The price makes them unbeatable. Considering my center monitor cost 369.99$ when I got it, I am feeling embarrassed at the price, despite being 120 Hz.


----------



## rush2049

Nice!


----------



## reflex99

Good news 590 bros. My mom ok'd the acquisition of a 590. Now I just need to get it past my dad.


----------



## RagingCain

Quote:


> Originally Posted by *reflex99;13078338*
> Good news 590 bros. My mom ok'd the acquisition of a 590. Now I just need to get it past my dad.


I am glad the trolling didn't get to you, but beware, the GTX 580 is about to have 3GB land on US shores (week or two.) Making it very future proof.

Maaaaaake sure its what YOU want, and not what everybody else wants for you









If you don't expect to overclock it 300 MHz like some people think, I believe you will be pleasantly satisfied with your performance.

Upgrade YOUR CPU







next!

Quote:


> Originally Posted by *rush2049;13078293*
> Nice!


Thanks







, you bring your GTX 590 over, and I will supply beerz. The center monitor is off (taller) by a few millimeters, can you guys tell/what do you think looks good??


----------



## jcde7ago

Quote:


> Originally Posted by *RagingCain;13078383*
> I am glad the trolling didn't get to you, but beware, the GTX 580 is about to have 3GB land on its shores (week or two.) Making it very future proof.
> 
> Maaaaaake sure its what YOU want, and not what everybody else wants for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you don't expect to overclock it 300 MHz like some people think, I believe you will be pleasantly satisfied with your performance.
> 
> Upgrade YOUR CPU
> 
> 
> 
> 
> 
> 
> 
> next!
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> , you bring your GTX 590 over, and I will supply beerz.


I agree with this, especially if you plan on running both Surround AND 3D.

2x3GB 580s would just destroy a single 590 due to the overclockability and all of that extra VRAM (of course, it's going to probably cost an extra $450-500 total i am guessing) with your triple monitor setup.

Granted, the 590 will do perfectly fine...but again, a 3GB 580 will just be an absolute monster for multi-monitor setups.

Personally, i am going to grab a 2nd monitor and call it a day...I don't have nearly that amount of desk space to be using 3 monitors anyways.


----------



## ablearcher

yeah, I'm waiting for the 3GB GTX580s, first. If there is a 6GB GTX590, I'm gonna be there, too







AMD doesn't seem to like large memory buses (allowing larger VRAM capacity), so I'm not sticking with AMD for much longer.


----------



## RagingCain

Quote:


> Originally Posted by *ablearcher;13078470*
> yeah, I'm waiting for the 3GB GTX580s, first. If there is a 6GB GTX590, I'm gonna be there, too
> 
> 
> 
> 
> 
> 
> 
> AMD doesn't seem to like large memory buses (allowing larger VRAM capacity), so I'm not sticking with AMD for much longer.


I do know (pretentious sounding I know, but I already asked Jacob) that EVGA will not being making a 6GB GTX 590. Making me believe that we probably won't be seeing it. As far as the other manufacturers, its possible but unlikely. Its a very niche market, dual-GPUs, and releasing a more expensive memory expanded one even more so of a niche, so it might not be even profitable. There will also not be any HydroCopper 3GB, which I was a bit surprised, but according to him, the old waterblocks fit.

I was half tempted to refund the GTX 590s for 3 of them to be honest... but with all the water block switching, time energy, and my laziness in general. I will remain faithful to the GTX 590 purchase due to its uniqueness. I have never owned a dual-GPU before, always avoided them like the plague.

Now with the introduction of properly threaded DX11 for both CPU and GPU, I think having the extra GPU will mater more than having the fastest GPU.

Similar to a 2.66 GHz Quad vs. a 3.2 GHz Core2Duo.


----------



## ablearcher

Quote:


> Originally Posted by *RagingCain;13078539*
> I do know (pretentious sounding I know, but I already asked Jacob) that EVGA will not being making a 6GB GTX 590. Making me believe that we probably won't be seeing it. As far as the other manufacturers, its possible but unlikely. Its a very niche market, dual-GPUs, and releasing a more expensive memory expanded one even more so of a niche, so it might not be even profitable. There will also not be any HydroCopper 3GB, which I was a bit surprised, but according to him, the old waterblocks fit.
> 
> I was half tempted to refund the GTX 590s for 3 of them to be honest... but with all the water block switching, time energy, and my laziness in general. I will remain faithful to the GTX 590 purchase due to its uniqueness. I have never owned a dual-GPU before, always avoided them like the plague.
> 
> Now with the introduction of properly threaded DX11 for both CPU and GPU, I think having the extra GPU will mater more than having the fastest GPU.
> 
> Similar to a 2.66 GHz Quad vs. a 3.2 GHz Core2Duo.


argh.... I guess I will be stuck with 3GB GTX580s as my only immediate upgrade option, then







Thank you for the info, though! At least JacobF of eVGA promised availibility of the 3GB variants within this month, lol... just in time for my tax refund







I guess it's time for me to goto the GTX580 club, then, thanks!


----------



## RagingCain

Quote:


> Originally Posted by *ablearcher;13078580*
> argh.... I guess I will be stuck with 3GB GTX580s as my only immediate upgrade option, then
> 
> 
> 
> 
> 
> 
> 
> Thank you for the info, though! At least JacobF of eVGA promised availibility of the 3GB variants within this month, lol... just in time for my tax refund
> 
> 
> 
> 
> 
> 
> 
> I guess it's time for me to goto the GTX580 club, then, thanks!


Hey, I am all for good information:

Here is the actual thread where we had a little Q&A with Jacob:
http://www.evga.com/forums/fb.ashx?m=951850

Release, according to him the release is within 1-2 weeks, and all the other good info.

I am also selling my GTX 580s if interested, pretty water block installed


----------



## Juggalo23451

Quote:


> Originally Posted by *Alatar;13068647*
> ...
> 
> We don't need this bickering in this thread
> 
> Would you guys please stop?
> 
> Anyways I'm adding juggalos GTX 590 taken apart thread to the OP since it seems to have some pretty good info.
> 
> Thanks for that Juggalo


No problem glad to of helped out


----------



## ReignsOfPower

So guys, I've finally got a stint of spare time. Going to run a few benchmarks for you lot on my 30"

AVP 2560x1600 Max Everything (can't seem to find the fight benchmark tool, can someone direct me?)
Crysis 2560x1600 Max Everything
Unigine Heaven Benchmark 2560x1600 Max Everything
3D Mark Scores are in my siggy.

Wish me luck!









[EDIT]
Ok, I've already done some benchies as per here:
http://www.overclock.net/12967945-post492.html

Here is one overclocked 3D Mark 11 run for you guys: P9881
http://3dmark.com/3dm11/1008332

Following are stock results

Unigine Heaven 2.5 1600P, 8xAA, 16xAF, Extreme Tess









Crysis 1600P, DX10, Very High All, 8xAA, 16xAF









Cinebench 10.5 OpenGL Test (Not sure if a non FS App activates dual gpu cards)


----------



## soilentblue

can someone please update the first post for me. 685mhz/1900 memory @ stock .925mv for me.


----------



## grunion

Quote:



Originally Posted by *ReignsOfPower*


So guys, I've finally got a stint of spare time. Going to run a few benchmarks for you lot on my 30"

AVP 2560x1600 Max Everything (can't seem to find the fight benchmark tool, can someone direct me?)
Crysis 2560x1600 Max Everything
Unigine Heaven Benchmark 2560x1600 Max Everything
3D Mark Scores are in my siggy.

Wish me luck!









[EDIT]
Ok, I've already done some benchies as per here:
http://www.overclock.net/12967945-post492.html

Here is one overclocked 3D Mark 11 run for you guys: P9881
http://3dmark.com/3dm11/1008332

Following are stock results
--------------------------

Crysis 1600P, DX10, Very High All, 8xAA, 16xAF











Can you run that at x0 and x4 AA please?


----------



## soilentblue

I ran crysis last night and received an average of 50fps at x4 AA on my resolution


----------



## Masked

I've been getting [email protected]

1 thing I have been noticing (over-zealous or not) is the difference between gaming and benching.

Say I pull out metro...The card is running well, cores are at 80/80 ~ Even usage...90/85 etc, even enough to say it's working as intended.

The moment I pull up and bench, like Heaven or 3Dmark, I see 80/50, 70/20, 100/40...This is a major issue to me...

If we do a bit more research via Aida, core 1 is at .913, core 2 is at .950 ~

If I reset the card to stock settings and run the same bench, I get the same results only a little less extreme...70/50, 60/40...However, it's obvious the 2nd core is still being limited.

In editing my drivers, I find there are several lines of code controlling the max voltage on the cores themselves...Which, if the card is ramping up to 100%, the 2nd core would then be limited because it's already [email protected] over the 1st core.

What it doesn't explain is why we see such good performance in gaming, even to the extent of 100/100 vs benching where we see something like 80/40.

It definitely explains the reviews, especially when overclocked...Which, I'm also shocked not a single one of them bothered to test core 1v vs core 2v...They simply just popped the voltage to 1.2...No wonder they fried...Seriously!

I'm on the fence...In looking at these drivers...I don't really understand what they're thinking -- Jacob on EVGA is definitely 100% though, benches are not functioning correctly, they're actually cutting down performance a tremendous amount.


----------



## maur0

Linkie @ Jacob?


----------



## soilentblue

aquagrafx how to on installing the block. the waterblock seriously looks better in this vid than it does in any of the picts so far. I think it's the silver on silver.


----------



## jcde7ago

Quote:



Originally Posted by *maur0*


Linkie @ Jacob?


Fairly certain it's in one of the 'official benchmark scores' threads over at the EVGA 500 Series forum...i'd look it up for you but i am on my morning work commute and using tapatalk :/


----------



## Masked

Quote:



Originally Posted by *maur0*


Linkie @ Jacob?


Don't remember ~ It's been linked and ref'd quite a few times...Was just re-linked yesterday in another thread.

Basically the drivers aren't controlling the window correctly and thus the benches are not performing properly...Ontop of that, the card attempts to overvault and hits a wall...

The 2nd core voltage is higher than the first so, the 2nd core is already hitting the wall when the core attempts to increase the voltages.

Wall

Benches and SOME games are illustrating this wall; some more than others.


----------



## Twilex

errrr wish i would have known EVGA was in the midst of making a 3gb 580 >< Guess we'll see how this pans out...


----------



## Masked

Quote:



Originally Posted by *Twilex*


errrr wish i would have known EVGA was in the midst of making a 3gb 580 >< Guess we'll see how this pans out...


That was rumored about a month ago too









Some of you guys really need to start learning how to use this mythical service called "Google", ooooooooh shiny!

That aside, sending you a PM Twilex


----------



## saulin

Another review

http://www.fudzilla.com/reviews/item...2-pov-?start=8

This time is the overclocked GTX 590. Just like I thought it seems to come with voltage of 0.963 and it seems to do 725Mhz

I'm starting to think that this is the max voltage these cards will take and as far as they will probably go. Since more voltage draws more power.

POV is supposed to release a water cooled card with even higher clocks so I'm wondering if that will just come overclocked to the max wich is probably around 720Mhz


----------



## soilentblue

damn that's not bad at all from POV. may need that bios lol i wonder if the volts will stay the same for the watercooled "beast" version.


----------



## Mirjalovic

Getting mine in few days, i hope my card will be fine.


----------



## Alatar

updated

post if I forgot something


----------



## remer

Quote:



Originally Posted by *Alatar*


updated

post if I forgot something


I've been running at an overclock. Here's the proof.

Quote:



Originally Posted by *remer*


I've been tweaking in my overclock. I rolled back to 267.85 drivers so I could do some overvolting. In my case anything above 0.930v is causing some throttling, even with a custom fan profile. The highest 3dmark11 score I was able to get was P9551 with 690/1380/3600 and 0.925v.

It theses clocks I appear to be right on the edge of ocp. Anything higher caused a drastic decline in performance. I haven't played around with the memory too much. The stability for me goes away somewhere inbetween 1800 and 1850 MHz.

I think I'll stick with these clocks for a while. Hopefully the upcoming drivers will allow a little overvolting.


----------



## Masked

If you guys use aida, rivatuner, msi, even Dxdiag ~ I'd be interested to know at what voltage your cores are...

Both stock and while benching ~ Especially anyone over 700mhz ~


----------



## Alatar

Quote:



Originally Posted by *remer*


I've been running at an overclock. Here's the proof.


corrected


----------



## soilentblue

Quote:



Originally Posted by *Masked*


If you guys use aida, rivatuner, msi, even Dxdiag ~ I'd be interested to know at what voltage your cores are...

Both stock and while benching ~ Especially anyone over 700mhz ~


gpu #1 .925mv
gpu #2 .913mv

that's my max for the two. about to try and run crysis to see what the volts get up to


----------



## Masked

Quote:



Originally Posted by *soilentblue*


gpu #1 .925mv
gpu #2 .913mv

that's my max for the two. about to try and run crysis to see what the volts get up to


That's interesting...Even at work atm: Core 1 .925v; Core 2 .940v

This card is stock too...I find that extremely interesting.

What are your cores VS voltage while running something like Crysis 2 vs Benching?


----------



## soilentblue

cores are at 685 and volts seem to be consistantly stable for both chips. gpu load during crysis 1 benckmark fluxuates between 95%-64% constantly. I have yet to see a gameplay or a benchmark with consistant 95-99% gpu load.


----------



## capchaos

My cores voltage used to be all different but after changing the min 3d voltage in the bios they all stay the same. I can confirm this bc I had 4 instances of gpuz running to monitored all 4 cores


----------



## soilentblue

capchaos you have any picts of your setup?


----------



## ReignsOfPower

Quote:


> Originally Posted by *grunion;13082761*
> Can you run that at x0 and x4 AA please?


I've done this before, if I recall it was 54FPS 4x AA and about 60 on 0x AA. When I get home from work I'll run those settings for you.


----------



## capchaos

this is my rig



Uploaded with ImageShack.us





Uploaded with ImageShack.us


----------



## Freiya

That looks amazing, *** fan is that?


----------



## jcde7ago

HOLY FANS, BATMAN!


----------



## capchaos

Those are 18 areocool shark fans in push pull on a mo-ra3 RAD


----------



## Twilex

My voltages are .913 and .938. They dont seem to flucuate at all no matter what i do. I have only really been playing bad company 2 and they both stick at 95%+ usage the entire time im playing.


----------



## RagingCain

Quote:



Originally Posted by *capchaos*


Those are 18 areocool shark fans in push pull on a mo-ra3 RAD


Very bright I like the dark loop setups, I should post mine.

Is that your only RAD or is that just one RAD external? Why didn't you go for GentleTyphoons if you don't mind me asking? Some package deal?


----------



## capchaos

I just like the air flow of the shark fans they are running at 7volts instead of 12v.12 v sounded like a wind tunnel. I have a 240 between the 990x and the gpus. Dnt know if I really need it though the mo-ra3 1080 can handle it all


----------



## RagingCain

Quote:



Originally Posted by *capchaos*


I just like the air flow of the shark fans they are running at 7volts instead of 12v.12 v sounded like a wind tunnel. I have a 240 between the 990x and the gpus. Dnt know if I really need it though the mo-ra3 1080 can handle it all


Next time I am in Perry I will have to invite myself over haha.

Just as well, I currently have 6 of those 150 cfm GentleTypoons... which is nothing gentle or quiet about them. I don't have a fan controller with enough amperage to put them on my external PA120.3 radiator push pull, so I am just running 3 Ultra Kazes on low speed. Seems to be working, great temps in Prime95 and Furmark.


----------



## capchaos

Thnk I'm bt to go to 3 3g 580's even though I've only had these for a short time. I do think the 590's are great cards I dnt care how many people talk crap abt them. I just need more vram for my 3 30inchers . will prob move one to my other rig that is hooked up too my tv and sell the other


----------



## reflex99

eVGA 590 ordered from TigerDirect

Should be here around the 25th


----------



## Alatar

Congratulations on your purchase!

Post some pics when you get it


----------



## reflex99

I am excited. First "new" (read: not used) computer part i have bought in a loooooong time.

Having a warranty in MY name is going to be awesome.

As Ms. Black put it:
We We We so excited
We So excited!


----------



## jcde7ago

Quote:



Originally Posted by *capchaos*


Thnk I'm bt to go to 3 3g 580's even though I've only had these for a short time. I do think the 590's are great cards I dnt care how many people talk crap abt them. I just need more vram for my 3 30inchers . will prob move one to my other rig that is hooked up too my tv and sell the other


Wow, 3x30 inch monitors? At 2560x1600 each?

If so, i'm extremely jealy....and hell, you might need Tri/Quad SLI 3GB 580s to handle that amount of screen real estate.


----------



## reflex99

Anyone want to share their experience with surround?

The first time i tried it (2x470s), it did not work at all. So i am hoping that they have fixed it by now.


----------



## capchaos

As long as I don't touch any aa setting I can play most games in surround with 60+fps but I am at the vram limit almost always


----------



## zerounleashednl

Quote:



As long as I don't touch any aa setting I can play most games in surround with 60+fps but I am at the vram limit almost always


How do you measure the load on the vram?


----------



## Freiya

^

Gpu Z


----------



## capchaos

Or msi afterburner with the on-screen display


----------



## RagingCain

Quote:


> Originally Posted by *reflex99;13092930*
> Anyone want to share their experience with surround?
> 
> The first time i tried it (2x470s), it did not work at all. So i am hoping that they have fixed it by now.


I just setup up mine two days ago, it is working good, frame rate is low (40-60) in Crysis 2, but hey its only on 2x 580s on 5760x1080.

At 1920x1080, I am near 180 fps, so if I ever need to switch back I will


----------



## Triangle

Shall I get a 590??


----------



## reflex99

My card is backordered.

Fuuuuuuuuuuuuuuu tigerdirect


----------



## RagingCain

Quote:


> Originally Posted by *reflex99;13104710*
> My card is backordered.
> 
> Fuuuuuuuuuuuuuuu tigerdirect


Call EVGA Sales, I got off the phone with them today. *hint hint*

Both my replacement cards are on their way back and actually spoke to Jacob's boss. It felt pretty cool.


----------



## Triangle

I think I will get one...


----------



## reflex99

Quote:


> Originally Posted by *RagingCain;13105000*
> Call EVGA Sales, I got off the phone with them today. *hint hint*
> 
> Both my replacement cards are on their way back and actually spoke to Jacob's boss. It felt pretty cool.


confused face









are you implying that it is possible to get one directly from them if you work them on the phone?

(if i did, i would have to pay $70+ in tax though)


----------



## rush2049

Okay so question..... was playing some l4d2 earlier and it BSOD on me..... I had reverted everything in my system to stock, stock cpu, stock gpu, stock mobo, stock ram......

I didn't even have a custom fan profile for the 590 going and it BSOD'ed after ~1 hour of L4D2..... now I don't know what would the cause be, but I am guessing heat.

I tried to start up immediately and got screens looking like the following (frozen at this point):










See the pixels of color across the top..... now what is peoples opinion on this?

After 20-30 min of waiting with the power off / cord pulled out, it booted up fine and I checked all the temps, and they are all < 40 C...... I don't know what is going on.


----------



## whiz882

GTX590 Quad SLI Reviews










☆3DMARK VANTAGE / CPU 4.8GHz

Clock(MHz)FSB:4818.7MHz 209.5×23

Vcore(V):1.488v(CPUZ)

QPI voltage:1.47v

Memory voltage:1.65v

DRAM Frequency:1047.5MHz

DRAM Timing:8-8-8-20-1T

VGA clock/mem/shader:750/1900/1500

scre:65280










Crysis Benchmark Results Thread ↓

http://www.overclock.net/benchmarking-software-discussion/303427-crysis-benchmark-results-thread.html

☆Crysis DX10 Results : 2560x1600, 8xAA , Very High

Clock(MHz)FSB:4009.9MHz 211.0×19

Vcore(V):1.191v(CPUZ)

QPI voltage:1.445v

Memory voltage:1.64v

DRAM Frequency:1055.1MHz

DRAM Timing:8-8-8-20-1T

VGA clock/mem/shader:750/1900/1500

Avg. FPS :72.815










☆Crysis DX9 Results : 1280x1024, No AA, High.

Clock(MHz)FSB:5011.8MHz 200.5×25

Vcore(V):1.521v(CPUZ)

QPI voltage:1.475v

Memory voltage:1.64v

DRAM Frequency:1002.4MHz

DRAM Timing:8-8-8-20-1T

VGA clock/mem/shader:750/1900/1500

Avg. FPS :127.16


----------



## derickwm

Good god man..that is amazing.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13105000*
> Call EVGA Sales, I got off the phone with them today. *hint hint*
> 
> Both my replacement cards are on their way back and actually spoke to Jacob's boss. It felt pretty cool.


EVGA is legit people.

Back in the day we used to have EVGA exclusive contracts (Back when things were really custom) and when I first started working here in 2005, they disolved most of those contracts (Was the start of Dell)...It's unfortunate because every call I had to make to them was rediculously easy, pain free and I actually enjoyed it!

Now I get a new guy, just as legit but, I miss those days of RMA'ing rediculously crazy inventory because of a bad capacitor or something; serious $$$$..

And Mr. Rush, I'd pick up a wonderful program named Rivatuner, Aida or something similar...Monitor your real time fan speeds not your "reported" fan speeds...It sounds like you sir are having a heat issue.

If you monitor them and the problem does indeed persist, I'd contact your vendor for RMA.


----------



## Farih

Quote:


> Originally Posted by *rush2049;13107248*
> Okay so question..... was playing some l4d2 earlier and it BSOD on me..... I had reverted everything in my system to stock, stock cpu, stock gpu, stock mobo, stock ram......
> 
> I didn't even have a custom fan profile for the 590 going and it BSOD'ed after ~1 hour of L4D2..... now I don't know what would the cause be, but I am guessing heat.
> 
> I tried to start up immediately and got screens looking like the following (frozen at this point):
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See the pixels of color across the top..... now what is peoples opinion on this?
> 
> After 20-30 min of waiting with the power off / cord pulled out, it booted up fine and I checked all the temps, and they are all < 40 C...... I don't know what is going on.


That seems like sort of artifacting m8









I had wierd dots like that on my first xfx 5870 to, i would rma it if you cant solve it.


----------



## soilentblue

Quote:


> Originally Posted by *rush2049;13107248*
> Okay so question..... was playing some l4d2 earlier and it BSOD on me..... I had reverted everything in my system to stock, stock cpu, stock gpu, stock mobo, stock ram......
> 
> I didn't even have a custom fan profile for the 590 going and it BSOD'ed after ~1 hour of L4D2..... now I don't know what would the cause be, but I am guessing heat.
> 
> I tried to start up immediately and got screens looking like the following (frozen at this point):
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See the pixels of color across the top..... now what is peoples opinion on this?
> 
> After 20-30 min of waiting with the power off / cord pulled out, it booted up fine and I checked all the temps, and they are all < 40 C...... I don't know what is going on.


did it blue screen with a "dumping physical memory" issue?


----------



## RagingCain

Quote:


> Originally Posted by *whiz882;13107703*
> GTX590 Quad SLI Reviews
> 
> ☆3DMARK VANTAGE / CPU 4.8GHz
> Clock(MHz)FSB:4818.7MHz 209.5×23
> Vcore(V):1.488v(CPUZ)
> QPI voltage:1.47v
> Memory voltage:1.65v
> DRAM Frequency:1047.5MHz
> DRAM Timing:8-8-8-20-1T
> VGA clock/mem/shader:750/1900/1500
> scre:65280
> 
> Crysis Benchmark Results Thread ↓
> http://www.overclock.net/benchmarking-software-discussion/303427-crysis-benchmark-results-thread.html
> 
> ☆Crysis DX10 Results : 2560x1600, 8xAA , Very High
> Clock(MHz)FSB:4009.9MHz 211.0×19
> Vcore(V):1.191v(CPUZ)
> QPI voltage:1.445v
> Memory voltage:1.64v
> DRAM Frequency:1055.1MHz
> DRAM Timing:8-8-8-20-1T
> VGA clock/mem/shader:750/1900/1500
> 
> Avg. FPS :72.815
> 
> ☆Crysis DX9 Results : 1280x1024, No AA, High.
> Clock(MHz)FSB:5011.8MHz 200.5×25
> Vcore(V):1.521v(CPUZ)
> QPI voltage:1.475v
> Memory voltage:1.64v
> DRAM Frequency:1002.4MHz
> DRAM Timing:8-8-8-20-1T
> VGA clock/mem/shader:750/1900/1500
> 
> Avg. FPS :127.16


Good job doing benchies Wiz, but why is your QPI so freaking high for a 990x?

Look at your GPU usage under the Crysis benchmark, its looks to be about 65~70% as well. I anticipate you might have an unstable overclock, probably CPU territory.

On the Crysis 4GHz run, you look to be undervolted quite a bit, have you run stability testing? If you are getting core errors, its going to greatly reduce you score. Over voltage on the QPI can increase likelihood of this as well.

You maybe able to increase your scores greatly without increasing your speed, just adjusting the voltages.

I personally run (for 24/7) 4.100 GHz @ 1.270v (full load), so about 1.31875v in BIOS with VDroop on. My QPI is only +75mV, so its stock of 1.200v + 75 = 1.275v, with a 30 mV LLC, making it 1.305~1.310 at full load. You are entitled to run things however you want, but the Gulftowns are more sensitive from voltages when it comes to error, too high or too low can cause errors. Those voltages are as low as I can get without getting either a Core failure / BSOD. IBT 100 passes Max Memory and Prime95 Custom Blend with 21000 MB of RAM stable for 16 hours straight.

The max safe vCore & QPI voltages I have read is approximately 1.40v. However, most enthusiasts agree going over 1.400v on QPI causes errors, and does not aid overclock. Now on the vCore, error town (without sub-zero) begins appearing around 1.59~1.60v, thus making your max OC limited to vCore.

Also what are your Uncore/Northbridge & QPI frequencies? Having a stock Uncore can hold back your benchmarking. QPI should be set to around 3.5 GHz and if you don't have direct control over that, have to adjust Bclk and lower CPU multiplier. Its more complicated I know but worth it. If you scroll down the EVGA's hall of fame, you can see I am in 4th place on 3DMark11 for 2x GPUs, but its primarily due to a boost from my Combo score at the end, nearly a 1000 higher, just from raising my Uncore to a stable 3.6 GHz as well. It increases system intercommunication greatly. In addition to that, I have added a small boost to my IOH Core voltage, which has been shown to increase the performance of multiple card / multi GPU setups. I run that at approximately from 1.100v to 1.225v, not sure what the live voltages are but I assume it doesn't go much higher or lower. 0.060v for every GPU








Quote:


> Originally Posted by *rush2049;13107248*
> Okay so question..... was playing some l4d2 earlier and it BSOD on me..... I had reverted everything in my system to stock, stock cpu, stock gpu, stock mobo, stock ram......
> 
> I didn't even have a custom fan profile for the 590 going and it BSOD'ed after ~1 hour of L4D2..... now I don't know what would the cause be, but I am guessing heat.
> 
> I tried to start up immediately and got screens looking like the following (frozen at this point):
> 
> See the pixels of color across the top..... now what is peoples opinion on this?
> 
> After 20-30 min of waiting with the power off / cord pulled out, it booted up fine and I checked all the temps, and they are all < 40 C...... I don't know what is going on.


Where it freezes (not just how) can be many things, and where you freeze really opens up a ton of things. It could be because of unstable clocks (which you said are running stock), GPU Heat, GPU Drivers, System files corrupting.

Every BSOD has the chance to break Windows, we are just living in an age where there is tons of redundancy and repairability. I doubt the card has damage, due to the Windows loading logo has to produce virtually no load on the GPU, thus I would say its a software based artifacting as opposed to physical problems, UNLESS, you continue to have issues in games.

That length of time you reported would definitely have me concern because an hour in game should already be at its maximum temp for whatever game that is, 70c, 80c, 85c, 90c <-entering the unsafe zone there, etc. So try it again using on SCREEN temp monitor such as Afterburners OSD which can monitor everything (it can be somewhat annoying to really be playing but just keep an eye on temps while playing for an hour.) If the same thing occurs and its 85c and below on each GPU, I would anticipate either driver OR another piece of hardware such as RAM. At which point you might want to post more data for us to diagnose better like the memory dump/stopcode etc.

Your best bet is to see what caused it (I suspect drivers) and then just re-install them to see if it helps. Could just be a driver bug with L4D, nVidia can have them, RARE I know









To find out some info on the Crash (for anybody who wants to know)
Start -> Control Panel -> Administrator Tools -> Computer Management -> Event Viewer

Display all sub-options after selecting Event Viewer, then go Windows Logs, and expand it.

There are two spots I check, Applications and then Systems. Looking primarily for a red circle with a ! (exclamation point) in it. This will give you basic info (without having to tear through a memorydump if you did infact get one.)

Once you find it you open it up into its own window or just click it on it and see the summary below the list of what kind of error. You can even post in here if you are unsure, it has a .XML format which is easy to follow if you are familiar with XML.

Look for values or .dll or .dvkm that start with nV


----------



## Alatar

Hey guys

Any idea if the EK backplate for the 590 would fit the card if it doesn't have a waterblock installed? I'm looking at the installation manual and they don't mention if it fits without a WB. Any ideas?

Heres the manual

My asus card has the reference backplates that I don't like so much, so I'm looking for alternatives.

Thanks


----------



## Masked

Quote:


> Originally Posted by *Alatar;13110071*
> Hey guys
> 
> Any idea if the EK backplate for the 590 would fit the card if it doesn't have a waterblock installed? I'm looking at the installation manual and they don't mention if it fits without a WB. Any ideas?
> 
> Heres the manual
> 
> My asus card has the reference backplates that I don't like so much, so I'm looking for alternatives.
> 
> Thanks


They fit the 480's...I imagine they'll fit the 590's.

They're connected to the card VIA the 4 corners...


----------



## Alatar

Quote:



Originally Posted by *Masked*


They fit the 480's...I imagine they'll fit the 590's.

They're connected to the card VIA the 4 corners...


hmm thanks for that, I guess I could shoot them an e-mail before ordering one...


----------



## potitoos

just post a pic or a screenie when they arrive and you'll get your place next to juggalo's quad SLI

I'm supposed to get my card today, it's 7am here, have to go to school first. Let's hope the post office doesn't fail or something.


----------



## soilentblue

Quote:


> Originally Posted by *Alatar;13110476*
> hmm thanks for that, I guess I could shoot them an e-mail before ordering one...


i would shoot them an email but it should fit without the need for a waterblock. no reason why it shouldn't.


----------



## RagingCain

Quote:



Originally Posted by *reflex99*


confused face









are you implying that it is possible to get one directly from them if you work them on the phone?


Big resounding yes, PM me your email address.


----------



## ranerX3

OUCH! this club make me wana own a GTX590 SLI







...


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Big resounding yes, PM me your email address.


They don't have many left though from my understanding VIA Jacob.

I'd get on it soon if you want one.

It's interesting there isn't an R2; it's just staggered.

First time Nvidia has ever done that.


----------



## RagingCain

Quote:



Originally Posted by *Masked*


They don't have many left though from my understanding VIA Jacob.

I'd get on it soon if you want one.

It's interesting there isn't an R2; it's just staggered.

First time Nvidia has ever done that.


I have confirmed they do not physically have any in, mine were apart of the last 3 shipped yesterday, they do believe to have some more in stock relatively soon.

I did speak to him privately on RMAs and promised to keep what he said privately.

I don't believe we have much to worry about period and as a reminder, any OCing/Voltage increasing done within the allowable realms of the drivers is indeed covered by warranty, and editing the BIOS is not. They can even tell if you have flashed the BIOS outside of the factory and which revision it was. So if it is other than an update, you could risk the voiding of warranty right off the bat.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


I have confirmed they do not physically have any in, mine were apart of the last 3 shipped yesterday, they do believe to have some more in stock relatively soon.

I did speak to him privately on RMAs and promised to keep what he said privately.

I don't believe we have much to worry about period and as a reminder, any OCing/Voltage increasing done within the allowable realms of the drivers is indeed covered by warranty, and editing the BIOS is not. They can even tell if you have flashed the BIOS outside of the factory and which revision it was. So if it is other than an update, you could risk the voiding of warranty right off the bat.


RMA's are covered via all retailers at this point...Nobody should worry about RMA's and those quantities.

EVGA I believe is 3/4 covered which, will never come to fruition unless we're all EXTRA ******ed one day.

I don't think EVGA would void anyone for editing the drivers but, the BIOS, most definitely.

In any event, I think you're absolutely correct and nobody has anything to worry about


----------



## coolhandluke41

can someone confirm that you can run two 590's in quad on 
any mobo with 2 PCIe slots @x16/x16.the MB in question is G1 sniper
Thank you


----------



## RagingCain

Quote:



Originally Posted by *coolhandluke41*


can someone confirm that you can run two 590's in quad on 
any mobo with 2 PCIe slots @x16/x16.the MB in question is G1 sniper
Thank you


Yes? I don't see why not...


----------



## whiz882

Quote:



Originally Posted by *RagingCain*


Good job doing benchies Wiz, but why is your QPI so freaking high for a 990x?

Look at your GPU usage under the Crysis benchmark, its looks to be about 65~70% as well. I anticipate you might have an unstable overclock, probably CPU territory.

On the Crysis 4GHz run, you look to be undervolted quite a bit, have you run stability testing? If you are getting core errors, its going to greatly reduce you score. Over voltage on the QPI can increase likelihood of this as well.

You maybe able to increase your scores greatly without increasing your speed, just adjusting the voltages.

I personally run (for 24/7) 4.100 GHz @ 1.270v (full load), so about 1.31875v in BIOS with VDroop on. My QPI is only +75mV, so its stock of 1.200v + 75 = 1.275v, with a 30 mV LLC, making it 1.305~1.310 at full load. You are entitled to run things however you want, but the Gulftowns are more sensitive from voltages when it comes to error, too high or too low can cause errors. Those voltages are as low as I can get without getting either a Core failure / BSOD. IBT 100 passes Max Memory and Prime95 Custom Blend with 21000 MB of RAM stable for 16 hours straight.

The max safe vCore & QPI voltages I have read is approximately 1.40v. However, most enthusiasts agree going over 1.400v on QPI causes errors, and does not aid overclock. Now on the vCore, error town (without sub-zero) begins appearing around 1.59~1.60v, thus making your max OC limited to vCore.

Also what are your Uncore/Northbridge & QPI frequencies? Having a stock Uncore can hold back your benchmarking. QPI should be set to around 3.5 GHz and if you don't have direct control over that, have to adjust Bclk and lower CPU multiplier. Its more complicated I know but worth it. If you scroll down the EVGA's hall of fame, you can see I am in 4th place on 3DMark11 for 2x GPUs, but its primarily due to a boost from my Combo score at the end, nearly a 1000 higher, just from raising my Uncore to a stable 3.6 GHz as well. It increases system intercommunication greatly. In addition to that, I have added a small boost to my IOH Core voltage, which has been shown to increase the performance of multiple card / multi GPU setups. I run that at approximately from 1.100v to 1.225v, not sure what the live voltages are but I assume it doesn't go much higher or lower. 0.060v for every GPU











RagingCain , thanks for your advice and kindness. QPI voltages going over1.400v , because northbridge frequencies adjusted to over @4000MHz , that's score is better.


----------



## rush2049

Well, for me my stability issues seem to be resolved when I force a custom fan profile on the card..... keeping it in the 50-65 degree range.

I hope nvidia releases a new driver here before portal 2 comes out.... would love some performance increases.....


----------



## kcuestag

GTX590 arrived 2 days ago at a friend's house, he's using latest drivers from Nvidia's site, so far so good, the card performs like a beast and has a decent low noise, I am impressed









So yeah, I'll tell him to register in OCN so he can join the 590 owners club!









By the way, he's on a Corsair HX750, you guys think he'll be fine? He's got an i7 930 @ 4.2Ghz HT.


----------



## Arizonian

Quote:


> Originally Posted by *kcuestag;13120374*
> GTX590 arrived 2 days ago at a friend's house, he's using latest drivers from Nvidia's site, so far so good, the card performs like a beast and has a decent low noise, I am impressed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So yeah, I'll tell him to register in OCN so he can join the 590 owners club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, he's on a Corsair HX750, you guys think he'll be fine? He's got an i7 930 @ 4.2Ghz HT.


That power supply will be perfect for his GTX 590, he won't have any problems unless he's thinking SLI for quad power. Glad it's all good. He's going to cut through FPS like a hot knife through butter, without any over clocking.


----------



## kcuestag

Hehe, I don't think he can afford another GTX590 right now, plus he won't really need more GPU power for at least a year or 2









Thanks a lot, then we'll keep him that PSU







Thank you!


----------



## krazyatom

hey guys,

Only reason I want to get gtx 590 over amd 6990 is that they're shorter than 6990.
6990 doesn't fit in my case, so my only option is gtx 590. I saw many reviews and it seems 6990 is better overall. What am I going to do?


----------



## kcuestag

If you don't want to change ur case, get a GTX590, it's a beast card and it's almost as good as HD6990, either cards won't dissapoint you









Choose whichever you heart tells you


----------



## Levesque

Quote:


> Originally Posted by *krazyatom;13120898*
> hey guys,
> 
> Only reason I want to get gtx 590 over amd 6990 is that they're shorter than 6990.
> 6990 doesn't fit in my case, so my only option is gtx 590. I saw many reviews and it seems 6990 is better overall. What am I going to do?


You could always buy a new case, since graphic cards are getting bigger and bigger every year. And come see us also in the 6990 owner thread. Some crazy OC under water are starting to show up.


----------



## blues man

What's New


----------



## Masked

Quote:



Originally Posted by *Levesque*


You could always buy a new case, since graphic cards are getting bigger and bigger every year. And come see us also in the 6990 owner thread. Some crazy OC under water are starting to show up.


Reported...again...for the umpteenth time this morning...Congratulations for being a troll on epic scale.

The volts on my C1 vs C2 at work are off a bit...One's at .913 the other is at .887 ~ This is my work machine so, I don't quite give a rat's ass however, this is disconcerting considering c1 is supposed to match c2...


----------



## kcuestag

Quote:



Originally Posted by *Masked*


Reported...again...for the umpteenth time this morning...Congratulations for being a troll on epic scale.

The volts on my C1 vs C2 at work are off a bit...One's at .913 the other is at .887 ~ This is my work machine so, I don't quite give a rat's ass however, this is disconcerting considering c1 is supposed to match c2...


Actually I wouldn't worry, I've seen many other GTX590's where the cores have different voltages.

I take it as a normal thing? I wouldn't worry if ur cores have different voltages


----------



## Masked

Quote:



Originally Posted by *kcuestag*


Actually I wouldn't worry, I've seen many other GTX590's where the cores have different voltages.

I take it as a normal thing? I wouldn't worry if ur cores have different voltages


















Go back like 30 pages...

My cards are serial "twins", one right after the other...One card has a .95, .938; other card has whatever I just said.

They can both overclock to the same number however, both aren't volting the same at the same apparent numbers...Thus my above post.

If you guys are going to comment in here, please have SOME semblance of the previous conversation and it's context.

ESPECIALLY if you're trolling and don't even own the damn card. (That was aimed at Levesque)


----------



## Alatar

oh wow just got a reply from EKWB and they said the backplate wont fit with the stock cooler...

Well I guess I'll s tick with the stock one then.


----------



## Masked

Quote:



Originally Posted by *Alatar*


oh wow just got a reply from EKWB and they said the backplate wont fit with the stock cooler...

Well I guess I'll s tick with the stock one then.


Really? That's very odd.

On the 480s//580's you could use the waterblock etc because the connectors themselves are washers that go on the 4 corners...

Hrm.

I wonder what changed?


----------



## reflex99

Glorious news comrades!

I just got an e-mail from tigerdirect. Basically, what it says is, that they will be getting more 590s, and they will be able to fulfill my order.









I was getting worried for a second there.

EDIT: What prevents you from just rolling back drivers to enable voltage controll again?


----------



## kcuestag

Wait, how many did you order?









Congrats!


----------



## reflex99

1 lol.

Right after i ordered though, they changed the webpage to "item is unavailible", and sent me an email saying it was back ordered.

This is what i got today though:

Quote:



Due to an unusually large response, we are temporarily out of stock on the item(s) listed below. Additional quantities are on order and we expect them soon. Your order will receive priority upon receipt in our warehouse and we expect to be able to ship your order very soon. We are very sorry for the delay and assure you that we are making every effort to speed your order to you. If you would like to cancel or substitute your item please give us a call at 1-800-800-8300.


----------



## kcuestag

Oh ok, let's hope you get your card soon








Congrats, I want some pictures once it arrives!


----------



## soilentblue

Quote:



Originally Posted by *reflex99*


1 lol.

Right after i ordered though, they changed the webpage to "item is unavailible", and sent me an email saying it was back ordered.

This is what i got today though:


might have one up for sale soon if you interested







I need more vram.


----------



## reflex99

Let's hope it gets here before the end of the month.









Edit: I want my evga one. I need a new size xl shirt.


----------



## RagingCain

Quote:



Originally Posted by *soilentblue*


might have one up for sale soon if you interested







I need more vram.


Going for the GTX 3GB?

51% of me wants to keep the GTX 590s and 49% wants to switch to 3GB GTX 580s.


----------



## reflex99

590 is bigger epeen


----------



## RagingCain

Quote:



Originally Posted by *reflex99*


590 is bigger epeen


Its already big.

Worried that the GTX 590 SLI isn't really that much more powerful than GTX 580 SLI overclocked like mine. Which really would put it behind GTX 580 Tri-Sli overclocked. Just thinking in terms of the performance I see now vs. the performance I will get.

I can't help but thinking 4 GPUs, with time, maturity in drivers and a slight overclock is the best route. There are a lot of uncertain factors though, driver performance is never guaranteed, overlocking is the same, and titles that benefit on 4 GPUs is still yet to be shown.

With the exception of newer GTX 580, it looks like gaming @ 5760x1080 is fairly limited in performance on all GPUs, and apparently 3GB is good for 2560x1600 x 3 monitor. Not to mention I can't see myself gaming in all titles on 3 monitors. I think it would be awesome for C&C, Starcraft 2, or Supreme Commander 1&2. Maybe WoW, but thats it. FPS would probably be best played on a single monitor.

Jesus, its like living with two people in my head, hearing all the pros and cons over and over and over.


----------



## soilentblue

Quote:



Originally Posted by *RagingCain*


Going for the GTX 3GB?

51% of me wants to keep the GTX 590s and 49% wants to switch to 3GB GTX 580s.


dragon age 2 has me crashing the card from vram limit. pretty sure the witcher will as well. full out I can't use the card with my monitor. Just not enough vram. not a bad card and drivers will make it better, but the vram will never be going up.

evga gtx 580 3gb($640ish each) or amd 6990($700).


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Its already big.

Worried that the GTX 590 SLI isn't really that much more powerful than GTX 580 SLI overclocked like mine. Which really would put it behind GTX 580 Tri-Sli overclocked. Just thinking in terms of the performance I see now vs. the performance I will get.

I can't help but thinking 4 GPUs, with time, maturity in drivers and a slight overclock is the best route. It looks like gaming @ 5760x1080 is fairly limited in performance on all GPUs, and apparently 3GB is over kill for that resolution. Not to mention I can't see myself gaming in all titles on 3 monitors. I think it would be awesome for C&C, Starcraft 2, or Supreme Commander 1&2. Maybe WoW, but thats it.

Jesus, its like living with two people in my head, hearing all the pros and cons over and over and over.


To be quite honest, it's all about what you're looking for.

Considering the room I'm seeing between playing it safe and being stupid; I have no doubt this card will EVENTUALLY be opened up...The question is: when?

If you're looking for performance, the 580's are the way to go...This is a hands down, no bull**** answer.

Considering the scaling possible, I think 2x590's will BARELY beat out a tri-SLI 580 configuration...Right now, they're JUSt under but, I eventually see the 590 coming out ahead.

That being said, if you're looking for a single slot solution, obviously the 590 is your card...

It's all about what YOU as the end user actually want...Performance wise, there's no argument 580's will win but, for the same price, the 590's are the better choice.

It's definitely a tough decision if you're on the fence because 2x590's = 3x580's...


----------



## Masked

Quote:



Originally Posted by *soilentblue*


dragon age 2 has me crashing the card from vram limit. pretty sure the witcher will as well. full out I can't use the card with my monitor. Just not enough vram. not a bad card and drivers will make it better, but the vram will never be going up.

evga gtx 580 3gb($640ish each) or amd 6990($700).


...I'm in beta for the witcher 2 and I don't crash with the 590...DA2 also never crashes...

Wouldn't say it's the vram but, that's just me.


----------



## soilentblue

Quote:



Originally Posted by *Masked*


...I'm in beta for the witcher 2 and I don't crash with the 590...DA2 also never crashes...

Wouldn't say it's the vram but, that's just me.


i've seen as high as 1800+vram in da2 from others with 2gb cards. it's the vram. I am also pushing more pixels than you too though and the witcher 2 demo won't have the full force to crash the 590 like the final game.

how do you like the beta?


----------



## Masked

Quote:



Originally Posted by *soilentblue*


i've seen as high as 1800+vram in da2 from others with 2gb cards. it's the vram. I am also pushing more pixels than you too though and the witcher 2 demo won't have the full force to crash the 590 like the final game.

how do you like the beta?


The beta's amazing but, I also only 1 window it so I can see why you'd have an issue.

NDA is pretty strict on this one so, that's all I can say.

I have other stuff going on so, I hear that.

I don't think their final release will be that taxing but, I could be wrong.

We were tossed directly into the beta so, I've never actually seen the demo TBH...I'll ask for a copy and let you know what I find, though.

I really can't see them requiring that much Vram considering the average consumer...They're very level headed about that.


----------



## toX0rz

stock


----------



## kcuestag

Very nice! Congrats on ur purchase, looks very pretty







!


----------



## Alatar

well well, the first MSI card to be seen around here









E: added.


----------



## RagingCain

Haha, I knew MSIalex was holding out on me!

@Congrats Toxor, do you feel like pulling off the heatsink and having some snapshots taken of both the front and the back? It looks reference, but I can't 100% tell. I was wondering if MSI put an extra VRM or two









It is reference design, still good


----------



## Cyanotical

Quote:



Originally Posted by *reflex99*


1 lol.

Right after i ordered though, they changed the webpage to "item is unavailible", and sent me an email saying it was back ordered.

This is what i got today though:


ah ***, i ordered two of them for my new build and go no such email


----------



## toX0rz

Quote:



Originally Posted by *kcuestag*


Very nice! Congrats on ur purchase, looks very pretty







!



Quote:



Originally Posted by *RagingCain*


Haha, I knew MSIalex was holding out on me!

@Congrats Toxor, do you feel like pulling off the heatsink and having some snapshots taken of both the front and the back? It looks reference, but I can't 100% tell. I was wondering if MSI put an extra VRM or two










thanks guys









@ RagingCain
well, I'd like to but I don't want to void the warranty (yeah this isnt eVGA







) and since I don't have a watercooling system I dont plan to put any block on it too.

All I know is, that they are putting all solid caps on this card but dunno aobut the VRM's.
I know that MSI has been advertising this card with being OCable to 840 MHz though as various sites state:

http://www.techpowerup.com/142850/MS...hics-Card.html

http://news.softpedia.com/news/MSI-A...d-191424.shtml

Maybe MSIAlex can clear it up.


----------



## RagingCain

Quote:



Originally Posted by *toX0rz*


thanks guys









@ RagingCain
well, I'd like to but I don't want to void the warranty (yeah this isnt eVGA







) and since I don't have a watercooling system I dont plan to put any block on it too.

All I know is, that they are putting all solid caps on this card but dunno aobut the VRM's.
I know that MSI has been advertising this card with being OCable to 840 MHz though as various sites state:

http://www.techpowerup.com/142850/MS...hics-Card.html

http://news.softpedia.com/news/MSI-A...d-191424.shtml

Maybe MSIAlex can clear it up.


Yeah I saw those, if you do overclock, just be ein biÃŸchen careful.


----------



## Masked

Quote:



Originally Posted by *toX0rz*


thanks guys









@ RagingCain
well, I'd like to but I don't want to void the warranty (yeah this isnt eVGA







) and since I don't have a watercooling system I dont plan to put any block on it too.

All I know is, that they are putting all solid caps on this card but dunno aobut the VRM's.
I know that MSI has been advertising this card with being OCable to 840 MHz though as various sites state:

http://www.techpowerup.com/142850/MS...hics-Card.html

http://news.softpedia.com/news/MSI-A...d-191424.shtml

Maybe MSIAlex can clear it up.


What you guys aren't seeing are the issues.

Initially AND at the time of production, that card and all COULD oc to 850mhz...The problem was that people did.

From my understanding a LARGE chunk have already been RMA'd due to the stupidity of others.

NVIDIA then took it upon themselves to LOCK our cards due to the morons and stupid people that OC'd beyond NVIDIA's suggested voltages.

That's not a fault of MSI, that's a decision by Nvidia and it's absolutely still working as intended.

Blame the community for that one.


----------



## RagingCain

Quote:



Originally Posted by *Alatar*


well well, the first MSI card to be seen around here









E: added.


Hey Alatar, you mind if I edit the spread sheet a bit?


----------



## Alatar

Quote:



Originally Posted by *RagingCain*


Hey Alatar, you mind if I edit the spread sheet a bit?


it's there for peeps to edit.

E: if you don't have permission, you need that before editing. Just request it and I'll add you, when I get the mail.


----------



## RagingCain

@Alatar
I have permission lol, I was the one who setup to begin with hehe.

@Twilex/Tox
Added you both to the spreadsheet.

Okay, I have put Quads on top of the list (as far as I can tell) let me know if I made a mistake, lets try and fill out the information as accurately as we can.

I have included space for 3 benchmarks, with the idea that more can be added, but since just about anyone can get / has 3DMark 11 and/or Unigine Heaven, I have only started with those two to begin with.

The post that has the benchmark info, will be the hyperlink that goes in the proof box.

*3DMark 11 Requirements*:
Screenshot of all 4 scores with GPU-z open and 3dMark11 verification link.

*Unigine Requirements* - We have to agree upon:

For Unigine we have to agree on a standard preset to follow for benchmarking, how does this sound:

1920x1080x8AA
All Settings on Highest, Extreme Tessellation.

Alternatively, we can do 2 or 3 resolutions if you would like, I think its only fair, if you use better resolutions, you should get a chance to use them.

1920x1080x8AA, 1920x1200x4AA, and 2560x1600x0AA (VRAM considerations). Maybe even a special category for multi-monitors.

Requires a screenshot of the score while benchmark is still running, and another image of GPU-z for clock/driver information.

What do you guys think? Let me know what other benchmarks you guys want, I am assuming 3DMark Vantage will be the next suggestion.

One other thing, I would like to add is Power Supply, for actual reference to find out whats working and whats not, could be useful for future members.


----------



## Alatar

Quote:



Originally Posted by *RagingCain*


@Alatar
I have permission lol, I was the one who setup to begin with hehe.


yeah sorry about that heh, I just posted and didn't even look who I quoted


----------



## RagingCain

Okay guys, I have tweaked the spreadsheet quite a bit, please feel free to have a look.

Post any corrections or updates and I will do my best to try and stay on top of it. Rather than have gaps, I have filled the data by scavenging through out the thread, saving other people having to work, or by filling drivers used with the latest, and clocks @ stock. Like I said, any changes, let me know.

Another good reason to fill out sig rigs informatively.

I think we should add a couple of easy benchmarking games:
Crysis (Dx10), Metro2033(Dx11), and AvP(Dx11)?

What do you guys think?


----------



## Masked

I'd agree to that...Common games where we can test performance + resolutions etc.


----------



## reflex99

I can't do 1080p >.>


----------



## RagingCain

Quote:


> Originally Posted by *reflex99;13127181*
> I can't do 1080p >.>


You get ONE benchmark at 1680x1050 then, your pick =).

Just kidding we can do 3 resolutions for the games:
1680x1050, 1920x1080, 2560x1600.

I will adjust Heaven benchmark as well.

Since most monitors that have 1920x1200 can produce 1920x1080 resolution, I think that will easily cover everyone in this thread if you just go by those 3.

Metro2033 is ready for benchmarks.

Figured Metro2033 on Very High, DoF Enabled, everything enabled but PhysX.

Has room for both AAA and 4xAA results.
1680x1050/1920x1080/2560x1600 Results tables included.

@Alatar
Do you think you could include the settings for all the benchmarks in the main post (the settings can be found in between this post and the last one.)


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13127349*
> You get ONE benchmark at 1680x1050 then, your pick =).
> 
> Just kidding we can do 3 resolutions for the games:
> 1680x1050, 1920x1080, 2560x1600.
> 
> I will adjust Heaven benchmark as well.
> 
> Since most monitors that have 1920x1200 can produce 1920x1080 resolution, I think that will easily cover everyone in this thread if you just go by those 3.


I'm a stubborn ass and I use older Syncmasters but, I'm swapping soon, I think so...For the time being just 1680


----------



## remer

The new members list looks great.


----------



## ReignsOfPower

I don't have a gmail account at the moment so If anyone would like to add in my score on 3D Mark 11 in, by all means do so







Check my siggy for details, was P9881 IIRC


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13130756*
> *I don't have a gmail account* at the moment so If anyone would like to add in my score on 3D Mark 11 in, by all means do so
> 
> 
> 
> 
> 
> 
> 
> Check my siggy for details, was P9881 IIRC


Serious?

Updated, but I need your voltages for that overclock of 682 Core.

Also put a link to the 3DMark score in your post (if you ever change signature.)

Data integrity


----------



## ReignsOfPower

Quote:


> Originally Posted by *RagingCain;13131460*
> Serious?
> 
> Updated, but I need your voltages for that overclock of 682 Core.
> 
> Also put a link to the 3DMark score in your post (if you ever change signature.)
> 
> Data integrity


Oh I have one, but I'm trying to keep it super clean. Android device only and stuff. I'll make another for general things (which I mainly use Hotmail for anyways







) Oh and as far as voltages I didn't move them at all from those which are already listed in the spreadsheet. Stock volts. Unless ofcourse they fluctuate a bit when I bump up the speeds?

Also - Thanks for adding that in! Looks ace. I'll chrun out all the other benchmarks for you guys when I get home


----------



## helifax

Quote:


> Originally Posted by *reflex99;13127181*
> I can't do 1080p >.>


I subscribe, My Samsung 2233Rz monitors can't go beyound 1680x1050


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;13127349*
> @Alatar
> Do you think you could include the settings for all the benchmarks in the main post (the settings can be found in between this post and the last one.)


updated the OP, the benchmark settings are below the spreadsheet. Just let me know if I got something wrong. I'm also assuming that metro 2033 has AAA and 4AA results for the 1680x1050 resolution?

Absolutely terrific job with the spreadsheet BTW!


----------



## zerounleashednl

Guys, i'm thinking about getting 1 or maybe even 2 GTX590's. I'm really attracted to these cards...









But I'm a bit worried the VRAM will not be enough for the next wave of games coming out in 2011/2012. Can you give an example of the VRAM usage of a 590 in a recent game like Bad Company 2 or Crysis 2 with all settings at max on 1920x1080 (or 1200)?

Thx


----------



## Masked

Quote:


> Originally Posted by *zerounleashednl;13135138*
> Guys, i'm thinking about getting 1 or maybe even 2 GTX590's. I'm really attracted to these cards...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I'm a bit worried the VRAM will not be enough for the next wave of games coming out in 2011/2012. Can you give an example of the VRAM usage of a 590 in a recent game like Bad Company 2 or Crysis 2 with all settings at max on 1920x1080 (or 1200)?
> 
> Thx


I'd be more worried about finding 2, personally...Considering the last staggering is upcoming...


----------



## zerounleashednl

Quote:


> I'd be more worried about finding 2, personally...Considering the last staggering is upcoming...


I don't understand what you're saying...


----------



## Masked

Quote:



Originally Posted by *zerounleashednl*


I don't understand what you're saying...










The 590 was a staggered release. There was only 1 production run.

Meaning they had 5 quantities, right? So 1+1+1+1+1 = 5.

2 Quantities were sold on release day.

1 a week later.

1 a week after that.

That means there's only 1 release left and after that they're gone.

When I say gone I mean the rest will be used for RMA's.

So if you're even thinking about getting a NIB 590 (New in box) you have the next release or it's not happening.


----------



## reflex99

So when do you think they will release the last one?


----------



## Masked

Quote:



Originally Posted by *reflex99*


So when do you think they will release the last one?


From what I'm being told, they're dipping into their RMA pool so, I'd imagine 1 more release period.

When? I couldn't tell you...

Maybe this week? Maybe they'll let anticipation build and next week?

My understanding right NOW is if you guys are getting cards from vendors like Tigerdirect // etc...They're open box returns that original owners didn't keep/open etc.

If you'd noticed, 6990's didn't sell out on the day of release from TD or Newegg, it happened a week later (Reception, return and re-purchase after restocking fee) so, if any of you are getting cards, in the interum, it's likely a return from the 3rd release...

That's just the reality of limited production.


----------



## RagingCain

Quote:



Originally Posted by *zerounleashednl*


Guys, i'm thinking about getting 1 or maybe even 2 GTX590's. I'm really attracted to these cards...









But I'm a bit worried the VRAM will not be enough for the next wave of games coming out in 2011/2012. Can you give an example of the VRAM usage of a 590 in a recent game like Bad Company 2 or Crysis 2 with all settings at max on 1920x1080 (or 1200)?

Thx










Well I will be happy to post some actual screenshots for you later today or tomorrow, however (since I do have GTX 580s) I can tell you as of right now there is nothing you can't play at 1920x1080 on max settings (VRAM wise), and I would even bet the same for 1920x1200.

Games that get close are DA2 + HQ/ GTA4 / Oblivion + Mod. Most I have seen is DA2 with 1920x1080 and everything on with the HQ Texture pack installed. Never went about 1378 MB <- Highest I ever saw.

Quote:



Originally Posted by *remer*


The new members list looks great.










Thanks, I appreciate it









Quote:



Originally Posted by *Alatar*


updated the OP, the benchmark settings are below the spreadsheet. Just let me know if I got something wrong. I'm also assuming that metro 2033 has AAA and 4AA results for the 1680x1050 resolution?

Absolutely terrific job with the spreadsheet BTW!










Great eye, fixed!

I should have Crysis benchmarks setup next.

I am willing to take some inputs on some other titles, I believe next up would be Aliens Vs. Predator. Takes a little setting up for the benchmark I know, but I can't think of any other decent Dx11 benchmarks. We can always have FarCry 2, but thats already pretty high FPS wise.


----------



## RagingCain

Double Post..... ...my bad, no coffee yet.


----------



## holyhyperion

Great thread guys. When I go back to my apartment on Sunday from visiting my folk's house, I'll post pics and proof of my setup. Can't wait until then.


----------



## zerounleashednl

Quote:



The 590 was a staggered release. There was only 1 production run.

Meaning they had 5 quantities, right? So 1+1+1+1+1 = 5


Hmmm interesting... where do you get you're information from? A few google searches only brings op the limited releases like the EVGA Classified, but nothing about limited productions of the general GTX 590...


----------



## Masked

Quote:



Originally Posted by *zerounleashednl*


Hmmm interesting... where do you get you're information from? A few google searches only brings op the limited releases like the EVGA Classified, but nothing about limited productions of the general GTX 590...


I'm an Admin for a major computer company that had betas of these cards in November.

In November it was stated publically via a "leak" that the card was an extremely limited production. Same in December, January, Feb and March prior to it's release, then again on release day.

The card had 1 production run due to the picking of GPU's and it's on a staggered release because of that.


----------



## reflex99

Quote:


> Originally Posted by *zerounleashednl;13136012*
> Hmmm interesting... where do you get you're information from? A few google searches only brings op the limited releases like the EVGA Classified, but nothing about limited productions of the general GTX 590...


He works for _________.

The hint is in his avatar.


----------



## Masked

This is the article in Feb I mentioned:

Geeks3d

Another in late January:

NordichHardware

Etc etc etc...


----------



## RagingCain

Quote:


> Originally Posted by *zerounleashednl;13136012*
> Hmmm interesting... where do you get you're information from? A few google searches only brings op the limited releases like the EVGA Classified, but nothing about limited productions of the general GTX 590...


I do not have any numbers like Masked said, but I did talk the Product Manager at EVGA, and he is expecting a few shipments this afternoon or Monday afternoon. He didn't confirm anymore in the future, but he also didn't deny it. He also mentioned RMA stock has not been touched and not allowable for sale at this time as well as some RMA figures to me (privately) which I will keep this way. I just reiterate what I said before, if you were worried about getting an RMA its statistically in YOUR favor. Haha, they extended tons of courtesy to me and my RMAs, called me up personally actually, however only one was truly an RMA and the other just had a manufacturing defect which I didn't care for.

They highly recommended to jump on the AUTO-NOTIFY list, as the second the UPC/Serial is entered in the system, its immediately emailed out to you all.

Auto-Notify List for 2x GTX 590 HCs
Auto-Notify List for 2x GTX 590
Auto-Notify List for 1x GTX 590 HC
Auto-Notify List for 1x GTX 590


----------



## Masked

Whenever I call them it's such a LOOOOONNNNGGGGGG conversation.

We end up talking about samples and omg...3-4 hours pass in the meantime.

Just haven't had time to call them lately, Jacob, Mark or even Tom









My sources are saying this is really the "last big push" for 590 retail so, I'd get one while/if you can click fast enough once you're notified...If not, good riddance.


----------



## RagingCain

Well hello everybody







UPS came today... and then I did too


















Just stock, I know I know.

Edit:
Anybody else getting a hard lock up/freeze with OC and 3DMark 11? I was hoping the app/driver would crash but reb00ting takes for eva!


----------



## capchaos

YUP. I couldn't go past 650 mhz in 3d mark 11 without upping voltage in quad sli


----------



## RagingCain

Quote:



Originally Posted by *capchaos*


YUP. I couldn't go past 650 mhz in 3d mark 11 without upping voltage in quad sli


Its definitely not artifacting, I think its OCP/Driver. Its 645 MHz for me, oh well, have to revert to 267.85.

Anybody testing these 268 series drivers?


----------



## ReignsOfPower

*EVGA GTX590 Classified Overclocked Benchmarks
Core 682MHz Shader 1364MHz Memory 3692MHz (GPU1 0.925V GPU2 0.925V)*
3D Mark 11 P9881
http://3dmark.com/3dm11/1008332

3D Mark 11 X3363
http://3dmark.com/3dm11/1036276

*EVGA GTX590 Stock Clocks Benchmarks
Core 630MHzShader 1260MHz Memory 3456MHz (GPU1 0.925V GPU2 0.925V)*
Unigine Heaven 2.5 Very High + Extreme Tess. (1680x1050x8AA) AF16x
http://img641.imageshack.us/img641/6420/heaven01.png

Unigine Heaven 2.5 Very High + Extreme Tess. (1920x1080x8AA) AF16x
http://img824.imageshack.us/img824/5409/heaven02.png

Unigine Heaven 2.5 Very High + Extreme Tess. (2560x1600x0AA) AF16x
http://img717.imageshack.us/img717/2...extremerun.png

Metro 2033 Max Settings + DoF On + Tess On + PhysX Off (1680x1050xAAA)
http://img132.imageshack.us/img132/2872/metro01.png

Metro 2033 Max Settings + DoF On + Tess On + PhysX Off (1680x1050x4AA)
http://img708.imageshack.us/img708/4002/metro02.png

Metro 2033 Max Settings + DoF On + Tess On + PhysX Off (1920x1080xAAA)
http://img849.imageshack.us/img849/2530/metro03.png

Metro 2033 Max Settings + DoF On + Tess On + PhysX Off (1920x1080x4AA)
http://img228.imageshack.us/img228/4845/metro04.png

Metro 2033 Max Settings + DoF On + Tess On + PhysX Off (2560x1600xAAA)
http://img51.imageshack.us/img51/7376/metro05.png

Metro 2033 Max Settings + DoF On + Tess On + PhysX Off (2560x1600x4AA)
http://img810.imageshack.us/img810/2636/metro06.png

*The System (Everything Air Cooled)*
























i7 920 D0 @ 4.24 GHz 1.24V
3x2GB Corsair CM3 1600MHz CL7
ASUS Rampage 3 GENE mATX 0704
EVGA GTX590 Classified Edition
Corsair HX1000W
DELL U3011
OCZ Vertex 2 128GB SATAII
WD Black Caviar (64MB) 2TB SATAII
Auzentech Prelude 7.1
Windows 7 x64 Ultimate SP1
270.51 Drivers

*Picture of System*
http://img534.imageshack.us/img534/8...0407194816.jpg


----------



## rush2049

What 268 drivers are you talking about?


----------



## 2010rig

I'm one step closer in being convinced to pick up a 590
http://forums.nvidia.com/index.php?showtopic=197878

Does anyone here know if Adobe Premiere truly utilizes Both GPU cores?

I've signed up for the notification list. Thanks RagingCain.


----------



## rush2049

I believe masked said something earlier about CS5 utilizing both, but only the main one 100% and the secondary a measly 20%....... basically bad drivers for sli performance right now.


----------



## helifax

Alatar, if I might have a suggestion:

You could add to the list, benchmark entries on how the card performs with Nvidia 3D Vision. As we all know, 3D vision has it's toll on the card only when the stereoscopic 3D option is enabled in nvidia control panel ( even if the card is used in plain 3D).
I think, some of us will find it interesting to find out how the card is working in stereo 3D.
You could add a section for single monitor and a one for 3D vision souround (using 3 monitors in 3D).

Best regards


----------



## kazukun

Good score appeared

[email protected](210.5x23)
[email protected] Mem950MHz Shader1500MHz

3DMARK VANTAGE
P65080
GPU　59608
CPU　89814









3DMARK11
P16476
Graphics　19302
Physics　12457
Combined　10209


















http://blue.ap.teacup.com/kazukun/102.html


----------



## Arizonian

Quote:


> Originally Posted by *kazukun;13149380*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://blue.ap.teacup.com/kazukun/102.html


----------



## 2010rig

Quote:



Originally Posted by *rush2049*


I believe masked said something earlier about CS5 utilizing both, but only the main one 100% and the secondary a measly 20%....... basically bad drivers for sli performance right now.


Yea, I remember him saying that, AFAIK Adobe Premiere specifically does not recognize an SLI set up, so I wonder what the deal is with a Dual GPU.


----------



## reflex99

So what happens if my card takes longer than 30 days to get here?

Can i not register the warranty?

If so, that would be a classic evga jerk move.


----------



## Baasha

Quote:



Originally Posted by *kazukun*


http://blue.ap.teacup.com/kazukun/102.html


Kazukun,

Excellent scores and awesome rig!

Can you run the Xtreme setting for 3DMark 11? I have the air-cooled EVGA GTX-590 Quad-SLI at stock clocks and would like to compare.


----------



## Baasha

Guys,

Need opinions/advice here.

I have the EVGA GTX-590 Classified Quad-SLI and it's running great. They're at stock clocks/voltage.

The issue I'm having is that I have 3 Dell U3011 so my resolution is at 8100x1600 (bezel-corrected) and so I'm having to turn my AA settings down in games. I can run 4xMSAA in BC2 with 16xAF but in Bulletstorm, for example, I can only play at 0xAA. Even if I put it at 2XAA, it slows down to 1-2FPS. I guess it's running out of VRAM.

I just got these cards (doh!) and now I'm thinking whether I should get 3 GTX-580 3GB versions(?). EVGA is about to release them in the next few days.

First question is will the tri-SLI GTX-580 outperform the 590 Quad-SLI at these resolutions? With my current settings in BC2, I get about 80-110FPS so it plays fantastically well. However, turning down the bells and whistles (AA/HBAO) defeats the purpose of playing at such incredible resolution(s) doesn't it? I mean, I had (now it's for sale) one GTX-580 SC and I was able to max out all settings on one 30" monitor (2560x1600).

Second question is I still have time to "return" the cards to EVGA but they want to charge me a 15% restocking fee which is about $220!







That's a bit ridiculous for keeping the card for about 2 weeks! I could turn around and sell them independently however that may be a hassle (unless someone here wants them







).

Let me know your thoughts. I mainly want the best GPU solution I can get but I need 1 PCI-E slot for my sound card (even after I remove my RAID-controller and use Intel ICH10R). Now, with 590 Quad-SLI, I can use BOTH my sound card AND my RAID controller without issue. If I didn't get my sound card, I would get 4 GTX-580 3GB and that would definitely outperform the 590 Quad-SLI.


----------



## ranerX3

2 HD 6990 in QUAD SLI should probably do a better job since they have 1 more GB of memory. (but they will do more noise and will run much much hotter if you put them at close slots and even if they got some space between them they will still run hotter...)

or just get 3/4 GTX580 3GB'S they should out perform the GTX590 SLI in terms of performance but they dont have more memory :/

you can also wait maybe some company will release a 6GB version of the GTX590 or a 8GB version of the 6990 but then their is no guaranty something like this will happen...


----------



## toX0rz

Encountering the first problem with the GTX 590.

When I play Crysis Warhead, it only utlizies both GPU's by 30-40% most of times, rarely going up to 60%. (Using evga Precision to read the values)

I've had this happening in older games which didn't require that much power (power management) but in this case the game starts to stutter @ 20-30 FPS just because the GPU's are not being utilized at all...

anyone here that can try to replicate it ?

oh and the power management setting in driver is set to maximum performance, so it cant be that.
Driver is 270.51 beta.


----------



## RagingCain

QuadSLI doesn't seem to want to work on 267.85 drivers for me. Only enabled when using Surround.

Must be some Surround / SLI glitch when multiple monitors are detected.

Looks like my benching will have to wait till voltage is unlocked again.

Secondly, for those that have the hydro coppers from EVGA, have you noticed any indication of reduced flow, my GPU temps are great, but my CPU temps are 5~10c+ at max load. My flow is acceptable but looks to be about half of what it was with 2x MCP655 pumps running together. I am entirely surprised that the cards would be this restrictive in flow rate.


----------



## kazukun

Quote:



Originally Posted by *Baasha*


Kazukun,

Excellent scores and awesome rig!

Can you run the Xtreme setting for 3DMark 11? I have the air-cooled EVGA GTX-590 Quad-SLI at stock clocks and would like to compare.



The some amplification&Xtreme Measurement









[email protected](210.5x23)
1683MHz 8-8-8-20 2T　→　1683MHz 7-7-7-18 1T

[email protected] Mem950MHz Shader1500MHz

3DMARK VANTAGE
P65080→P65828
GPU　59608→60325
CPU　89814→90632









3DMARK11
P16476→P16555
Graphics　19302→19368
Physics　12457→12774
Combined　10209→10062









Xtreme Measurement

3DMARK VANTAGE
X39209
GPU　38078
CPU　89972









3DMARK11
X6444
Graphics　6523
Physics　12962
Combined　4029









http://blue.ap.teacup.com/kazukun/103.html


----------



## sublimejhn

Quote:


> Originally Posted by *RagingCain;13152440*
> QuadSLI doesn't seem to want to work on 267.85 drivers for me. Only enabled when using Surround.
> 
> Must be some Surround / SLI glitch when multiple monitors are detected.
> 
> Looks like my benching will have to wait till voltage is unlocked again.
> 
> Secondly, for those that have the hydro coppers from EVGA, have you noticed any indication of reduced flow, my GPU temps are great, but my CPU temps are 5~10c+ at max load. My flow is acceptable but looks to be about half of what it was with 2x MCP655 pumps running together. I am entirely surprised that the cards would be this restrictive in flow rate.


While I can't give any specifics for flow rate, I can say I leave my res about 1/3 empty. With my watercooled GTX295 I could clearly see the water in the res moving at a much faster rate than with the GTX590 hydocopper. Only other thing I have in the loop is my northbridge, so it doesn't matter at all too me, but it's definitely more restrictive.


----------



## Juggalo23451

Quote:


> Originally Posted by *sublimejhn;13158982*
> While I can't give any specifics for flow rate, I can say I leave my res about 1/3 empty. With my watercooled GTX295 I could clearly see the water in the res moving at a much faster rate than with the GTX590 hydocopper. Only other thing I have in the loop is my northbridge, so it doesn't matter at all too me, but it's definitely more restrictive.


The hydrocoppers on some the evga cards like the the gtx590 are made by swiftech. These blocks are highly restrictive as well compared to other blocks ie koolance and ek. That's why I never buy cards with a wb from a manufacturer.


----------



## RagingCain

Quote:



Originally Posted by *Juggalo23451*


The hydrocoppers on some the evga cards like the the gtx590 are made by swiftech. These blocks are highly restrictive as well compared to other blocks ie koolance and ek. That's why I never buy cards with a wb from a manufacturer.


Okay phew, thought I was having an issue with my loop. I assume if one is restrictive, then two is just dastardly. Good thing I upgraded to a dual pump. I was about to rebuild the loop for the 4th time. It sucks about CPU temps being higher, but the GPUs are cooled better than the DangerDens I have. An hour running Unigine only had 38c on all 4 GPUs. I was able to hit 60c on my GPUs with hours of gameplay on the old DDs.

The micro-stutter is realllllllllllllllllllllllllllly bad in Heaven (but so is the GPU usage), so far I am not impressed by the performance of the GTX 590s (coming from SLI GTX 580 mind you).

Anybody who had a GTX 295 can you tell me if the driver stutter got better over time with drivers? I am assuming its just Quad-SLI issues.


----------



## $ilent

Can I just ask how does the gtx 590 comapre with the gtx 480? Both have directx 11 right?

Cos you can get a gtx 480 for Â£200 now, with that in mind you can have 2 gtx 480s in SLI for less than price of gtx 590, what would the performance be like?


----------



## anand00x

I purchased the EVGA 590 GTX last night for $768 from provantage. I also have a VisionteK ati 6990 that I was going to install in my new build. Unfortunately it has has sitting around waiting for my Asus Maximus to come in (hopefully by Wed) . I purchased the 590 because I am worried about the noise that the 6990 is going to produce. I am new to Nvidia coming from 1GB 5870's, 2 x 2GB 5870 6970s and not even opened yet $699.00 - 6990. I have read many reviews and have heard the 590 is a much quieter card. I also like the fact that I can do 3 monitors on this single card. 3d vision is definitely a plus. I have a few questions:

1. Can I still do 3d vision on my 3 x VE278Q Asus Led monitors even without 120hz? 
2. What is a recommended Physx dedicated card for the 590 GTX. Will a GTX 285 suffice?
3. Also any benefit to adding IC Diamond 7 Carat compound between the heatsinks and GPUs?


----------



## RagingCain

Quote:



Originally Posted by *anand00x*


I purchased the EVGA 590 GTX last night for $768 from provantage. I also have a VisionteK ati 6990 that I was going to install in my new build. Unfortunately it has has sitting around waiting for my Asus Maximus to come in (hopefully by Wed) . I purchased the 590 because I am worried about the noise that the 6990 is going to produce. I am new to Nvidia coming from 1GB 5870's, 2 x 2GB 5870 6970s and not even opened yet $699.00 - 6990. I have read many reviews and have heard the 590 is a much quieter card. I also like the fact that I can do 3 monitors on this single card. 3d vision is definitely a plus. I have a few questions:

1. Can I still do 3d vision on my 3 x VE278Q Asus Led monitors even without 120hz? 
2. What is a recommended Physx dedicated card for the 590 GTX. Will a GTX 285 suffice?
3. Also any benefit to adding IC Diamond 7 Carat compound between the heatsinks and GPUs?


No, I don't think it works like that, you need a 120 Hz monitors for 3D Vision from nVidia.

I believe PhysX on these 5xx series cards is handle very well with minimal loss of performance. 285 GTX I believe would be overkill with all but one titles I am familiar with PhysX. I would consider that a poorly implemented game (Mafia II) since it can gett better use of PhysX from a GTX 480.

High end paste usually trumps whatever they normally put, but its up to you, I would wait and put it in the system check out temps before you tear it apart. I change my pastes up every six months or so.

Quote:



Originally Posted by *$ilent*


Can I just ask how does the gtx 590 comapre with the gtx 480? Both have directx 11 right?

Cos you can get a gtx 480 for Â£200 now, with that in mind you can have 2 gtx 480s in SLI for less than price of gtx 590, what would the performance be like?


2x GTX 480 > 1x GTX 590 performance/benchmark wise.

GTX 590 is cooler, quieter, better at tessellation, CUDA, PhysX naturally, supports 3 monitors off of one card including 3d vision. Less performance, more energy efficient, new drivers (need to mature.)

GTX 480 SLI has more power, little bit noisier / hotter, still decent CUDA/PhysX. Supports 3 / 3D monitors but only in SLI, but have more mature drivers and its cheaper to boot.


----------



## Arizonian

Quote:



Originally Posted by *anand00x*


I purchased the EVGA 590 GTX last night for $768 from provantage. I also have a VisionteK ati 6990 that I was going to install in my new build. Unfortunately it has has sitting around waiting for my Asus Maximus to come in (hopefully by Wed) . I purchased the 590 because I am worried about the noise that the 6990 is going to produce. I am new to Nvidia coming from 1GB 5870's, 2 x 2GB 5870 6970s and not even opened yet $699.00 - 6990. I have read many reviews and have heard the 590 is a much quieter card. I also like the fact that I can do 3 monitors on this single card. 3d vision is definitely a plus. I have a few questions:

1. Can I still do 3d vision on my 3 x VE278Q Asus Led monitors even without 120hz? 
2. What is a recommended Physx dedicated card for the 590 GTX. Will a GTX 285 suffice?
3. Also any benefit to adding IC Diamond 7 Carat compound between the heatsinks and GPUs?


I can only answer question #1 - A 120Hz monitor is required to run 3D blu-ray moives or 3D gaming.

How do 3D glasses work to produce 3D vision? 3D is enabled by sending different images to each eye and the 3D glasses helps the eyes to separate these two image streams. This is pretty much the same concept as in the real world where your two eyes see two different images and the brain translates this into 3D. A 120hz monitor is required to achieve this.

Here is a link to a thread I'm working on as I've been learning about 3D monitors. Sort of a hardware review I've put together info in one place. I'm by no means an expert but I'm doing my homework as I learn about monitors.


----------



## Arizonian

Whoops double post.


----------



## DyWaN

Hi guys, just got my Gigabyte GTX 590 with some fancy toolbox







pushed the cores & mems to 675/1350/2000

The card itself









The clocks atm









cheers


----------



## reflex99

^what country are you in?


----------



## DyWaN

Indonesia


----------



## reflex99

do you guys get MSi 590's there?

All we get is eVGA and Asus


----------



## Nexus6

Quote:


> Originally Posted by *DyWaN;13170467*
> Indonesia


Cool! Apa Kabar.


----------



## DyWaN

Quote:


> Originally Posted by *reflex99;13170497*
> do you guys get MSi 590's there?
> 
> All we get is eVGA and Asus


i think we got someone with MSI already
http://www.overclock.net/13123380-post932.html
Quote:


> Originally Posted by *Nexus6;13170501*
> Cool! Apa Kabar.


hi Nexus6







kabar baik








did you learned that from Bali or somewhere else? hehe


----------



## reflex99

ok, i guess they are europe then.


----------



## Nexus6

Quote:



Originally Posted by *DyWaN*


hi Nexus6







kabar baik








did you learned that from Bali or somewhere else? hehe


Nope!!! Orang Indo.......udah lama di U.S.
My Indo is really, really, really bad. 
Nice to see another Indo in OCN.
Awesome card you got. How much you spend on that in Rp?


----------



## DyWaN

Nice to see you too.

I think it's a bit overpriced here. got it for 810 USD (somewhere around IDR 7.0xx.xxx) Quite difficult to get a 590 back here esp Gigabyte brand as they got only 4 of 'em. But I managed to see 590s from MSI retails for about USD 60 less. Maybe it's overpriced because of it's fancy toolbox and gaming mouse bundled...


----------



## RagingCain

(This will be my benchy post @ stock .... unfortunately, even a few MHz and its good bye stability)

*270.61 WHQL was released today *









Going to give them a whirl.

*Surround QUAD-SLI Users (like two of us)*:
For those having weird SLI/Surround related trouble, I advise you to disable either BEFORE installing the new drivers if you are using the Clean Install method. Upon install, to activate Surround, the Control Panel is forcing me to use the second card now for a monitor. It will refuse to activate Surround until I move a monitor over. Looks like they have identified a problem with one card + 3 displays or perhaps just a hiccup for the Quadies, such a pain. *Recommend two monitors on the top card in the two DVI connectors on top, and then on second card, monitor number 3 in either top DVI connector to avoid issues setting up the new driver.*

Max Voltage Still: 0.925

Will test performance out a bit later as I am still recovering from RAID issues. I noticed a ton of computational errors yesterday in BOINC, all of them GPU accelerated WU. No OC on the GPUs, and CPU/Memory OC is stable, wasn't sure it was RAID or what but that was on drivers 270.51 just a heads up, you BOINCers.

The multi-monitor performance seemed very weak, not in game as I haddn't tried it, but its essentially dragging the Windows around terribly unsmooth, and even resizing the Windows can freeze the whole system for a second or two. Responsiveness is just awful in my opinion. After a reboot or two or twelve (fresh install of Win 7 this morning) the responsive ness has increased tremendously, not sure what was affected but its working better now in desktop. This all while Aero was on, so after post-Win7 install system performance check to unlock Aero. Leaving it on Aero Basic does indeed have similar effects of window dragging unsmoothly etc.

Some Crysis Candy:
Crysis run earlier, not very special, but thats probably GPU usage at fault there.










On the other hand, 5760x1080 totally impressed me. Average was probably 50 fps, all settings Very High, noAA. I am sure it would get worse over time, yet it still amazed me.

GPU Memory usage: ~1350 with a bit of spiking to 1400. AA puts it to 1450, it runs smooth till it spikes then it chokes down to 7fps before getting back up.

Here are a few screenies (just a note, JPG so image quality is crap):






















































You can see the GPU speeds, usage, memory, fps in the top left corners. Just wanted to add that the FPS is all lower than what it was actually in game. I had two apps taking screenshots at the same time from the same key, so it caused a stutter. Thats all from STEAM, hence the image quality and the lower FPS.

Apparently my PNG images are too big for ImgShack to show them uncompressed









Edit #2342
Apparently, whatever SLI / Surround issue performance wise they had seems to work better now. If I switch to "Single Monitor Performance" while in Surround, it now gives me full QUAD-SLI performance @ 1920x1080 on a monitor, IF I leave it on Multi-Monitor its still good, but not as good as Single-Monitor. BEFORE, I had to switch from Surround to SLI or I would get around 15 fps no matter what. This a huge headache relief as I can just leave it on Surround now!

This is Surround on Single Monitor performance, compared to above, which was SLI only, not other monitors running.









Amazingly 0xAA has the same performance as 8xQAA, and to touch up on in-game performance, there was virtually no stuttering (micro or otherwise @ 1920x1080) and just marginal stuttering at 5760x1080 resolution in Crysis. It does run better than SLI GTX 580 in game at my resolution.


----------



## ReignsOfPower

Nice scores there bud. I lol-ed at the first Crysis picture and how long Nomads arms are. Looks like the elastic dude from the Fantastic 5









Did you notice any performance increases over the non-WHQL drivers? Would love to see some more speed improvements in future releases for the 590


----------



## reflex99

no hudfix for crysis?

This is what makes me rage about eyefinity/surround, i waste so much time trying to get my hud to work correcty.


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13179764*
> Nice scores there bud. I lol-ed at the first Crysis picture and how long Nomads arms are. Looks like the elastic dude from the Fantastic 5
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you notice any performance increases over the non-WHQL drivers? Would love to see some more speed improvements in future releases for the 590


In all honesty, I can't really say. I have only had a working system less than a day. I will be able to tell in Unigine but thats about the only thing that I tested. Everything else went kablooey after 590 installs due to the new HDDs failing in my RAID (was super pissed, angry flaming kitty had nothing on me, that I had two RMAed hard drives fail for the exact same reason.)

Yeah I didn't adjust the FoV in the HUD, I was really trying to get everything re-installed still, I have 8 games installing of STEAM, Metro2033/AvP are two of them, and WoW / SC2 download install. So I know that probably took a few fps off my score from that alone.

I doubt there are any performance gains over 270.51. I did see good improvements from 267.85 to 270.51 when I was trying to OC, but I didn't get to save any numbers due to RAID hell.
Quote:


> Originally Posted by *reflex99;13179792*
> no hudfix for crysis?
> 
> This is what makes me rage about eyefinity/surround, i waste so much time trying to get my hud to work correcty.


You have to understand that some of it isn't a driver issue. Thats programmed that way in the game. Don't forget Crysis is already 4 years old now, AND, I seriously doubt that they were testing the game on multi-monitors as nothing could max it out then. Just because the drivers let you do it, doesn't mean the game is designed for it.

Your best bet is to hope future titles support Surround/Eyefinity, when they don't work, then you can be mad. When the old titles work, then you are just pleasantly surprised!

I can't wait to drool over Dead Space 2 in Surround


----------



## reflex99

usually hudfixes are community released.

Most games support eyefinity/surround, but the hud is on the very left/right of the screen, which is annoying


----------



## manu97416

was kinda hard to take the pics :S


----------



## Alatar

welcome aboard!


----------



## toX0rz

Quote:


> Originally Posted by *DyWaN;13170411*
> Hi guys, just got my Gigabyte GTX 590 with some fancy toolbox
> 
> 
> 
> 
> 
> 
> 
> pushed the cores & mems to 675/1350/2000
> 
> The card itself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The clocks atm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cheers


hey, where can I find the wallpaper that ure using in the 2nd pic?

I've been looking for it since I saw it in the official 590 teaser but couldnt find it anywere.

Any links?


----------



## Masked

I'm still having major issues with scaling which, is interesting to me.

Granted it's real-world apps that are "beta" atm, I've never seen these issues to this extreme.

My power supply went out on Friday both at home and in the office so, getting replacements today and will do some more testing


----------



## Alatar

Quote:


> Originally Posted by *toX0rz;13187752*
> hey, where can I find the wallpaper that ure using in the 2nd pic?
> 
> I've been looking for it since I saw it in the official 590 teaser but couldnt find it anywere.
> 
> Any links?


http://www.nvidia.co.uk/object/cool_stuff_uk.html#/wallpapers/3339

there you go









E: err seems like the link is not working properly, just choose nvidia geforce wallpapers.


----------



## toX0rz

Quote:


> Originally Posted by *Alatar;13187990*
> http://www.nvidia.co.uk/object/cool_stuff_uk.html#/wallpapers/3339
> 
> there you go
> 
> 
> 
> 
> 
> 
> 
> 
> 
> E: err seems like the link is not working properly, just choose nvidia geforce wallpapers.


nice!

thanks


----------



## anand00x

What is the best dedicated PhysX card in combo with my EVGA GTX 590. Will a 460 suffice? Want to still be in the DX11 arena and not go to cheap with a 9800 GTX or 285 series. Someone locally is selling a new EVGA 460 GTX for $100 might grab it.


----------



## Farih

Quote:



Originally Posted by *anand00x*


What is the best dedicated PhysX card in combo with my EVGA GTX 590. Will a 460 suffice? Want to still be in the DX11 arena and not go to cheap with a 9800 GTX or 285 series. Someone locally is selling a new EVGA 460 GTX for $100 might grab it.


atm i am testing with a gts450 and it seems to be more then enough, highest i seen was 78% usage in 3dmark vantage.

in Metro2033 it only uses up to 25%

Think a gtx460 should be more then enough for a long time to come


----------



## Alatar

Quote:



Originally Posted by *Farih*


atm i am testing with a gts450 and it seems to be more then enough, highest i seen was 78% usage in 3dmark vantage.

in Metro2033 it only uses up to 25%

Think a gtx460 should be more then enough for a long time to come


Try mafia 2? I hear it's heavy on PhysX cards.


----------



## Farih

anywhere i can dl a demo or bench ?

i dont have the game


----------



## reflex99

Steam had the demo which includes a bench.


----------



## soilentblue

i'd have to say that mafia 2 is the heaviest physx game to date. it's amazing what they did in it. I wish more games with physx went that far.


----------



## Masked

Quote:



Originally Posted by *soilentblue*


i'd have to say that mafia 2 is the heaviest physx game to date. it's amazing what they did in it. I wish more games with physx went that far.


Physx isn't exactly the end-all be-all solution to shading/etc...

I have a feeling once MLAA//FXAA get up and going, they'll be replacing physx entirely...


----------



## reflex99

Physx is stupid NVIDIA gimmick to sell more cards. Competing technologies like opencl and havok are far superior, and 100% less propreietary.

And imo, it looks bad in mafia.


----------



## Farih

I just run the Mafia2 demo benchmark on highest settings @ 1920x1080
Physx to high and it used max 26% of the gts450


----------



## Masked

Quote:



Originally Posted by *reflex99*


Physx is stupid NVIDIA gimmick to sell more cards. Competing technologies like opencl and havok are far superior, and 100% less propreietary.

And imo, it looks bad in mafia.


It's all going to be replaced by MLAA or FXaa anyway...









Never saw the purpose of it, personally...


----------



## RagingCain

Quote:



RagingCain ---- i7 980x/ 4.749 GHz ----- GTX 590 SLI 822 / 1644 / 1800 ----- 150.2 ----- 3783

*30.48*% Overclock over EVGA 630 stock, and *35.42*% over other GTX 590 (since they are all the same.)




















I am now 3rd Place on the Unigine 2.5 benchmark, behind CallSignVega and then Faster who are both in Quad GTX 580.

Overclocking. It can be done.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


I am now 3rd Place on the Unigine 2.5 benchmark, behind CallSignVega and then Faster who are both in Quad GTX 580.

Overclocking. It can be done.


Welcome to the dark side ~ Huzaa.

The new drivers are actually controlling the overvolting better but, it's still a major OCP-related issue imo.

Good score though


----------



## RagingCain

I am seeing 0% ocp throttling at 1.038v and 822 clock. My memory is very hard to OC and not freeze.

This is on driver set 267.84 and Unigine 2.5. Not saying OCP throttling is non existant, just don't see any in Heaven (pun intended.)

Drivers could use some work, over all, very impressed so far.

Sent from my DROID2 using Tapatalk


----------



## ReignsOfPower

Quote:



Originally Posted by *RagingCain*


I am seeing 0% ocp throttling at 1.038v and 822 clock. My memory is very hard to OC and not freeze.

This is on driver set 267.84 and Unigine 2.5. Not saying OCP throttling is non existant, just don't see any in Heaven (pun intended.)

Drivers could use some work, over all, very impressed so far.

Sent from my DROID2 using Tapatalk


Nice clocks bud. Good to see someone not giving a crap about what's been mentioned in news and just going all out.


----------



## RagingCain

Quote:



Originally Posted by *ReignsOfPower*


Nice clocks bud. Good to see someone not giving a crap about what's been mentioned in news and just going all out.


It took me six hours, patience and a bunch of temperature sensors for me, so I did give a crap, I just didn't do it stupid like Sweclockers or W1zzard at TPU =). I hit every 10 MHz, and I actually stopped at 822 to try out 3dMark11. I also have a max temp of about 45c, so its not as scary as someone on air.

3DMark11 lowered my scores below stock so there is a weird invisible OCP / Throttling going on, despite clocks, voltages, and GPU usage about the same.

Best combo was around 700~740 @ 0.950v-0.988v.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


It took me six hours, patience and a bunch of temperature sensors for me, so I did give a crap, I just didn't do it stupid like Sweclockers or W1zzard at TPU =). I hit every 10 MHz, and I actually stopped at 822 to try out 3dMark11. I also have a max temp of about 45c, so its not as scary as someone on air. On top of it, Levesque was nearby, so that should shut him up for about the GTXs 590 for a.... minute.

3DMark11 lowered my scores below stock so there is a weird invisible OCP / Throttling going on, despite clocks, voltages, and GPU usage about the same.

Best combo was around 700~740 @ 0.950v-0.988v.



3DMark11 is still having issues w/benching and there's definitely some OCP still in place.

I was told this morning that it would eventually be slowly "lifted" but, they'll never fully take it off.

To my understanding the reason 0.95/0.988 was "perfect" is because even WITH the ocp limiting the voltage to the cores, it was still enough that the cores COULD run 100% and continue to draw.

You have to realize that 0.95 on these cards is realistically like 0.913...It's being limited 100%.

In CS6 I experience this fairly regularly...1 core is 100% with max V, other core is at 40% with a controlled reading...Aida gives me a different answer than Precision so, not too sure what/who to believe but, there's definitely some limiting going on there.


----------



## -iceblade^

please refrain from trolling, or accusing others of doing the same. it doesn't help to clean up the issue either.

if you have issues, report, PM a mod, or use the ignore list function.

again, please don't do that - it causes more harm than it's worth


----------



## RagingCain

Quote:


> Originally Posted by *-iceblade^;13215902*
> please refrain from trolling, or accusing others of doing the same. it doesn't help to clean up the issue either.
> 
> if you have issues, report, PM a mod, or use the ignore list function.
> 
> again, please don't do that - it causes more harm than it's worth


I didn't feel I was trolling, but I can drop the subject matter. Sorry.

Best I could get today:


----------



## Abiosis

Quote:



Originally Posted by *RagingCain*


I am now 3rd Place on the Unigine 2.5 benchmark, behind CallSignVega and then Faster who are both in Quad GTX 580.

Overclocking. It can be done.


_Nice job ~ Old man...









Keep the ball rolling~ _


----------



## RagingCain

Quote:



Originally Posted by *Abiosis*


_Nice job ~ Old man...









Keep the ball rolling~ _


Haha, I am not that old... starting to get forehead wrinkles bah. I am young enough looking to keep going to the bars with college students and not look out of place









I killed a card today... only temporarily =)

I believe I drew more power than my PSU could deliver on my 12v2 rail.

GPU went down for about 5 minutes only to bounce right back. Believe it took a minute or two to cool down my PSU to be honest.

So two overclocked GTX 590s + 4.8GHz on a i7 980x, drained my 1200W power supply. What a shame... Anybody want to trade PSUs (1200W for a 1200W) so I can keep benching =)?


----------



## ReignsOfPower

You drew that much power that a 1200W PSU overheated? Amazing. That's a bit ludicrous lol.
Here is a change in my system. I'm thinking of changing my sound card to an ASUS Xonar STX and gettign around to installing the Noctua Fan I powder coated the shroud black. Then I think my systems nice and finished until I decide to upgrade to the 2011 Platform.

















Fan looks identical to this (This pic was taken by another user in these forums)


----------



## RagingCain

Quote:



Originally Posted by *ReignsOfPower*


You drew that much power that a 1200W PSU overheated? Amazing. That's a bit ludicrous lol.
Here is a change in my system. I'm thinking of changing my sound card to an ASUS Xonar STX and gettign around to installing the Noctua Fan I powder coated the shroud black. Then I think my systems nice and finished until I decide to upgrade to the 2011 Platform.

















Fan looks identical to this (This pic was taken by another user in these forums)










Looks like it would be good.

I have been doing a little reading, and I know I dont have the best PSU, but I think I hit the OCP on the 12V rail inside the PSU. It makes more sense than overloading a PSU at an estimated 900watt usage. The Noise Blocker in my PSU should be easily doing better than my old generic fan that was in there. We will see, I benched it hard two days now, so now I am taking some time off till probably Sunday, from benching. One of you trouble makers can take over for me.

3DMark11 Sweet spot: 720-730 core @ 0.963v, started at stock, got all the way to 725 and I went over, I might even pool together my 22 tests I did today in a spreadsheet if any one wants to try their hand at it.

Heaven will take whatever you want to "stablely" throw at it. My record was 823,1646,3600 @ 1.038v, although after what I learnt today, memory OC might be useless. I was trying to get it as close to stock GTX 580 as I could, but I don't see it happening now, and I ended up hurting my 3DMark tests with anything over 5 MHz.


----------



## rush2049

I can almost guarantee that you are using all your PSU can do. My 590 destroyed my TruePower Quattro 1000W. I upgraded to a single rail AX1200 and now I am getting super stable everything.... the first drivers that have a less restrictive OCP mechanism in them and I will see how high I can get on air.

The sad thing was, when my psu became taxed to its maximum it started throwing very very bad (varying) voltages and fried my mobo.....

So now I am going to be more carefull...


----------



## Scytus

Just went out of stock on Newegg again!

I want to pick 2 of these cards up, but wasn't able to grab it in time on Newegg, anyone know how often they reappear in the inventory?


----------



## krazyatom

Quote:


> Originally Posted by *RagingCain;13220898*
> Haha, I am not that old... starting to get forehead wrinkles bah. I am young enough looking to keep going to the bars with college students and not look out of place
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I killed a card today... only temporarily =)
> 
> I believe I drew more power than my PSU could deliver on my 12v2 rail.
> 
> GPU went down for about 5 minutes only to bounce right back. Believe it took a minute or two to cool down my PSU to be honest.
> 
> So two overclocked GTX 590s + 4.8GHz on a i7 980x, drained my 1200W power supply. What a shame... Anybody want to trade PSUs (1200W for a 1200W) so I can keep benching =)?


Quote:


> Originally Posted by *rush2049;13225078*
> I can almost guarantee that you are using all your PSU can do. My 590 destroyed my TruePower Quattro 1000W. I upgraded to a single rail AX1200 and now I am getting super stable everything.... the first drivers that have a less restrictive OCP mechanism in them and I will see how high I can get on air.
> 
> The sad thing was, when my psu became taxed to its maximum it started throwing very very bad (varying) voltages and fried my mobo.....
> 
> So now I am going to be more carefull...


wait.. so 1000watts PSU is not enough for single gtx 590?


----------



## rush2049

Well 1000 watts is enough watts, if your power supply is able to deliver all of those watts to whatever needs them, but if you have rails whether it be 2,3,4,5,6,7,8+ then the amount of watts is split up across certain modular cables and you may not be able to get a combination that works.

Also something to be aware of, watts is not the only thing you have to watch, amps is also crucially important. My last power supply, which was a very nice one, was not able to supply enough amps with the 2 rails I could plug into it, the card just drew too much power.

My system isn't a typical gaming system either, its a media storage hub as well as running a local dns cache. So my 6 TB of hard drive space spread across 5 mechanical hard drives pulls a decent amount of power when added to the high end sound card/video card/processor. It all adds up.

This card has a spec of 50A for your system, which is an over-estimate on nvidias part to be safe. But my previous power supply was only able to supply 25A on a single rail, of which it had 4. So I needed to split my hard drives between rails, because they drew too much for 1, and my processor/motherboard needed 1.... that left the video card with 25A and change.... it just didn't work.....


----------



## RagingCain

Quote:


> Originally Posted by *krazyatom;13226816*
> wait.. so 1000watts PSU is not enough for single gtx 590?


I think it is, as long as the amperage is there on your 12v rail(s).

I think I tripped OCP on my PSU or it over heated that puppy. Like I said, heavily overclocked gulftown, 2x 30% oced GTX 590s, and 5 mechanical hard drives, an optical drive, all the peripherals, and a sound card. Plus 3x Ultra Kazes, 3x GentleTyphoons, and 2x not so Gentle Typhoons (150 cfm ones) have to steal some amperage.


----------



## maur0

6990 is normally throttling in Metro 2033 and 3DMark Vantage, but not in Heaven.

So it wouldn't be strange if 590 was throttling in 3DMark 2011.
More aggressively then in other applications, that is.


----------



## Levesque

Quote:


> Originally Posted by *maur0;13228609*
> 6990 is normally throttling in Metro 2033 and 3DMark Vantage, but not in Heaven.


This is true only with one of the 2 BIOS on the 6990.

The 6990 is not throttling if you are using the AUSUM BIOS. It's only throttling with the stock BIOS. Powertune limits are different in both BIOS.

Anand did a good article on that, talking about the differences in both BIOS. Anand were not able to reach the limit or get any throttling with the AUSUM BIOS, no matter what they were trying.


----------



## RagingCain

Quote:


> Originally Posted by *maur0;13228609*
> 6990 is normally throttling in Metro 2033 and 3DMark Vantage, but not in Heaven.
> 
> So it wouldn't be strange if 590 was throttling in 3DMark 2011.
> More aggressively then in other applications, that is.


Quote:


> Originally Posted by *Levesque;13228711*
> This is true only with one of the 2 BIOS on the 6990.
> 
> The 6990 is not throttling if you are using the AUSUM BIOS. It's only throttling with the stock BIOS. Powertune limits are different in both BIOS.
> 
> Anand did a good article on that, talking about the differences in both BIOS. Anand were not able to reach the limit or get any throttling with the AUSUM BIOS, no matter what they were trying.


9999999999.99% driver issues, give them a chance to mature for both cards. The results look anemic no matter who is doing what to them =/


----------



## Levesque

Quote:


> Originally Posted by *RagingCain;13228886*
> 9999999999.99% driver issues, give them a chance to mature for both cards. The results look anemic no matter who is doing what to them =/


I totally agree. The 6990 is also still running on beta 11.4 preview drivers. But it's surprinsigly good, compared to the 5970 launch, that was a mess.

AMD put 2 BIOS on the 6990 for a reason. 1 BIOS has the ''regular'' AMD PowerTune and TDP values, while the other one has a much higher Powertune limit and TDP.

So one BIOS will throttle if you push the card too far, but not the other one. So if you use a waterblock and one to push the card hard, you use the AUSUM BIOS to avoid getting any throttling.

I'm sure next gen cards from Nvidia will have something like that, or the same general idea. A BIOS switch with different Powertune limit is great idea for enthusiasts.


----------



## -iceblade^

Quote:


> Originally Posted by *ReignsOfPower;13223509*
> You drew that much power that a 1200W PSU overheated? Amazing. That's a bit ludicrous lol.
> Here is a change in my system. I'm thinking of changing my sound card to an ASUS Xonar STX and gettign around to installing the Noctua Fan I powder coated the shroud black. Then I think my systems nice and finished until I decide to upgrade to the 2011 Platform.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fan looks identical to this (This pic was taken by another user in these forums)


that's a sexy machine


----------



## RagingCain

Quote:



Originally Posted by *-iceblade^*


that's a sexy machine










Has anyone noticed that water cooling requires like a billion fans more than air cooling systems... ... or is it just me?

I am looking at my tube filled clutter case, and googles of fans everywhere.... ...I totally got confused as to why I went liquid cooling.


----------



## reflex99

Because is you used the same amount of fans as air cooling, you would get air cooling like temps.


----------



## vwmikeyouhoo

Add me? http://www.techpowerup.com/gpuz/dbru9/


----------



## Scytus

To those running a 590, or 590 Quad-SLI, are you experiencing any micro stuttering issues whatsoever?


----------



## rush2049

yes


----------



## Scytus

Quote:


> Originally Posted by *rush2049;13246842*
> yes


I would think your Phenom II would be a bottleneck.

Anyone with the new sandy bridge platform experience micro stuttering?


----------



## ReignsOfPower

Quote:


> Originally Posted by *Scytus;13246508*
> To those running a 590, or 590 Quad-SLI, are you experiencing any micro stuttering issues whatsoever?


None here on a single 590. Definitely noticed some on my 5970.


----------



## Imglidinhere

Quote:


> Originally Posted by *RagingCain;13220898*
> Haha, I am not that old... starting to get forehead wrinkles bah. I am young enough looking to keep going to the bars with college students and not look out of place
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I killed a card today... only temporarily =)
> 
> I believe I drew more power than my PSU could deliver on my 12v2 rail.
> 
> GPU went down for about 5 minutes only to bounce right back. Believe it took a minute or two to cool down my PSU to be honest.
> 
> So two overclocked GTX 590s + 4.8GHz on a i7 980x, drained my 1200W power supply. What a shame... Anybody want to trade PSUs (1200W for a 1200W) so I can keep benching =)?


How many watts are allocated to the CPU power connector? You might have overloaded a combination of things.


----------



## pacho

does the full backplate on evga's gtx590 help the card run cooler than the reference design?


----------



## Scytus

Quote:


> Originally Posted by *ReignsOfPower;13248499*
> None here on a single 590. Definitely noticed some on my 5970.


Fantastic news on the single, that gives me relief. Very curious about the double though.

Anyone experience any micro stutters on a 590 Quad-SLI?


----------



## ReignsOfPower

Quote:


> Originally Posted by *pacho;13250016*
> does the full backplate on evga's gtx590 help the card run cooler than the reference design?


I would suspect a negligible difference between the two variations. I personally hate non-back-plated cards. I was really happy when EVGA make their custom one


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13230459*
> Has anyone noticed that water cooling requires like a billion fans more than air cooling systems... ... or is it just me?
> 
> I am looking at my tube filled clutter case, and googles of fans everywhere.... ...I totally got confused as to why I went liquid cooling.


I have 9 fans in my case and on 1 loop, maintain a 7* delta with a 590 and a 980x...at 45dB.

I dare anyone on air to say something remotely similar.


----------



## RagingCain

Well I did a little benchmarking today, wasn't planning on it but oh well.

HWBot Submission

Ranked 17th out of 42 (4 GPU Setups)
Ranked 5th out of 18 (4 GPU + H20 Setups)
Ranked 1st out of 1 (GTX 590 Submissions) <- Only Temp, the other two make me feel happy.

Of course our own little benchmarking thread for Unigine here on OCN.
Quote:


> Probably last push with this PSU. Personal best:
> 
> RagingCain ---- i7 980x/ 4.853 GHz
> 
> GTX 590 SLI 830 / 1690 / 1728
> 
> 154.4
> 
> 3890


I included a screenshot of Heaven has some issues (45% GPU usage for over 10 seconds triggered by the Benchmark itself) and I anticipate drivers improving that GPU Usage to more consistent higher state.

I have officially beaten a couple of quad GTX 580 setups with this Overclock.

This is on driver set 267.85, as opposed to 267.84 last time.


----------



## Citra

Quote:


> Originally Posted by *DyWaN;13170411*
> Hi guys, just got my Gigabyte GTX 590 with some fancy toolbox
> 
> 
> 
> 
> 
> 
> 
> 
> pushed the cores & mems to 675/1350/2000
> 
> The card itself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cheers


Do you mind posting some pictures of the toolbox which it came in?


----------



## RagingCain

I am going to probably be going down for a little bit (dismantling computer), but is there anybody who is a BOINCer / Folder able to run their GTX 590 for a little bit?

I think there is a flaw with my cards and perhaps all EVGA voltages.

Problem:
Getting computational errors (fewer than last time).

Reason:
Faulty cards OR mis-matched voltage. I know my cards, and probably all EVGA, cards have mis-matched voltage. It looks like EVGA raised the voltages to accomodate the 630 stock overclock. In doing so, only one of the two GPUs for me sits at the higher voltage. Both of them have the permanent clock rate of 630 MHz. GPU 0 is at 0.925v, while GPU 1 is at 0.913v. I have noticed this on both cards, in video games, while folding, all while having great temps. While folding, the cards sit around 30% usage, and have a max temp of 35c, so there is no visible throttling, and GPU1 (2nd GPU) in both cases has a solid 0.913v which leads to believe it is set in BIOS this way (possibly by accident.)

Possible Non-Permanent Solution:
Using obviously a voltage unlocked driver set, I manually set my stock clocks and voltage in MSI Afterburner. What this does, it forces my 3D Power speed to 630 MHz (no change) but it forces the corresponding voltage to 0.925v for both GPUs as long as Afterburner is up and set.

Possible Permanent Solution:
Edit BIOS.

I am going to contact EVGA regarding this, but if I can get someone to run BOINC/Folding for the day without adjusting the voltage, and see if you get an error.

Why This Is Bad:
This isn't catastrophic, however, this same error can crash a video game, benchmark, operating system, errors in encoding/decoding, or just crash the nVidia drivers themselves. Its just generally all around problematic.


----------



## rush2049

You sir, get rep for this......


----------



## Masked

Quote:



Originally Posted by *RagingCain*


I am going to probably be going down for a little bit (dismantling computer), but is there anybody who is a BOINCer / Folder able to run their GTX 590 for a little bit?

I think there is a flaw with my cards and perhaps all EVGA voltages.

Problem:
Getting computational errors (fewer than last time).

Reason:
Faulty cards OR mis-matched voltage. I know my cards, and probably all EVGA, cards have mis-matched voltage. It looks like EVGA raised the voltages to accomodate the 630 stock overclock. In doing so, only one of the two GPUs for me sits at the higher voltage. Both of them have the permanent clock rate of 630 MHz. GPU 0 is at 0.925v, while GPU 1 is at 0.913v. I have noticed this on both cards, in video games, while folding, all while having great temps. While folding, the cards sit around 30% usage, and have a max temp of 35c, so there is no visible throttling, and GPU1 (2nd GPU) in both cases has a solid 0.913v which leads to believe it is set in BIOS this way (possibly by accident.)

Possible Non-Permanent Solution:
Using obviously a voltage unlocked driver set, I manually set my stock clocks and voltage in MSI Afterburner. What this does, it forces my 3D Power speed to 630 MHz (no change) but it forces the corresponding voltage to 0.925v for both GPUs as long as Afterburner is up and set.

Possible Permanent Solution:
Edit BIOS.

I am going to contact EVGA regarding this, but if I can get someone to run BOINC/Folding for the day without adjusting the voltage, and see if you get an error.

Why This Is Bad:
This isn't catastrophic, however, this same error can crash a video game, benchmark, operating system, errors in encoding/decoding, or just crash the nVidia drivers themselves. Its just generally all around problematic.


I've been saying this since the day of release...doesn't anyone read what I say anymore?

The voltage is NOT due to Evga's overclock because there isn't one...If you want to run this card at bone stock, it's at bone stock.

The voltage is due to the bios (As I've pointed out 5 times, just checked my previous posts), it's also due to the drivers attempting to draw more from core 2 vs core 1.

Core 2 is the limited core while core 1 recieves less voltage.

We have 4 Asus 590's in the office, same EXACT problem as the Evga 590's...Thus, I decided to write my own drivers.

There's even an issue with customizing the drivers because the fluctuations are actually accounted for at both levels.

If you really want this card to run 100% properly, wait for Nvidia to release a new bios...I wouldn't touch the bios on these cards if you paid me and ironically, that's pretty much what I get paid to do.


----------



## rush2049

Masked, I do know you were saying this from day 1, I think it was one of the first things you brought up.... RagingCain is the first to confirm it. I also suspected, but both of my cores seem to be reporting the same voltage..... but perhaps what I am seeing is just what it is set to and not what they are receiving.

Now here is an interesting question, how often does nvidia release new bioses? And how painful is it to flash?


----------



## Masked

Quote:



Originally Posted by *rush2049*


Masked, I do know you were saying this from day 1, I think it was one of the first things you brought up.... RagingCain is the first to confirm it. I also suspected, but both of my cores seem to be reporting the same voltage..... but perhaps what I am seeing is just what it is set to and not what they are receiving.

Now here is an interesting question, how often does nvidia release new bioses? And how painful is it to flash?


Nvidia doesn't often release new bioses, they're very stubborn in terms of pride.

If you disect the bios of these cards, you'll find some "code" that sits a bit differently and looks a bit odd, out of place if you will...That's your issue.

The cores are being limited already at the base programming, the OTHER lovely little contributor is the driver control.

Nvidia has linked the drivers into their bios so, there is a dual layer of control...

Initially the drivers were open BUT, so many people bricked their cards, a "hard-lock" was enforced.

What we see now is this "lock" in conjunction with them not "trusting" the public to open the cards up again.

If you attempt to change the drivers, it's like a domino effect, you're not sure what controls this or that because of the "unique" bios.

It's literally going to take them to change it, to fix any of these issues and they've promised TO change it, question is...When?

If you had an old version of Rivatuner or, Aida...These are programs that rely on "real-data" and give real time voltages...Thus, from day 1, I've said the cores are different in every aspect...

What's interesting is that until this latest update from MSI, one could not tell any different...Same goes for Precision and most other utilities.


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


Masked, I do know you were saying this from day 1, I think it was one of the first things you brought up.... RagingCain is the first to confirm it. I also suspected, but both of my cores seem to be reporting the same voltage..... but perhaps what I am seeing is just what it is set to and not what they are receiving.

Now here is an interesting question, how often does nvidia release new bioses? And how painful is it to flash?


Going on 8 hours error free now.

Its not terrible, but its good to have a backup GPU or know how to "blind" flash in case it goes awry.

If there is a BIOS update it will also come from EVGA, not nVidia, or if nVidia release an updated BIOS, it will go through EVGA channels before its approved. Lets put it like that.

It depends on how serious they want to fix this. If its just me & masked reporting, we will probably hear crickets. We get a group of guys reporting this to their respective vendor's in a polite manner and we get a fix.

http://www.evga.com/forums/tm.aspx?m=979570

Thats the EVGA forum thread I started.

Secondly, the temporary fix is working fine and dandy as of now.


----------



## rush2049

Well keep us updated, I might have to go talk to Asus..... Asus already released a bios update, but in the description it only said something like 'voltage lockdown', or something similar. So I haven't used it yet....


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Going on 8 hours error free now.

Its not terrible, but its good to have a backup GPU or know how to "blind" flash in case it goes awry.

If there is a BIOS update it will also come from EVGA, not nVidia, or if nVidia release an updated BIOS, it will go through EVGA channels before its approved. Lets put it like that.

It depends on how serious they want to fix this. If its just me & masked reporting, we will probably hear crickets. We get a group of guys reporting this to their respective vendor's in a polite manner and we get a fix.

http://www.evga.com/forums/tm.aspx?m=979570

Thats the EVGA forum thread I started.

Secondly, the temporary fix is working fine and dandy as of now.


I talked to EVGA about this on Friday as well as a contact at Nvidia...

I've personally come to the conclusion, since this has existed since day 1, it has to be a bios issue...

Unfortunately, according to what my contact at Nvidia says, it has to be them to change their bios and if they don't...Then the ball is in the manufacturer's court.

However, and this is a BIG however, they basically lose Nvidia support...Like Asus did at launch with their "bios update".

We have 3 cards in the office that were flashed to this update and current drivers DO NOT, I repeat DO NOT, work well with this bios.

After disecting the driver in an attempt to make my own, I found that to be true...There are strings that link the two together thus, making it virtually impossible to successfully 100% alter either.

As I said before, according to Nvidia, the ball is currently in their court and as we can see, they're staying behind the wall of "this card is perfect".


----------



## capchaos

I have no fluctuations in voltage across my gpus. But of course I changed the bios min voltage on all gpus to .963 and afterburner reads them all as such on the on-screen display. No problems here


----------



## Masked

Quote:



Originally Posted by *capchaos*


I have no fluctuations in voltage across my gpus. But of course I changed the bios min voltage on all gpus to .963 and afterburner reads them all as such on the on-screen display. No problems here


With these drivers, I find that highly unlikely.

I'd change to a base monitoring program like Aida or even Rivatuner.

Interns have told me Pcwizard works, as does Hmonitor or HWinfo...

I'm not saying you're wrong, I'm saying that after editing 3 bios and x-drivers, and I'm STILL having issues according to Aida...

What's going to happen is, when you actually run a bench or stress the card, with those voltages, you'll see 1 of 2 things.

1) Your cores will be at 90% and 40%, precision or MSI will show them at 90/90 this isn't true monitoring.

2) Your voltages will "throttle" on 1 core down to 0.923/0.895 for the duration of the "stress test" until it's over and then will return to "stock".

If neither happens, I'd love to know what you did, personally and how you edited your bios because a miracle might be upon us!


----------



## RagingCain

Quote:



Originally Posted by *Masked*


With these drivers, I find that highly unlikely.

I'd change to a base monitoring program like Aida or even Rivatuner.

Interns have told me Pcwizard works, as does Hmonitor or HWinfo...

I'm not saying you're wrong, I'm saying that after editing 3 bios and x-drivers, and I'm STILL having issues according to Aida...

What's going to happen is, when you actually run a bench or stress the card, with those voltages, you'll see 1 of 2 things.

1) Your cores will be at 90% and 40%, precision or MSI will show them at 90/90 this isn't true monitoring.

2) Your voltages will "throttle" on 1 core down to 0.923/0.895 for the duration of the "stress test" until it's over and then will return to "stock".

If neither happens, I'd love to know what you did, personally and how you edited your bios because a miracle might be upon us!


Is anybody else getting the 90% GPU usage one one and low (40%) GPU usage on core 2? I had forgotten that Masked was having this issue, but I haven't seen it once in all my testing.

Quote:



Originally Posted by *capchaos*


I have no fluctuations in voltage across my gpus. But of course I changed the bios min voltage on all gpus to .963 and afterburner reads them all as such on the on-screen display. No problems here


Or, he has fixed the problem by setting the values correctly (either on purpose) or accident. I have to ask though, does it idle/downclock correctly when not in use?


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Is anybody else getting the 90% GPU usage one one and low (40%) GPU usage on core 2? I had forgotten that Masked was having this issue, but I haven't seen it once in all my testing.

Or, he has fixed the problem by setting the values correctly (either on purpose) or accident. I have to ask though, does it idle/downclock correctly when not in use?


None of you have seen it because of the monitoring you're using.

Precision and Msi, nor GPU-Z have never once shown this issue since release.

Same with the voltages. Check with Afterburner, check with Precision, the voltages have always been even however, they are far from.

Aida, HWmonitor, a couple of the programs I've mentioned, have.

When running 3Dmark11 this is very obvious, especially when overclocked...1 core ramps to max, the 2nd core is "limited" by draw.

When not in use, it does downlock correctly however, with Afterburner, Precision and GPU-Z on screen, none of the 3 report a single difference.

I'm not preaching, go get Aida, it's like 40$ a year, I use it for professional purposes...You don't NEED this program.

What I am preaching is, the monitor software isn't and hasn't actually been monitoring real-time results and that's blatantly obvious by your most recent findings.

If this individual, that just joined this month, has indeed edited his bios, I would LOVE to see the results and the adjacent bios because, we've had 0 success, especially with Asus's bios BECAUSE of the strings found within the drivers.

So, if HIS bios is editing and has allowed him to OC, I'd actually pay good money to see what he/she changed.


----------



## RagingCain

Quote:



Originally Posted by *Masked*


None of you have seen it because of the monitoring you're using.

Precision and Msi, nor GPU-Z have never once shown this issue since release.

Same with the voltages. Check with Afterburner, check with Precision, the voltages have always been even however, they are far from.

Aida, HWmonitor, a couple of the programs I've mentioned, have.

When running 3Dmark11 this is very obvious, especially when overclocked...1 core ramps to max, the 2nd core is "limited" by draw.

When not in use, it does downlock correctly however, with Afterburner, Precision and GPU-Z on screen, none of the 3 report a single difference.

I'm not preaching, go get Aida, it's like 40$ a year, I use it for professional purposes...You don't NEED this program.

What I am preaching is, the monitor software isn't and hasn't actually been monitoring real-time results and that's blatantly obvious by your most recent findings.

If this individual, that just joined this month, has indeed edited his bios, I would LOVE to see the results and the adjacent bios because, we've had 0 success, especially with Asus's bios BECAUSE of the strings found within the drivers.


The reason I ask is because I stumbled upon a beta version of AIDA64 and its inclusion of a hotfix of GTX 590 monitoring correction... which hasn't gone un-beta yet...


----------



## capchaos

It idles and downclocks Jst like normal only thing was changed was the 3d min voltage the 2d voltage was left alone. Biggest reason I flashed to a higher min voltage was so I could be able to overclock on the newer drivers since it locks to ur bios min voltage


----------



## RagingCain

Quote:



Originally Posted by *capchaos*


It idles and downclocks Jst like normal only thing was changed was the 3d min voltage the 2d voltage was left alone. Biggest reason I flashed to a higher min voltage was so I could be able to overclock on the newer drivers since it locks to ur bios min voltage


Yeah thats what I figured. Looks like all we need to address voltage variation on full load is a BIOS tweak. I am sure all he did was edit the vBios with NiBiTor and set minimum 3D Clocks. Actually CAP, did you just edit it the same way a GTX 580 would get edited? Or are there 2 sets of values per performance level (one for each GPU)?

Masked:
http://www.techpowerup.com/forums/sh...=132926&page=3

Scroll down to get the latest AIDA64 beta if you don't have it. This is several releases after the GTX 590 fix, but it still has it.


----------



## capchaos

My voltages were not always even with afterburner. If u would go into after burner settings and swapped between which Gpu u wanted as master it would show the voltage variation between the gpus


----------



## Masked

Quote:



Originally Posted by *RagingCain*


The reason I ask is because I stumbled upon a beta version of AIDA64 and its inclusion of a hotfix of GTX 590 monitoring correction... which hasn't gone un-beta yet...


Yep, been using the beta since it first came out, actually









There are too many coincidences to NOT believe the numbers...Especially when my Heaven benches are 30% below a user using stock with an [email protected] yet, apparently my card reported 90/90 (precision) during the entire bench.

I find that HIGHLY suspect...The cause, you ask?

My card was OC'd to 800mhz.

We've seen hundreds of reviews where the above is/was happening since day 1.

I find the same thing true today, it's not as extreme but, even with my core OC'd to 700mhz, it's a bit of a struggle.

I believe you've seen the same results before as well, if I'm not mistaken?

Yet every monitoring program you've yet listed has claimed 90%/90%...

Do you believe they're working at 90% in conjunction to produce that score?

The logical answer is...No.

Aida has shown they're not, the beta has shown they're not...

Via Voltages, I believe, the cores are being strangled when a request is made for more over x...This is what I've found to be true after editing our bios and using the Asus flavor but, to each his own, obviously.

Quote:



Originally Posted by *capchaos*


It idles and downclocks Jst like normal only thing was changed was the 3d min voltage the 2d voltage was left alone. Biggest reason I flashed to a higher min voltage was so I could be able to overclock on the newer drivers since it locks to ur bios min voltage


May we please see a copy of this bios?

I'm not trying to be an *******, being very serious, actually...If you've found a way to successfully edit your bios, I'd be VERY interested in seeing how it was actually done.

Quote:



Originally Posted by *capchaos*


My voltages were not always even with afterburner. If u would go into after burner settings and swapped between which Gpu u wanted as master it would show the voltage variation between the gpus


I never tried this...Hrmmm...Interesting.


----------



## capchaos

I used the latest inbitor 6.02 although it is not officially supported for the 590 but can be used and won't be until 6.03. Least that is what is stated in his thread at guru 3d


----------



## RagingCain

Quote:



Originally Posted by *Masked*


Yep, been using the beta since it first came out, actually









There are too many coincidences to NOT believe the numbers...Especially when my Heaven benches are 30% below a user using stock with an [email protected] yet, apparently my card reported 90/90 (precision) during the entire bench.

I find that HIGHLY suspect...The cause, you ask?

My card was OC'd to 800mhz.

We've seen hundreds of reviews where the above is/was happening since day 1.

I find the same thing true today, it's not as extreme but, even with my core OC'd to 700mhz, it's a bit of a struggle.

I believe you've seen the same results before as well, if I'm not mistaken?

Yet every monitoring program you've yet listed has claimed 90%/90%...

Do you believe they're working at 90% in conjunction to produce that score?

The logical answer is...No.

Aida has shown they're not, the beta has shown they're not...


I am not quite sure what you are going on about? You do know I have the record holding score for Unigine right?

I do not see flat 90% GPU usage across the board, yet I still managed to benchmark them well.

My issue is that @ stock, the voltage is incorrectly set on GPU2. No need for you to go on another OCP crusade. The bug I found has nothing to do with that. My issue is I believe the BIOS is incorrect and fixable. What is worse is you are going to derail my thread. I want this issue fixed for everybody.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


I am not quite sure what you are going on about? You do know I have the record holding score for Unigine right?

I do not see flat 90% GPU usage across the board, yet I still managed to benchmark them well.

My issue is that @ stock, the voltage is incorrectly set on GPU2. No need for you to go on another OCP crusade. My issue has nothing to do with that. My issue is I believe the BIOS is incorrect and fixable. What is worse is you are going to derail my thread. I want this issue fixed for everybody.










do you really think they're not aware of this problem?

Do you really think Nvidia isn't?

If so, you're so mistaken it's amazing.

This problem has been pointed out since day 1 and on the 2nd driver release became blatantly obvious because of the "lock".

This isn't an OCP "crusade" this is something you're about to see in your own bios.

The issue stands that there is an issue with the cores being limited, with voltage being limited and my stance has NOT changed since my 3rd post.

I had the beta for aida before it even came out, I don't need your "mystery" link when I've been one of the testers for it.

Precision and Afterburner have NOT been reporting equally and/or decent results...

If you're going to draw a conclusion off of those programs, we might as well go ask Kadafi how good of a leader he is because it's about the same exact amount of bias.

The only reason why I'm calling foul on your thread, I mentioned, every program you've used so far, hasn't reported the problem UNTIL now when there's been an issue since day 1...That's an amazing amount of bias.

Change your monitoring software to something that detects a real difference in earlier drivers, I won't say a word to that conclusion.

Certain programs have illustrated the voltage differentiations since day 1 and it's been reported since...Nvidia sits on the throne of "fixing" this isse and as has been mentioned several times, they've eventually promised a fix.

I've found no solution via Guru 3d and the several others helping, perhaps we will but, I'd be very interested in seeing 1 in the mean time that actually works in conjunction with the current drivers.

Stupid blackberry copied instead of cutting...BOOO.


----------



## EXVAS3221

dose it get up to 100'C! thats F-ing hot! I seen it run 120'C,


----------



## RagingCain

Quote:



Originally Posted by *Masked*









do you really think they're not aware of this problem?

Do you really think Nvidia isn't?

If so, you're so mistaken it's amazing.

This problem has been pointed out since day 1 and on the 2nd driver release became blatantly obvious because of the "lock".

This isn't an OCP "crusade" this is something you're about to see in your own bios.

The issue stands that there is an issue with the cores being limited, with voltage being limited and my stance has NOT changed since my 3rd post.

I had the beta for aida before it even came out, I don't need your "mystery" link when I've been one of the testers for it.

Precision and Afterburner have NOT been reporting equally and/or decent results...

If you're going to draw a conclusion off of those programs, we might as well go ask Kadafi how good of a leader he is because it's about the same exact amount of bias.

The only reason why I'm calling foul on your thread, I mentioned, every program you've used so far, hasn't reported the problem UNTIL now when there's been an issue since day 1...That's an amazing amount of bias.

Change your monitoring software to something that detects a real difference in earlier drivers, I won't say a word to that conclusion.

Certain programs have illustrated the voltage differentiations since day 1 and it's been reported since...Nvidia sits on the throne of "fixing" this isse and as has been mentioned several times, they've eventually promised a fix.

I've found no solution via Guru 3d and the several others helping, perhaps we will but, I'd be very interested in seeing 1 in the mean time that actually works in conjunction with the current drivers.

Stupid blackberry copied instead of cutting...BOOO.


You seriously need to stop. You won't win an argument against me, because I only argue facts. I am not even arguing anything to do with OCP.

I have noticed the voltage difference since I have had my cards, and before then when I made the members list. I already said this. I am the one who made the member's list and copied all the data. I saw just about everyones voltages who posted. Who do you think put all that data togeher? I just anticipated it was just some voltage fluctuation similar to a CPU vcore on idle vs. load. I am also using 3 different programs to monitor GPU Vcore, including your AIDA64. They all report the exact same numbers all the time.

Secondly, *THERE was no need for concern up until I started having errors in BOINC*. I only just started using BOINC with these cards one time last week and last night. Both times I had computational errors.

Thirdly, I have so far solved the issue by forcing the stock voltage on to both GPUs. SO why would argue with me and prevent a tiny fix?


----------



## Masked

Quote:



Originally Posted by *RagingCain*


You seriously need to stop. You won't win an argument against me, because I only argue facts. I am not even arguing anything to do with OCP.

I have noticed the voltage difference since I have had my cards, and before then when I made the members list. I already said this. I am the one who made the member's list and copied all the data. I saw just about everyones voltages who posted. Who do you think put all that data togeher? I just anticipated it was just some voltage fluctuation similar to a CPU vcore on idle vs. load. I am also using 3 different programs to monitor GPU Vcore, including your AIDA64. They all report the exact same numbers all the time.

Secondly, *THERE was no need for concern up until I started having errors in BOINC*. I only just started using BOINC with these cards one time last week and last night. Both times I had computational errors.

Thirdly, I have so far solved the issue by forcing the stock voltage on to both GPUs. SO why would argue with me and prevent a tiny fix?


...So, I don't argue facts at all?

The same facts I've stated since day 1, far before your "edit".

The same facts I've been doubted for that you now "confirm"...?

But, they're not "facts"?

Interesting.

I'm not arguing to prevent a tiny fix, I'm saying the powers that be are more aware than you are and it's not "just a tiny fix", as you so claim it is.

After speaking WITH Evga on Friday and a Nvidia executive, I guess I don't speak "facts" so, I will no longer speak.

I wish to withdraw from this "club" please, Alatar.

Good riddance.


----------



## RagingCain

Anyways, it looks like the temporary fix is good so far. I am still at 0 errors computational.

Can someone with "voltage locked drivers" and the latest copy of Afterburner tell me if you are indeed able to force "stock" voltage, by increasing the clocks to 631 and then apply, while monitoring the GPU vcore before and after.

I want to see if it changes the voltage on the 2nd GPU, or the GPU with the lower voltage if they are not matching each other.

Edit:
Also how many people would be interested in an overclocking template spreadsheet, showing the power/performance curve of 3DMark11 and Unigine?

I was going to make one, but if nobody is interested, I might not bother.

Edit 2:
To make matters even more tantalizing / interesting. The two bios's on my cards are different, anyone else able to report the same?

GPU 0: 70.10.37.00.90
GPU 1: 70.10.37.00.91
GPU 2: 70.10.37.00.90
GPU 3: 70.10.37.00.91


----------



## rush2049

When I go from 612 mhz and force 630 or 631 mhz the vcore jumps to 
core1: .913 
core2: .912

Then after ~15 seconds falls back to idle voltage of .875 on both cores.

When running furmark with forced 630mhz voltages stay at:
core1: .913 
core2: .912

When running furmark with stock 612mhz voltages stay at:
core1: .913 
core2: .912

I find it interesting that the idle voltage is equal, but the 3d mode voltage is not... Maybe one of them is a better sample, but chances are that the cherry picked chips were paired to be very close..... and they should be set equal, not having one be suffocated of energy.

edit:
done with msi afterburner v2.2.0 Beta 2


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;13266037*
> When I go from 612 mhz and force 630 or 631 mhz the vcore jumps to
> core1: .913
> core2: .912
> 
> Then after ~15 seconds falls back to idle voltage of .875 on both cores.
> 
> When running furmark with forced 630mhz voltages stay at:
> core1: .913
> core2: .912
> 
> When running furmark with stock 612mhz voltages stay at:
> core1: .913
> core2: .912
> 
> I find it interesting that the idle voltage is equal, but the 3d mode voltage is not... Maybe one of them is a better sample, but chances are that the cherry picked chips were paired to be very close..... and they should be set equal, not having one be suffocated of energy.
> 
> edit:
> done with msi afterburner v2.2.0 Beta 2


Thats exactly the voltages I would expect for an ASUS GTX 590. When overclocking, about every 23~27 MHz requires +12/13mv for stability. That would put it right inline with EVGA's over the stock 607 / 0.913v cores. Also why ASUS would not need 0.925v, and 1 mV of variation is perfectly fine. However, having 13mV is not due to the stability.

Are your BIOS's the same in GPU-z for each GPU?


----------



## rush2049

wow, you spotted that one quick, I didn't notice, but no they are not....

GPU1 Bios: 70.10.37.00.02
GPU2 Bios: 70.10.37.00.01

And this is an Asus 590 that has not used Asus's custom new bios.....


----------



## RagingCain

Okay.... well I half hoped that it was indicative of there being an error in EVGA's cards, and the fix would be flash both to the same GPU version. I wouldn't do that, but just food for thought. For instance thinking or assuming .02 is newer than .01, but I doubt thats the case now that you have repoorted the same.

It may very well be how they differentiate between .01 is GPU1 and .02 is GPU2, maybe how they know how to flash which BIOS to which GPU. That also leads me to the conclusion that perhaps nvFlash is extremely risky. If the nVflash flashes each bios to whatever GPU it finds willy nilly, you could definitely end up with a bricked card. On the other hand, if nvflash ONLY recognizes GPU1 for BIOS .01, and the same for GPU2 with BIOS .02, then it would work with no problems.

I find it strange they would need two different BIOS, but then dual-gpu cards are a new territory for me.


----------



## Alatar

Quote:


> Originally Posted by *Masked;13265685*
> I wish to withdraw from this "club" please, Alatar.


if that's what you want. You're welcome back at any time.


----------



## armartins

Quote:


> Originally Posted by *RagingCain;13266252*
> Okay.... well I half hoped that it was indicative of there being an error in EVGA's cards, and the fix would be flash both to the same GPU version. I wouldn't do that, but just food for thought. For instance thinking or assuming .02 is newer than .01, but I doubt thats the case now that you have repoorted the same.
> 
> It may very well be how they differentiate between .01 is GPU1 and .02 is GPU2, maybe how they know how to flashing which BIOS to which GPU. That also leads me to the conclusion that perhaps nvFlash is extremely risky. If the nVflash's each bios to whatever GPU it finds willy nilly, you could definitely end up with a brick card. On the other hand, if nvflash ONLY recognizes GPU1 for BIOS .01, and the same for GPU2 with BIOS .02, then it would worky with no problems.
> 
> I find it strange they would need two different BIOS, but then dual-gpu cards are a new territory for me.


This is pretty usual on dual GPU cards, my 5970 has master and slave bios, the 295 was that way too...


----------



## RagingCain

Quote:


> Originally Posted by *armartins;13266457*
> This is pretty usual on dual GPU cards, my 5970 has master and slave bios, the 295 was that way too...


Ah, okay, very cool. Thanks for that tidbit, I learn something new everday







I assumed as much.

Spoke to EVGA_Jacob, and they are looking into the information I have brought up, I asked permission to share that message with you guys.

Just that though, no promises, however EVGA support is top notch in my book, and if there really is a problem they can find/fix it. If not, the temporary solution is still working.


----------



## ReignsOfPower

So raging what you're trying to say is that the GPU's are incorrectly volted and they fluctuate on some cards hence causing instability? I'd be glad to help out, I'll check the BIOS' and voltages again for you tomorrow if this is all you needed?

And lets stop freaking arguing over these damn cards and lets start being constructive on how we can improve milk them for performance.


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13266919*
> So raging what you're trying to say is that the GPU's are incorrectly volted and they fluctuate on some cards hence causing instability? I'd be glad to help out, I'll check the BIOS' and voltages again for you tomorrow if this is all you needed?
> 
> And lets stop freaking arguing over these damn cards and lets start being constructive on how we can improve milk them for performance.


I am a nobody, I am just some guy.

I whole heartedly believe that even at stock settings, there may have been stability issues right out of the box for EVGA users due to this tiny vCore difference. I tested every 10 Mhz on this all the way up to 830. I know what difference 12~13mv of vcore can do. I just initially made the mistake of waving off the voltage as just fluctuation or within acceptable parameters, but even still its really hard to see the errors. I mean its fairly common to have a benchmark stutter for us due to what we thought was just new drivers for a new card.

I mean we probably didn't think anything of it. However, it makes a lot of sense to me now, that many people who had crashing benchmarks and performance issues at stock could have that at least fixed. I am sure there will still be quite a few errors for us to diagnose and we still have the aggressive OCP, but I believe this is 100% slam dunk spot on bug w/ fix.


----------



## capchaos

One bios is the master and one is the slave bios.


----------



## capchaos

Yup it only recognizes .01 and .02 u can't write the .02 bios to the .01 or via versa bc it will not find a match unless u force it via command such would be really stupid..01 can only be flashed to .01


----------



## RagingCain

Quote:


> Originally Posted by *capchaos;13267571*
> Yup it only recognizes .01 and .02 u can't write the .02 bios to the .01 or via versa bc it will not find a match unless u force it via command such would be really stupid..01 can only be flashed to .01


Lol, I wouldn't say its stupid, HOWEVER, I have met programmers who don't even think about stuff like that.

I think my programming team hated working with me as QA, but my boss loved it.

I spent 90% of my time getting emails "Regards to Bug" were the response was "good finds, but I don't think that will ever be an issue in the real world." Three days later, fresh new deployment has to be pulled back into beta as the in-house quality assurance user (who happened to be a vice president at the time) went ahead and did the same thing I did "and found it to be an issue in the real world."

Haha, still, I really enjoyed my job.


----------



## rush2049

I am fairly confident that all of competent programmers spend 90% of their time dealing with incompetent programmers, myself included.


----------



## RagingCain

Hey Rush, you think you could upload your BIOS's in a post? If you don't know how I can wallk you through it, but you just save them from GPU-z, both saved individually.

I told Mavke from MVKtech.com that I would get him a copy. I also need a copy of the newer BIOS if you have it?

You can zip everything up and upload in the post as an attachment. I would really appreciate it if you do it









Besides he scratched my back and gave me a copy of the latest NiBiTor so I can make GTX 590 BIOSes.


----------



## rush2049

I will get you the two bioses that I currently use, and the one from asus's site soon as I finish this l4d2 game, lol.....


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;13269171*
> I will get you the two bioses that I currently use, and the one from asus's site soon as I finish this l4d2 game, lol.....


You are awesome









Okay some data I want to share with you all.










Here is the exact specifics of the down throttle:
*Core: 553.5 MHz ~ 553 MHz ---- Derived from BOX#3
Shader: 1107 MHz ~ 1106 MHz ---- BOX#3
Memory: 801.5 MHz (Effective 1603 MHz ~ 1602/1604) ---- Derived from BOX#5

THESE values are the hidden values not displayed by sensors when you hit max power draw. I believe because of the way our tools work, they don't register the change therefore they do not display the change. The voltage it uses is 0.875v~0.925v, again the change does not display but the effect is obviously observable.

Thats terrible. Thats lowest clock speed you can actually reach in either Precision or Afterburner for 3D Clocks. Leaves me with one question, why not just go back to stock? Is stock capable of the power draw limitation being reached?*

Here is the good news: like we have discovered with Heaven, its application specific. Which to me sounds like its driver executed as well. There is way to many title names to fit in a BIOS, not too mention they need to be able to updated it often, thats not happening with a BIOS for sure. The driver must be the trigger that recognizes the potential problem application, and activates power draw. When said application reaches the power draw level, the onboard OCP/PowerDraw limiter activates it.

Here is the crazy part, you can edit it. You can essentially undo the powerdraw restriction and make the card work against itself. I could switch the powerdraw to actually OC the card. From a benchmarking stand point though, it would be shady as whatever you set in PowerDraw won't match what you have in Afterburner/Precision. Unless you are an honest benchmarker like me and would never do that... not even to destroy the only pair of 6990s that are beating me.... omg I want to do this now.

Although I can see it for myself under the clock domains, I found the same info on the GTX 590 bios editing thread on Guru3D.

More info on overvolting/overclock/unervolting can be found here:
http://forums.guru3d.com/showthread.php?t=336117

Nibitor v 6.0.3 tool can be found here:
http://www.mvktech.net <- Its not on public release yet unless you donate, but you can request custom BIOSs for free. He actually wants more people on his forums. OH submit your BIOSs too, he loves collecting them. Rush I will give him yours.

I personally think the links and clocks would make a great entry at the bottom of first post, a sort of disclaimer. I anticipate the PowerDraw trigger would be a non-changeable variable, hard coded on the card, if it was dynamic... it could be bypassed. Its either on or its off. We should still try and find out where it occurs.


----------



## rush2049

Ok I attached my two bioses.....

and I unzipped the bios update from asus, not sure which ones are the bios files... so I attached the whole contents, take what you will.

A NOTE:

the bios ending in .02 is the first Chip, aka GF110*A*
the bios ending in .01 is the second Chip, aka GF110*B*


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;13270425*
> Ok I attached my two bioses.....
> 
> and I unzipped the bios update from asus, not sure which ones are the bios files... so I attached the whole contents, take what you will.
> 
> A NOTE:
> 
> the bios ending in .02 is the first Chip, aka GF110*A*
> the bios ending in .01 is the second Chip, aka GF110*B*


Awesome, I will send them over pronto. Good work.

@Alatar if you want these images for main you can use them:


----------



## capchaos

Good job. Finally someone else willing to do some tinkering around in the bios.

Quote:


> Originally Posted by *RagingCain;13269285*
> You are awesome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay some data I want to share with you all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is the exact specifics of the down throttle:
> *Core: 553.5 MHz ~ 553 MHz ---- Derived from BOX#3
> Shader: 1107 MHz ~ 1106 MHz ---- BOX#3
> Memory: 801.5 MHz (Effective 1603 MHz ~ 1602/1604) ---- Derived from BOX#5
> 
> THESE values are the hidden values not displayed by sensors when you hit max power draw. I believe because of the way our tools work, they don't register the change therefore they do not display the change. The voltage it uses is 0.875v~0.925v, again the change does not display but the effect is obviously observable.
> 
> Thats terrible. Thats lowest clock speed you can actually reach in either Precision or Afterburner for 3D Clocks. Leaves me with one question, why not just go back to stock? Is stock capable of the power draw limitation being reached?*
> 
> Here is the good news: like we have discovered with Heaven, its application specific. Which to me sounds like its driver executed as well. There is way to many title names to fit in a BIOS, not too mention they need to be able to updated it often, thats not happening with a BIOS for sure. The driver must be the trigger that recognizes the potential problem application, and activates power draw. When said application reaches the power draw level, the onboard OCP/PowerDraw limiter activates it.
> 
> Here is the crazy part, you can edit it. You can essentially undo the powerdraw restriction and make the card work against itself. I could switch the powerdraw to actually OC the card. From a benchmarking stand point though, it would be shady as whatever you set in PowerDraw won't match what you have in Afterburner/Precision. Unless you are an honest benchmarker like me and would never do that... not even to destroy the only pair of 6990s that are beating me.... omg I want to do this now.
> 
> Although I can see it for myself under the clock domains, I found the same info on the GTX 590 bios editing thread on Guru3D.
> 
> More info on overvolting/overclock/unervolting can be found here:
> http://forums.guru3d.com/showthread.php?t=336117
> 
> Nibitor v 6.0.3 tool can be found here:
> http://www.mvktech.com <- Its not on public release yet unless you donate, but you can request custom BIOSs for free. He actually wants more people on his forums. OH submit your BIOSs too, he loves collecting them. Rush I will give him yours.
> 
> I personally think the links and clocks would make a great entry at the bottom of first post, a sort of disclaimer. I anticipate the PowerDraw trigger would be a non-changeable variable, hard coded on the card, if it was dynamic... it could be bypassed. Its either on or its off. We should still try and find out where it occurs.


----------



## jcde7ago

Good lord, talk about the undervolting done do these cards!

Nice work, Cain.


----------



## RagingCain

Quote:


> Originally Posted by *capchaos;13270571*
> Good job. Finally someone else willing to do some tinkering around in the bios.


ME? Nah, I am not brave enough for that, besides I want to hear back from EVGA/Jacob on this BIOS issue.

I already hold the record without a bios edit, no point going crazy and editing it.

Besides, I think editing the BIOS is extremely risky still, I need to learn way more before I even consider it.
Quote:


> Originally Posted by *jcde7ago;13270576*
> Good lord, talk about the undervolting done do these cards!
> 
> Nice work, Cain.


I am just sick today with a migraine from hell, and I had some time in between some chores I had to take care of around town today. I am actually going to get off the computer, the LCDs are killing my eyeballs.

I should anticipate the 3DMark11 scores will be update one or two more times, I plan on scaling back the CPU clock to get more steady results, however, I will add more Core clocks at both ends, so we can see the Power Draw curve better.

I will do the same thing for Unigine Heaven benchmark too.


----------



## capchaos

Ragingcain ur link is incorrect it is mvktech.net


----------



## RagingCain

Quote:



Originally Posted by *capchaos*


Ragingcain ur link is incorrect it is mvktech.net


Thanks I fixed it. Good catch.

I am going to go head butt some pillows for a while.


----------



## ExTrEmE_PeRfOrMaNcE

Hey guy's here are my new GTX 590,


----------



## RagingCain

Quote:



Originally Posted by *ExTrEmE_PeRfOrMaNcE*


Hey guy's here are my new GTX 590, 














Congrats and welcome!

@Group

We need more peeps on HWBot with these cards.

Somone just stole 8th place out from under quite a few people








http://www.hwbot.org/rankings/benchm.../world_records

Also first time I ever did anything stable at 5 GHz. Personal record for me










Its also the 3rd fastest QUAD-GPU setup in the world for Unigine DX9. Kick ass








(Select 4x at the top to organize it)
http://www.hwbot.org/rankings/benchm...(dx9)/rankings

Come on guys, I know you can do this too!


----------



## reflex99

bad news guys.

Not going to be joining the 590 crew after all.

TD says it won't be in for a month at least, and i can't wait that long.

I have succumb to Levesque and will be buying a 6990.

sorry guys, blame eVGA for their staggered launch.


----------



## RagingCain

Quote:



Originally Posted by *reflex99*


bad news guys.

Not going to be joining the 590 crew after all.

TD says it won't be in for a month at least, and i can't wait that long.

I have succumb to Levesque and will be buying a 6990.

sorry guys, blame eVGA for their staggered launch.


Sorry to see you go









I still think you will have a lot of fun with the 6990, but it just isn't as special teehee. Make sure you get benching with it on HWBot after you do a little video gaming of course


----------



## reflex99

yea, that is one of the things that swayed me, the benching potential.


----------



## teichu

Quote:



Originally Posted by *reflex99*


bad news guys.

Not going to be joining the 590 crew after all.

TD says it won't be in for a month at least, and i can't wait that long.

I have succumb to Levesque and will be buying a 6990.

sorry guys, blame eVGA for their staggered launch.


lol i bought 2 different brand's hd6990 one is XFX other is sapphire , i rma both of them because those 2 gaves awful performance... since i am a always nvidia user , this is my first time switch from nvidia to ati so disappoint.... hd6990 on my rigs even run worst than my prior single gtx580 , i think my computer seems doesnt like ati hahah.... so i just gave up to go back ati..... right now i am thinkin goin 580 sli or get gtx590


----------



## reflex99

You sir are doin' at least something wrong.....

I have pretty good luck with ATi, so i am not too worried.


----------



## Arizonian

Quote:



Originally Posted by *reflex99*


bad news guys.

Not going to be joining the 590 crew after all.

TD says it won't be in for a month at least, and i can't wait that long.

I have succumb to Levesque and will be buying a 6990.

sorry guys, blame eVGA for their staggered launch.


Sorry to see you go before you got started but I'd understand. You got the money burning a hole and couldn't make your move. You were patient. Good luck on the AMD 6990 it's a solid dual GPU none the less.

Only thing that swayed me away from ATI/AMD was driver issues with every card I owned by them. They are slow to get it right and when they do another problem rears it's ugly head eventually. And no, it's not user error, it's driver issues. Even programmers have driver issues with them.

In that regards seems both AMD and Nvidia dual gpu's are slow to fix driver issues not just AMD. Mostly due to not enough dual gpu's out there compared to the single gpu's so they focus on the masses rather than the few in comparison.

In any event good luck, though you won't need it. And congrats on your AMD 6990. See ya around.









Quote:



Originally Posted by *teichu*


lol i bought 2 different brand's hd6990 one is XFX other is sapphire , i rma both of them because those 2 gaves awful performance... since i am a always nvidia user , this is my first time switch from nvidia to ati so disappoint.... hd6990 on my rigs even run worst than my prior single gtx580 , i think my computer seems doesnt like ati hahah.... so i just gave up to go back ati..... right now i am thinkin goin 580 sli or get gtx590


You'll get more performance out of two GTX 580's over one GTX 590 or AMD 6990.

Your PSU will just meet the mark to SLI the GTX 580 and your already half way there with one card. No problems what so ever with them, they have qaulity VRM, very over clockable and the drivers just improve performance with each release. The drivers aren't even mature yet.

Nice set up to have, even if you go with one GTX 590 especially with 3D gaming and not having to sacrafice much in FPS while in 3D mode pushing out the FPS.


----------



## Arizonian

Quote:



Originally Posted by *ExTrEmE_PeRfOrMaNcE*


Hey guy's here are my new GTX 590, 






















Two GTX 590 FTW ! You know we will be awaiting your benchmarks under water. FPS and temps please.


----------



## krazyatom

Hey guys do you guys recommend gtx 590 in air cooling? I have no plan to go water cool.
My 6990 is way too long and it won't fit in my case







I might grab gtx 590 because it's shorter and it will fit. I am tired of waiting for gtx 580 3gb.


----------



## Shogon

Shouldn't be that bad for air as long as you have good internal case airflow and a good mix of intake/exhaust fans.


----------



## remer

New drivers 270.61 64-bit and 32-bit. I'm not sure if these just came out. The "release date" was last week, but I don't see anyone running them...

Edit: I see RagingCain already posted a while ago.


----------



## Alatar

updated.
Quote:


> Originally Posted by *RagingCain;13270527*
> @Alatar if you want these images for main you can use them:


I could but since you seem to have so many links, images benchies, etc. would you like to gather them to one post that I'll put under the useful links section? Crediting you ofc.

I could link the pics like that individually but the OP might get a bit cluttered.


----------



## RagingCain

Quote:


> Originally Posted by *Alatar;13277177*
> updated.
> 
> I could but since you seem to have so many links, images benchies, etc. would you like to gather them to one post that I'll put under the useful links section crediting you.
> 
> I could link the pics like that individually but the OP might get a bit cluttered.


I will contact a moderator to see if they will place a second post for all the power draw / miscellaneous stuff.


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;13277213*
> I will contact a moderator to see if they will place a second post for all the power draw / miscellaneous stuff.


Yeah good, that's a better idea actually


----------



## Masked

Since I've received 4/5 PM's asking for "clarification", I'll give it.

I don't come in here to be treated like a child that doesn't give "factual information"...As an admin, especially of my company, that's all I offer.

I also don't care about BOINC...I care about the real time data from a program I helped develop/rely on, for an entire office + an entire server farm...That program has told me exactly what RagingCain claims...and I have said such, since day 1.

Examples 2+ weeks ago.
Quote:


> Originally Posted by *Masked;13085154*
> If you guys use aida, rivatuner, msi, even Dxdiag ~ I'd be interested to know at what voltage your cores are...
> 
> Both stock and while benching ~ Especially anyone over 700mhz ~


Quote:


> Originally Posted by *soilentblue;13085297*
> gpu #1 .925mv
> gpu #2 .913mv
> 
> that's my max for the two. about to try and run crysis to see what the volts get up to


Quote:


> Originally Posted by *Masked;13085650*
> That's interesting...Even at work atm: Core 1 .925v; Core 2 .940v
> 
> This card is stock too...I find that extremely interesting.
> 
> What are your cores VS voltage while running something like Crysis 2 vs Benching?


Immediately after first discovering this "issue", at my weekly meeting with Nvidia, I pointed this out and was given the "this card is perfect" attitude...That was on March 27th...Ironically, I wasn't even the first person to come to them about this issue, it was already "known" and they really didn't care.

I then called EVGA that Friday and then again last week to get an "update".

All I have said on either thread is that, it's not just voltages controlling these cards, there are strings in the bios and the drivers that you're not addressing which, as a programmer, Mr. RagingCain should KNOW to be true.

I have no issue in assisting to "fix" these cards and I'm glad it may be simple but, I quite honestly don't think it is that "simple".

I also have no issue with RagingCain spearheading this, quite honestly it benefits us all but, the disrespect has to end.

Regardless of my opinion, I've always treated you all with respect, all I ask is for the same and I'll have no issue contributing, if it's going to be like it was, where I'm treated like a 5 year old, I'd literally rather deal with our customers.


----------



## RagingCain

Quote:


> Originally Posted by *Masked;13277539*
> Lots of words.


Look, I am not sure what you are wanting.

You want credit for being the first one to notice it? It's all yours.
You want credit for being convinced there is a physical problem? It's all yours.
You want me to believe there is this some nVidia conspiracy-esque issue? Well, I am not going to, but if you want to tell people I agree with you, go ahead.

We had a saying when I was in the military:
If you are not a part of the solution, you are a part of the problem.

As far as your high and mighty stance over me:
You give no conclusive evidence there was a physical problem.
You have given no evidence that nVidia knows about this.
You have given no evidence of any thing about anything essentially.
You also claim this is apart of the OCP. I went into the BIOS, and there is NOTHING relating to the OCP on this issue. Voltage is simply just fed wrong to the GPU it seems.

One thing I don't tolerate is BS from anyone. I don't care if you are the president of Truth.inc, I will call it how I see it.

*This is the last I am talking about this though*.

All I have done:
Essentially, I discovered on my own there was a problem with CUDA based application. I traced it back to voltage on GPU2. I tested a hypothesis. Solution worked. That's it. Very simply. Problem solved so far.

*I am still going strong for nearly 36 hours continuous GPU computing, with 0 errors. My last error was the 24th of April*.

You can see for yourself:

Errors:
http://einstein.phys.uwm.edu/results.php?userid=329590&offset=0&show_names=0&state=5

Valid:
http://einstein.phys.uwm.edu/results.php?userid=329590&offset=0&show_names=0&state=3

Processing:
http://einstein.phys.uwm.edu/results.php?userid=329590&offset=0&show_names=0&state=2

Lots CUDA units failed that day.









So far the stock clocks set to 0.925v is working perfectly. Rather than accepting that I may have come up with a solution, you suggest I am blind, refer to Khadafi, and other stupid remarks.

You claim all of my sensors are wrong, yada yada yada, and then I go and prove that your AIDA64 has had a bug with it, that didn't display the correct voltages/GPU usage for GTX 590. Then you claim thats the one you are using, you have been using this released beta for-ever, even though it was only released like two weeks ago. So even if all my sensors are wrong, the cards are working top notch on high-end physics based computation work units with 0 errors. But that doesn't matter because you don't care about BOINC, and you are an expert Alienware administrator, who builds elite and private CUDA applications, and has a private army of GTX 590s at your "disposal."

To be honest Masked, just about every post you bring up, I don't believe about 95% of the things you claim. I don't have to, and you could be telling the truth. So like I said, I am not talking about this anymore, its just going to drag on and on. I personally don't like drama, which is why I asked you to stop derailing my EVGA thread with all this over-zealousness.

@Others
I apologize for arguing this way, sometimes you just have to take a stand, and pull out some numbers for people who have lost their way. Just like I do when people say the GTX 590 doesn't overclock.

I wanted to add I saw this fantastic 6990 CFX vs. 590 SLI.

I was shocked that the GTX 590 pulled ahead.
http://www.guru3d.com/article/triple-monitor-gaming-on-geforce-gtx-590-and-radeon-6990/1

@Alatar
I will pm iceblade.
Quote:


> Originally Posted by *krazyatom;13275215*
> Hey guys do you guys recommend gtx 590 in air cooling? I have no plan to go water cool.
> My 6990 is way too long and it won't fit in my case
> 
> 
> 
> 
> 
> 
> 
> I might grab gtx 590 because it's shorter and it will fit. I am tired of waiting for gtx 580 3gb.


Yes, its moderately well for air, however, you need a good airflow case I am not familiar with yours, but if you are able to get a steady flow of air in the lower region, and exhausted out the back/roof, there will be little to no heat issues.

@All
Looks like TiN.... has begun. There go my benchmarks...

http://www.kingpincooling.com/forum/showthread.php?t=1163&page=5
Quote:


> Each GPU powered with separate 10 phase VRM, capable to feed up to 2V.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13278326*
> Look, I am not sure what you are wanting.
> 
> You want credit for being the first one to notice it? It's all yours. *No, obviously I wasn't since the report from Nvidia said they already knew.*
> You want credit for being convinced there is a physical problem? It's all yours. *No, never was this about credit.*
> You want me to believe there is this some nVidia conspiracy-esque issue? Well, I am not going to, but if you want to tell people I agree with you, go ahead. *What conspiracy? There's been an error in Adobe since 6 yet, to Adobe, it's working perfectly, no error...Is Adobe involved in a conspiracy too? Absolutely not...A company will do business as it does business and according to Nvidia, their GPU is #1, and is perfect.*
> 
> We had a saying when I was in the military:
> If you are not a part of the solution, you are a part of the problem. *Absolutely agree and this "solution" has been known since page 3, noted by Pedros and Nvidia has known of the issue since March 27th, EVGA was made aware on April 1st.*
> 
> As far as your high and mighty stance over me:
> You give no conclusive evidence there was a physical problem. *There never had to be, the issue was so widespread and proven through this entire thread it had become common knowledge by the time you chose to "BOINC". All you have to do is start from Page 1.*
> You have given no evidence that nVidia knows about this.*Unfortunately, I can't reveal my sources, it's a violation of my NDA, I'd lose my job and much like you beg EVGA to "tell us, the public", I don't have this option, sorry.*
> You have given no evidence of any thing about anything essentially. *Again, this is a fallacy...Start from page 1, I've proven without a doubt different voltages were going to different cores VIA other members and they have posted their data...Among them were Pedros, Soilentblue and several others.*
> You also claim this is a part of the OCP. I went into the BIOS, and there is NOTHING relating to the OCP on this issue. Voltage is simply just fed wrong to the GPU it seems. *I disagree with this aspect due to the strings that exist with this bios and the drivers...Have you taken a driver a part yet and actually viewed the strings? No? Have you actually dissected the entire bios yet, NOT just through a utility? No? THESE are my issues with your claims...You've done nothing to scientifically or factually prove yourself beyond evidence that's apparently only available to you and a select few.*
> 
> One thing I don't tolerate is BS from anyone. I don't care if you are the president of Truth.inc, I will call it how I see it. *I don't either and unfortunately, you've compounded the issue, let me explain why.
> 
> You've used 2/3 monitoring programs that haven't picked this issue up in 3 weeks when programs, like Aida (Which my office, Alienware Corp, assists in programming and gets our software WEEKS before public "retail" release) saw this on this first week of release. This was also proven Via Rivatuner and several others used between April 1st and the 15th.
> 
> Yet, you choose to use monitoring software that was INCORRECT for 3 weeks while others proved the same exact theory you're now preaching...Suddenly, those monitoring programs are now detecting the issue and NOW it's an issue opposed to 3 weeks ago when multiple members pointed this out.
> 
> In statistics this is called SEVERE BIAS and thus is only counted as an anomaly.
> 
> Again, you're using monitoring programs that failed consistently for 3 weeks and as a programmer, you accept this as fact? I'm sorry but, as an admin, that's bull****.
> 
> If a program had failed on a programmer for 3+ weeks (considering we've established this is in the bios and existed since day 1) you would've dropped them and found other solutions...Your failure to do so is absolutely suspect.*
> 
> *This is the last I am talking about this though*.
> Thank god, all I want is a proper experiment, all I've said since day 1 and you have yet to produce 1.
> 
> All I have done:
> Essentially, I discovered on my own there was a problem with CUDA based application. I traced it back to voltage on GPU2. I tested a hypothesis. Solution worked. That's it. Very simply. Problem solved so far. *Again, my issue with this was the hypotheses existed weeks ago, all one had to do 3 weeks ago was test it on their own...Why now?*
> 
> *I am still going strong for nearly 36 hours continuous GPU computing, with 0 errors. My last error was the 24th of April*.
> 
> You can see for yourself:
> 
> That's awesome, serious points for you but, you have yet to provide me with non-bias data.
> 
> So far the stock clocks set to 0.925v is working perfectly. Rather than accepting that I may have come up with a solution, you suggest I am blind, refer to Khadafi, and other stupid remarks. *Incorrect...I claim common sense.
> 
> You used the word hypothesis...How does one prove something scienticifically using failed sensors?
> 
> Your sensors failed for 3 weeks, you've admitted that...Yet, they're still law when everything else worked, again, as a logical person, I find that rediculous.
> 
> Rivatuner worked, DxDiag was proven to have worked by Soilent yet, nothing mattered until after 3 weeks and 4 days, MSI afterburner showed YOU the problem, Precision showed YOU the problem...Absolutely suspect.*
> 
> You claim all of my sensors are wrong, yada yada yada, and then I go and prove that your AIDA64 has had a bug with it, that didn't display the correct voltages/GPU usage for GTX 590. Then you claim thats the one you are using, you have been using this released beta for-ever, even though it was only released like two weeks ago. So even if all my sensors are wrong, the cards are working top notch on high-end physics based computation work units with 0 errors. But that doesn't matter because you don't care about BOINC, and you are an expert Alienware administrator, who builds elite and private CUDA applications, and has a private army of GTX 590s at your "disposal." *This is the bull**** I was referring to...Educate yourself.
> 
> You claim to use scientific theory and yet you have a failed hypothisis from the start based on flawed sensors...That's a period and end.
> 
> Of course I won't accept that data as fact, I don't accept that from interns, why would I hold you to any other standard?
> 
> All I asked for was ACTUAL data with different sensors.
> 
> Which is interesting because all you've done this entire rant is prove my post from a month ago, to absolutely be 100% correct, with those apparent flawed sensors.
> 
> So, care to explain how I was wrong with what I posted a month ago when all you continue to do is validate it??*
> 
> To be honest Masked, just about every post you bring up, I don't believe about 95% of the things you claim. I don't have to, and you could be telling the truth. So like I said, I am not talking about this anymore, its just going to drag on and on. I personally don't like drama, which is why I asked you to stop derailing my EVGA thread with all this over-zealousness.
> *I'm not over-zealous I just don't accept flawed data, which is all you've provided VIA your sensoring.*


The difference between you and I is one of what we can say and what we cannot say.

You can say whatever you want because of your status/occupation and that's awesome...geniunely, that's fantastic.

I cannot...That doesn't make me 95% full of **** considering you've proven every single "theory" I had correct since day 1.

That means I simply can't post every single fact I find and that's fine with me.

What I am asking is that this unfounded *******-attitude be vanquished, that you man up and provide non-bias data via sensors that WERE NOT broken for over a month...It's a very simple request...And that you respect your peers, something you're obviously incapable of atm.

I don't think I'm asking for much but, unfortunately I may be


----------



## teichu

Quote:


> Originally Posted by *Arizonian;13274432*
> Sorry to see you go before you got started but I'd understand. You got the money burning a hole and couldn't make your move. You were patient. Good luck on the AMD 6990 it's a solid dual GPU none the less.
> 
> Only thing that swayed me away from ATI/AMD was driver issues with every card I owned by them. They are slow to get it right and when they do another problem rears it's ugly head eventually. And no, it's not user error, it's driver issues. Even programmers have driver issues with them.
> 
> In that regards seems both AMD and Nvidia dual gpu's are slow to fix driver issues not just AMD. Mostly due to not enough dual gpu's out there compared to the single gpu's so they focus on the masses rather than the few in comparison.
> 
> In any event good luck, though you won't need it. And congrats on your AMD 6990. See ya around.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You'll get more performance out of two GTX 580's over one GTX 590 or AMD 6990.
> 
> Your PSU will just meet the mark to SLI the GTX 580 and your already half way there with one card. No problems what so ever with them, they have qaulity VRM, very over clockable and the drivers just improve performance with each release. The drivers aren't even mature yet.
> 
> Nice set up to have, even if you go with one GTX 590 especially with 3D gaming and not having to sacrafice much in FPS while in 3D mode pushing out the FPS.


haha thanks for suggestion tho , i think i will add another another 580


----------



## RagingCain

Quote:


> Originally Posted by *Masked;13279206*
> The difference between you and I is one of what we can say and what we cannot say.
> 
> You can say whatever you want because of your status/occupation and that's awesome...geniunely, that's fantastic.
> 
> I cannot...That doesn't make me 95% full of **** considering you've proven every single "theory" I had correct since day 1.
> 
> That means I simply can't post every single fact I find and that's fine with me.
> 
> What I am asking is that this unfounded *******-attitude be vanquished, that you man up and provide non-bias data via sensors that WERE NOT broken for over a month...It's a very simple request...And that you respect your peers, something you're obviously incapable of atm.
> 
> I don't think I'm asking for much but, unfortunately I may be


You are questioning my data integrity, and have not posted *anything* backing up a single claim you have had.

I have had both GPUs on both cards at 0.913v / 0.925v voltages the entire time. There was no need for concern as no vissible issues appeared. I have plenty of GPUs with different voltages in the past despite being identical. Most recently a pair of GTX 580s. I didn't care till I started getting CUDA errors. There are plenty of cases of different voltages in the cores of cards and is not grounds alone to prove a problem exists.

CUDA errors are indicative GDDR5 errors. Which are caused by two things, unstable overclock on GPU or Memory. You can either, lower clocks, or up the voltage. GDDR5 errors can range from a low spike in FPS, full on stutter, or all the way to crashing. It can even be as hard diagnose as having lower than stock performance. Our GDDR5 is ECC, which makes it tricky to identify issues with common tools, and less likely to cause a crash. These errors effect everybody. Not just some nerd running BOINC like myself. Infact, the best GPU stability program is based off of CUDA, and I don't know if it supports GTX 590, but worth a look at (OCCT)

Since, the errors are/were sporadic, it looked like it was a small adjustment, when I started looking at all my settings, I figured lets start with voltage. I clicked apply 0.925v and it forced both GPUs to the same voltage (which should be stock voltage.) Apparently thats all I needed, as under the same conditions for nearly 2 days now, I have had 0 CUDA errors. So in a sense, I am getting 0 GDDR5 errors.

My sensor data from Precision, MSI Afterburner, GPU-Z, and AIDA64 is the same across all platforms. Which is what I expect, since the first 3 are all derived from Rivatuner, and AIDA is a mixed bag, but still works.

I have already proven this, if you look back at the screenshot I posted using the "latest" fixed AIDA64 application. All the sensors report the same voltage, overclocked, not overclocked.

Show me anything that corroberates anything you have said, Masked. Put up or shut up. I am sick of you cheapening my efforts for everybody.


----------



## -iceblade^

what is it with this thread that causes people to get upset???

please keep it civil, again...


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13279849*
> You are questioning my data integrity, and have not posted *anything* backing up a single claim you have had.
> 
> I have had both GPUs on both cards at 0.913v / 0.925v voltages the entire time. There was no need for concern as no vissible issues appeared. I have plenty of GPUs with different voltages in the past despite being identical. Most recently a pair of GTX 580s. I didn't care till I started getting CUDA errors. There are plenty of cases of different voltages in the cores of cards and is not grounds alone to prove a problem exists.
> 
> CUDA errors are indicative GDDR5 errors. Which are caused by two things, unstable overclock on GPU or Memory. You can either, lower clocks, or up the voltage. GDDR5 errors can range from a low spike in FPS, full on stutter, or all the way to crashing. It can even be as hard diagnose as having lower than stock performance. Our GDDR5 is ECC, which makes it tricky to identify issues with common tools, and less likely to cause a crash. These errors effect everybody. Not just some nerd running BOINC like myself. Infact, the best GPU stability program is based off of CUDA, and I don't know if it supports GTX 590, but worth a look at (OCCT)
> 
> Since, the errors are/were sporadic, it looked like it was a small adjustment, when I started looking at all my settings, I figured lets start with voltage. I clicked apply 0.925v and it forced both GPUs to the same voltage (which should be stock voltage.) Apparently thats all I needed, as under the same conditions for nearly 2 days now, I have had 0 CUDA errors. So in a sense, I am getting 0 GDDR5 errors.
> 
> My sensor data from Precision, MSI Afterburner, GPU-Z, and AIDA64 is the same across all platforms. Which is what I expect, since the first 3 are all derived from Rivatuner, and AIDA is a mixed bag, but still works.
> 
> I have already proven this, if you look back at the screenshot I posted using the "latest" fixed AIDA64 application. All the sensors report the same voltage, overclocked, not overclocked.
> 
> Show me anything that corroborates anything you have said, Masked. Put up or shut up. I am sick of you cheapening my efforts for everybody.


I already have...Soilentblue did not 2 weeks ago...Exact same voltages you posted.

Theories have been tossed around via Pedros from day 1, myself included.

Many members pointed out the voltage issues and the "problems" associated with such LONG before anyone chose to BOINC.

My point has always remained the same, these issues were pointed and known LONG before you chose to investigate them by both the community and the companies involved.

Do you really think Nvidia releases a BIOS without KNOWING absolutely 100% the entire BIOS?

Aida since day 1, showed the voltages as what they were, it was commented on MANY times, do a search of my name with voltages, you'll find 20-30 posts.

Same with Soilent, Pedros, MANY members in this club.

I'm not attempting to cheapen your efforts at all, you found the issue was within the bios, something MANY of us theorized about earlier and you've taken the effort to provide HARD data, that's awesome.

My point STILL has never once changed.

USE monitoring programs that actually worked since day 1 instead of using "tainted and biased" data.

Did Precision pick up this issue the first 5 weeks? NO.

Did Eleet-tuner pick up this issue the first 5 weeks? NO.

Did Afterburner pick up this issue the first 5 weeks? NO.

Did a DXDiag done by Soilentblue pick up this issue in the first 5 weeks? Yes, at 3 weeks.

Did Aida pick up this issue in the first 5 weeks? Yes, at 2 weeks, 3 weeks, 4 weeks and 5 weeks, posts came from the ultimate beta VIA masked but, no screen-shots were provided.

Did Rivatuner pick up this issue in the first 5 weeks? Yes, at 3 weeks via Pedros whom posted such.

I told you to read from day 1, go through the thread, it's not my responsibility to provide you with proper data when you can't even do it for yourself.

Again I'm not cheapening your results, I'm not satisfied with your conclusion at all based on bias...Nobody should be.

The fact that the fault lies with the BIOS is undisputed...Can't argue that, I actually at first said it was a driver issue which, after looking at both, I'd say it's a conjunction of the two but, these theories exist within these pages.

You claim science but, yet stab it in the back by claiming flawed data as fact.

When you actually follow scientific protocol, you won't hear an argument from me.


----------



## RagingCain

Does anybody else get why Masked is actually arguing with me about nothing? I believe since day one, everything with you was "Driver Issues & Crap Scaling." You were practically cramming that down everyones throat too. You still are not providing data of what you proved, a program crashing, or anything at all.

This is a waste of my time, and you are overly trying to make your self important and I don't care anymore, you contribute nothing regardless. There is no reasoning with the unreasonable.

Here is some more "tainted data" that shows the fix in action, including a bug in AIDA64, which I also found our voltage setting, fixes it. The fix for GPU vCore, is confirmed to work until reboot. So set it once, and you are golden for as long as the computer is on.

Fresh Reboot:









After Applying Change:









Showing Proper Idling:









*Going on just shy of 38 hours in being computational error free*.


----------



## Masked

You know what? LOL

Just got off the phone with EVGA and Pm'd Jacob...Will wait to hear the official response.

Enjoy your tweaking.

But, BTW: Soilent's post and proof, page 84.

Quote:



Originally Posted by *soilentblue*


gpu #1 .925mv
gpu #2 .913mv

that's my max for the two. about to try and run crysis to see what the volts get up to


----------



## RagingCain

Quote:



Originally Posted by *Masked*


You know what? LOL

Just got off the phone with EVGA and Pm'd Jacob...Will wait to hear the official response.

Enjoy your tweaking.

But, BTW: Soilent's post and proof, page 84.


Proof of what? Everybody's card is probably doing this? What have you shown that proves this was causing problems? Where is your proof it caused problems?

We all thought any issues were driver related and driver teething. I don't think anyone expected Stock clocks to be problematic, and many probably are surprised by the fact that the cards might have an error on them.

Those are the same versions of Afterburner and Precision from 5 weeks ago... so I am not sure why you keep saying only now they are showing the same voltages. They have always been showing this, it wasn't considered a problem though. At least not without any evidence.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Proof of what? Everybody's card is probably doing this? What have you shown that proves this was causing problems? Where is your proof it caused problems?

We all thought any issues were driver related and driver teething. I don't think anyone expected Stock clocks to be problematic, and many probably are surprised by the fact that the cards might have an error on them.

Those are the same versions of Afterburner and Precision from 5 weeks ago... so I am not sure why you keep saying only now they are showing the same voltages. They have always been showing this, it wasn't considered a problem though. At least not without any evidence.


Obviously speculation was incorrect, my previous speculation that the drivers//bios were to blame was completely wrong, I apologize...

Had a long discussion about your "attitude" and I agree, I'm never going to win.

So you win, I concede, I was absolutely wrong, I know nothing and that's that.

Let's move forward.


----------



## ExTrEmE_PeRfOrMaNcE

Quote:



Originally Posted by *Arizonian*









Two GTX 590 FTW ! You know we will be awaiting your benchmarks under water. FPS and temps please.










Will do as soon as i finish having fun with them first, have to try some new games and how they run all that good stuff.... Will post some vids while i am testing them out, still waiting on my 9800 gt for Physx it should arrive any day noW....


----------



## RagingCain

Quote:



Originally Posted by *ExTrEmE_PeRfOrMaNcE*


Will do as soon as i finish having fun with them first, have to try some new games and how they run all that good stuff.... Will post some vids while i am testing them out, still waiting on my 9800 gt for Physx it should arrive any day noW....










You should visit my R3E BE thread and post pictures of the board!!!

http://www.overclock.net/intel-mothe...k-edition.html


----------



## ReignsOfPower

RagingCain and Masked. If you feel as if you have some constructive evidence of your findings please post them before stating them. So far I have seen RagingCain's proof of concept, and none from Masked. Masked bud, if you feel like you have found something, post your findings with proof. Raging is trying his best to prove his point very well. You're out to win the bar fight. There's no glory in winning here. Us owners are simply interested in the truth. Arguments are simply a waste of time. Move on if you don't have anything more to add to the table. We get it, you think its a mix of scaling, ocp and bad drivers. Thanks for the food for thought.









[EDIT] - Oh and I forgot to mention, I've reproduced Cains errors/findings almost the same. Ones on 0.9130 instead of his 0.9120 and the other hits 0.9250


----------



## Masked

Quote:



Originally Posted by *ReignsOfPower*


RagingCain and Masked. If you feel as if you have some constructive evidence of your findings please post them before stating them. So far I have seen RagingCain's proof of concept, and none from Masked. Masked bud, if you feel like you have found something, post your findings with proof. Raging is trying his best to prove his point very well. You're out to win the bar fight. There's no glory in winning here. Us owners are simply interested in the truth. Arguments are simply a waste of time. Move on if you don't have anything more to add to the table. We get it, you think its a mix of scaling, ocp and bad drivers. Thanks for the food for thought.









[EDIT] - Oh and I forgot to mention, I've reproduced Cains errors/findings almost the same. Ones on 0.9130 instead of his 0.9120 and the other hits 0.9250


The irony is, I really can't bud.

If I post a screenshot of my UI (Reprimanded for this once already) I'm in CLEAR violation of my NDA. Any proof I could give to you, get's me fired, boss has already stated such. In fact, he popped back in this morning on his way to HIS headquarters to "remind me".

That being said, not all "flavors" have the same voltages, we've proven this without a doubt. There is a discrepancy between different brands and different voltages.

My request to broaden his field of "monitoring software" was actually because nobody else has...and I think he'd find something very interesting...Yet, it's thrown in my face like I'm a dumbass.

I also requested that he follow an actual scientific process and get more samples...Something anyone from any statistics OR scientific background would agree with...Yet, it was thrown in my face again like I'm a dumbass.

This bios was something that Nvidia has been aware of since day 1 at 0 hour of release...I can get an exact name who "alerted" them first but, I I'd bet you money it would be Nvidia engineer "Dan" (Not giving last names ~ He's a cool ass dude to boot).

Never once did I say I had discovered this, I said it had been mentioned to them SEVERAL times by major companies...I know for a fact 2 of the guys at Falcon did not 10 minutes after our meeting. -- Nvidia gave the same exact response.

It was also mentioned by several members of this thread...I have no beef with what he's proving...My beef is with the method.

That bios, if you actually look at it beyond the editor, in it's script form, you'll see what I'm talking about yet, you stare at it through an editor?

In regards to our "argument", I called EVGA again, to actually discuss something else, had a good laugh, my arguing with him is worthless because I'll always be wrong so, I've moved forward.


----------



## ReignsOfPower

Thank you for clarifying that Masked. I believe in time we will find the truth. Hopefully the NDA will be lifted shortly. Us consumers don't like being left in the dark after spending 700US$ on a piece of hardware


----------



## krazyatom

Hey guys,
I play World of warcraft mainly, so I decided to sell my msi 6990 and buy asus gtx 590.
Hopely someone buy my 6990, so I get get in real action!









By krazyatom at 2011-04-21


----------



## RagingCain

Quote:



Originally Posted by *krazyatom*


Hey guys,
I play World of warcraft mainly, so I decided to sell my msi 6990 and buy asus gtx 590.
Hopely someone buy my 6990, so I get get in real action!









By krazyatom at 2011-04-21


Even better news for you: World of Warcraft just lifted the beta of DX11 rendering. Its now an in game option, so it should be implemented better


----------



## Masked

Some of the interns this morning (those with 590's) reported that they've crashed repeatedly in WoW due to their drivers failing and being recovered.

To my understanding, this has happened ~12 times this morning alone amongst the 3 of them.

2 are using SLI 590s, the other is using a single.

1 is an Asus setup, other is an EVGA setup and the single is our review sample so, generic.


----------



## Arizonian

Quote:



Originally Posted by *Masked*


Some of the interns this morning (those with 590's) reported that they've crashed repeatedly in WoW due to their drivers failing and being recovered.

To my understanding, this has happened ~12 times this morning alone amongst the 3 of them.

2 are using SLI 590s, the other is using a single.

1 is an Asus setup, other is an EVGA setup and the single is our review sample so, generic.


I want to work somewhere where I can play WOW too......


----------



## krazyatom

Quote:



Originally Posted by *Masked*


Some of the interns this morning (those with 590's) reported that they've crashed repeatedly in WoW due to their drivers failing and being recovered.

To my understanding, this has happened ~12 times this morning alone amongst the 3 of them.

2 are using SLI 590s, the other is using a single.

1 is an Asus setup, other is an EVGA setup and the single is our review sample so, generic.



Quote:



Originally Posted by *Arizonian*


I want to work somewhere where I can play WOW too......










Me too! I don't care if drivers fails or not haha


----------



## Masked

Quote:



Originally Posted by *Arizonian*


I want to work somewhere where I can play WOW too......










Pm me your resume...Hiring a new batch, soon.

Driver issues still exist...I actually "allowed" 3 of them to take their rigs home last night to raid with...

They claimed frequent LD's due to driver issues, especially in regards to anything DX11.

Frequent blowing out of the ground + zones, especially when zoning.

Same went for Crysis last night, as well.


----------



## helifax

Quote:


> Originally Posted by *RagingCain;13280686*
> Does anybody else get why Masked is actually arguing with me about nothing? I believe since day one, everything with you was "Driver Issues & Crap Scaling." You were practically cramming that down everyones throat too. You still are not providing data of what you proved, a program crashing, or anything at all.
> 
> This is a waste of my time, and you are overly trying to make your self important and I don't care anymore, you contribute nothing regardless. There is no reasoning with the unreasonable.
> 
> Here is some more "tainted data" that shows the fix in action, including a bug in AIDA64, which I also found our voltage setting, fixes it. The fix for GPU vCore, is confirmed to work until reboot. So set it once, and you are golden for as long as the computer is on.
> 
> Fresh Reboot:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After Applying Change:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Showing Proper Idling:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Going on just shy of 38 hours in being computational error free*.


Hello,

Interesting things you have there.
Have you tried running at these frequencies: @963mV/ 705Mhz per Clock/ 1410Mhz Shaders/ 1900Mhz Memory?

I find this set very flexible and stable. I am gaming on 3D Vision Soround, which means 3x3D Monitors in stereoscopic 3D. As you might know the toll on the graphic card is higher when playing in stereo 3D ( basically the image is rendered twice on each screen).
I found out that I can increase my clocks but apparently when playing in stereo 3D the driver crashes.

I am using 1xAsus GTX590 on watercooling. For the tweaks I use MSI Afterburner 2.2.0 Beta 1.
If you want you could try and make some tests with these frequencies. Maybe they will work good for you too

Best Regards.

PS: I am using the latest nVidia drivers with a modded bios to allow an increase in voltage up to 0.963V. Also for the higher value (when editing with NiBiToR I set 1.02 I believe, can't remember the exact value).


----------



## RagingCain

Quote:


> Originally Posted by *helifax;13306905*
> Hello,
> 
> Interesting things you have there.
> Have you tried running at these frequencies: @963mV/ 705Mhz per Clock/ 1410Mhz Shaders/ 1900Mhz Memory?
> 
> I find this set very flexible and stable. I am gaming on 3D Vision Soround, which means 3x3D Monitors in stereoscopic 3D. As you might know the toll on the graphic card is higher when playing in stereo 3D ( basically the image is rendered twice on each screen).
> I found out that I can increase my clocks but apparently when playing in stereo 3D the driver crashes.
> 
> I am using 1xAsus GTX590 on watercooling. For the tweaks I use MSI Afterburner 2.2.0 Beta 1.
> If you want you could try and make some tests with these frequencies. Maybe they will work good for you too
> 
> Best Regards.
> 
> PS: I am using the latest nVidia drivers with a modded bios to allow an increase in voltage up to 0.963V. Also for the higher value (when editing with NiBiToR I set 1.02 I believe, can't remember the exact value).


I have found the following frequencies/voltages to be stable:
630 @ 0.925
650 @ 0.938
675 @ 0.950~0.951
700 @ 0.951~0.963
725 @ 0.963~0.975 <- 0.963 seems to be the cherry point
730 @ 0.975 <- in my testing this is on the threshold of PDL / OCP
740 @ 0.975
750 @ 0.988
775 @ 1.000~1.013
800 @ 1.026
825 @ 1.050~1.052

I am not actually sure of the last few jumps in voltage, I will definitely refine that list when my computer is rebuilt.

Sent from my DROID2 using Tapatalk


----------



## helifax

Quote:


> Originally Posted by *RagingCain;13307578*
> I have found the following frequencies/voltages to be stable:
> 630 @ 0.925
> 650 @ 0.938
> 675 @ 0.950~0.951
> 700 @ 0.951~0.963
> 725 @ 0.963~0.975 <- 0.963 seems to be the cherry point
> 730 @ 0.975 <- in my testing this is on the threshold of PDL / OCP
> 740 @ 0.975
> 750 @ 0.988
> 775 @ 1.000~1.013
> 800 @ 1.026
> 825 @ 1.050~1.052
> 
> I am not actually sure of the last few jumps in voltage, I will definitely refine that list when my computer is rebuilt.
> 
> Sent from my DROID2 using Tapatalk


Thank you for the list of all stable frequencies/voltages. This might help someone who is trying to OC the card and get the best of it!

Best Regards.


----------



## Rognin

So umm, is the Watercooled version from EVGA worth getting?

I don't feel like rummaging through 111 pages of text, they became available tonight...

Thanks... =P


----------



## RagingCain

I may have gotten air cooled ones and put my own block on if I did it all over again. But at least they are warrantied this way, hassle free.

Sent from my DROID2 using Tapatalk


----------



## Rognin

Quote:



Originally Posted by *RagingCain*


I may have gotten air cooled ones and put my own block on if I did it all over again. But at least they are warrantied this way, hassle free.

Sent from my DROID2 using Tapatalk


So that's a yes?


----------



## Shogon

Just wondering bout the EVGA Classified if it could run on my 650W PSU, I know it says 700W but I wouldn't plan on overclocking it. If not I'll get a 750W.


----------



## tonyjones

let's see some pics of 2 x GTX 590 in QUAD SLI!


----------



## kazukun

QuadSLI*@750/1900/1500*

3DMARK VANTAGE
P65828
GPU　60325
CPU　90632








X39209
GPU　38078
CPU　89972









3DMARK11
P16555
Graphics　19368
Physics　12774
Combined　10062








X6444
Graphics　6523
Physics　12962
Combined　4029


----------



## RagingCain

Quote:


> Originally Posted by *kazukun;13319206*
> QuadSLI*@750/1900/1500*
> 
> 3DMARK VANTAGE
> P65828
> GPU　60325
> CPU　90632
> 
> 
> 
> 
> 
> 
> 
> 
> X39209
> GPU　38078
> CPU　89972
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMARK11
> P16555
> Graphics　19368
> Physics　12774
> Combined　10062
> 
> 
> 
> 
> 
> 
> 
> 
> X6444
> Graphics　6523
> Physics　12962
> Combined　4029


Wonderful scores, I guess I have scores to aim for at beating now


----------



## armartins

Guys could you run Vantage with Phsx off so I can get an idea how this stack against 6990 crossfire?


----------



## Masked

Quote:



Originally Posted by *armartins*


Guys could you run Vantage with Phsx off so I can get an idea how this stack against 6990 crossfire?


There are several threads about OC'd 590's (modded bios) taking down 6990's....


----------



## RagingCain

Quote:



Originally Posted by *armartins*


Guys could you run Vantage with Phsx off so I can get an idea how this stack against 6990 crossfire?


You can go to www.3dmark.com/results and they have a huge database to compare, but you might have a hard time finding results with out physx on, but I am sure someone has tried at least one.

I am guessing their GPU scores are anywhere from 50k-60k


----------



## exlink

Potential GTX 590 owner here, just waiting for them to restock on Newegg. I was going to pull the trigger yesterday when they were restocked but decided to hold off until I did a bit more research.

I sit on an Antec P180 Mini so I would like a card that runs cool and quiet that is also dual-GPU since its mATX. So I thought that the GTX 590 would be a good option for me since the only game I really play now is Bad Company 2 which nVidia excels in (and sometimes WoW). I also think it would do well in Battlefield 3.

Anyway, few questions to GTX 590 owners:

1) Have you experienced any performance increases through driver updates yet?
2) I've read that GTX 590 performance should increase quite a bit more since one of the problems right now in its performance seems to be scaling issues that will be resolved via driver updates; is this true?
3) I run a 3 monitor solution that has a resolution of 4320x900 or 4530x900 with bezel compensation. This resolution is essentially the same as 2560x1600 since it just pushes 10k or so less pixels which is nothing. Would the GTX 590 suffice at this surround resolution well for BC2 and WoW (and hopefully BF3)?

Basically, I am trying to decide between this and the HD 6990. However the HD 6990 runs so hot and loud and is unappealing to me because of that. Also, if the GTX 590 performance will increase quite a bit due to SLI scaling issues being resolved then that would hopefully put it above HD 6990 performance as well.

Would purchasing a GTX 590 seem logical in my situation?


----------



## RagingCain

Quote:


> Originally Posted by *exlink;13327099*
> Potential GTX 590 owner here, just waiting for them to restock on Newegg. I was going to pull the trigger yesterday when they were restocked but decided to hold off until I did a bit more research.
> 
> I sit on an Antec P180 Mini so I would like a card that runs cool and quiet that is also dual-GPU since its mATX. So I thought that the GTX 590 would be a good option for me since the only game I really play now is Bad Company 2 which nVidia excels in (and sometimes WoW). I also think it would do well in Battlefield 3.
> 
> Anyway, few questions to GTX 590 owners:
> 
> 1) Have you experienced any performance increases through driver updates yet?
> 2) I've read that GTX 590 performance should increase quite a bit more since one of the problems right now in its performance seems to be scaling issues that will be resolved via driver updates; is this true?
> 3) I run a 3 monitor solution that has a resolution of 4320x900 or 4530x900 with bezel compensation. This resolution is essentially the same as 2560x1600 since it just pushes 10k or so less pixels which is nothing. Would the GTX 590 suffice at this surround resolution well for BC2 and WoW (and hopefully BF3)?
> 
> Basically, I am trying to decide between this and the HD 6990. However the HD 6990 runs so hot and loud and is unappealing to me because of that. Also, if the GTX 590 performance will increase quite a bit due to SLI scaling issues being resolved then that would hopefully put it above HD 6990 performance as well.
> 
> Would purchasing a GTX 590 seem logical in my situation?


Thats a toughie







Give us some easy questions... just kidding.

As far as cooling goes, it is cooler than a 6990, but both cards are hotter than your typical high-end GPUs. Primarily to the way they are designed to exhaust heat. These cards exhaust heat out the back of the case AND to the middle of the case. So having good air flow to the GPUs and exhaust is vital.

Yes, we have seen performance increases going from 267.85 to 270.51 was modest 10% gain. I would expect more in the future. Some CUDA applications, reported by Masked, are showing very significant scaling issues/GPU usages (one being 90% while other GPU is taking a relaxing 30% of usage). In general, I hate saying always, dual-GPU cards benefit from maturing drivers. More than any other card to boot.

Now resolution wise, this is a fiercely contested item, and quite frankly, we are not used to be able to hitting the VRAM wall before hitting GPU limitations like this so readily. My advice is, give me a full list of titles that you play. We can only speculate on Battlefield 3, but as far as Bad Company 2 goes, I believe you will be playing very decent FPS, given the fact you may on occasion have to sacrifice some AA possibly a bit of texture quality (different for each game.) The biggest problematic titles are Crysis/Metro2033/Dragon Age2/Oblivion with mods, and similar titles. Just about every Dx9 game can be handled though without too much sweat. WoW has no issues, although, with the release of Dx11, some have stated its being problematic/crashing. I currently am looking at a pile of computer parts, and can't tell you one way or another.

You are actually in an area, where you may have a list of games that lean more toward nVidia, or you may have some ATi/AMD biased games, in which case you should go that route.

These cards, for the most part, perform identical, and thus you have to look more into your personal experience you will have, and which will benefit you personally the most.


----------



## sarojz

Just got my EVGA gtx590 on Wednesday and I've overclocked it to 771/1540/1812 per gpu but as soon as I reboot for any reason they return back to their default settings. I made sure I ran OC scanner for 24hrs to make sure they were stable. How can I get the settings to remain after reboot?


----------



## kazukun

cudaTripper








CPU&GPU MAX usage


----------



## 2010rig

Have you guys seen this yet?
http://www.eteknix.com/news/point-vi...st-retail-910/

Comes pre-overclocked to 691 MHZ









Quote:



the card is stupidly expensive at Â£755.51


----------



## Masked

Quote:



Originally Posted by *2010rig*


Have you guys seen this yet?
http://www.eteknix.com/news/point-vi...st-retail-910/

Comes pre-overclocked to 691 MHZ










All they did was edit the bios and kick up stock clocks...

IMO when the card collaborates with the drivers...You'll get massive underclocking b/c of the strings.

Far from impressed.


----------



## RagingCain

Quote:


> Originally Posted by *2010rig;13332306*
> Have you guys seen this yet?
> http://www.eteknix.com/news/point-view-tgt-release-gtx-590-beast-retail-910/
> 
> Comes pre-overclocked to 691 MHZ


Thats a serious waterblock. Definitely looks like business.


----------



## ReignsOfPower

Quote:


> Originally Posted by *RagingCain;13336354*
> Thats a serious waterblock. Definitely looks like business.


EVGA HC Blocks look far far better IMO.


----------



## Nvidia Geforce

GTX 590 will Never Beat Nvidia 6200 A-le


----------



## capchaos

Well selling one of my asus gtx 's with ek block on ebay right now.
I got the palit 3g cards so i put them into my main system and moved my 590's to my acer 27inch 3d system. Having quad sli for one screen turned out to be way overkill so decided to get rid of one.


----------



## RagingCain

Nice, but sad. I can understand it, but something seems wrong about such a beautiful creation being eBay and associated with that riffraff









I think I will be stuck with these cards until 7xx series and probably hold out until 8xx series and go with a pair of dual GPUs again. I couldn't afford to upgrade anything more on my rig without selling body parts


----------



## capchaos

Most games in 3d would only use a max of 2gpus and the other 2 would be idle. Except crysis 2 and metro 2033. Tri or quad does not work with dx9 in 3d such is most of my library sadly


----------



## Juggalo23451

Quote:



Originally Posted by *capchaos*


Most games in 3d would only use a max of 2gpus and the other 2 would be idle. Except crysis 2 and metro 2033. Tri or quad does not work with dx9 in 3d such is most of my library sadly


Not if you have 3 monitors


----------



## capchaos

don't think i could handle 3 monitors in 3d surround. i occasionally nauseated with just the one monitor in 3d. Plus i don't wana drop another $1400 for two more monitors.
"R U Down with the clown Juggalo23451"


----------



## EvilClocker

Does anyone have the EVGA GTX 590 Bios ? I flashed mine with the Asus bios to try something and forgot to backup my evga bios and now I can;t run higher then 613 Clock or my card crashes. I am open to a Custom Bios also

thanks


----------



## RagingCain

Quote:


> Originally Posted by *EvilClocker;13357927*
> Does anyone have the EVGA GTX 590 Bios ? I flashed mine with the Asus bios to try something and forgot to backup my evga bios and now I can;t run higher then 613 Clock or my card crashes. I am open to a Custom Bios also
> 
> thanks


http://www.mvktech.net/component/option,com_remository/Itemid,0/func,fileinfo/id,3523/

Use at your own risk!


----------



## EvilClocker

Whats the Best Way to flash this..


----------



## EvilClocker

sorry I meant whats the current nvlfash command to flash both GPU on the card


----------



## ReignsOfPower

Any news on the voltage irregularities etc anyone?


----------



## kazukun

nvflash -5 -6 -i1 (bios).rom
nvflash -5 -6 -i2 (bios).rom
http://blue.ap.teacup.com/kazukun/104.html


----------



## Huckleberry

So I have not kept up with this thread for the last 30 pages; what's the word on this card now that its been out a good month+? You guys still recommend it? I see it is "available" on evga's site at least for the hyrdocopper version, and I am still interested in getting one. Overclock wise, I doubt I would do anything major that would require any voltage bumps, but I am looking at a res of 2560x1600. Thoughts please?


----------



## RagingCain

Quote:



Originally Posted by *ReignsOfPower*


Any news on the voltage irregularities etc anyone?


No word from EVGA or Jacob to me. I will try and ask for an update.


----------



## ReignsOfPower

Thanks peeps!

P.S. Is anyone running their monitors off DP? I noticed a funny issue. When I have a window open and I power off the monitor with DP, the windows get resized and some of my desktop icons get moved about into a more condensed form. DVI never did this. It seems to me when you disable your monitor DP downscales to its minimum resolution and then when the panel is turned back on it finds out what the native rez is again and makes it bigger.
I also noticed that with the DP connection you cant use the NVIDIA GPU scaling for resolutions (you know, stretch, maintain aspect ratio, etc) Can only select 'Display Scaling' which is what's built into my DELL U3011. What if my monitor didnt have a built in scaler? Like some of the LG panels?


----------



## sarojz

I've got my Evga gtx590 installed and left it stock for now. I'm very happy with it so far and I've had no problems. Here's my proof, it's a screen shot with various apps running to show an 590 is installed.


----------



## Masked

My office just ditched all our X58's.

I had ordered 6x 980x's every core was a "dud" and fried out the classified 3's...Ugh, this has been such a headache.

Nobody at EVGA could figure it out and I mean NOBODY, turns out it was the cores.

Thank god that nightmare is over.

So, I upgraded everyone up to the P67 2600k via the Asus Sabertooth...

Hopefully the 590's, especially in SLI, will run just as well...I'm waiting to see what happens.

In my woes with these motherboards, I ended up talking to upper level tech about the voltage issues and I believe Cain is correct...

One thing I was told was that, more testing needed to be done (ahem) and that we should expect something "soon" to remedy the current issues or, if you choose to do it yourself, do it yourself.

We "did it" with 2 of our cards and I must say, I see a big difference HOWEVER, I do see some issues because certain strings are still in existance and I personally see them as limiting factors.

"We" have yet to determine what the strings are acting on behalf of or vice versa but, IMHO they do exist and are very present...

So tweak at your own risk but, I do see a performance increase.

+Rep to Mr. Cain.


----------



## ReignsOfPower

Ah nice sounds interesting. Would love to see it officially released soon. Even if it decides to take its time, we can just refer to mister raging for a how - to (assuming he is kind enough to grace us with a how - to







)

What kind of performance increases are we talking about? Less aggressive OCP or just more consistent frame rates?


----------



## Masked

Quote:



Originally Posted by *ReignsOfPower*


Ah nice sounds interesting. Would love to see it officially released soon. Even if it decides to take its time, we can just refer to mister raging for a how - to (assuming he is kind enough to grace us with a how - to







)

What kind of performance increases are we talking about? Less aggressive OCP or just more consistent frame rates?


Well, since day 1 I said that my belief was that it was the OCP.

I still stand by this.

To me, after editing drivers for the past 7 years, I see some new "things".

Among them, I see links between the driver/bios, I see some limits set and if one were to edit the bios, I do believe whatever those limiting factors are, still exist even though they're now changed.

It's almost like changing your multiplier...Even though you've changed the base, certain limiting factors still exist.

In OUR runs, I saw some issues, some performance decreases, etc etc etc.

That may be a circumstance of our chips/motherboards so, as I said earlier, I'll wait to pass judgement.

I do believe it would benefit the community to make a how-to and provide it on both venues BUT, I have heard from EVGA that they "may//will" be doing something soon to [correct] the issues.

I would say at the end of the day, we will have the ability to overclock without being limited at the low end.

Perhaps they'll allow a max of 1.25v or 1.3v as we all have noticed a +/- 0.2 difference in the cores.

IMO it will happen eventually but, keep in mind this LOCK exists because of people OCing in the first place.


----------



## RagingCain

Quote:



Originally Posted by *Masked*


My office just ditched all our X58's.

I had ordered 6x 980x's every core was a "dud" and fried out the classified 3's...Ugh, this has been such a headache.

Nobody at EVGA could figure it out and I mean NOBODY, turns out it was the cores.

Thank god that nightmare is over.

So, I upgraded everyone up to the P67 2600k via the Asus Sabertooth...

Hopefully the 590's, especially in SLI, will run just as well...I'm waiting to see what happens.

In my woes with these motherboards, I ended up talking to upper level tech about the voltage issues and I believe Cain is correct...

One thing I was told was that, more testing needed to be done (ahem) and that we should expect something "soon" to remedy the current issues or, if you choose to do it yourself, do it yourself.

We "did it" with 2 of our cards and I must say, I see a big difference HOWEVER, I do see some issues because certain strings are still in existance and I personally see them as limiting factors.

"We" have yet to determine what the strings are acting on behalf of or vice versa but, IMHO they do exist and are very present...

So tweak at your own risk but, I do see a performance increase.

+Rep to Mr. Cain.


You didn't have to, I just wanted to help people.







Glad you are getting things setup.

Does this include the memory disappearing issue you reported?

If EVGA do not fix this specific issue, I will take matters into my own hands and test out a fix. I would be glad to aid anybody who wants their own personal bios tweaked/fixed. I will not remove power draw, and I will not adjust clocks above stock. If you want something like that privately, you can just PM me. Too many youngins get on here and just brick their stuff. So I will only publicly provide a fix. They can't deny a warranty like this anyways as its set to this value technically in BIOS.

I have faith in EVGA, from my understanding, stuff like this can't be rushed. I still would like to know whats going on of course.

As far as PowerDraw, its appears, I have not tested it, but it appears to be easily over-ridden. Although I feel that even on water, this is not a wise decision. I may just keep benchmarking Heaven, until 3DMark11 is no longer on the list.

The driver OCP is the issue really. Anything on that list is essentially crippling an overclock over 725 0.925/0.938v.

I still need to find away for people using 270.51/270.61 to "set" the voltage accordingly... I am still thinking on that. Hard to test though. I have not had a computer since Thursday.... case is taking toooooooo long to get here.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


You didn't have to, I just wanted to help people.







Glad you are getting things setup.

Does this include the memory disappearing issue you reported?

If EVGA do not fix this specific issue, I will take matters into my own hands and test out a fix. I would be glad to aid anybody who wants their own personal bios tweaked/fixed. I will not remove power draw, and I will not adjust clocks above stock. If you want something like that privately, you can just PM me. Too many youngins get on here and just brick their stuff. So I will only publicly provide a fix. They can't deny a warranty like this anyways as its set to this value technically in BIOS.

I have faith in EVGA, from my understanding, stuff like this can't be rushed. I still would like to know whats going on of course.

As far as PowerDraw, its appears, I have not tested it, but it appears to be easily over-ridden. Although I feel that even on water, this is not a wise decision. I may just keep benchmarking Heaven, until 3DMark11 is no longer on the list.

The driver OCP is the issue really. Anything on that list is essentially crippling an overclock over 725 0.925/0.938v.

I still need to find away for people using 270.51/270.61 to "set" the voltage accordingly... I am still thinking on that. Hard to test though. I have not had a computer since Thursday.... case is taking toooooooo long to get here.


Yes, upper level support and I spent HOURS on the phone trying to figure out the "issues" considering the 980's were brand new.

Jacob chimed in and RMA'd my principal board, blaming a bent pin however, 6/6 replicating the same exact problems VIA memory and PCIE slots? I chose to just move on.

When I swapped back to the 950's MOST of the issues went away so, I've returned the 980s to that vendor and was swiftly offered a set of Sabertooths so, I bit...Not the BEST board out there but, I believe it will fit our needs, well.

That being said, I agree there are still issues with the power draw...That's something Jacob and I talked a little about but, I really didn't go into detail because the motherboard issue was a bit more pressing.

I believe EVGA will handle it...They're good people...While I do believe this is more of a Nvidia issue, I think EVGA will take charge considering.

I'll look into the voltage issue...It may be something as simple as a vlock=true...Hrm.


----------



## capchaos

New case nice. what did You Get. I just ordered a mountainmods Ascension tday. Needed more room for the evga 4way sli board that i jst got
Quote:


> Originally Posted by *RagingCain;13390458*
> You didn't have to, I just wanted to help people.
> 
> 
> 
> 
> 
> 
> 
> 
> Glad you are getting things setup.
> 
> Does this include the memory disappearing issue you reported?
> 
> If EVGA do not fix this specific issue, I will take matters into my own hands and test out a fix. I would be glad to aid anybody who wants their own personal bios tweaked/fixed. I will not remove power draw, and I will not adjust clocks above stock. If you want something like that privately, you can just PM me. Too many youngins get on here and just brick their stuff. So I will only publicly provide a fix. They can't deny a warranty like this anyways as its set to this value technically in BIOS.
> 
> I have faith in EVGA, from my understanding, stuff like this can't be rushed. I still would like to know whats going on of course.
> 
> As far as PowerDraw, its appears, I have not tested it, but it appears to be easily over-ridden. Although I feel that even on water, this is not a wise decision. I may just keep benchmarking Heaven, until 3DMark11 is no longer on the list.
> 
> The driver OCP is the issue really. Anything on that list is essentially crippling an overclock over 725 0.925/0.938v.
> 
> I still need to find away for people using 270.51/270.61 to "set" the voltage accordingly... I am still thinking on that. Hard to test though. I have not had a computer since Thursday.... case is taking toooooooo long to get here.


----------



## RagingCain

Quote:


> Originally Posted by *capchaos;13396635*
> New case nice. what did You Get. I just ordered a mountainmods Ascension tday. Needed more room for the evga 4way sli board that i jst got


I bought from a vendor here on OCN, a custom M10 from CaseLabs. I don't know if its as big as Ascension, but it won me over so easy









Sent from my DROID2 using Tapatalk


----------



## capchaos

I needed something big enough to fit my mo-ra 2 rad in. Sill have not got my other 590 sold yet. Well it sold but the guy hasnt paid me yet. So tmarrow i will open an unpaid item case. haven't decided if I will re list or not. might just tear down the pc again and put it back in.


----------



## sarojz

So I'm I in the EVGA gtx590 club or what? Do I need to what for Alatar to give the go ahead? Still waiting...


----------



## RagingCain

Well Sarojz, I can add you to the spreadsheet but we are collecting data by users so we need a screenshot of your clocks and GPU-z open showing that the card is installed, speed, drivers, etc.

@All
Got an PM from Jacob today late in the day, he wants to give me a BIOS to try out. I will see if I can share it, its no good to me now till Monday / Tuesday since I don't have a computer anyways.


----------



## sarojz

I've already posted all that in a previous post a day or so ago.


----------



## Alatar

Quote:



Originally Posted by *sarojz*


So I'm I in the EVGA gtx590 club or what? Do I need to what for Alatar to give the go ahead? Still waiting...










sorry bout that, sometimes I miss stuff









updating now.


----------



## mistkron

Would like to say thank you to RagingCain for providing valuable information after reading through this whole thread while I happened to be researching stuff about my GTX 590 ( I am a proud owner of a new GTX 590 ) and it did took me quite awhile to read through the whole thread.

Please update us on the new bios once you can get your hands on your computers.

On a side note, I just can't help but to comment a little about Masked, I understand you might/could be a high profile guy from Alienware but throughout this whole thread I have seen nothing from you to back up the claims you had and the fact that you kept pulling in high profile contacts in every of your replies irks me. People don't need to know who you are meeting or where your boss is heading to next for a top secret meeting with nvidia or [ insert company ] yada yada. People just wanted to see some factual informations from you. Oh and don't pull off from a club just because you are throwing a hissy fit. -_- gosh

Then again I noticed you have already apologized to RagingCains but still I am amazed how easy RagingCain forgives after the number of disrespectful replies you gave in this whole thread while he is trying to educate and giving proper solution to 590s owner. Well sometimes it must be the ego-demons at work.

@ Alatar , will be posting my screenshots soon so that I can be in the club as well.


----------



## Juggalo23451

Quote:



Originally Posted by *capchaos*


I needed something big enough to fit my mo-ra 2 rad in. Sill have not got my other 590 sold yet. Well it sold but the guy hasnt paid me yet. So tmarrow i will open an unpaid item case. haven't decided if I will re list or not. might just tear down the pc again and put it back in.


The TH10 can fit four 480 rads in push pull with room to spare. Check Sig I have review of the case. I am using the 4 way classified as well. If you haven't paid case labs woould be a better buy.


----------



## sarojz

@Alatar:

Thanks for replying back. Looking at the screenshots I uploaded ZI just realized they only depict stock settings and not the OC'd ones I claimed. I will upload more convincing ones in a little bit. Can you please update my 'Proof' link as soon as I post the images? thanks


----------



## capchaos

Better buy maybe so. But ya I already paid and configured the case exactly how I want it

Quote:



Originally Posted by *Juggalo23451*


The TH10 can fit four 480 rads in push pull with room to spare. Check Sig I have review of the case. I am using the 4 way classified as well. If you haven't paid case labs woould be a better buy.


----------



## Alatar

Quote:



Originally Posted by *sarojz*


@Alatar:

Thanks for replying back. Looking at the screenshots I uploaded ZI just realized they only depict stock settings and not the OC'd ones I claimed. I will upload more convincing ones in a little bit. Can you please update my "Proof" link as soon as I post the images? thanks


Yeah I'll update it when I can


----------



## sarojz

OK this should be readable now. Here's a







image that shows and specs.


----------



## Masked

Quote:



Originally Posted by *mistkron*


On a side note, I just can't help but to comment a little about Masked, I understand you might/could be a high profile guy from Alienware but throughout this whole thread I have seen nothing from you to back up the claims you had and the fact that you kept pulling in high profile contacts in every of your replies irks me. People don't need to know who you are meeting or where your boss is heading to next for a top secret meeting with nvidia or [ insert company ] yada yada. People just wanted to see some factual informations from you. Oh and don't pull off from a club just because you are throwing a hissy fit. -_- gosh


Actually...I've had all the 590's for sale, VIA a vendor, for a couple weeks, which is actually why I removed myself.

I sold our principal stock...I consider being part of a community a position of honor, thus I removed myself, there was no hissy fit...And I more/less nerd rage when I QQ, btw.

I'm also not in a position to provide data...Right now I'm under 15 NDA's, practically 1 NDA for every icon on this screen AND my OS is NDA atm...So, by my providing you guys with any information, I break MULTIPLE disclosure agreements every single attempt.

Which I did and was reprimanded for, once already.

Wasn't because of a hissy fit, more/less been having compatability issues out of my ass. I require 100% reliability and when it's not there, sorry but, that's the end of the line for this office.

With the discovery that our boards have...problems...I decided to give what's left of our personal stock, a second chance; thus I returned...

We'll see what happens but, my providing data = no no.


----------



## Alatar

Quote:



Originally Posted by *sarojz*


OK this should be readable now. Here's a image that shows and specs.

[/TD]
[/TR]
[/TABLE]
mind giving a validation link or a screen with your OCN name in it?


----------



## RagingCain

Quote:



Originally Posted by *mistkron*


Then again I noticed you have already apologized to RagingCains but still I am amazed how easy RagingCain forgives after the number of disrespectful replies you gave in this whole thread while he is trying to educate and giving proper solution to 590s owner. Well sometimes it must be the ego-demons at work.

@ Alatar , will be posting my screenshots soon so that I can be in the club as well.


Its really no big deal, if I did get upset, talking some trash, it really wouldn't be right or in my character. It would also show that I was full of crap when I said I wanted to help people, namely our little group, which is very true. The fact that EVGA have adjusted the BIOS has lead me to believe there really is something wrong. That and other people have confirmed it worked for them too gives it a bit of strength.

However, I actually have a confession to make. In regards to this being an issue, I think there is enough evidence going around from various people to be proof of a bit of a hiccup, but....

I on the other hand..... had a major PSU snafu. Apparently my primary GPU had one of its 6+2 pins semi disconnected at the PSU. All the plugs were in the GPU so all 8 pins, but in the PSU only 6 of the 6+2 pins were actually plugged in, its an easy mistake to make when the amount of modularized cables and cable management I had to deal with, but thats pretty bad. Now, this was only 1 of my 3 (6+2 pins), but this could easily explain why I was having the power outages. It could even lead to a GPU having errors. The fact though that the errors stopped after increasing GPU2's voltages by 13mV seems that indeed there was another legitimate issue. On the flip side, I found it pretty awesome that the card ran essentially on a 6x Pin, and a 6+2 pin all the way up to 830 MHz. I think thats a pretty cool snafu.

Here is my 2nd part of confessions (I sound like Usher now haha): Guess who was using a 16x slot with 8x slot for his 590s? This guy. Only realized it after a complete loop tear down. We don't officially know how much difference 8x vs. 16x makes for a GTX 590, but we know for a single GTX 580 its about ~1%. The reason I didn't think anything of the slots I was using because I have been using 2x GTX 580s for four months in the same slots. Re-using the same perfect water tubing cut for SLI completely left me in a diminish capacity so I didn't notice the big 8x on my Classified 3 PCI-E slot. Hopefully when I get back on the horse my benchmarks will be easier to run with the correct power (new PSU too) and 16x/16x, thus giving me some more numbers to destroy the 2x6990 score thats beating me.


----------



## rush2049

Lol, well ragincain. It seems these cards are enough to break anyone's build. They are very good at pointing out weak spots in the rest of your computer. I myself had to replace a little over half my computer.....

HEY NVIDIA, if you read these forums, please get your engineers to remove the artificial limitations in the bios and driver, ok? Thanks!


----------



## krazyatom

hey guys,
Is it overkill to buy a gtx 590 for World of Warcraft?
Should I buy another gtx 580 for SLI?
If you look at this picture below, single gtx 580 in 2560 x 1600 resolution is barely above 60 fps. I don't think it's future proof, so I need some advice.









By krazyatom at 2011-04-21


----------



## Arizonian

Quote:



Originally Posted by *krazyatom*


hey guys,
Is it overkill to buy a gtx 590 for World of Warcraft?
Should I buy another gtx 580 for SLI?
If you look at this picture below, single gtx 580 in 2560 x 1600 resolution is barely above 60 fps. I don't think it's future proof, so I need some advice.


It's over kill for one GTX 590 or even more so for GTX 580 SLI. WOW dosen't require more than a GTX 570 at maxed settings.

However on OCN overkill is usually norm.

If your itching for an upgrade regardless then your half way there just SLI the GTX 580.

You'll be set for years before graphic intensive games catch up with the pair and even longer for WOW.


----------



## sarojz

Quote:


> Originally Posted by *Alatar;13402644*
> mind giving a validation link or a screen with your OCN name in it?


ummmmm, Will this due? I'm sorry I'm new here so I'm trying to comply


----------



## Alatar

Quote:


> Originally Posted by *sarojz;13414595*
> ummmmm, Will this due? I'm sorry I'm new here so I'm trying to comply


You can get a validation link from the GPU-Z program itself. after clicking validate set your name to your OCN name and post the link you get here.









or alternatively take a screenshot with those clocks and have a notepad window open with your OCN name

I would much rather use that as proof.

Sorry if you have to go through some trouble, I'll update your clocks and remove the question mark but the validation link would still be nice for proof.


----------



## Masked

Quote:


> Originally Posted by *Arizonian;13413631*
> It's over kill for one GTX 590 or even more so for GTX 580 SLI. WOW dosen't require more than a GTX 570 at maxed settings.
> 
> However on OCN overkill is usually norm.
> 
> If your itching for an upgrade regardless then your half way there just SLI the GTX 580.
> 
> You'll be set for years before graphic intensive games catch up with the pair and even longer for WOW.


WoW framrates are capped.

Him buying another card is NOT going to change that regardless due to that cap.


----------



## Arizonian

Quote:


> Originally Posted by *Masked;13415727*
> WoW framrates are capped.
> 
> Him buying another card is NOT going to change that regardless due to that cap.


I know that's why I said the 570 was more than enough and he didn't have to SLI the card or get a 590.

I was implying if he had to absolutely do some sort of upgrade just because he had an upgrade itch, the better route was to get a second 580 since he already had one over buying a new 590 and having to sell his 580.


----------



## RagingCain

Quote:


> Originally Posted by *krazyatom;13413539*
> hey guys,
> Is it overkill to buy a gtx 590 for World of Warcraft?
> Should I buy another gtx 580 for SLI?
> If you look at this picture below, single gtx 580 in 2560 x 1600 resolution is barely above 60 fps. I don't think it's future proof, so I need some advice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By krazyatom at 2011-04-21


Yeah although I disagree with the frame cap at sixty, if I recall its 300, but SLI performs actually worse than single cards for WoW. Its Fermi and GPU usage is the problem. Well 5870 had the same issue for me, both in single, double, and tri setup.

WoW is a terrible game for considering "future proofing" although the proper DX11 code should/supposed to rectify this, I can't confirm that for you right now, and even people say its still buggy.

I will say that SLi 580 runs quite a few games @ 60+ fps, most dx9/dx10 at 120+. The only games you struggle with would be Metro2033 and Crysis/WarHead, and DA2(poorly optimized for nVidia). Having a 590 is a lot more fun in my opinion, its more of a challenge to get working optimal ahaha.

The GTX 580 sli is a pimp setup, tri/quad is really just icing on the cake. The GTX 580 SLi is the best setup you can get for 1920x1200 hands down. Only trouble is the VRAM wall it about 2560x1600x4aa or multi-monitor. If you are okay being adaptive with settings in some games it won't bother you.

Option 3: sell current card and pick up a pair of 3GB 580s for real future proof. Or just get 3x 6970s for your resolution









Although I think we all agree here, most of us are biased to the GTX 590 single, and SLI. Its, in my opinion, the underdog that everyone has written off, and its fun to let them shine. Nothing like a little: "Oh really? ::benchmark::"

Sent from my DROID2 using Tapatalk


----------



## krazyatom

Quote:


> Originally Posted by *Masked;13415727*
> WoW framrates are capped.
> 
> Him buying another card is NOT going to change that regardless due to that cap.


I don't think single gtx 580 is enough for me in 2560 x 1600 resolutions.
Quote:


> Originally Posted by *RagingCain;13416475*
> Yeah although I disagree with the frame cap at sixty, if I recall its 300, but SLI performs actually worse than single cards for WoW. Its Fermi and GPU usage is the problem. Well 5870 had the same issue for me, both in single, double, and tri setup.
> 
> WoW is a terrible game for considering "future proofing" although the proper DX11 code should/supposed to rectify this, I can't confirm that for you right now, and even people say its still buggy.
> 
> I will say that SLi 580 runs quite a few games @ 60+ fps, most dx9/dx10 at 120+. The only games you struggle with would be Metro2033 and Crysis/WarHead, and DA2(poorly optimized for nVidia). Having a 590 is a lot more fun in my opinion, its more of a challenge to get working optimal ahaha.
> 
> The GTX 580 sli is a pimp setup, tri/quad is really just icing on the cake. The GTX 580 SLi is the best setup you can get for 1920x1200 hands down. Only trouble is the VRAM wall it about 2560x1600x4aa or multi-monitor. If you are okay being adaptive with settings in some games it won't bother you.
> 
> Option 3: sell current card and pick up a pair of 3GB 580s for real future proof. Or just get 3x 6970s for your resolution
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Although I think we all agree here, most of us are biased to the GTX 590 single, and SLI. Its, in my opinion, the underdog that everyone has written off, and its fun to let them shine. Nothing like a little: "Oh really? ::benchmark::"
> 
> Sent from my DROID2 using Tapatalk


Thanks for your comment. I might consider getting a evga gtx 580 3gb, but asus gtx 590 is just $100 more









I would like to pay $100 more and get gtx 590 lol. WHat do you think?


----------



## Masked

Quote:


> Originally Posted by *krazyatom;13419039*
> I don't think single gtx 580 is enough for me in 2560 x 1600 resolutions.
> 
> Thanks for your comment. I might consider getting a evga gtx 580 3gb, but asus gtx 590 is just $100 more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would like to pay $100 more and get gtx 590 lol. WHat do you think?


Regardless of your resolution, WoW frames are capped, period.

At that resolution, if your primary game was WoW, you'd be more than fine on a GTX 570 as was said before...A 580 is a bit more overkill but, a 590 would be extreme overkill.


----------



## RagingCain

Quote:


> Originally Posted by *krazyatom;13419039*
> I don't think single gtx 580 is enough for me in 2560 x 1600 resolutions.
> 
> Thanks for your comment. I might consider getting a evga gtx 580 3gb, but asus gtx 590 is just $100 more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would like to pay $100 more and get gtx 590 lol. WHat do you think?


You know you want too









Its your money/your call, you have a GTX 580 now, so you will see little to no improvement just switching it for a 3GB card. You will see probably +40~50% performance with a single GTX 590 in most games. WoW is a wild card, and doesn't follow the normal conventions of 3d performance as it is heavy on CPU, heavy on network traffic, and data/system communication etc.. HDDs have been none to hold back WoW, but nothing beats faster GPUs/CPUs.

Just try putting WoW on one of your SSDs which ever route you go, that can aid quite a bit in the FPS rise+dropping.


----------



## krazyatom

Ok, just pre-ordered asus gtx 590 lol.
I will let you guys know how it's compared to my current gtx 580.


----------



## Masked

Quote:


> Originally Posted by *krazyatom;13419855*
> Ok, just pre-ordered asus gtx 590 lol.
> I will let you guys know how it's compared to my current gtx 580.


I wouldn't do that.

My understanding is that the Asus releases are finished and complete.

I'd scope out EVGA's page 2-3 times a day and wait.

My sources/vendors tell me that OUR stock has actually extended their "staggered" release, that's how many are left.

Your best bet, IMO, is EVGA or Ebay.


----------



## RagingCain

Quote:


> Originally Posted by *krazyatom;13419855*
> Ok, just pre-ordered asus gtx 590 lol.
> I will let you guys know how it's compared to my current gtx 580.


I am sure many people would appreciate it, try showing a comparison with FRAPS or MSI Afterburner displaying FPS memory usage etc.

A single 580 vs. 590 ya know?


----------



## Arizonian

Quote:



Originally Posted by *RagingCain*


I am sure many people would appreciate it, try showing a comparison with FRAPS or MSI Afterburner displaying FPS memory usage etc.

A single 580 vs. 590 ya know?


Two down clocked GTX 580's vs a single GTX 580 is not fair comparison. Compare it to the AMD 5970 or AMD 6990 would be more fair.


----------



## Masked

Quote:



Originally Posted by *Arizonian*


Two down clocked GTX 580's vs a single GTX 580 is not fair comparison. Compare it to the AMD 5970 or AMD 6990 would be more fair.


I believe Cain means more of a base to base comparison...

I still don't personally believe the 590 is "ready" for a head to head all considering...Especially with the voltage issues but, to each their own.

Still no word on a bios update going public I assume Mr. Cain? ~ Has been none here.


----------



## RagingCain

Quote:



Originally Posted by *Arizonian*


Two down clocked GTX 580's vs a single GTX 580 is not fair comparison. Compare it to the AMD 5970 or AMD 6990 would be more fair.


Yeah its just for anybody else switching from a 580 to a 590, I guess and plays WoW. Its a lot easier to make a purchase (or not) with data available.

Well EVGA sent me the BIOS Friday, but also said I should not share, they want me to test it and if its good they will make an official release







I of course have a mountain of parts right now, no working system.

Of course am very adamant about doing the right thing and if the BIOS update snafus any of you I will feel personally responsible.... of course I would also potentially be fixing your errors a week or two in advance...

I guess PMs sent to me stating you won't be mad at me if it kablooeys, or hold me responsible... if you PM me with 2 posts and a join date of... today, no way in hell. I will consider people who are familiar with PC internals.

I don't know if this will bold:
*I WILL NOT DO IT FOR REP!*

Its a win utility flash app too, not nvflash, so I assume that's why it took so long to make.

Sent from my DROID2 using Tapatalk


----------



## Arizonian

Quote:



Originally Posted by *RagingCain*


Yeah its just for anybody else switching from a 580 to a 590, I guess and plays WoW. Its a lot easier to make a purchase (or not) with data available.

Well EVGA sent me the BIOS Friday, but also said I should not share, they want me to test it and if its good they will make an official release







I of course have a mountain of parts right now, no working system.

Of course am very adamant about doing the right thing and if the BIOS update snafus any of you I will feel personally responsible.... of course I would also potentially be fixing your errors a week or two in advance...

I guess PMs sent to me stating you won't be mad at me if it kablooeys, or hold me responsible... if you PM me with 2 posts and a join date of... today, no way in hell. I will consider people who are familiar with PC internals.

I don't know if this will bold:
*I WILL NOT DO IT FOR REP!*

Its a win utility flash app too, not nvflash, so I assume that's why it took so long to make.

Sent from my DROID2 using Tapatalk


Duly noted.


----------



## ReignsOfPower

Quote:


> Originally Posted by *RagingCain;13421785*
> Yeah its just for anybody else switching from a 580 to a 590, I guess and plays WoW. Its a lot easier to make a purchase (or not) with data available.
> 
> Well EVGA sent me the BIOS Friday, but also said I should not share, they want me to test it and if its good they will make an official release
> 
> 
> 
> 
> 
> 
> 
> I of course have a mountain of parts right now, no working system.
> 
> Of course am very adamant about doing the right thing and if the BIOS update snafus any of you I will feel personally responsible.... of course I would also potentially be fixing your errors a week or two in advance...
> 
> I guess PMs sent to me stating you won't be mad at me if it kablooeys, or hold me responsible... if you PM me with 2 posts and a join date of... today, no way in hell. I will consider people who are familiar with PC internals.
> 
> I don't know if this will bold:
> *I WILL NOT DO IT FOR REP!*
> 
> Its a win utility flash app too, not nvflash, so I assume that's why it took so long to make.
> 
> Sent from my DROID2 using Tapatalk


Sounds good buddy!


----------



## RagingCain

EVGA have released their GTX 580 3Gbs.... I so wanted 3 of them.... deeeeeeeep sigh....

http://www.evga.com/products/moreInfo.asp?pn=03G-P3-1584-AR&family=GeForce%20500%20Series%20Family&sw=

Pleasantly surprised nobody has PMed me for the BIOS, keeping me safe and sound


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;13428306*
> EVGA have released their GTX 580 3Gbs.... I so wanted 3 of them.... deeeeeeeep sigh....


Reference PCB?


----------



## RagingCain

Quote:


> Originally Posted by *Alatar;13428661*
> Reference PCB?


Yeppppppp, just double capacity/density memory cluster/chips. Means they are waterblock friendly.

Don't get me wrong I am happy with the GTX 590s, but I am in a position to not only utilize more than 1.5GB but could actually benefit from it.

For now, I will just have to switch from Surround to single monitor on the more intense games.


----------



## badatgames18

do any of you guys experience really bad coil whining when benchmarking or stress testing? i am afraid mine will blow a capacitor or something. It's at stock and i haven't touched the voltage.


----------



## RagingCain

Quote:


> Originally Posted by *badatgames18;13428694*
> do any of you guys experience really bad coil whining when benchmarking or stress testing? i am afraid mine will blow a capacitor or something. It's at stock and i haven't touched the voltage.


Coming from a GTX 590 or the PSU? A lot of the time its from the PSU under stress.

I don't know anybody has mentioned coil whine on the 590s. At least not yet, not like there are thousands of them out there, and only a handful here on OCN, it is possible.

Just no noise here, and I have water cooling so it technically would be easier to hear something like that.


----------



## badatgames18

hmm.. i'm not sure where it's coming from, i'll check but i am using a seasonic x 850 so it's probably not it... might it be coming from something else? The rig i am using with it is in my profile somewhere

EDIT: switched to my sig rig i am using the 590 on


----------



## RagingCain

Quote:


> Originally Posted by *badatgames18;13428740*
> hmm.. i'm not sure where it's coming from, i'll check but i am using a seasonic x 850 so it's probably not it... might it be coming from something else? The rig i am using with it is in my profile somewhere
> 
> EDIT: switched to my sig rig i am using the 590 on


I am going to go on a limb and say I don't believe its normal, but many people do experience coil whine from GPUs or PSUs and never have an issue, other than it would drive me so freaking crazy I might throw it at a wall.

If you have another GPU, give it a try, if its the PSU (on the off chance a SeaSonic might be failing), it would be very very very good to catch it now... before he decides to commit hari-kari with all its friends.


----------



## badatgames18

Quote:


> Originally Posted by *RagingCain;13428948*
> I am going to go on a limb and say I don't believe its normal, but many people do experience coil whine from GPUs or PSUs and never have an issue, other than it would drive me so freaking crazy I might throw it at a wall.
> 
> If you have another GPU, give it a try, if its the PSU (on the off chance a SeaSonic might be failing), it would be very very very good to catch it now... before he decides to commit hari-kari with all its friends.


ok i'll try another gpu... just got it yesterday and starting playing with it today... i hope nothing is broken or defective.


----------



## RagingCain

Quote:


> Originally Posted by *badatgames18;13428984*
> ok i'll try another gpu... just got it yesterday and starting playing with it today... i hope nothing is broken or defective.


One of my launch 590s was DOA, it happens









It really suck because I had card numbers 47 and 48. Serial twins










I decided to RMA them both though, and got a fresh pair in return, again serial twins 144,145.


----------



## sarojz

Question, I've successfully overclocked my gtx590 and its stable at it's current settings but I can't figure out why every time I reboot my system for any reason the gpu falls to stock settings again. What I'm I missing?


----------



## badatgames18

if you are using msi afterburner click the option to apply overclock at startup...

that way you won't have to keep reapplying it... but i'd recommend keeping it stock for web browsing,general non intensive activities

Update on coil whining... i don't know what it was but i rma'ed my 590. I hope i get it back within this week


----------



## sarojz

Sorry I forgot to mention I am using EVGA's Precision/tester app. I'm a Avid PC gamer & Enthusiast so the ability to OC is important.


----------



## badatgames18

well i haven't used precision before but it should be the same thing as msi just with a different gui i take it...

there should be an option that you can choose to automatically apply your settings upon boot. At least Afterburner has this option... precision should also

EDIT: i haven't heard good things about precision... i'd recommend switching software and going with afterburner...

the latest beta is 2.2 beta 2


----------



## badatgames18

here is an example i circled it
View attachment 209648


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13428306*
> EVGA have released their GTX 580 3Gbs.... I so wanted 3 of them.... deeeeeeeep sigh....
> 
> http://www.evga.com/products/moreInfo.asp?pn=03G-P3-1584-AR&family=GeForce%20500%20Series%20Family&sw=
> 
> Pleasantly surprised nobody has PMed me for the BIOS, keeping me safe and sound


Any update yet?

I heard they had a couple guys there all weekend but, that's hear-say, obviously...

I really don't want to screw with these cards until it's released from them...BLEH.


----------



## Razi3l

I had one.. but it's dead lol. I'd join once I get the replacement but I don't have a rig.. ;D


----------



## RagingCain

Quote:


> Originally Posted by *Razi3l;13439212*
> I had one.. but it's dead lol. I'd join once I get the replacement but I don't have a rig.. ;D


Lol Raz, I thought you were more AMD/ATi?

Happened to me too, when I first got them. Guy almost convinced me it was my PSU... almost -.-

@Masked,
Sorry haven't installed it yet, still no computer, coming to you live from Netbook.

The Tower of Power should be here in the next hour or so....







<-excited eek.

You guys are more than welcome to visit the build log. My case is very similar to Juggalo's but with less e-peen and more pew pew


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13439907*
> Lol Raz, I thought you were more AMD/ATi?
> 
> Happened to me too, when I first got them. Guy almost convinced me it was my PSU... almost -.-
> 
> @Masked,
> Sorry haven't installed it yet, still no computer, coming to you live from Netbook.
> 
> The Tower of Power should be here in the next hour or so....
> 
> 
> 
> 
> 
> 
> 
> <-excited eek.
> 
> You guys are more than welcome to visit the build log. My case is very similar to Juggalo's but with less e-peen and more pew pew


I'm in the process of doing the same w/our entire office...Mine comes last









Have a Sabertooth P67 sitting here for my PC but, with EVGA's eta being this//next week it has me teething.

The Sabertooth does offer a 5 year warranty though so, I've been back and forth...Can't decide.

2600k A batch is even sitting on my desk









Rawr.

On the 590, I'm primarily waiting on EVGA//NVIDIA because of the OCP or whatever is limiting...Will be interesting to see how they edit the bios/drivers...Especially if they're as connected as I believe they are...


----------



## Razi3l

Quote:



Originally Posted by *RagingCain*


Lol Raz, I thought you were more AMD/ATi?

Happened to me too, when I first got them. Guy almost convinced me it was my PSU... almost -.-


Hey.. I like both sides of the fence. NVIDIA have really sucked imo this time around (590 vs 6990) but I hope the next series doesn't make me so disappoint. The real problem is the prices in the UK compared to the Radeon's which are equal or faster and cheaper. The GTX 400 series is now really cheap and the newer 500 aren't. Same could be said of the AMD cards though but NVIDIA pricing still sucks. And I read that "Its your PSU" post you made back then.


----------



## Masked

Quote:


> Originally Posted by *Razi3l;13440143*
> Hey.. I like both sides of the fence. NVIDIA have really sucked imo this time around (590 vs 6990) but I hope the next series doesn't make me so disappoint. The real problem is the prices in the UK compared to the Radeon's which are equal or faster and cheaper. The GTX 400 series is now really cheap and the newer 500 aren't. Same could be said of the AMD cards though but NVIDIA pricing still sucks. And I read that "Its your PSU" post you made back then.


I actually disagree with this.

Reasons are:

1) The 6990's had the same problems on release, maybe not the same "quality" issues but, the drivers sucked tremendously...Experienced them personally as I have 2x XFX 6990's sitting to my right in their boxes.

2) 590's with unlocked/custom drivers etc, are actually out-performing 6990s...This is seen on Guru, 3dmark's latest scores...etc.

3) As the 590 matures, it's just going to surpass the 6990 more on a retail level than some enthusiasts already are...This is more of an observation but, as Cain has noted + several others, it's really already started down that road.


----------



## Razi3l

Quote:



Originally Posted by *Masked*


I actually disagree with this.

Reasons are:

1) The 6990's had the same problems on release, maybe not the same "quality" issues but, the drivers sucked tremendously...Experienced them personally as I have 2x XFX 6990's sitting to my right in their boxes.

2) 590's with unlocked/custom drivers etc, are actually out-performing 6990s...This is seen on Guru, 3dmark's latest scores...etc.

3) As the 590 matures, it's just going to surpass the 6990 more on a retail level than some enthusiasts already are...This is more of an observation but, as Cain has noted + several others, it's really already started down that road.


I wasn't really talking about performance, I wouldn't care if it was (or is) slower than a 6990. It just sucks because overclocking isn't as "safe".. and whatnot. Kind of spoils the fun for me.


----------



## Masked

Quote:



Originally Posted by *Razi3l*


I wasn't really talking about performance, I wouldn't care if it was (or is) slower than a 6990. It just sucks because overclocking isn't as "safe".. and whatnot. Kind of spoils the fun for me.


Overclocking isn't "safe" because, from what I understand, 1000+ morons, OC'd their cards past the safety barrier on release day and required RMA's.

This isn't really a situation of it's 100% their fault...I mean you could say, they could have used higher quality parts which, is true but, this "lock" is due MOSTLY to the public.

Had there not been so many RMA's the first few weeks, I almost guarentee you we'd be able to overclock at our whimsy but, because of "us", we are not.

Which, if you've noticed is slowly changing, driver to driver.

There are some other issues, true but, as this card matures it will certainly beat out the 6990.


----------



## RagingCain

I agree with Masked, I feel quite a few people ruined it for the majority. Plus the OCP/PDL as it is, totally prevents the danger zone for occuring. The VRM voltage is decent few mV below the GPU voltages anyway. Of all the dying GTX 590 crap that occurs its also been on standard heatsinks and the initial drivers, not waterblocks (which sucks to have at the moment due to voltage locking) and newer drivers.

EVGA could count their RMAs on the first 200 that they sold with 6 fingers, and 2 were mine, lol. I mean taken with a grain of salt, of course, its EVGA telling me this, not some third party investigator, but still, they don't really have a reason to lie. They already had my business, and I am still able to return them.

But still you don't really need to overclock them, but if you do, you are still able too. I would recommend a waterblock though to be safe.


----------



## RagingCain

Yay case is finally here! I will be in my Build Log thread if anybody needs me.

http://www.overclock.net/water-cooli...-nameless.html


----------



## hendrix84

Quote:



Originally Posted by *kazukun*


nvflash -5 -6 -i1 (bios).rom
nvflash -5 -6 -i2 (bios).rom
http://blue.ap.teacup.com/kazukun/104.html


Hello

You can give me the command to flash a single graphics card?
a tutorial that explains thank you.

what nvflash you use?


----------



## Masked

Quote:



Originally Posted by *hendrix84*


Hello

You can give me the command to flash a single graphics card?
a tutorial that explains thank you.

what nvflash you use?


This is the primary reason why I'd wait for an "official release".

Rumor is, there will be a fix coming VERY soon VIA either EVGA or Nvidia; I'd suggest waiting.

I'm not going to get into my issues with NVflash//End user flashing, right now but, I'd strongly urge some patience in this matter.

It won't be long, Jacob has given word to more than a few people that a fix will be out soon.


----------



## capchaos

Question for U guys. Jst closed the case on the fellow that was suppose to buy my 2nd gtx590 
Trying to decide if I should re list it or put it back in my pc? What are your opinions.
Thanks


----------



## RagingCain

Quote:


> Originally Posted by *capchaos;13457080*
> Question for U guys. Jst closed the case on the fellow that was suppose to buy my 2nd gtx590
> Trying to decide if I should re list it or put it back in my pc? What are your opinions.
> Thanks


Well you do have 4 GTX 580s yes? I would sell both of the 590s, run that computer on some cheaper AMDs, and get all your money back on the 580s


----------



## capchaos

Yup sure do. Already resisted it. Maybe the next guy will actually pay for it. I am keeping the one however Jst for my acer 27invh 3d for 3d vision. It's rocking on my old amd board with a phenom 2 965 be @ 4.0 wad using both 590 for a bit with the sli hack

Quote:



Originally Posted by *RagingCain*


Well you do have 4 GTX 580s yes? I would sell both of the 590s, run that computer on some cheaper AMDs, and get all your money back on the 580s


----------



## Anthraxinsoup

I go with 3 6970s for like 965 or some ****. Best bet, but you got money to blow.


----------



## Arizonian

Quote:



Originally Posted by *Anthraxinsoup*


I go with 3 6970s for like 965 or some ****. Best bet, but you got money to blow.


This is the *Nvidia* GTX 590 Owners Club, on what thread did you think you were posting? If we want performance and have money why would anyone pick AMD and settle for less?


----------



## Anthraxinsoup

Quote:



Originally Posted by *Arizonian*


This is the *Nvidia* GTX 590 Owners Club, on what thread did you think you were posting? If we want performance and have money why would anyone pick AMD and settle for less?










I said if he wants to save money, and said since he has it go nvidia. Derp.


----------



## capchaos

True that.

Quote:



Originally Posted by *Arizonian*


This is the *Nvidia* GTX 590 Owners Club, on what thread did you think you were posting? If we want performance and have money why would anyone pick AMD and settle for less?


----------



## capchaos

Been ATI before. Had dual 5970's biggest pain in the ass ever. When they worked they were great. Had to wait months Jst for and to fix drivers for it. Then had to wait for new bios to fix cold bug. Never again


----------



## 2010rig

Quote:



Originally Posted by *capchaos*


Been ATI before. Had dual 5970's biggest pain in the ass ever. When they worked they were great. Had to wait months Jst for and to fix drivers for it. Then had to wait for new bios to fix cold bug. Never again


But they say that AMD driver issues are a myth, and most likely user error.


----------



## Anthraxinsoup

Quote:



Originally Posted by *capchaos*


Been ATI before. Had dual 5970's biggest pain in the ass ever. When they worked they were great. Had to wait months Jst for and to fix drivers for it. Then had to wait for new bios to fix cold bug. Never again


Hasn't happened on my 6970s for myself, or anyone I know of. ATI and Nvidia always trade blows. 5970s use to be the fastest cards. I like Nvidia, but the 6970s told me, Kepler maybe will.


----------



## ReignsOfPower

Thanks for the BIOS RaginCain. I was wondering, do you think that BIOS update is compatible with the GTX590 100% fan un-locker that I am currently using?

http://www.evga.com/forums/tm.aspx?m=956401


----------



## Masked

Quote:



Originally Posted by *Anthraxinsoup*


Hasn't happened on my 6970s for myself, or anyone I know of. ATI and Nvidia always trade blows. 5970s use to be the fastest cards. I like Nvidia, but the 6970s told me, Kepler maybe will.


As the owner of 6990's myself, I'm not happy.

I'll let you in on a dirty little secret...When I sold our stock of 590's I dropped both 6990's back in...Within @20 seconds, I was instantly reminded of why I hated them in the first place...UGH.

Drivers still aren't working and I even have it's own windows setup on a seperate SSD...*sigh*.


----------



## RagingCain

Quote:



Originally Posted by *ReignsOfPower*


Thanks for the BIOS RaginCain. I was wondering, do you think that BIOS update is compatible with the GTX590 100% fan un-locker that I am currently using?

http://www.evga.com/forums/tm.aspx?m=956401


Good question, if I were you, I would hold off on re-applying it just in case.

Quote:



Originally Posted by *capchaos*


Been ATI before. Had dual 5970's biggest pain in the ass ever. When they worked they were great. Had to wait months Jst for and to fix drivers for it. Then had to wait for new bios to fix cold bug. Never again


Sounds like me and my 5870 experience, 3 cards virtually unusable 50% of the time. Nothing like a little ATIKMDAG.SYS driver failure, BSODs, GSODs, flickering, artifacting. The most expensive junk setup I ever bought









I always say never buying AMD again, but more than likely I will try again in a generation or two, going with an all AMD build with a bulldozer 2nd gen.

Quote:



Originally Posted by *Anthraxinsoup*


I go with 3 6970s for like 965 or some ****. Best bet, but you got money to blow.



Quote:



Originally Posted by *Anthraxinsoup*


I said if he wants to save money, and said since he has it go nvidia. Derp.


One of these is not like the other. Kind of looked like you said he wasted money in the first one 

Quote:



Originally Posted by *Anthraxinsoup*


*Hasn't happened on my 6970s for myself, or anyone I know of.* ATI and Nvidia always trade blows. 5970s use to be the fastest cards. I like Nvidia, but the 6970s told me, Kepler maybe will.


Let me guess, no experience with the 58xx/59xx generation of cards, or just like repeating what every 6xxx gen user keeps saying?


----------



## RagingCain

Double post.


----------



## Alatar

Quote:



Originally Posted by *RagingCain*


5870 experience


crossfire 5870s *shudder*


----------



## [email protected]

My gtx 590


----------



## Anthraxinsoup

58/9xx series aren't the 69xx, can't hold it against them.


----------



## Alatar

Updating...

E: wait a sec

@ [email protected] fr

Can you state the card brand? ASUS, EVGA etc.
Quote:


> Originally Posted by *Anthraxinsoup;13479695*
> 58/9xx series aren't the 69xx, can't hold it against them.


Well they are made by the same company and the drivers are designed by the same team. I'm not saying the 6xxx is as bad but I still don't think I'm going to get a multiple GPU solution by AMD in the near future.


----------



## Anthraxinsoup

Quote:



Originally Posted by *Alatar*


Updating...

E: wait a sec

@ [email protected] fr

Can you state the card brand? ASUS, EVGA etc.

Well they are made by the same company and the drivers are designed by the same team. I'm not saying the 6xxx is as bad but I still don't think I'm going to get a multiple GPU solution by AMD in the near future.


The 6970 scales better than the 570 now.


----------



## RagingCain

Quote:


> Originally Posted by *[email protected];13479544*
> My gtx 590


What a beautiful first post. I am guessing ASUS or MSI, probably MSI de francais


----------



## Alatar

Quote:


> Originally Posted by *Anthraxinsoup;13480034*
> The 6970 scales better than the 570 now.


I was aware of the 5870s scaling bad when I bought them but that's not why I had a love-hate relationship with them.

My problem was the drivers. And no it wasn't user error either, I did everything by the book every time I installed or configured something. The cards were just riddled with problems and annoyances.


----------



## [email protected]

this is a gainward my GTX590







the block koolance is beautiful
sorry my english is bad


----------



## Alatar

Quote:



Originally Posted by *[email protected]*


this is a gainward my GTX590







the block koolance is beautiful
sorry my english is bad


no problem









And yes the waterblock looks absolutely awesome on it! Going to update you to the member list in a sec.


----------



## Arizonian

Quote:



Originally Posted by *Anthraxinsoup*


The 6970 scales better than the 570 now.










Why did you bench mark them youself? Next time you make a claim back it up or it means really nothing. Go make the claim or troll on another thread.

Plus don't expect us to like AMD or care about AMD on the Nvidia club threads, don't take offense. Might find a lot of us don't feel the way you do and it obviously bothers you. I'm sure you have more contructive things to do.

For the most part if you bothered to read the entire thread this is about the GTX 590's. These guys sometimes make an off topic remark. It's all about the BIOS, benching, and questions on experience with the 590. They rarely give their opinions on other cards, until you interjected an off topic post. So let them say what they want in their club thread. You don't have anything to prove to us.

*Arizonian goes back to reading quietly in the back ground as he learns from 590 members*


----------



## [email protected]

Quote:


> Originally Posted by *Alatar;13485339*
> no problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yes the waterblock looks absolutely awesome on it! Going to update you to the member list in a sec.


thank you for adding


----------



## [email protected]

Quote:


> Originally Posted by *capchaos;13265115*
> It idles and downclocks Jst like normal only thing was changed was the 3d min voltage the 2d voltage was left alone. Biggest reason I flashed to a higher min voltage was so I could be able to overclock on the newer drivers since it locks to ur bios min voltage


hello,what nvflash you use?


----------



## anand00x

Just received my 590 today after weeks of waiting. Did my first bench 3dmark06.


Does this score seem right?


----------



## RagingCain

Who is using 270.61?? Anybody magically getting their clocks set to 700 @ 0.925v... and even stranger stable?

I am trying to duplicate how this could have happened on its own. Was not, I repeat, NOT, higher than 630 at fresh install.
Quote:


> Originally Posted by *anand00x;13499304*
> Just received my 590 today after weeks of waiting. Did my first bench 3dmark06.
> 
> 
> Does this score seem right?


WIE looks fine, can't help on the 3DMark'06 (not area of experience) I am sure one of these salty devils can help you.

P.S. Just for validation, can you post a screenshot with GPU-z running so we can see the GTX 590, BIOS Version, Driver version etc etc.
Quote:


> Originally Posted by *ReignsOfPower;13497975*
> Haha yeah you know I used to get paid over 50k doing QA 7 years ago when I was 19 for a big telecommunications company. It wasn't even bug testing related. If you never let problems slip through your fingers you are very valuable to the company. You learn a lot in the process also.
> 
> Sucks concerning the BIOS update, might bite the bullet and just flash to the fan unlocked BIOS for now until it gets sorted? What do you think? Probably should be asking that question back over at the GTX590 thread so feel free to reply there before we sidetrack this thread too far


Well just a heads up, the FLASH was a flop, it doesn't recognize my GPU and I don't know if Reign has tried it, I hope he does to confirm that it doesn't help him... just so we can suffer together haha.

I think they just forgot to change the device ID recognition, probably very simple fix, I already passed it along to Jacob/EVGA. Sorry if I got anybody's hopes up :|

I have also wanted to add the voltages are the same on every single driver .51/.84/.85/ .91, and 270.51/.61. The only weird thing I noticed as I first mentioned up above, I just had a fresh boot, 700 clock, 964 on memory, and 0.925v on the voltages. I had to hit reset in MSI afterburner to bring it down. I will see if a reboot brings it back. Very curious about this. Before anybody asks though, I have no overclocks applied at startup.

Edit: *Reboot brings it back. One more thing to note, I have the EVGA SLI Enhancement installed, but as far as I know, it has never adjusted clock speeds before.*


----------



## RagingCain

Edited: Double Post.


----------



## Alatar

Quote:


> Originally Posted by *anand00x;13499304*
> Just received my 590 today after weeks of waiting. Did my first bench 3dmark06.
> 
> pic


Quote:


> Originally Posted by *RagingCain;13500702*
> P.S. Just for validation, can you post a screenshot with GPU-z running so we can see the GTX 590, BIOS Version, Driver version etc etc.


@anand00x

see above. There is nothing in your screenshot to indicate you have a GTX 590.


----------



## ReignsOfPower

Yep flash is dud for me too. Identical problem to what you mentioned. Shouldn't be a tough fix though on EVGA's part. Let us know if they reveal any new information


----------



## RagingCain

Can anybody with EVGA test the SLI-Enhancement if they are using the 270.61 drivers???

This clock change is weird!!!


----------



## anand00x

Quote:


> Originally Posted by *Alatar;13501222*
> @anand00x
> 
> see above. There is nothing in your screenshot to indicate you have a GTX 590.


----------



## RagingCain

Well this sucks, new GTX 590 revision?
http://vr-zone.com/articles/revised-geforce-gtx-590-cards-in-june/12222.html


----------



## Alatar

yeah I know :/

I might try to get one somehow, but I dunno if we're lucky some AIBs or nvidia might be generous and give us replacements or something. After all the people buying this card are so small of a minority that it probably wouldn't even scratch their profits.

also updating in a sec.


----------



## Nova.

There will have to be a trade in system or something....


----------



## RagingCain

Quote:


> Originally Posted by *Nova.;13505895*
> There will have to be a trade in system or something....


We can only hope, but lets not forget a number of us with "faulty" cards have a few world ranking spots on HWBOT









My advice is if you are interested in this, immediately (if not sooner







) start contacting manufacturers for more info. The quicker you act the quicker you can get to the truth or replacement.


----------



## Alatar

Quote:


> Originally Posted by *RagingCain;13505965*
> (if not sooner
> 
> 
> 
> 
> 
> 
> 
> )


You have some more useful info maybe?


----------



## Masked

I don't find this at all surprising -- Mentioned the possibility a while back, actually.

Quite honestly, I don't personally think they have the physical GPU's unless they did another batch...

But, if it is true, they're likely won't be a trade up for them...

Let's be honest...It's not something you can just swap out, it's an entire card and then...What happens for RMA's? You get back your original card?

It just doesn't work like that.

Will be very interesting to see what happens if it's more than just a rumor.


----------



## RagingCain

Quote:


> Originally Posted by *Alatar;13506009*
> You have some more useful info maybe?


No not this time, we won't hear back anything till Monday (which will probably be no plans at this time type of answer) when EVGA gets back into the office. I am still waiting on Jacob / EVGA to fix my BIOS update so I can actually install it, instead of look at it on my desktop.

I actually had a laptop replaced by ASUS/nVidia at the start of their Mobile GPU fiasco back in the 9600M/260M craziness just because I was consistent in communication and quick to the beat, instead of joining a stupid class action suit thats still going on. I believe the laptop now replacement for those old laptops is slower than my Droid2 phone, where as I got a dollar to dollar / model for model replacement right there and then.

I suggest all of us (at least EVGA 590s) get over to EVGA forums, just to chime in:
http://www.evga.com/forums/tm.aspx?high=&m=1013654&mpage=1#1013729

I asked in a polite manner whats the dealio / is it true.

Masked is probably right though, if it is true, even EVGA might say tough luck. I have a plan for that though and it involves a lot of VRMs, RMAs, and lighters. Just kidding hahhahaha... (I hope.)


----------



## anand00x

I can not edit the voltage on Afterburner. I can edit everything else but not the voltage. What are some decently stable overclock numbers without doing any voltage editing for
Core Clock
Shader Clock
Memory Clock


----------



## RagingCain

Quote:


> Originally Posted by *anand00x;13508016*
> I can not edit the voltage on Afterburner. I can edit everything else but not the voltage. What are some decently stable overclock numbers without doing any voltage editing for
> Core Clock
> Shader Clock
> Memory Clock


Stock is pretty much it. Some users have gotten 650 out of it. I was not successful at overclocking with stock voltage. I did run 675 @ 0.938v though.

To change voltage you have to use either driver (use 85) 267.84 / 267.85.


----------



## Alatar

I was able to go up to 660mhz on the core with stock volts (0.925). haven't tried overvolting so I can't comment on that.


----------



## exlink

Not sure if this has been posted yet, but VR Zone has information on a revised version of the GTX 590 coming out in June.


----------



## remer

If they do come out with a second rev, there will most likely be a flood of original 590 RMAs that all of a sudden mysteriously fry. I'm not really sure how to feel about it. At one end I'm upset that there is little to no "voltage tweaking" capability as ASUS advertised but at the other end I'm not a fan of people breaking there equipment to take advantage of the RMA system. That crap is bad for everyone in the long run. I can't even think right now. Time for bed.


----------



## Arizonian

To think I was upset when EVGA didn't allow me to 'step up' from the GTX 580 to 590 within my 90 day period. At the time I didn't realize it but they did me a favor.









I really hope they save face with current owners to replace with RMA the first cards if this is in fact true. Going to be interesting how Nvidia approaches this. Can't imagine there was a lot of cards out there already for them not to take it in the chin







and do what's right.

Also going to possibly be a second shot at taking back 'king of the hill' flagship card for dual GPU.


----------



## maur0

Quote:


> Originally Posted by *Arizonian;13522242*
> I really hope they save face with current owners to replace with RMA the first cards if this is in fact true. Going to be interesting how Nvidia approaches this. Can't imagine there was a lot of cards out there already for them not to take it in the chin
> 
> 
> 
> 
> 
> 
> 
> and do what's right.


Exchanging older for a new rev. will not be Nvidia's, but AIB's decision.

Those who did advertise it as overclocker's card, like ASUS I think, should offer this to customers. This would be the right and the moral thing to do.

But I wouldn't lose much sleep over this, as we can't know if this new rev. will be better then original.


----------



## Masked

Aight well, here's my opinion.

There are 2 ways to look at this...

1) Nvidia has never accepted the fact publicly that anything was wrong with the 590...They've maintained a hard stance of perfection since day 1 so, I find a revision to be kind of bs but, I can also see why they would.

A new revision allows Nvidia to release a modified Bios/Driver set without looking like morons...It gives them the ability to sneakily give us the same bios that's on that card with a Voltage limit of say 1.05v ~ Which was what Nvidia themselves said was safe to begin with.

2) If they release a revision, I HIGHLY, HIGHLY, MEGA doubt that we'll be able to "step up"...Like I said, if you do the math, unless they did ANOTHER batch of 580 fermi cores, we're at the end of the rope...Very end of that rope.

There aren't ENOUGH physical cores to do a 2nd revision.

There ARE enough to do a VERY limited release but, it won't cover retail, far from.

So the issue you now create is...Who gets the cards?

Do I get a card because I'm #30? Do YOU get a card because you're #150?

It causes MAJOR issues.

~~~~~

Again this is just my opinion, not fact by any means but, you can't release this card without causing problems...There's just no way.


----------



## exlink

This whole limited release thing I'm finding to be crap. Its been 2 months since they've been initially released and they still are coming back in and out of stock at various online retailers.

People said that they went out of production a long time ago and yet we constantly get a new batch of them every two weeks or so. And now there is a supposed revised version of it coming out in June? Really starting to doubt the whole "limited production" theme with this card, unless this is one of the longest limited production items I've recently seen stay in production.


----------



## Masked

Quote:



Originally Posted by *exlink*


This whole limited release thing I'm finding to be crap. Its been 2 months since they've been initially released and they still are coming back in and out of stock at various online retailers.

People said that they went out of production a long time ago and yet we constantly get a new batch of them every two weeks or so. And now there is a supposed revised version of it coming out in June? Really starting to doubt the whole "limited production" theme with this card, unless this is one of the longest limited production items I've recently seen stay in production.


























There were 2,000 units (to my knowledge, I actually think it's less) or less actually released, give or take...That's EXTREMELY limited and they staggered that release meaning there will be less and less available each cycle.

This was mostly public knowledge on release when all retailers said this was the case, including EVGA.

Cherry picked GPUs are cherry picked...limited.

I still don't think there are enough left for a revision...It's just not feasible unless they ate RMA stock.


----------



## exlink

Quote:



Originally Posted by *Masked*

























There were 2,000 units (to my knowledge, I actually think it's less) or less actually released, give or take...That's EXTREMELY limited and they staggered that release meaning there will be less and less available each cycle.

This was mostly public knowledge on release when all retailers said this was the case, including EVGA.

Cherry picked GPUs are cherry picked...limited.

I still don't think there are enough left for a revision...It's just not feasible unless they ate RMA stock.


So since they were released in March, 2 months ago, you're saying that less than 2,000 units were sold combined from every single online and brick and mortar retailer?

I have a feeling that Nvidia possibly intended it to be a limited release, but due to a surprisingly high demand for it they've put it into a continuous production, but sparse release. That would also explain this possible revision next month.


----------



## Masked

Quote:



Originally Posted by *exlink*


So since they were released in March, 2 months ago, you're saying that less than 2,000 units were sold combined from every single online and brick and mortar retailer?

I have a feeling that Nvidia possibly intended it to be a limited release, but due to a surprisingly high demand for it they've put it into a continuous production, but sparse release. That would also explain this possible revision next month.


There aren't enough GPU cores for that to happen...Period.

That's why I think this whole revision scenario is a load of crap.

My understanding (What they told the company I work for) is that there were only X number of units and this has ALWAYS been a staggered release of 25-50 units per sent to the Egg and TG...Always 25-50...Period.

We have 6/7/8 something like that in this office...They're all serial twins except for 2 RMA's.

The 2 RMA's are ending in 7xx#'s...That leads me to believe that if RMA stock is starting below the 1k's that RETAIL release for each company was LESS than that.

Now, this is 100% specualtion and a massive assumption but, do the math.

Cherry picked run that requires 2x 580 GPU's...There has been NO production run of the 580's since. That leaves you with a depleted source and no room for new cards UNLESS you pull RMA stock OR re-run production.

Considering how close they are on Kepler, I find it HIGHLY unlikely that they'll do another 580 run just to save face.

In all likleyhood, these 595's are re-branded 590's with a new BIOS and new drivers...+ Nvidia will take the opportunity to release new drivers/bios with that release and save face.

Personally, considering their past and the 9800 series, 9800x2 to be more specific, I believe strongly this is what they'll do...I've been wrong before so, who knows but, Nvidia has a history of doing just the above.


----------



## goldboy

Just bought an asus gtx 590 from amazon for $600??? it says estimated shipping time 1-2months

so will I receive revision 2? because i do not want revision 1


----------



## Arizonian

Quote:



Originally Posted by *goldboy*


Just bought an asus gtx 590 from amazon for $600??? it says estimated shipping time 1-2months

so will I receive revision 2? because i do not want revision 1


They haven't come out with revison 2 or even officially either. So you purchased the first release. So far it's rumor that there's a revision 2.

Speculation says the new GTX 590 is two watered down GTX 580 3Gb version for GTX 690 6Gb. Instead of the GTX 590 3Gb.

Yet to be determined.


----------



## ReignsOfPower

Quote:



Originally Posted by *Arizonian*


They haven't come out with revison 2 or even officially either. So you purchased the first release. So far it's rumor that there's a revision 2.

Speculation says the new GTX 590 is two watered down GTX 580 3Gb version for GTX 690 6Gb. Instead of the GTX 590 3Gb.

Yet to be determined.


I don't see what the density of memory chips they use on the 3GB version of the 580 has anything to do with the core itself?


----------



## Arizonian

Quote:



Originally Posted by *ReignsOfPower*


I don't see what the density of memory chips they use on the 3GB version of the 580 has anything to do with the core itself?


I did mean 590 - type o on 690. But I was just pointing out, it's not sure really if there will be a revision of the orignally 590 to the previous poster with a revision 2. It's speculation, for all we know it's just another version of 590 without any change to it. We'll have to sit tight.


----------



## ReignsOfPower

Quote:



Originally Posted by *Arizonian*


I did mean 590 - type o on 690. But I was just pointing out, it's not sure really if there will be a revision of the orignally 590 to the previous poster with a revision 2. It's speculation, for all we know it's just another version of 590 without any change to it. We'll have to sit tight.


Ah. Noted. Very true, only time will tell.







While I'd love a new revision (especially as its a premium level product) I don't think it will happen easily. Luckily I purchased an EVGA and their warranty is second to none. Especially in Australia (If a 580 SC dies and its EOL and there's a 680 SC out as a replacement, you get the 680 SC) Cool no?


----------



## ilukeberry

http://www.theinquirer.net/inquirer/...designs-nvidia

*No new Geforce GTX 590 board designs says Nvidia*


----------



## Masked

Quote:



Originally Posted by *ilukeberry*


http://www.theinquirer.net/inquirer/...designs-nvidia

*No new Geforce GTX 590 board designs says Nvidia*


Hate to say this but...Told you guys...

There just aren't enough GPU's with Kepler on the rise to feasibly waste the time on an old GPU.

Not only that but, Kepler, to my knowledge, is already in production so, they'd literally have to stop, re-produce and go from there.


----------



## kcuestag

Quote:



Originally Posted by *Masked*


Hate to say this but...Told you guys...

There just aren't enough GPU's with Kepler on the rise to feasibly waste the time on an old GPU.

Not only that but, Kepler, to my knowledge, is already in production so, they'd literally have to stop, re-produce and go from there.


I agree with you, they're probably too busy with the production of Kepler.


----------



## ilukeberry

Anyway can i join the club i have

Point of View/TGT "Charged" GeForce GTX 590

The card is factory overclocked by TGT @ 668Mhz default voltage is 0.9620 V

Card runs perfectly, hasn't blown up yet









Pictures:

Box:









Card:

















Inside my HAF X:









GPU-Z


----------



## Alatar

Welcome aboard!


----------



## rush2049

New Drivers:

275.27
x64: http://www.nvidia.com/object/win7-wi...nvidiaupdate=1
x86: http://www.nvidia.com/object/win7-wi...nvidiaupdate=1

And an article talking about the release: http://www.geforce.com/#/News/articles/r275-driver

And the release notes themselves: http://us.download.nvidia.com/Window...ease-Notes.pdf

Something to note, besides the performance increases the release notes have no mention of any issues resolved or open about the gtx590, but it does say it is supported.


----------



## [email protected]

My score 3m Mark 11 performance ^^


----------



## RagingCain

I am on the new drivers and the new BIOS.

Already has issue :| Perhaps I just fix voltage mah self?

New drivers still have voltage lock.


----------



## [email protected]

my score 3d mark vantage


----------



## ReignsOfPower

Getting weird voltages again Ragin? I havnt touched the BIOS tweak since the first try. I've not put on the Fan unlocked BIOS either so my card is a .90 and a .91 BIOS still


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13547673*
> Getting weird voltages again Ragin? I havnt touched the BIOS tweak since the first try. I've not put on the Fan unlocked BIOS either so my card is a .90 and a .91 BIOS still


Yeah I got a new BIOS from Jacob yesterday, going to see what he says about the voltage issue, so far I have had no errors/issues but with 4 GPUs its a little harder and takes longer to ensure stability. Plus you really have to stress them which I haven't had time to do. So we will see.

I will try Witcher 2 tonight, apparently its the new Crysis graphics crushing wise, although BOINC maybe better at this.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13548530*
> Yeah I got a new BIOS from Jacob yesterday, going to see what he says about the voltage issue, so far I have had no errors/issues but with 4 GPUs its a little harder and takes longer to ensure stability. Plus you really have to stress them which I haven't had time to do. So we will see.
> 
> I will try Witcher 2 tonight, apparently its the new Crysis graphics crushing wise, although BOINC maybe better at this.


I played all last night without issue...Good game ~ Pwns face IMO.

That being said, it's not suprising, to me anyway, that NVIDIA won't fix the "locking" issue.

They're still claiming this card is perfect...As was seen in the press release saying there won't be a revision.

If we want this card "unlocked" it's becoming more and more evident, we're going to have to do it ourselves.


----------



## RagingCain

I bit the bullet and asked TiN and Jacob what we can do about the voltage lock down.

TiN might have been a better choice first, but I also asked Jacob when I gave him an update on the 92/93 bios's.

I don't think he is going to want to reflash his BIOS for EVERY set of benchmarks, so he must be looking into this as well.


----------



## [email protected]

my gtx 590 rock stable h24 720/1400/1820


----------



## Masked

Quote:



Originally Posted by *[email protected]*


my gtx 590 rock stable h24 720/1400/1820




Wow, I love that background -- What's it from?


----------



## ReignsOfPower

I would like to get the small voltage irregularities fixed personally. Then give the card a slight overclock. Then we're good. I'm on air obviously so I can push 700+ MHz Core without some heat issues. If you manage to get a new v of the BIOS Ragin, CC me it, i'd love to give it a shot







I'm still on the official drivers, not the beta.


----------



## grunion

That would be BioShock2


----------



## [email protected]

http://imageshack.us/photo/my-images...1920x1200.jpg/


----------



## [email protected]r

Quote:



Originally Posted by *ReignsOfPower*


I would like to get the small voltage irregularities fixed personally. Then give the card a slight overclock. Then we're good. I'm on air obviously so I can push 700+ MHz Core without some heat issues. If you manage to get a new v of the BIOS Ragin, CC me it, i'd love to give it a shot







I'm still on the official drivers, not the beta.


I use the stock Gainward bios I modified it myself with NiBiTor 6.03
I can flash your bios if you want to send me
sorry for my English


----------



## Masked

Quote:


> Originally Posted by *grunion;13555724*
> That would be BioShock2


Quote:


> Originally Posted by *[email protected];13555906*
> http://imageshack.us/photo/my-images/684/bioshock21920x1200.jpg/


My step brother works for Irrational Games...I have a feeling he'd murder me in my sleep if he knew I just asked that so, let's keep that Q on the D/L aight?

That being said ~

I'm experiencing some major issues in BF2 and Witcher 2, especially if I play them back to back w/the new driver.

Old one has a weird issue where if I watch a movie, witcher freaks out but, BF2 runs 100%

*sigh*...

Someone fix this card already...


----------



## exlink

Just purchased an EVGA Classified GTX 590 3GB from Newegg when they popped back into stock for the brief moment along with an Corsair AX850. Still have to wait before I can get everything up and running because I have to RMA my P8P67-M Pro because the SATA just went out due to the recall. At least its a free replacement with cross shipping but I have to wait to file the RMA because I move next week.


----------



## RagingCain

Welcome Exlink







Get some pictures of your beauty up.

Well Witcher 2 started off strong. Ultra Spec with Uber Sampling and Draw Distance upgraded to Far. Was in the 60 fps with a little stutter here and there... 95% usage across the board while running through the woods.... then while getting beat in the jail cell dropped down to 25% usage flat.

Its been a loooooooong time since I have been 7fps hahahaha. It felt surprisingly good, now I know Q-SLI is not a waste.

Two things to note, if you are using ultra high-res desktops i.e. 5760x1080, she won't let the W2 start. Just doesn't run, so thats a glitch.

@Voltage Issues:
New BIOS makes my cards stable to 700 @ 0.925v, even with "Gimpy" (thats my weak GPU number 4) using only 0.913v Looks like my power curve and numbers might need to be re-done.

@Strange Boot-Up Overclock:
Managed to figure out what the hell is OCing my cards. ASUS R3E Black Edition comes with a few operating programs, one of them has no interface for me called "GPU TweakIt!" which is supposed to be ASUS's Afterburner esque program but it autos me to 700/1400 and 924 mem. I am fairly confident it would most certainly have crashed under old BIOS.

@Reign I am going to send you the new BIOS, tinker around with it a bit (it will need to install 4 BIOSes so pay attention to the onscreen instructions (your best bet is to open a CMD Prompt as administrator and install it via CMD) If you need a little help with the old DOS commands or need a quick refresher let me know.

@Off-Topic
I restarted college on Monday, so I am taking 3 Calculus classes back to back from morning till 1 pm, and I will most likely be slower to respond to things, and not much time for doing benchies and gaming of course. One thing I realized by day 4, I am still as sharp as ever, but my memory isn't quite as good when I was a younger buck so I might actually have to spend a few hours every other night trying to re-learn this stuff. I have also gone back to my original major of Astrophysics, which is why I volunteered myself to take my Calculus series over again. Its a long story, namely, my wife (ex-wife now) pressured me into changing my major into CompSci so I could graduate faster to pay the (her) bills.


----------



## [email protected]

I tried to flash my "gtx 590 gainward" into asus' but it didn't work... 
I took away the protections but no success.
Do you have any idea how can I flash it?"


----------



## EvilClocker

The New Bios didnt do anything for my EVGA 590. Doesnt Overclock any better either. kinda stunks Witcher 2 at 1080p gets 20fps on my rig!


----------



## ReignsOfPower

Witcher 2 doesnt drop under 60fps for me @ Max everything 2560x1600 and Ubersampling Off.

Epic Lols @ Ragin's post. Look forward to your email


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13566344*
> Witcher 2 doesnt drop under 60fps for me @ Max everything 2560x1600 and Ubersampling Off.
> 
> Epic Lols @ Ragin's post. Look forward to your email


Please PM me your email.


----------



## [email protected]

disable


----------



## ReignsOfPower

New Voltage BIOS flashed fine, but the voltages still go from 0.888 to 9.125 on one GPU. The other is 0.925 solid. I'm going back to the fan unlocked BIOS









That being said, after going back to the fan unlocked BIOS my voltages are now 0.875V per GPU. I think you need to tell Jacob, close, but no cigar. Would be a good idea also to add the fan unlocked profile into the voltage mod one they're working on.


----------



## rush2049

Um I don't know if this is news or old, but I was slightly surprised....

new MSI Afterburner beta, 2.2.0 Beta 3........

available at the usual place....


----------



## RagingCain

Double post.


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;13572793*
> Um I don't know if this is news or old, but I was slightly surprised....
> 
> new MSI Afterburner beta, 2.2.0 Beta 3........
> 
> available at the usual place....


Really cool thanks.
Quote:


> Originally Posted by *[email protected];13567144*
> disable


Merci beacoup, jai adore ubersampling








Quote:


> Originally Posted by *ReignsOfPower;13571974*
> New Voltage BIOS flashed fine, but the voltages still go from 0.888 to 9.125 on one GPU. The other is 0.925 solid. I'm going back to the fan unlocked BIOS
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That being said, after going back to the fan unlocked BIOS my voltages are now 0.875V per GPU. I think you need to tell Jacob, close, but no cigar. Would be a good idea also to add the fan unlocked profile into the voltage mod one they're working on.


Yeah I saw that coming from a mile away.
Quote:


> Originally Posted by *TiN;15320*
> Watercooling will not help, issues are not due temperature.
> I experienced VRM failure under subzero GPU's and +20°C VRM.
> 
> Keep us posted. I will continue on 590's this week.


I have some interesting news from TiN in response to me wondering if he is able to find away for voltage unlock, for those that are in the overclocking persuasion. TiN had a VRM failure at low temps while overclocking.

Sigh... I don't like that.


----------



## grunion

Quote:


> Originally Posted by *RagingCain;13578675*
> Really cool thanks.
> 
> Merci beacoup, jai adore ubersampling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I saw that coming from a mile away.
> 
> I have some interesting news from TiN in response to me wondering if he is able to find away for voltage unlock, for those that are in the overclocking persuasion. TiN had a VRM failure at low temps while overclocking.
> 
> Sigh... I don't like that.


Didn't he also use 580 components?
If that's true, the issue runs deeper.


----------



## [email protected]




----------



## [email protected]

[email protected] said:


> stobeur
> 
> cpu i7 2600k @ 5.0GHz
> 
> GTX 590 855/2000/1710
> 1.088V
> driver : 267.52


----------



## RagingCain

Quote:



Originally Posted by *grunion*


Didn't he also use 580 components?
If that's true, the issue runs deeper.


I am not sure but I think he tried a quick overclock with LN2 before adding the beastly power regulation system. So he could be refering to something he did, or something he found out.

Taken from context he simply suggests that overclocking the 590 is guaranteed to kill them.

I also played a little video gaming last night. Bad Company 2 is hard-locking my system after about 25~30 minutes. Temps under 50c.

I think I am just going to put them up for sale.

Quote:



Originally Posted by *[email protected]*


















stobeur

cpu i7 2600k @ 5.0GHz

GTX 590 855/2000/1710
1.088V
driver : 267.52


Your scores are pretty good for a single GTX 590.


----------



## fr0st.

[email protected] said:


> Quote:
> 
> 
> 
> Originally Posted by *[email protected]*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> stobeur
> 
> cpu i7 2600k @ 5.0GHz
> 
> GTX 590 855/2000/1710
> 1.088V
> driver : 267.52
> 
> -snip-
> 
> 
> Turn off PPU, it's over inflating your CPU score.
> 
> (Options, PPU tick box)


----------



## RagingCain

Quote:



Originally Posted by *fr0st.*


Turn off PPU, it's over inflating your CPU score.

(Options, PPU tick box)


He doesn't have to, just go by GPU score. About on par with two of my GTX 580s I had.


----------



## [email protected]

Quote:



Originally Posted by *RagingCain*


I am not sure but I think he tried a quick overclock with LN2 before adding the beastly power regulation system. So he could be refering to something he did, or something he found out.

Taken from context he simply suggests that overclocking the 590 is guaranteed to kill them.

I also played a little video gaming last night. Bad Company 2 is hard-locking my system after about 25~30 minutes. Temps under 50c.

I think I am just going to put them up for sale.

Your scores are pretty good for a single GTX 590.


thanks








I will try to pass the 900mhz with my graphics card


----------



## [email protected]

I think it goes to 1.105v


----------



## [email protected]

Quote:


> Originally Posted by *RagingCain;13578675*
> Really cool thanks.
> 
> Merci beacoup, jai adore ubersampling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I saw that coming from a mile away.
> 
> *I have some interesting news from TiN in response to me wondering if he is able to find away for voltage unlock, for those that are in the overclocking persuasion. TiN had a VRM failure at low temps while overclocking.
> 
> Sigh... I don't like that.*


What voltage did he use?


----------



## RagingCain

Quote:



Originally Posted by *[email protected]*


[/B]
What voltage did he use?


I am assuming the max voltages allowable in BIOS, so a 1.203v. TiN was under Liquid Nitrogen and 50~100c below zero.


----------



## [email protected]

stobeur i7 2600k @ 5.0GHz
GTX 590 875/2000/1750 1.105V
Pilotes : 267.52
windows Seven-64bit


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13584587*
> I am not sure but I think he tried a quick overclock with LN2 before adding the beastly power regulation system. So he could be refering to something he did, or something he found out.
> 
> Taken from context he simply suggests that overclocking the 590 is guaranteed to kill them.
> 
> I also played a little video gaming last night. Bad Company 2 is hard-locking my system after about 25~30 minutes. Temps under 50c.
> 
> I think I am just going to put them up for sale.


It's not the cards...

It's the driver.

I'm having relatively the same exact issue but, on my end...Basically, the game automatically tabs into a window, then when kicked back into full, has a failure and driver "recovers"...I have to re-start the whole game...Bla bla bla.

These drivers are terrible.

I'd wait for the next WHQL to make your decision, personally.


----------



## Timmaigh!

Hi there, new member here and fellow owner of GTX590 as well 

I joined to ask a question about overclocking of this card, and it seems this is the place to obtain some valuable intel... i read majority of this thread, but i am still bit unsure about few things...

so i bought my GTX590 (Gainward brand, do not feel like making a photo now, hope its not a prerequisite to post here ) for rendering with Octane render (CUDA app) and obviously i would like to overclock it as far as possible (on stock voltage only though).

I actually already did it, i run it at 670 MHz and it can render for 3 hours, so i guess its stable. I have Multi-GPU off, as this is, how Octane likes it, so i game with one GPU only as well, but thats ok, i play SC2 and Black Ops only.

Now i am thinking about trying even more MHz, 700 and beyond, but i am quite "scared" to do it after reading those horrible stories about dead cards, is it completely safe now? Can i kill my card if i OC it too much or there is no chance (bar some really really bad luck), as OCP will at worst downclock or shut it down completely? Fellow Octane user posted on their forums screen of his Gainward GTX590 clocked to 775MHz on stock volts via Gainward OC Tool and apparently it was stable - do not know for how long though... do you think its safe for me to try it?

Other question, i have Seasonic S12-D 750W PSU, is it good enough to run the card above 700 MHz? The rest of my system is 980x @3,78 GHz stock volts, gigabyte x58a ud7 rev 1.0, 6x2GB Kingston 1333 CL9 RAM non-oced probably 1,5V, 2x Intel 80GB SSD, 1x WD Caviar Green 2TB, DVD drive obviously and 5 fans (3x Fractal design silent series and 2x Noctua NH14 at low voltage settings)...

Thanks in advance for your answers


----------



## RagingCain

Quote:


> Originally Posted by *Timmaigh!;13609722*
> Hi there, new member here and fellow owner of GTX590 as well
> 
> I joined to ask a question about overclocking of this card, and it seems this is the place to obtain some valuable intel... i read majority of this thread, but i am still bit unsure about few things...
> 
> so i bought my GTX590 (Gainward brand, do not feel like making a photo now, hope its not a prerequisite to post here ) for rendering with Octane render (CUDA app) and obviously i would like to overclock it as far as possible (on stock voltage only though).
> 
> I actually already did it, i run it at 670 MHz and it can render for 3 hours, so i guess its stable. I have Multi-GPU off, as this is, how Octane likes it, so i game with one GPU only as well, but thats ok, i play SC2 and Black Ops only.
> 
> Now i am thinking about trying even more MHz, 700 and beyond, but i am quite "scared" to do it after reading those horrible stories about dead cards, is it completely safe now? Can i kill my card if i OC it too much or there is no chance (bar some really really bad luck), as OCP will at worst downclock or shut it down completely? Fellow Octane user posted on their forums screen of his Gainward GTX590 clocked to 775MHz on stock volts via Gainward OC Tool and apparently it was stable - do not know for how long though... do you think its safe for me to try it?
> 
> Other question, i have Seasonic S12-D 750W PSU, is it good enough to run the card above 700 MHz? The rest of my system is 980x @3,78 GHz stock volts, gigabyte x58a ud7 rev 1.0, 6x2GB Kingston 1333 CL9 RAM non-oced probably 1,5V, 2x Intel 80GB SSD, 1x WD Caviar Green 2TB, DVD drive obviously and 5 fans (3x Fractal design silent series and 2x Noctua NH14 at low voltage settings)...
> 
> Thanks in advance for your answers


Welcome to OCN, click on USER CP at the top, and you can "Edit System" to put your system in your signature like we all have.

More than likely you would be good to go at overclocking (PSU wise), but as it stands to take it past 700 MHz, most people seem to need a voltage bump above 0.925v Some lucky members get it to 700 MHz, after my BIOS update, I definitely can get 700 MHz with 0.925v.

In order to be able to increase voltage you either have to use 267.84/85 driver set, or edit the BIOS. Editing the BIOS is a sure way to void warranty and it has an intermediate difficulty level. Using old drivers is fairly simple, download them and then install them. Use whatever GPU overclocking tool you like, I am a fan of MSI Afterburner, increase the voltage and clocks in small increments for stability.

Depending on Gainward's warranty and customer service is what you have to consider. EVGA is pretty much straightforward, barring BIOS editing and physical damage (i.e. dropping it), they will replace the card if it dies. There is an increased chance of killing your card yes, mostly on air / stock cooling from everything I have seen, what chance is it? I don't think its very high. I have seen one card die at stock, even that can't be proven. Is it guaranteed to kill your card? Probably not if you are cautious.

I personally don't see the point of overclocking 2x 590s because I have 4 GPUS, however using a single GTX 590, I would want a little bit more oompf out of it.

The sweet spot most people have been trying to get on their OC's is about 720~730 MHz, and under 0.963/0.975v due to an invisible OCP that you will not see except in your FPS / Benchmark scores plummeting. I hope that helps a little.
Quote:


> Originally Posted by *Masked;13606451*
> It's not the cards...
> 
> It's the driver.
> 
> I'm having relatively the same exact issue but, on my end...Basically, the game automatically tabs into a window, then when kicked back into full, has a failure and driver "recovers"...I have to re-start the whole game...Bla bla bla.
> 
> These drivers are terrible.
> 
> I'd wait for the next WHQL to make your decision, personally.


Probably not getting rid of them, just frustrated when I get a chance to play and they don't want to work. I actually have more issues dealing with Surround than I do with a single monitor, and the switching back and forth. As of late, I am just using my 120 Hz monitor for gaming. I normally game on one monitor and be productive with 3, and playing musical chairs with the monitors going on and off, and SLI getting disabled, then re-enabled, then sometimes its just 3-GPUs instead of all 4... It just gives me a headache when I want to play for 20 minutes. As a former Software QA, I find it not only disturbing, but also insulting that they don't thoroughly test a feature set as big as Surround.

Turns out BadCompany 2 was crashing due to my Thunderbolt. So it also needs new drivers.


----------



## capchaos

Quote:


> Originally Posted by *RagingCain;13611850*
> helps a little.
> Probably not getting rid of them, just frustrated when I get a chance to play and they don't want to work. I actually have more issues dealing with Surround than I do with a single monitor, and the switching back and forth. As of late, I am just using my 120 Hz monitor for gaming. I normally game on one monitor and be productive with 3, and playing musical chairs with the monitors going on and off, and SLI getting disabled, then re-enabled, then sometimes its just 3-GPUs instead of all 4... It just gives me a headache when I want to play for 20 minutes. As a former Software QA, I find it not only disturbing, but also insulting that they don't thoroughly test a feature set as big as Surround.
> 
> Turns out BadCompany 2 was crashing due to my Thunderbolt. So it also needs new drivers.


Why not just use the windows key +P and switch between single screen and surround instead of disabling surround all the time.


----------



## RagingCain

To be honest, I am normally very prone to using Win7 to setup peripherals normally, but I noticed benchmark/FPS being lower when setup for Surround, and only playing on a single monitor. Even when it was set "Single Display Performance mode." I haven't tried with the latest drivers to see if it was an issue, but it was on the 267.xx drivers so I assumed its the same.


----------



## Timmaigh!

Quote:


> Originally Posted by *RagingCain;13611850*
> Welcome to OCN, click on USER CP at the top, and you can "Edit System" to put your system in your signature like we all have.
> 
> More than likely you would be good to go at overclocking (PSU wise), but as it stands to take it past 700 MHz, most people seem to need a voltage bump above 0.925v Some lucky members get it to 700 MHz, after my BIOS update, I definitely can get 700 MHz with 0.925v.
> 
> In order to be able to increase voltage you either have to use 267.84/85 driver set, or edit the BIOS. Editing the BIOS is a sure way to void warranty and it has an intermediate difficulty level. Using old drivers is fairly simple, download them and then install them. Use whatever GPU overclocking tool you like, I am a fan of MSI Afterburner, increase the voltage and clocks in small increments for stability.
> 
> Depending on Gainward's warranty and customer service is what you have to consider. EVGA is pretty much straightforward, barring BIOS editing and physical damage (i.e. dropping it), they will replace the card if it dies. There is an increased chance of killing your card yes, mostly on air / stock cooling from everything I have seen, what chance is it? I don't think its very high. I have seen one card die at stock, even that can't be proven. Is it guaranteed to kill your card? Probably not if you are cautious.
> 
> I personally don't see the point of overclocking 2x 590s because I have 4 GPUS, however using a single GTX 590, I would want a little bit more oompf out of it.
> 
> The sweet spot most people have been trying to get on their OC's is about 720~730 MHz, and under 0.963/0.975v due to an invisible OCP that you will not see except in your FPS / Benchmark scores plummeting. I hope that helps a little.


Thank you for your response, i will take care of my sig later.

As i said, i would rather not touch the voltage and BIOS, not only because of warranty, but generally i am not that experienced with the overclocking and too many things could go wrong in this case...so if it means, i wont be able to go beyond 700 MHz, so be it. I can live with that, but i would kill myself if i killed the card.

Still this does not mean, i do not want to OC it at all, i want to do it, but eliminate the risks as much as i can. It seems that keeping voltage at stock value is the basic step  BTW you said 0,925, is that supposed to be stock? GPU-z says 0,912 in my case, is this the difference due to the Vdrop/vdroop like with CPUs? (i am bit more experienced with OCing those). Or am i missing something?

Yesterday for a few minutes i tried the 700 MHz clock with Octane and everything looked normal, no crash or explosion







so i guess it might work at least at this frequency OK and there is a performance increase. Octane states in the main window, how fast is it rendering in Megasamples, on my benchmark scene i get 5,34 on 607, 5,82 on 670 and 6,03 on 700... so it scales well so far...

i am really tempted to try up to 775 though, the guy on the Refractive Software forums did it and the Wizzard of TPU did it as well (before killing the card while overvoltaging)...
I am bit hopeful this might be my case:

http://forums.nvidia.com/index.php?showtopic=197878

Less powerdraw = higher clocks, right? OFC under condition, the cores are cherry picked enough to be stable on default volage there...


----------



## Timmaigh!

Ok, addendum

i tried it with Octane on 723 - 1446 - 3822 MHz for half hour and it worked! The sampling speed increased from 5,82/5,84 to 6,26 Ms/sec, hopefully the apps built-in counter does not lie 

Here is the proof:










Indeed, it can always crash after 2 hours, but i believe it would be stable. Maybe if i tried it with Furmark/3dMark11, it would crash on these settings within first minute, but i am not going to do it, as the performance within Octane is what is important to me.

The question now is, should i try even more MHz, given the 90C temps... it was 88C max on 670, so assuming it would work at 770, i suppose it could be 92C there...Maybe if i tried it with Furmark/3dMark11, it would crash on these settings as well, but i am not going to do it, as the performance within OCtane is what is important to me.

Another question, if its stable within Octane while both cores are working at 95-99percent on these 723Mhz clocks, is it granted to be stable while gaming with ONE core only? Or not at all?


----------



## vertex

Hey can I be added? New "green team player." Had 2 5970's (acutally still have) but thought I'd make the switch...Does anyone have the driver that isn't locked down?


----------



## RagingCain

Quote:


> Originally Posted by *Timmaigh!;13619017*
> Thank you for your response, i will take care of my sig later.
> 
> As i said, i would rather not touch the voltage and BIOS, not only because of warranty, but generally i am not that experienced with the overclocking and too many things could go wrong in this case...so if it means, i wont be able to go beyond 700 MHz, so be it. I can live with that, but i would kill myself if i killed the card.
> 
> Still this does not mean, i do not want to OC it at all, i want to do it, but eliminate the risks as much as i can. It seems that keeping voltage at stock value is the basic step  BTW you said 0,925, is that supposed to be stock? GPU-z says 0,912 in my case, is this the difference due to the Vdrop/vdroop like with CPUs? (i am bit more experienced with OCing those). Or am i missing something?


Sorry I am so selfish, I often forget their are other brands out there







, Gainward is my favorite brand but tough to get here State-side. I remember the best of the best Geforce 2s/4s were Gains and Chaintechs.

The stock voltages for EVGA are 0.925 with a core clock of 630 MHz on Core. The memory I believe is all stock with exception of what PoV or was Palit as they have an agressively clocked card.
Quote:


> Yesterday for a few minutes i tried the 700 MHz clock with Octane and everything looked normal, no crash or explosion
> 
> 
> 
> 
> 
> 
> 
> so i guess it might work at least at this frequency OK and there is a performance increase. Octane states in the main window, how fast is it rendering in Megasamples, on my benchmark scene i get 5,34 on 607, 5,82 on 670 and 6,03 on 700... so it scales well so far...
> 
> i am really tempted to try up to 775 though, the guy on the Refractive Software forums did it and the Wizzard of TPU did it as well (before killing the card while overvoltaging)...
> 
> I am bit hopeful this might be my case:
> http://forums.nvidia.com/index.php?showtopic=197878


I have seen almost every dead card related to over voltage. The Power regulation system has been called into question numerous times but the truth of the matter it seems to be running stock fine, and have plenty of over-head for a bit of overclocking with stability. I achieved 830 MHz @ 1.038~1.05v stable enough for 4, maybe 5 runs of the Heaven benchmark. I am also on water so take that with a grain of salt, as every dead card I have seen has not been on water.
Quote:


> Less powerdraw = higher clocks, right? OFC under condition, the cores are cherry picked enough to be stable on default volage there...


Less power draw = less power draw. A higher achieved clock ratio will produce more heat and require more power to draw upon. This being said there are many variables to consider especially the Voltage Regulation, and it may seem like you are not drawing any more power because the numbers change, it is just reaching that voltage cap of whatever it is said to. It seems a bit confusing but for example, you set it to 0.913v, it reads 0.913v, but the VRM is actually at 0.900v and will provide enough voltage to the run the GPU up to 0.913v.

You set it to 0.925v and it reads 0.925v but the VRMs might still read 0.900v because you haven't increased the draw of power through the GPU. VRMs are allowed to allow enough juice to go through up to 0.925v, however it gates the current, but it doesn't have to be 0.925v. On these newer Geforce cards though, I have noticed, if you set the value to 0.925v, it immediately gives it that current and the VRMs compensate and fluctuate to above and below the value. In essence, the gap has greatly decreased. Think of it as Load-Line Calibration in a sense, however its sometimes hard to read without a voltmeter. Our third party sensors aren't as accurate as we like to believe and many of the values we see are as off as +/- 15% on a lot of them, but for the most part they are good enough to get the job done.

That is my current understanding of how these more complicated VRMs work, and I maybe off slightly, but I think I got the general idea of how it works. Sometimes the system can provide extra performance with no added juice. Sometimes you need a nudge of extra voltage to increase the speed just a fraction, but by doing so, you are able to give yourself even more headroom to go even faster.
Quote:


> Indeed, it can always crash after 2 hours, but i believe it would be stable. Maybe if i tried it with Furmark/3dMark11, it would crash on these settings within first minute, but i am not going to do it, as the performance within Octane is what is important to me.


3DMark11 is your best bet, Furmark is good for seeing max temperatures really. 3DMark11 also has another constraint on it called the PDL. The Power Draw Limit is something that appears to be a non-hardware type of throttling (slowing down) of the GPU. We worked it out that as the Power Draw increases, a limiting performance bouncing effect occurs of jumping between the core clocks you have set, and the PDL "safe" underclock which I think is 553 MHz.

It easy to think of this as an oscillating function that will diverge to a specific benchmark score, for those that want a visual, I guess a converging alternating series function would suffice.

Core @ Time: 0.00s = 775 MHz
Core @ Time: 0.15s = 553 MHz
Core @ Time: 0.30s = 775 MHz
etc, etc.

I believe the only way it can happen "without being seen" is that it is happening so fast the the GPU monitoring program isn't "polling" the change, so we are talking really fast up and down frequency changing.

We found to avoid this invisible limitation, to aim for the 725 MHz with a Max Voltage of 0.980v if at all possible, even lower to 0.968v if you have what you called a "cherry GPU."
Quote:


> The question now is, should i try even more MHz, given the 90C temps... it was 88C max on 670, so assuming it would work at 770, i suppose it could be 92C there...Maybe if i tried it with Furmark/3dMark11, it would crash on these settings as well, but i am not going to do it, as the performance within OCtane is what is important to me.
> 
> Another question, if its stable within Octane while both cores are working at 95-99percent on these 723Mhz clocks, is it granted to be stable while gaming with ONE core only? Or not at all?


To be honest, the GTX 580 TJMax is I believe 105c. I wouldn't recommend keeping a GTX 580 GPU any higher than 90c to expect a decent life span 2-3 years+

For a GTX 590, I would guess that to be safe, keep it below 85c as its a more delicate flower.

Only way you know if it is gaming stable is if you test it


----------



## ilukeberry

Quote:


> Originally Posted by *RagingCain;13626082*
> The stock voltages for EVGA are 0.925 with a core clock of 630 MHz on Core. The memory I believe is all stock with exception of what PoV or was Palit as they have an agressively clocked card.


That would be PoV cards


----------



## HHawk

Quick question;

I currently have 2x GTX 470 in SLI @ 750 Mhz. will upgrading to a signle GTX 590 wil give better perfomance, similar or worse?

I tried looking up GTX 470 SLI vs GTX 590, but couldn't find anything, only a signle GTX 470 vs GTX 590.

Thanks in advance guys.


----------



## Masked

Quote:


> Originally Posted by *HHawk;13629623*
> Quick question;
> 
> I currently have 2x GTX 470 in SLI @ 750 Mhz. will upgrading to a signle GTX 590 wil give better perfomance, similar or worse?
> 
> I tried looking up GTX 470 SLI vs GTX 590, but couldn't find anything, only a signle GTX 470 vs GTX 590.
> 
> Thanks in advance guys.


I swapped from 3x480's and I see an increase in...everything...So, I would assume if you have 2x470's, then yes, this will be an upgrade.

@Ragin, I know with their MB fiasco things are a bit...Hectic. Has/is there any word on the bios front from EVGA?


----------



## RagingCain

Quote:


> Originally Posted by *Masked;13630192*
> I swapped from 3x480's and I see an increase in...everything...So, I would assume if you have 2x470's, then yes, this will be an upgrade.
> 
> @Ragin, I know with their MB fiasco things are a bit...Hectic. Has/is there any word on the bios front from EVGA?


They have just left me with BIOS version 2 or 92/93 BIOS so far.

Although my voltage is lower on GPU4, I won't be really able to tell till I run BOINC for a while if its stock error free, but honestly the games that do run, run really smooth.

I had Bad Company 2 at 150+fps and WoW between 70-250 fps, both on 1920x1080 max settings. The girlfriend was telling me Sims 3 plays great as well, with facial glitches on portraits though.

Maybe Reign of Power can give better experience, I have barely had time to play.

Just PM either of us if you want the BIOS update.


----------



## Alatar

new members added, please PM or post if I missed you


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13630393*
> They have just left me with BIOS version 2 or 92/93 BIOS so far.
> 
> Although my voltage is lower on GPU4, I won't be really able to tell till I run BOINC for a while if its stock error free, but honestly the games that do run, run really smooth.
> 
> I had Bad Company 2 at 150+fps and WoW between 70-250 fps, both on 1920x1080 max settings. The girlfriend was telling me Sims 3 plays great as well, with facial glitches on portraits though.
> 
> Maybe Reign of Power can give better experience, I have barely had time to play.
> 
> Just PM either of us if you want the BIOS update.


At home, I've been encountering some issues that I can't quite put my finger on...Same goes for work.

It's frustrating because if I swap drivers, I lose compatability with 2/3 programs I'm currently using and if I don't...I seem to randomly crash.

The Witcher 2, isn't that big of a deal anymore because I save often...But, the BC2 glitch has become extremely predictable.

In server select the screen automatically reduces to a window and forces me back into windows...Upon maximizing and loading a server, once playing, the game then freezes, both screen go white and flicker...and it's actually somewhat difficult to exit the window...Then I get a windows scheme message asking me if I wish to change my windows theme etc...All of the above happens every single time like clockwork.

I have a strong feeling it's driver related but, due to sheer "stability", I've had to keep the 6990's in...My office literally bakes.

I'm hoping this WHQL release they address it and hopefully EVGA will come out with a "fixed" bios, as well.


----------



## CrashPush

Just took the plunge and ordered my GTX 590. Was going back and forth between the 590 and 6990 for two weeks.


----------



## exlink

Quote:


> Originally Posted by *CrashPush;13631489*
> Just took the plunge and ordered my GTX 590. Was going back and forth between the 590 and 6990 for two weeks.


2 weeks? That's it? I was debating that since they released late March up until about a week ago.


----------



## Arizonian

Whoops posted on wrong thread and removed. Next.


----------



## Timmaigh!

Quote:


> Originally Posted by *RagingCain;13626082*
> The stock voltages for EVGA are 0.925 with a core clock of 630 MHz on Core. The memory I believe is all stock with exception of what PoV or was Palit as they have an agressively clocked card.


Yeah, my card has 607 default clock and the GPU-z says VDDC is 0,912 at load and 0,875 at idle.

Quote:


> Originally Posted by *RagingCain;13626082*
> Less power draw = less power draw. A higher achieved clock ratio will produce more heat and require more power to draw upon. This being said there are many variables to consider especially the Voltage Regulation, and it may seem like you are not drawing any more power because the numbers change, it is just reaching that voltage cap of whatever it is said to. It seems a bit confusing but for example, you set it to 0.913v, it reads 0.913v, but the VRM is actually at 0.900v and will provide enough voltage to the run the GPU up to 0.913v.
> 
> You set it to 0.925v and it reads 0.925v but the VRMs might still read 0.900v because you haven't increased the draw of power through the GPU. VRMs are allowed to allow enough juice to go through up to 0.925v, however it gates the current, but it doesn't have to be 0.925v. On these newer Geforce cards though, I have noticed, if you set the value to 0.925v, it immediately gives it that current and the VRMs compensate and fluctuate to above and below the value. In essence, the gap has greatly decreased. Think of it as Load-Line Calibration in a sense, however its sometimes hard to read without a voltmeter. Our third party sensors aren't as accurate as we like to believe and many of the values we see are as off as +/- 15% on a lot of them, but for the most part they are good enough to get the job done.
> 
> That is my current understanding of how these more complicated VRMs work, and I maybe off slightly, but I think I got the general idea of how it works. Sometimes the system can provide extra performance with no added juice. Sometimes you need a nudge of extra voltage to increase the speed just a fraction, but by doing so, you are able to give yourself even more headroom to go even faster.


Uff, these things with voltages and currents, it kills me







Anyway my point was, in case of GTX590, the issue is actually with the insufficient power cascade and aggressive OCP, not cores being unable to hit clocks under 800 MHz...if then my CUDA app has lesser power draw than games/benchmarks/stress tests, i can clock it higher, before i hit the OCP wall, right? Again under condition, the cores can run stable at given clocks and volts.

Quote:


> Originally Posted by *RagingCain;13626082*
> 
> 3DMark11 is your best bet, Furmark is good for seeing max temperatures really. 3DMark11 also has another constraint on it called the PDL. The Power Draw Limit is something that appears to be a non-hardware type of throttling (slowing down) of the GPU. We worked it out that as the Power Draw increases, a limiting performance bouncing effect occurs of jumping between the core clocks you have set, and the PDL "safe" underclock which I think is 553 MHz.


Yeah, 3Dmark and Furmark are surely very stressful apps, the question is, does Octane stress the GPU as much as these apps? I do not need my card to not crash Furmark, i need it run Octane at given clocks...
the thing is, i read lot of reviews of the card and most of them claimed, they were not able to OC the card higher on stock volts than 670-710 MHz... i reckon that means that 3dmark/furmark whatever they used crashed/read some error with higher clocks than that. But then you have The Wizzard fella of TPU, who managed to run his card at 772 on stock volts... can the difference between the ability of the cores on different cards be so huge? One can run stable at 670 only, but another at 770?
Anyway what i am trying to say, going with the majority, my card should be probably stable at 0,912 up to lets say 690 Mhz... i suppose it would make few minutes/half hour of 3dmark/furmark/heaven/OCCT etc and could be considered stable. Now i could run it already half of an hour at 723 MHz rendering with Octane, i have either good card or Octane does not stress the GPU/ does not draw as much power as those benchmarks, despite both cores are loaded at 90-100 percent and the temps are up to 90C degrees... if the latter is true, i fail to see a point of these benchmarks then...

One more question, is there any app, which would let me make OC profiles for my GPU... i mean i wish to run it at 723 with Octane, but i am ok with only 670 while gaming...so i do not have to set it manually, each time i want to run the specific app...


----------



## Timmaigh!

Ok, i tried 772MHz and 4000 on RAM










As you can see under the rendered image, it rendered for half an hour and sampled 4677 samples compared to 4411 with previous 723 MHz overclock. 4677/4411 and 772/723 are both 1,06 ratio, so the performance increase is as expected, no throttling or anything... The max temp was 90C, the same as with 723MHz...

I would indeed need to render something for 2-3 hours, to be definitely sure, it is stable, but if yes, i am fairly satisfied as its the 2 x gtx580 performance for 300-400 EUROS less.

Anyway i played SC2 with Multi-GPU off and it crashed...well it kept freezing on 670MHz before as well, but everytime it unfroze after approximately 30secs/ 1 minute.... and it let me to go to Win and shut the game down via Task Manager...this time, i could not do ****, it crashed for good...although as i played with buddy, we were on Skype while playing and Skype continued to work, so obviously it was down to graphics...
Its still weird with SC2, this whole freezing/unfreezing, i googled a bit about it and it seems i am not the only one with this issue... i need to run the game on default GPU clocks (never done so far) to see, if there is any difference... it might ultimately be connected to the overclock.

TLDR: I can run Octane on 772MHz with both cores loaded, but it would crash SC2 on the same freq while playing on one core only.


----------



## exlink

Correct me if I'm wrong, but I think Dual-GPU cards are much more unstable when they are running only on one GPU.


----------



## [email protected]

stobeur

i7 2600k 5.2ghz

CPU : water

maximus 4 extreme

2x2 ripjawX 2133 cas 9

Gtx 590 890/1010/1780

graphic card 3D : Wwatercooling koolance

driver : 267.52

windows: Seven 64

P : 12313

Graphics score : 12843

Physx score : 12084

Combined score : 9612



stobeur
i7 2600k @ 5.2GHz
CPU : watercooling
GTX 590 884/2020/1768
driver 267.52
windows : Seven-64bits


----------



## Timmaigh!

Quote:


> Originally Posted by *exlink;13636297*
> Correct me if I'm wrong, but I think Dual-GPU cards are much more unstable when they are running only on one GPU.










Did not know that. Any explanation for such behavior?


----------



## [email protected]

Shift 2 works 1 a gpu and no problem for me


----------



## ReignsOfPower

So have evga managed to solve the voltage problem yet? Shouldn't be far off considering the time frame. They should get your Bios Ragin, and their fan bios, and merge the two. That being said, any updates Ragin? I'm just using the fan unlock BIOS now. Seems to work fine, but I'd like to have some more permanent increases to my speeds. Preferably around 680MHz Core.


----------



## Hawk777th

Quote:


> Originally Posted by *exlink;13636297*
> Correct me if I'm wrong, but I think Dual-GPU cards are much more unstable when they are running only on one GPU.


Coming from a guy that ran 2x295's I never experience that only running one gpu when sli was off.


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13648141*
> So have evga managed to solve the voltage problem yet? Shouldn't be far off considering the time frame. They should get your Bios Ragin, and their fan bios, and merge the two. That being said, any updates Ragin? I'm just using the fan unlock BIOS now. Seems to work fine, but I'd like to have some more permanent increases to my speeds. Preferably around 680MHz Core.


No updates, I haven't pushed for one either as its not causing any perceivable problems, I have just been so busy lately. I will have to get BOINC up and running and leave it over night.


----------



## armartins

Quote:


> Originally Posted by *exlink;13636297*
> Correct me if I'm wrong, but I think Dual-GPU cards are much more unstable when they are running only on one GPU.


Not necessarily, I play WoW with DX11 API in eyefinity windowed full screen (only way to use eyefinity +DX11 with ATI atm, shame on ATI/Blizzard) and my 5970 highly OC'd to 1000 core 1250 men works fine running only one core. It's nice, I use firefox 4 with a plugin that allow me to pin firefox window(s) above all other, and above wow as well, so I use half of my right monitor with Wowhead/facebook/Gmail/Windows Live tabs.


----------



## taveston

I have 2x590 sli but I get the same 3dmark11 score whether I have one of two cards in the case - as far as I can tell sli is active and 3dmark recognises that there are two cards. Any idea what I am doing wrong?

Build:

Silverstone FT03 case
Intel 2600k 3.4 Ghz @ 4.7 Ghz with Corsair H70
Asus P8P67-M Pro
Silverstone Strider 1200w PSU
8 GB DDR 3 1600 Mhz Corsair Vengeance
Intel 520 series SSD 250 GB
1 TB Western Digital 7200 rpm 8MB Cache
2 x Geforce 590 sli
Apple 27" Cinema Display
Windows 7 64 bit


----------



## taveston

Quote:


> Originally Posted by *taveston;13663828*
> I have 2x590 sli but I get the same 3dmark11 score whether I have one of two cards in the case - as far as I can tell sli is active and 3dmark recognises that there are two cards. Any idea what I am doing wrong?
> 
> Build:
> 
> Silverstone FT03 case
> Intel 2600k 3.4 Ghz @ 4.7 Ghz with Corsair H70
> Asus P8P67-M Pro
> Silverstone Strider 1200w PSU
> 8 GB DDR 3 1600 Mhz Corsair Vengeance
> Intel 520 series SSD 250 GB
> 1 TB Western Digital 7200 rpm 8MB Cache
> 2 x Geforce 590 sli
> Apple 27" Cinema Display
> Windows 7 64 bit


Actually 3dmark recognises there are 2 cards but shouldn't it recognise 4 with 2x 590? Anyone else have this problem and know how to fix it? I am using the new beta drivers 272.27 as the most recent certified didn't work either....


----------



## raizooor3

Quote:


> Originally Posted by *taveston;13663828*
> I have 2x590 sli but I get the same 3dmark11 score whether I have one of two cards in the case - as far as I can tell sli is active and 3dmark recognises that there are two cards. Any idea what I am doing wrong?
> 
> Build:
> 
> Silverstone FT03 case
> Intel 2600k 3.4 Ghz @ 4.7 Ghz with Corsair H70
> Asus P8P67-M Pro
> Silverstone Strider 1200w PSU
> 8 GB DDR 3 1600 Mhz Corsair Vengeance
> Intel 520 series SSD 250 GB
> 1 TB Western Digital 7200 rpm 8MB Cache
> 2 x Geforce 590 sli
> Apple 27" Cinema Display
> Windows 7 64 bit


Forget about the benchmarks


----------



## taveston

I am using benchmarks to test whether everything is set up properly more than anything else.

Here is the issue I have: both cards are recognised by device manager but nvidia control panel will only allow me to select multi gpu card rather than sli. It also tells me to install an sli bridge. I have tried two separate sli bridges and multiple driver versions.

Is anyone else having this issue? I heard it might be something to do with a Realtek LAN conflict but I have disabled this in the BIOS and uninstalled all the drivers and still nothing.

I am currently on the 275.27 drivers...

Here's my system - any help most appreciated:

Silverstone FT03 case
Intel 2600k 3.4 Ghz @ 4.7 Ghz with Corsair H70 
Asus P8P67-M Pro
Silverstone Strider 1200w PSU
8 GB DDR 3 1600 Mhz Corsair Vengeance
Intel 520 series SSD 250 GB
1 TB Western Digital 7200 rpm 8MB Cache
2 x Geforce 590 sli
Apple 27" Cinema Display 
Windows 7 64 bit


----------



## grunion

Double check the bios, make sure the secondary pci-e slot is set to auto.


----------



## raizooor3

Quote:



Originally Posted by *taveston*


I am using benchmarks to test whether everything is set up properly more than anything else.

Here is the issue I have: both cards are recognised by device manager but nvidia control panel will only allow me to select multi gpu card rather than sli. It also tells me to install an sli bridge. I have tried two separate sli bridges and multiple driver versions.

Is anyone else having this issue? I heard it might be something to do with a Realtek LAN conflict but I have disabled this in the BIOS and uninstalled all the drivers and still nothing.

I am currently on the 275.27 drivers...

Here's my system - any help most appreciated:

Silverstone FT03 case
Intel 2600k 3.4 Ghz @ 4.7 Ghz with Corsair H70 
Asus P8P67-M Pro
Silverstone Strider 1200w PSU
8 GB DDR 3 1600 Mhz Corsair Vengeance
Intel 520 series SSD 250 GB
1 TB Western Digital 7200 rpm 8MB Cache
2 x Geforce 590 sli
Apple 27" Cinema Display 
Windows 7 64 bit



Check with GPU-Z whether both cards are on.

Use the 3-Way SLI Bridge.


----------



## taveston

Quote:


> Originally Posted by *raizooor3;13692985*
> Check with GPU-Z whether both cards are on.
> 
> Use the 3-Way SLI Bridge.


Both cards are definately on - they are shown in device manager, nvidia control panel, and in GPU-Z

Why would a 3-Way SLI Bridge be better? I only have two cards...

Thanks for your advice


----------



## taveston

Quote:


> Originally Posted by *grunion;13692906*
> Double check the bios, make sure the secondary pci-e slot is set to auto.


I'll do this tonight, thanks


----------



## RagingCain

Quote:


> Originally Posted by *taveston;13691629*
> I am using benchmarks to test whether everything is set up properly more than anything else.
> 
> Here is the issue I have: both cards are recognised by device manager but nvidia control panel will only allow me to select multi gpu card rather than sli. It also tells me to install an sli bridge. I have tried two separate sli bridges and multiple driver versions.
> 
> Is anyone else having this issue? I heard it might be something to do with a Realtek LAN conflict but I have disabled this in the BIOS and uninstalled all the drivers and still nothing.
> 
> I am currently on the 275.27 drivers...
> 
> Here's my system - any help most appreciated:
> 
> Silverstone FT03 case
> Intel 2600k 3.4 Ghz @ 4.7 Ghz with Corsair H70
> Asus P8P67-M Pro
> Silverstone Strider 1200w PSU
> 8 GB DDR 3 1600 Mhz Corsair Vengeance
> Intel 520 series SSD 250 GB
> 1 TB Western Digital 7200 rpm 8MB Cache
> 2 x Geforce 590 sli
> Apple 27" Cinema Display
> Windows 7 64 bit


Try switching the monitor to a different Dvi socket, preferably the one next to the dp port on the top card. The drivers are picky after release drivers.

Post powered by DROID X2


----------



## taveston

Thanks everyone for the continued suggestions. I have tried the following now and still no luck:

1. Different Sli bridges
2. Various GPU drivers
3. Various Realtek LAN drivers
4. CMOS reset
5. Checked BIOS for PCI options - can change mode on PCI 3 but that is unused as PCI 1 covers it when I install the GPU
5. Swapping the cards in the PCI slots
6. Swapping the port the monitor is plugged in to (I am using the mini display port)
7. Complete clean install of Windows with chipset drivers and GPU drivers only

Once I got a bubble saying Sli wasn't activated but then when I opened nvidia CP all the maximise 3d performance option did was enable multi gpu card sli as usual

Maybe there is a hardware fault but how can I tell if it's the MB or one of the GPUs? Both GPUs work and are recognised but could one have a faulty Sli connector or something?

One thing I did notice was that nvidia control panel recongises all 4 GPUs as being on a 8x bus whereas GPU-z have them all at 16x...strange

Really at my wits end here!


----------



## grunion

Quote:



Originally Posted by *taveston*


Thanks everyone for the continued suggestions. I have tried the following now and still no luck:

1. Different Sli bridges
2. Various GPU drivers
3. Various Realtek LAN drivers
4. CMOS reset
5. Checked BIOS for PCI options - can change mode on PCI 3 but that is unused as PCI 1 covers it when I install the GPU
5. Swapping the cards in the PCI slots
6. Swapping the port the monitor is plugged in to (I am using the mini display port)
7. Complete clean install of Windows with chipset drivers and GPU drivers only

Once I got a bubble saying Sli wasn't activated but then when I opened nvidia CP all the maximise 3d performance option did was enable multi gpu card sli as usual

Maybe there is a hardware fault but how can I tell if it's the MB or one of the GPUs? Both GPUs work and are recognised but could one have a faulty Sli connector or something?

One thing I did notice was that nvidia control panel recongises all 4 GPUs as being on a 8x bus whereas GPU-z have them all at 16x...strange

Really at my wits end here!


Ok

NVCP gives you the option for muti gpu sli?
That's what you want.

Does gpu-z show sli 4 gpus?


----------



## RagingCain

Quote:


> Originally Posted by *taveston;13698260*
> Thanks everyone for the continued suggestions. I have tried the following now and still no luck:
> 
> 1. Different Sli bridges
> 2. Various GPU drivers
> 3. Various Realtek LAN drivers
> 4. CMOS reset
> 5. Checked BIOS for PCI options - can change mode on PCI 3 but that is unused as PCI 1 covers it when I install the GPU
> 5. Swapping the cards in the PCI slots
> 6. Swapping the port the monitor is plugged in to (I am using the mini display port)
> 7. Complete clean install of Windows with chipset drivers and GPU drivers only
> 
> Once I got a bubble saying Sli wasn't activated but then when I opened nvidia CP all the maximise 3d performance option did was enable multi gpu card sli as usual
> 
> Maybe there is a hardware fault but how can I tell if it's the MB or one of the GPUs? Both GPUs work and are recognised but could one have a faulty Sli connector or something?
> 
> One thing I did notice was that nvidia control panel recongises all 4 GPUs as being on a 8x bus whereas GPU-z have them all at 16x...strange
> 
> Really at my wits end here!


Wait. You think Sli isn't working because of the error bubble? I get that all the time. Just make sure all the GPUs are selected in the image/diagram on Maximum 3D Performance / SLI. If NVCPL is showing all 4 in "Quad SLI" you are good to go.

The PCI-e lanes are tricky to explain. One GPU on each card has 16x. The other GPU will display whatever the motherboard gives it. GPU-z for me always says 16x and NVCPL says whatever else you are getting.

Post powered by DROID X2


----------



## taveston

Quote:



Originally Posted by *RagingCain*


Wait. You think Sli isn't working because of the error bubble? I get that all the time. Just make sure all the GPUs are selected in the image/diagram on Maximum 3D Performance / SLI. If NVCPL is showing all 4 in "Quad SLI" you are good to go.

The PCI-e lanes are tricky to explain. One GPU on each card has 16x. The other GPU will display whatever the motherboard gives it. GPU-z for me always says 16x and NVCPL says whatever else you are getting.

Post powered by DROID X2


The NVCP shows one card in SLI across the two chips on the single card it does not show 4 in Quad SLI.

Each time i install a new driver set it tells me once to connect an sli bridge. I have tried 2 different bridges in different orientations.

All maximum 3d performance does is enable sli on the single card. There is no option for quad sli.


----------



## capchaos

Quote:


> Originally Posted by *taveston;13703357*
> The NVCP shows one card in SLI across the two chips on the single card it does not show 4 in Quad SLI.
> 
> Each time i install a new driver set it tells me once to connect an sli bridge. I have tried 2 different bridges in different orientations.
> 
> All maximum 3d performance does is enable sli on the single card. There is no option for quad sli.


What type of motherboard do you have. There have been reports of quad sli not working if ur board does not have a nf200 chip.


----------



## taveston

Quote:


> Originally Posted by *capchaos;13704001*
> What type of motherboard do you have. There have been reports of quad sli not working if ur board does not have a nf200 chip.


I have a ASUS P8P67-M PRO, it is Quad SLI compatible

http://www.asus.com/Motherboards/Intel_Socket_1155/P8P67M_PRO/


----------



## capchaos

I dnt know what it could be then


----------



## RagingCain

Good luck with the new 275.33 drivers. Epic system failure. Nothing works except AERO and Flash movies. So basically... porn. Good to see that nVidia Driver team has their priorities straight.


----------



## Arizonian

Quote:


> Originally Posted by *RagingCain;13719713*
> Good luck with the new 275.33 drivers. Epic system failure. *Nothing works except AERO and Flash movies. So basically... porn.* Good to see that nVidia Driver team has their priorities straight.


I think I speak for all GTX 590 owners when I say 'PHEW'.









Sorry to hear that truthfully. Let's hope they are addressing the issues. I'd have thought they'd have ironed it out prior to release. For the GTX 580 I'm doing just fine.


----------



## layikblackmore

I really am not good with pc's and I hope I do not offend anyone for posting in this section im in a bit of a bind and im hoping to find some friendly help. I own a dell xps 8300 I purchased it 2 weeks ago I really want to buy a gtx 590 and a 850 watt power supply I know nothing when it comes to specs to figure out wether I can put this in my pc .Anyways to simplify this I am not looking to go crazy or try to overclok or anything I just want a top of the line graphics card that will compete for the next few years so I can enjoy my pc gaming. The PSU IS A Corsair Professional Series Gold High Performance 850-Watt Power Supply CMPSU-850AX and ofcourse the EVGA GTX 590 if anyone could help me in this matter please email me at [email protected] your knowledge is greatly appreaciated thx


----------



## ilukeberry

Quote:



Originally Posted by *RagingCain*


Good luck with the new 275.33 drivers. Epic system failure. Nothing works except AERO and Flash movies. So basically... porn. Good to see that nVidia Driver team has their priorities straight.


I've installed those drivers yesterday, i don't have any problems, works fine.


----------



## Masked

Quote:


> Originally Posted by *taveston;13704155*
> I have a ASUS P8P67-M PRO, it is Quad SLI compatible
> 
> http://www.asus.com/Motherboards/Intel_Socket_1155/P8P67M_PRO/


Whether it's "Quad SLI compatable" or not, the absense of the NF200 will hurt you tremendously in that regard.

The highway just isn't big enough for all those cars...If you can understand that.

New drivers are terrible btw, bigger problems than the old ones


----------



## RagingCain

Quote:


> Originally Posted by *layikblackmore;13721336*
> I really am not good with pc's and I hope I do not offend anyone for posting in this section im in a bit of a bind and im hoping to find some friendly help. I own a dell xps 8300 I purchased it 2 weeks ago I really want to buy a gtx 590 and a 850 watt power supply I know nothing when it comes to specs to figure out wether I can put this in my pc .Anyways to simplify this I am not looking to go crazy or try to overclok or anything I just want a top of the line graphics card that will compete for the next few years so I can enjoy my pc gaming. The PSU IS A Corsair Professional Series Gold High Performance 850-Watt Power Supply CMPSU-850AX and ofcourse the EVGA GTX 590 if anyone could help me in this matter please email me at [email protected] your knowledge is greatly appreaciated thx


How old is the Dell you have? I ask because they use the same base model number for years but have different components. Dont worry this is the best spot to ask about any 590 questions. We also might suggest a 580 as it can be a headache to set the 590 up right. What games do you play or plan to play?

Post powered by DROID X2


----------



## layikblackmore

The dell was just bought 2 weeks ago and the mfg date of the pc was built in april ,I want to play battlefield 3 ,rage, games like that


----------



## layikblackmore

Ohh I also wanted to state that I bought black tie service from best buy so all my installs are free for the next year from the geek squad even if I didnt buy the components from them


----------



## RagingCain

Quote:


> Originally Posted by *layikblackmore;13724597*
> Ohh I also wanted to state that I bought black tie service from best buy so all my installs are free for the next year from the geek squad even if I didnt buy the components from them


Okay, whats inside it right now, CPU etc.?

Post powered by DROID X2


----------



## layikblackmore

im not sure how big the psu is but the graphics card is a radeon hd 5700 series 1gb
is their any way I can find out the psu on it ? I know the guy at best buy told me I deff would have to get a bigger psu, if you want to get an idea of my case just google image the dell xps 8300 i guess i just wish i knew stats more ,its a sleek black case I could take a pic of it and email it to you ?


----------



## RagingCain

Quote:


> Originally Posted by *layikblackmore;13724707*
> im not sure how big the psu is but the graphics card is a radeon hd 5700 series 1gb
> is their any way I can find out the psu on it ? I know the guy at best buy told me I deff would have to get a bigger psu, if you want to get an idea of my case just google image the dell xps 8300 i guess i just wish i knew stats more ,its a sleek black case I could take a pic of it and email it to you ?


I will have a look when I get on a computer. Or someone else will have a peak.

Post powered by DROID X2


----------



## layikblackmore




----------



## layikblackmore




----------



## layikblackmore




----------



## layikblackmore

allright here are the pics of the case


----------



## GTDanny

hi guys, Just built my new rig. used my trusty old antec 900 case. prob get something bigger asap.

Glad to see some other 590 Users.


----------



## Alatar

Quote:


> Originally Posted by *GTDanny;13737135*
> hi guys, Just built my new rig. used my trusty old antec 900 case. prob get something bigger asap.
> 
> Glad to see some other 590 Users.


post some pics?


----------



## GTDanny

Not happy with the cable managment but, I will be sorting that out. Sorry for the pic quality as well, iPhone 4.


----------



## rush2049

At least with the 900 you get nice airflow for the card and psu from that vent......


----------



## exlink

Finally got my computer up and running along with my new EVGA Classified GTX 590!


----------



## Alatar

added


----------



## manidu

Hey guys,

finished my completely watercooled new system a few days ago, containing 2 EVGA GTX590's on a ASUS Rampage III Extreme with a 990X (@4.15GHz) and 24GB of RAM...










Btw, could someone tell me the most stable clock settings for the stock 925mV?


----------



## exlink

Quote:


> Originally Posted by *manidu;13749903*
> Hey guys,
> 
> finished my completely watercooled new system a few days ago, containing 2 EVGA GTX590's on a ASUS Rampage III Extreme with a 990X (@4.15GHz) and 24GB of RAM...
> 
> _*IMAGE*_
> 
> Btw, could someone tell me the most stable clock settings for the stock 925mV?


Each GTX 590 overclocks differently, especially since you're on watercooling as well. So its hard to tell you what the most stable clocks would be for you.

Nice computer though, but is that all on a single loop...?


----------



## ReignsOfPower

Quote:



Originally Posted by *exlink*


Finally got my computer up and running along with my new EVGA Classified GTX 590!











Haha welcome Mr. Same-ish system.







Recently changed over to a Vertex 3 128GB MAX IOPS and a Xonar STX. Far better audio card than my old prelude. Also have a spare OCZ Vertex 2 atm, tossing up putting it in my system or doing something else with it. Might stick it in my server, in a laptop....or might do something with it in the current system...not sure what (or sell it) *shrugs*


----------



## manidu

Quote:


> Originally Posted by *exlink;13753919*
> Each GTX 590 overclocks differently, especially since you're on watercooling as well. So its hard to tell you what the most stable clocks would be for you.


Hm, I'll try to figure it out then...
Quote:


> Originally Posted by *exlink;13753919*
> Nice computer though, but is that all on a single loop...?


Yeah it is, do you think that's a problem?

I recently ordered two flow- and thermometers for the loop, I'm courious to see their data. Though it looks like the pump can handle the whole loop, but maybe I should have taken 13/10 tubes instead of the 10/8? Also due to space issues I took a Koolance Exos 2.5 external cooling kit, which has the tubes just going below the radiator instead of through it, looks a bit ineffective...


----------



## exlink

Quote:


> Originally Posted by *manidu;13764002*
> Yeah it is, do you think that's a problem?
> 
> I recently ordered two flow- and thermometers for the loop, I'm courious to see their data. Though it looks like the pump can handle the whole loop, but maybe I should have taken 13/10 tubes instead of the 10/8? Also due to space issues I took a Koolance Exos 2.5 external cooling kit, which has the tubes just going below the radiator instead of through it, looks a bit ineffective...


Personally, I would've done two separate loops no questions asked. The size of the tubes wouldn't make as big of an impact as just creating two separate loops.

Quad-Fire GTX 590's NEED to be on their own loop and I wouldn't go with anything less than a 360 radiator just for that loop if you want to get great results.

Then I'd put the 990x, NB, and ram (not a big fan of watercooling ram since its overkill IMO, I'd remove that out of the loop and just have good sinks on them if it was my system) on a separate 240 radiator loop.

Just my 2 cents.


----------



## exlink

Quote:


> Originally Posted by *ReignsOfPower;13763535*
> Haha welcome Mr. Same-ish system.
> 
> 
> 
> 
> 
> 
> 
> Recently changed over to a Vertex 3 128GB MAX IOPS and a Xonar STX. Far better audio card than my old prelude. Also have a spare OCZ Vertex 2 atm, tossing up putting it in my system or doing something else with it. Might stick it in my server, in a laptop....or might do something with it in the current system...not sure what (or sell it) *shrugs*


What a coincidence, I'm looking to buy myself a standalone sound card and SSD as my next purchase. However, I think I'm going to just stick with a Sound Blaster Titanium X-Fi Pro or HD since I only game on a Logitech X-230 2.1 system or my Sennheiser headphones.

I'm probably only going to get a 60GB or so SSD so I can install Windows on there and some of my more common applications/games for better load times.


----------



## XXXfire

Quote:


> Originally Posted by *exlink;13765990*
> Personally, I would've done two separate loops no questions asked. The size of the tubes wouldn't make as big of an impact as just creating two separate loops.
> 
> Quad-Fire GTX 590's NEED to be on their own loop and I wouldn't go with anything less than a 360 radiator just for that loop if you want to get great results.
> 
> Then I'd put the 990x, NB, and ram (not a big fan of watercooling ram since its overkill IMO, I'd remove that out of the loop and just have good sinks on them if it was my system) on a separate 240 radiator loop.
> 
> Just my 2 cents.


Why do these cards require two separate loops? I'm running a proverbial shift-ton of aggressively overclocked components on a single loop (& fully supported by a solitary pump) with very solid results. As stands, I've built around a pair of 6990s volt / clocked high enough to discharge 550 watts each & the powertune settings boosted to allow it. Under said conditions, load temperatures peak at 54 (range between 48-52 per gpu @ 99% load across the board, evaluated between a duration of hours). Besides that, I've got a stubborn 2600K (stubborn that higher clocks require high voltage) running 5.25 GHz with HT engaged. All idle temperatures are between 5-7 degrees of ambient (30-32 Celsius on CPU, 33-35 GPUs).

Not to besmirch dual loops, although I'm confident to say they are overall less efficient at fully utilizing a radiator's full & optimum capability. But a dual loop is certainly not essential.


----------



## XXXfire

Quote:


> Originally Posted by *manidu;13749903*
> Hey guys,
> 
> finished my completely watercooled new system a few days ago, containing 2 EVGA GTX590's on a ASUS Rampage III Extreme with a 990X (@4.15GHz) and 24GB of RAM...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, could someone tell me the most stable clock settings for the stock 925mV?


BTW, those blocks & the color scheme are effin' sick man. Super tight.


----------



## Masked

Quote:


> Originally Posted by *exlink;13765990*
> Personally, I would've done two separate loops no questions asked. The size of the tubes wouldn't make as big of an impact as just creating two separate loops.
> 
> Quad-Fire GTX 590's NEED to be on their own loop and I wouldn't go with anything less than a 360 radiator just for that loop if you want to get great results.
> 
> Then I'd put the 990x, NB, and ram (not a big fan of watercooling ram since its overkill IMO, I'd remove that out of the loop and just have good sinks on them if it was my system) on a separate 240 radiator loop.
> 
> Just my 2 cents.


They really don't.

1 of the interns in our office has 2x590's + his OC'd cpu all on one loop.

30-40c is great but, as long as it's under a 15* delta, I don't think it matters.

His max temps on the cards is 60c...

He's using an XSCP 360 mounted ontop of his hafX with 3push//3pull ~

If you're smart about it, there's absolutely no need for seperate loops especially when using some of the new resi's like DD's Monsoon, etc.


----------



## ReignsOfPower

Just finished upgrading the components in my PC, using new NVIDIA drivers now. My Xonar STX came with two free OP AMP upgrades (LME49720's) Didn't even know, apparently its a special edition thing they're doing for a short time. STX ****s on the Prelude I have if anyone cares lol. Esp with the new OPAMPS, suit my cans perfectly. Also running OCZ Vertex 3 MAX IOPS now, this baby is FAST.

Anyways, I'll start some tweakage and some benchmarkage. I'll try beat some of my old scores and upload some pictures tomorrow


----------



## Juggalo23451

GTX 590 voltage lock bypass


----------



## ReignsOfPower

Pardon the quality of the pics....plus I REALLY need to install my sprayed Noctua Fan. So damn lazy!


















































Hope you guys like


----------



## RagingCain

Cool video Jugg, side note,

For any old, beta, quadro, modded, new drivers go to LaptopVideo2Go.Com (heard your hard to find driver comments.)

The EVGA BIOS version 2 (92/93) is running pretty solidly despite one GPU being only 0.913mV.


----------



## ReignsOfPower

Quote:


> Originally Posted by *RagingCain;13777377*
> Cool video Jugg, side note,
> 
> For any old, beta, quadro, modded, new drivers go to LaptopVideo2Go.Com (heard your hard to find driver comments.)
> 
> The EVGA BIOS version 2 (92/93) is running pretty solidly despite one GPU being only 0.913mV.


Any ETA on BIOS version 3? I'd like 0.925 on both GPU's because I've got OCD haha. Just using the Unlocked fan bios myself atm. Seems to run perfectly fine on stock settings.


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13777663*
> Any ETA on BIOS version 3? I'd like 0.925 on both GPU's because I've got OCD haha. Just using the Unlocked fan bios myself atm. Seems to run perfectly fine on stock settings.


Just wrote a very annoyed PM to Jacob, although I know its not his fault.

Severe BOINC errors and BSOD, nothing I can do. I may bite the bullet and fix it myself. Its getting fixed in the next two weeks or I am getting a refund.

Do not use my BIOS. Apparently maybe interfering with the latest drivers too. The ones that I told people to stay off of. May have been the BIOS.


----------



## XXXfire

What kind of clocks are you fellas getting on your watercooled 590s? Not especially apt for crawling through expired forum pages :] For the folks implementing the work-around-voltage-restriction custom BIOSes, has their been a generally accepted range that's safe to operate within? (or was all the conjecture about likely VRM fault just exaggerative hysteria?) Since the wave of contemporary dual GPU competition has heat up (tune of 400-500 watts, after all) I've found great interest in the design choices, compromises, et. al of both companies. If anyone's informed such as to provide some insight, it'd be welcomed gratefully







Especially looking forward to inevitable postings of users lucking into paramount samples of low-volt/super-high clock GF 110s. Also wonder how the load temps are @ stock or cranked up, or in SLI tandem with one of it's brothers. While I'm blathering, I'm also quite curious as to clock for clock comparisons between a 580 locked up in SLI..with it's freshly minted flagship successor; is the on-board NF200 bridge benefiting, negligible, or latency inducing.. ever-so-slightly enough to substantiate a measured distinction that is utterly irrelevant? I've always been a fan of their played out pci-e bridge/switch, though in practice seen their benefit most significant in triple/quad configs.


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Just wrote a very annoyed PM to Jacob, although I know its not his fault.

Severe BOINC errors and BSOD, nothing I can do. I may bite the bullet and fix it myself. Its getting fixed in the next two weeks or I am getting a refund.

Do not use my BIOS. Apparently maybe interfering with the latest drivers too. The ones that I told people to stay off of. May have been the BIOS.


I'll jump on that boat, actually...With my personal card.

The new drivers are horrible and Nvidia isn't doing their job.

I realize EVGA is not COMPLETELY to blame but, in the very least, they could be assisting with some of the issues that are present.


----------



## Levesque

Quote:


> Originally Posted by *Arizonian;13781731*
> I'm not a 590 owner but this video shows two 590's stock clock vs two 6990's Extreme BIOS on a 3D Mark 11 bench done 2 months ago.


Still posting your usual useless 2 months old Tributar benchmarks I see.









There was a bug in AVP with only one core of the 6990 working when he made those, and that is now fixed. That's why the 6990 get half the FPS of the 590. Useless.

And 2X6990 are also spanking 2X590 with 3D Mark 11. Go have a look in the OCN 3D Mark 11 thread. And with those 6990 OCed to 1100/500 out there, a single 6990 can take 2 580 OCed head-on easily.

AMD are on a roll, and latest 11.5b Hotfix are crazy good. I can play the The Witcher 2 with Uber ON, while Nvida users still can't .









Please stop posting useless 2 months old benchmarks with beta drivers and bugs that are now fixed.









The 590 is a dead-end, while the 6990 is improving with every new drivers.


----------



## Ooimo

Quote:


> Originally Posted by *jy360;13038247*
> this thread makes me sad


Same


----------



## Shinobi Jedi

Hello fellow GTX-590 Jedi,

I've been an owner for about a month and have been following this thread. I need to get an image hosting site going so I can get some pics up for official membership









However, I just wanted to chime in and say that the current drivers are working flawlessly for me at 1080p on a 60" Plasma. If that helps at all....

Also, I really love this card. So much so, I'm considering getting another one before they're all gone for good, as overkill that it is for my current display. But rather for later, when I step up to a 2560x1600 or Surround display gaming solution.

But I also would have to get a new PSU as well...

My current plan is just to roll with the one card and then see what Kepler brings in the fall. But if Kepler is going to not work with X58 chipsets and needs a new mobo/cpu, then I'd rather go Quad SLI with 2 590's in my current rig and stretch out the CPU and Mobo longer.

I'm also worried that 1.5gb of VRam per GPU may not be enough when jumping into high resolution Surround Gaming.

So my plan is to either wait for Kepler, or wait and see if this rumored revision 590 ever turns out to be real (I don't believe so), and then possibly/probably be able to qualify for it through EVGA's step up if it happens within the next 60 days..

Or, if Kepler turns out to need a new mobo or CPU, then I'd rather try 2 590's SLI'd with a new PSU and be ready for when I get into a new display and stretch this current rig out until I could afford a real legit new one..

(I'm holding off on a higher res or multi monitor display setup due to spacing issues)

If anyone has any ideas or input, it's greatly appreciated.

I've still got one day to return the card and exchange it for a GTX 580 3gb which I could SLI with another, if anyone thinks that's a better option. Though I'm hesitant to do it as I don't know when I'd be able to get a second card, and don't feel that unknown time is worth the significant performance hit I'd be taking

(I know, because I originally had 1 580 3gb that I exchanged for the 590 when I saw that the 580 3gb didn't do hardly anything better than the original 570 HD I first bought and tried.)

Almost every game has been a locked 60fps (My display's refresh rate is 60hz)

Although I am getting some unperceived frame dips at 1080p on The Witcher 2. Though it's only dipped into the 50's in terms of FPS. It's got me worried I won't be ready for BF3 in the fall, and if I'm not, that it'll be impossible to find a 2nd GTX 590 due to it's botique limited production run..

Also, as shallow or inconsequntial as it may be, I really dig the light up GeForce logo LED's built into the card and how it looks in my case. I know it's inconsequintal to gaming, but if you appreciate aestheics or take the time to build rigs like we all do here, then it's definitely a nice touch.

One things for sure, I've gotten almost as many compliments on that small bling in the card almost as much as the card's performance itself..









Thanks to all who take any time and consideration to offer some input.

To be clear, I'm %90 leaning on keeping the card, and %10 considering exchanging towards a 580 3gb SLI setup. But knowing I get more than enough performance with one 590, and the fact that a major chip revision with Kepler is around the corner, I'm heavily leaning towards staying with what I have and see what Kepler brings to the table for performance.

If it ends up that when Kepler debuts, that 2 590's Quad SLI are my best option for BF3 without having to build/buy a whole new rig and they're all gone, I'm gonna be so bummed...









Thanks again for any insight-

I hope to have my pics up soon! Cheers!


----------



## exlink

Got some time to play Bad Company 2 last night and its crazy smooth. I used to have two HD 5870 2GB's in crossfire and I was also getting high FPS but it still seemed to lag now that I compare it to my GTX 590. Maybe there was a slight hint of microstutter with my Crossfire set-up before because I cranked up the settings for my GTX 590 (4540x900, 16xQ CSAA / 16x AF, Max Settings, HBAO On) where I get the same FPS (~70-80) and its much smoother.

Enjoying the green side again. AMD's got a good generation of cards, especialy for the price, but was never really a fan of their drivers. Nor was I going to pay $700+ for the HD 6990 that sounds like a high-powered vacuum and that performs roughly the same or slightly better than the much quieter GTX 590.

Going to overclock it this weekend. Not going to do any voltage changes until its safe(r). Hoping for 680MHz on the core, 50MHz overclock wouldn't be too bad on stock volts.


----------



## exlink

Quote:


> Originally Posted by *Shinobi Jedi;13784615*
> Hello fellow GTX-590 Jedi,
> 
> If it ends up that when Kepler debuts, that 2 590's Quad SLI are my best option for BF3 without having to build/buy a whole new rig and they're all gone, I'm gonna be so bummed...


Are you playing at 1080p? Because just so you know...a *SINGLE* GTX 580 has been playing the gameplay video's (an unoptimized pre-alpha built at high settings) released prior to E3 (I'm not sure if they're still on a single GTX 580 at E3 as well). For 1080p, the GTX 590 will be ample enough power for the final, optimized build of the game. They are building the game from the bottom up for every console, including PC, so expect it to be a LOT more optimized than Bad Company 2. EA/Dice have already stated that if you can play Bad Company 2 well then you'll play Battlefield 3 well.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *exlink;13784777*
> Are you playing at 1080p? Because just so you know...a *SINGLE* GTX 580 has been playing the gameplay video's (an unoptimized pre-alpha built at high settings) released prior to E3 (I'm not sure if they're still on a single GTX 580 at E3 as well). For 1080p, the GTX 590 will be ample enough power for the final, optimized build of the game. They are building the game from the bottom up for every console, including PC, so expect it to be a LOT more optimized than Bad Company 2. EA/Dice have already stated that if you can play Bad Company 2 well then you'll play Battlefield 3 well.


Nice!! Thanks for the heads up, that takes a load off my mind..

+rep!


----------



## Cyanotical

Would like to add my name to that list:
http://valid.canardpc.com/show_oc.php?id=1795955


----------



## Shinobi Jedi

Oh man, I am so tempted to get a second EVGA GTX 590 Classified LE card even though it is absolute overkill for my 1080p display...

Although I still get some frame rate drops under 60fps in some areas with Metro 2033 and Witcher 2. Though nothing unplayable... for sure..

My plan so far, is to wait on Kepler and see how much that changes the game. But the 590 cards are in stock at a couple local stores and online at EVGA yesterday (Might be gone by now...)
And so I'm tempted to pull the trigger before they're gone for good.

And if Kepler does need a whole new socket/mobo architecture, I thought about going Quad SLI in my current rig with 2 590's and then when that system stops running games full blast, then figure I'd get a whole new rig from scratch and use my current one as a second machine for a guest or backup or server or even as a loaner to some of my good gaming buddies who can't afford PC gaming..

I'd also have to get a new PSU. Everyone on the EVGA forums are swearing by the Corsair AX1200w...

So I'm looking at dropping another $1100 on my current X58 based setup to game on a 60" 1080p Plasma display.

Although I think I may have figured out a way to fit a desk and 3 monitors for Surround Gaming..

And if I get 3 that can do 2560x1600, will 2 590's be enough? Or will I "definitely" need cards with 3gb of VRAM?

So - Getting a 2nd GTX 590*...

Good Idea?

Bad Idea?

or

Focus my upgrade $$ on getting 3 display's together first?

Or maybe one single 120hz display? I hear gaming at 120fps on a 120hz display is like no other. (Though I'm sure surround is better)

Talk me into or out of it!

While I know it's overkill for this display, the "emotional incentive" for lack of a better term - is to do it knowing this is the most I can max this system out with this mobo and CPU combo and GPU..

If Kepler works fine on the X58 mobo's then I'll happily wait until then and see what's up before I upgrade.

Last quesiton: Will upgrading my ram, or increasing the amount, do anything to help with gaming performance - or is that just a waste of time? Because other than storage, I don't see where else I can add to, to increase performance besides Ram. But if it's a waste of time, then I won't bother.

I was also tempted to get an SSD for games, until I heard about data rot and so I figure I'm holding off on that.

So... that all leads my back to my temptation into getting a second 590.

Talk me into, or out of it! This is the only place on the web where you can get "non-hater" input on the GTX 590! Any input is appreciated!

*While I'm always down for OC'ing if it brings much needed or exceptional game performance, I don't have any interest in being able to, or not able to, OC the 590's. As even at these "low" clocks, games are getting destroyed by this card. So not being able to Overclock, even for future proofing - is not a warranted reason to me to not get a second one. Hitting stability problems or performance problems because 1.5gb of VRam isn't enough per GPU for Surround Gaming - that is a valid reason to reconsider. Though to be fair, a guy on the EVGA forums who had Tri SLI 580 3gb cards was still hitting performance caps at 8100x900 or whatever it is...

Thanks for any input...

Let the "debate" begin!


----------



## ReignsOfPower

Well I like to look at it this way. Is 1100 bucks worth playing these games slightly smoother?

1. Metro 2033 - no
2. Crysis - no
3. The Witcher 2 - well without ubersampling, I run it 60fps+ constantly on a 2560x1600 screen, so. No. Even though I love that games to bits.
4. Dragon Age II with DX11 Tessalation - no. Just turned that crap off, finish the game once, never played it again. DA1 was so much better








5. Every other game on max = NP.

Wait for a Dual GPU Kepler. That's what I'm doing. Either way, next upgrade for me is a 2011 Socket mATX Overclocking mobo, hexacore Ivy bridge and a new set of ram. Kepler will come after that.









Ragin- Any RE: concerning BIOS 3?


----------



## RagingCain

Quote:



Originally Posted by *Cyanotical*


Would like to add my name to that list:
http://valid.canardpc.com/show_oc.php?id=1795955











Love the color scheme, welcome!

Quote:



Originally Posted by *ReignsOfPower*


Well I like to look at it this way. Is 1100 bucks worth playing these games slightly smoother?

1. Metro 2033 - no
2. Crysis - no
3. The Witcher 2 - well without ubersampling, I run it 60fps+ constantly on a 2560x1600 screen, so. No. Even though I love that games to bits.
4. Dragon Age II with DX11 Tessalation - no. Just turned that crap off, finish the game once, never played it again. DA1 was so much better








5. Every other game on max = NP.

Wait for a Dual GPU Kepler. That's what I'm doing. Either way, next upgrade for me is a 2011 Socket mATX Overclocking mobo, hexacore Ivy bridge and a new set of ram. Kepler will come after that.









Ragin- Any RE: concerning BIOS 3?










RagingCain is on RagingCain's Bios 3, voltage unlocked to 0.975v, stock clocks. I have been ignored for about a week. They are probably tired of me finding issues with their BIOS, or they are swamped. Either way, I can't wait forever for post-sales support. Rocking my own BIOS solved the issue. Min voltage is 0.975 with latest drivers, I simply undervolt for stock clocks to 0.925v. No point in unlocking higher since PDL affects just about everything.

All GPUS (4 of them) fixed.

Quote:



Originally Posted by *Shinobi Jedi*


Oh man, I am so tempted to get a second EVGA GTX 590 Classified LE card even though it is absolute overkill for my 1080p display...


I disagree with this, all the more experienced members know there are plenty of ways to make 1920x1080 bring computers to their knees, especially in Crysis and Metro 2033. Its also a good way to future proof.

Quote:



Although I still get some frame rate drops under 60fps in some areas with Metro 2033 and Witcher 2. Though nothing unplayable... for sure..



Quote:



My plan so far, is to wait on Kepler and see how much that changes the game. But the 590 cards are in stock at a couple local stores and online at EVGA yesterday (Might be gone by now...)
And so I'm tempted to pull the trigger before they're gone for good.


Its really about today, and then tomorrow for me. I always plan my system out for what I have not thought of. Sure enough, its a little bit easier as you can always buy a second or third video card down the road if you built your system correctly. With the 590, that option will not be there in the future, so its really a now or never type deal.

Quote:



And if Kepler does need a whole new socket/mobo architecture, I thought about going Quad SLI in my current rig with 2 590's and then when that system stops running games full blast, then figure I'd get a whole new rig from scratch and use my current one as a second machine for a guest or backup or server or even as a loaner to some of my good gaming buddies who can't afford PC gaming..


I am always leary of new architecture. After this last battle royale with drivers, I am seriously not impressed. I will definitely wait for not just the new architecture but a minimum of 6~12 months of mature drivers.

Quote:



I'd also have to get a new PSU. Everyone on the EVGA forums are swearing by the Corsair AX1200w...


I hear really good things about the AX1200W and it should be able to handle QUAD 580s, or 2 590s.

Quote:



So I'm looking at dropping another $1100 on my current X58 based setup to game on a 60" 1080p Plasma display.


I am not sure why you would get a 1080p plasma, when a 30" 2560x1600 resolution monitor with low hardware input would be a much better choice for the price.

Quote:



Although I think I may have figured out a way to fit a desk and 3 monitors for Surround Gaming..


I went multi-monitor instead of a single 30" 2560x1600 monitor. Totally wish I had gone the other route for gaming wise, but as far as productivity such as Pshop or Visual Studio, nothing beats 3 monitors.

Quote:



And if I get 3 that can do 2560x1600, will 2 590's be enough? Or will I "definitely" need cards with 3gb of VRAM?


All depends on the video game AND your AA levels. For instance some games have a small memory footprint, while others like Crysis / Metro2033 have tons VRAM usage even at lower 1920x1080/1200 resolutions. Most games should be very playable even with AA on at 2560x1600.

Quote:



So - Getting a 2nd GTX 590*...

Good Idea?

Bad Idea?


There is an extra level difficulty in getting QUAD-SLI running smoothly in a lot of games, the drivers can be chaotic at times, and can really leave you... pissed off (for lack of more elegant way of saying it.)

But when it does run well.... oh does it run well. My first title to play was Crysis, max settings, 5760x1080 with 4xAA... boy did it run beautifully.

Quote:



Focus my upgrade $$ on getting 3 display's together first?


We can't decide this for you. There is an additional headache for you to go over is the fact that the 590 will not always be available to buy new.

Quote:



Or maybe one single 120hz display? I hear gaming at 120fps on a 120hz display is like no other. (Though I'm sure surround is better)


I have my center monitor 120 Hz, because I knew some games would suck in multi-monitor, and in my opinion there is nothing that will make you a better FPS player than a 120 Hz monitor. Its so smooth and really enables even more potential in players.

Quote:



Talk me into or out of it!

While I know it's overkill for this display, the "emotional incentive" for lack of a better term - is to do it knowing this is the most I can max this system out with this mobo and CPU combo and GPU..

If Kepler works fine on the X58 mobo's then I'll happily wait until then and see what's up before I upgrade.


Its really up to you at the end of the day, there are serious pros and cons to both getting and not getting a second 590. More so, you can always wait for something more powerful to come out. This is always true in computing, something bigger and shinier is always around the corner. Sometimes its prudent to wait, sometimes waiting ends up letting people down too though, especially when real world performance is way lower than the white paper specs.

Quote:



Last quesiton: Will upgrading my ram, or increasing the amount, do anything to help with gaming performance - or is that just a waste of time? Because other than storage, I don't see where else I can add to, to increase performance besides Ram. But if it's a waste of time, then I won't bother.


Fast RAM rarely increases gaming performance. Ever. Is it good to have? Sure. Is it a game changer? No.

Quote:



I was also tempted to get an SSD for games, until I heard about data rot and so I figure I'm holding off on that.


SSDs are best for Operating Systems and games like WoW that constantly read from Disk. RAIDs are still the best bang for buck and performance when it comes to games.

Quote:



*SNIP*

Let the "debate" begin!


Its all about what you want. Any one of us could talk you into buying a 590, and a few of us could talk you out of buying one too.

I personally have a new headache every other day, but like I said when it works.... oh its nice (big grin.)


----------



## exlink

I'm still debating whether or not to get rid of my 3x19" LED set-up (4320x900) and get a single 27" LED that is 1920x1080. I've basically looked at it this way:

*3x 19" LED Surround*
4320x900 (3.9MP)
*Pros:*
Larger FOV in games
More screen space
Easier multi-tasking
*Cons:*
Large resolution - requires more horsepower
Small monitors if used individually
Takes up a lot of space
Bezels sometimes, but rarely, get in the way

*Single 27" LED Monitor*
1920x1080 (2.1MP)
*Pros:*
Requires a lot less processing power
Huge uniform screen
More space on individual screen
Takes up less space
*Cons:*
Loss of some FOV in games
Loss of desktop space
Less of an "immersion" effect in games

Making the swap would pretty much cost me nothing since I'd get about the same amount for my 3 monitors that would cost me to buy the 27" monitor. I really game about 70% of the time I spend on the computer. Wish I had the money to just invest in 3x 27" monitors and 2 GTX 580 3GB's and have the best of both worlds. Too bad money doesn't grow on trees.









Its a hard decision to make.


----------



## Shinobi Jedi

@ RagingCain

Awesome info and Insight. I read you on everything you're saying and you've giving me a lot to think about...

Question: How much do you think Driver maturity will Iron out the kinks you're having with the card? Enough to make the difference? Or not at all?
My Metro benchrun's went up 10fps when switching from the 270 drivers to the 275 beta's and now WHQL's. On the 270 drivers I'd get the same results on the bench program as reviews - 48fps. Now I get 60-61 on all three runs of the test. So I was hoping that better drivers might work out some of these Quad SLI kinks. But if not, then maybe it's not worth it? I guess there's no really way of knowing ahead of time.

To answer your question about my Display, I'm a cinephile, so we bought it mainly for home theater and Blu-Ray and console gaming. I bought it before I got back into PC gaming. But being that Pioneer no longer makes display's and their Kuro line of Plasma's are still contender's for best picture in a Home Theater setup, though new technology should eclipse that soon.

Since I still use it a lot for Movie watching and it's ISF calibrated, I want to keep it. Until there is an LED or LCD that can produce the same deep level of blacks, it's hard to let go of this display. Especially since many are worth more now than MSRP since they're discontinued (Though sure to change for soon as LED tech is finally catching up.)

So since I don't have anywhere to store it and it's too nice to just sell, I'm holding onto it. Plus, I can't go from 60" to 30" on movie watching just for 2560x1600. And I live in an one bedroom apartment with no where to store the display to make room for a Surround display.

So those are the main, reasons, though I do agree for PC gaming a higher resolution and smaller screen can be a stronger choice.

However, buying one kick arse 30" 120hz 2560x1600 panel is a very good idea and would even be better in the space I'm thinking of.

I hope that answers your question!

Thanks again!


----------



## Shinobi Jedi

Quote:


> Originally Posted by *ReignsOfPower;13801672*
> Well I like to look at it this way. Is 1100 bucks worth playing these games slightly smoother?
> 
> 1. Metro 2033 - no
> 2. Crysis - no
> 3. The Witcher 2 - well without ubersampling, I run it 60fps+ constantly on a 2560x1600 screen, so. No. Even though I love that games to bits.
> 4. Dragon Age II with DX11 Tessalation - no. Just turned that crap off, finish the game once, never played it again. DA1 was so much better
> 
> 
> 
> 
> 
> 
> 
> 
> 5. Every other game on max = NP.
> 
> Wait for a Dual GPU Kepler. That's what I'm doing. Either way, next upgrade for me is a 2011 Socket mATX Overclocking mobo, hexacore Ivy bridge and a new set of ram. Kepler will come after that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ragin- Any RE: concerning BIOS 3?


Reins,

This is pretty much my logic. Thanks for the affirmation!

I really see both sides to this.

I think I'm just going to leave it up to the fates and see how my money situation turns out and if the card's still in stock. If things work out where I can buy a second card and can do so, I probably will. If not, I'm not going to lose any sleep.

Thanks for the responses. I'd love to hear any other insight if anyone has any.

Cheers!


----------



## RagingCain

Quote:


> Originally Posted by *Shinobi Jedi;13806094*
> @ RagingCain
> 
> Awesome info and Insight. I read you on everything you're saying and you've giving me a lot to think about...


No problem that's what we are here for








Quote:


> Question: How much do you think Driver maturity will Iron out the kinks you're having with the card? Enough to make the difference? Or not at all?


Here is the thing with drivers, although nobody really wants to admit it in the nVidia GPU camp, their QA has gone downhill over the last year or so in my opinion. To me they used to have been known for releasing solid tank-like drivers, and testing new things out on the beta front. Now it seems the drivers are just put out willy nilly, they make minor changes to ONLY the most popular titles OR those that are being actively complained about in the community. Its not a terrible approach, after all, there are a billion titles that most of them do work adequately, however, when you are paying for a 500$ or 700$ video card, thats not the support you want to see. You want your card to work out the box screaming, and that is it in a nutshell. There is nobody doing this right now. Not AMD/ATi, not nVIDIA, nobody. The primary reason I believe because QA is one of the (if not the ONE) first places budget cuts affect (hehe, I am biased being a recently dismissed QA person







.)

So right now, WE have become their QA. So keep that in mind when I say driver maturity. Driver maturity is a combination of in-house testing AND response to the community. Responses to the community can take anywhere from a week, to months to get a fix. A patch in one title, may affect others as well, hence the general increase in stability occurs. HOWEVER, a lot of the times EACH glitch or bug has to be pro-actively reproduced, debugged, and then fixed (usually within that specific game's profile, or again in the generic sub-set of profiles, such as "DX11 Titles".) Then of course new title upon new title is released, and I got to tell you, its not easy beta testing new games (especially when your job is to play through the game and find any glitches for yourself.) Then you have to consider what should be a game engine fix or a driver release, which requires back and forth between dev and nVidia. Its a very time consuming process and because of a lack of standardization (even with DX9/10/11/OGL APIs, there are many headaches that we and them have to put up with.

The best thing is to be a pro-active consumer. We may not want to be QA, but we are, and doing nothing about it only sucks for us, they have our money already. In theory, everything should get better with time... in theory.
Quote:


> My Metro benchrun's went up 10fps when switching from the 270 drivers to the 275 beta's and now WHQL's. On the 270 drivers I'd get the same results on the bench program as reviews - 48fps. Now I get 60-61 on all three runs of the test. So I was hoping that better drivers might work out some of these Quad SLI kinks. But if not, then maybe it's not worth it? I guess there's no really way of knowing ahead of time.


Quad-SLi is definitely dependent on peeps like me bitc... formally discussing issues we are having with titles on either nvidia's forums, or bug submission tickets.
Quote:


> To answer your question about my Display, I'm a cinephile, so we bought it mainly for home theater and Blu-Ray and console gaming. I bought it before I got back into PC gaming. But being that Pioneer no longer makes display's and their Kuro line of Plasma's are still contender's for best picture in a Home Theater setup, though new technology should eclipse that soon.


I must have miss read, I thought you were buying a plasma to game, if you already have one then use the heck out of it








Quote:


> Since I still use it a lot for Movie watching and it's ISF calibrated, I want to keep it. Until there is an LED or LCD that can produce the same deep level of blacks, it's hard to let go of this display. Especially since many are worth more now than MSRP since they're discontinued (Though sure to change for soon as LED tech is finally catching up.)


I don't think LED blacklit LCDs will ever reach your level of.... black.
Quote:


> So since I don't have anywhere to store it and it's too nice to just sell, I'm holding onto it. Plus, I can't go from 60" to 30" on movie watching just for 2560x1600. And I live in an one bedroom apartment with no where to store the display to make room for a Surround display.
> 
> So those are the main, reasons, though I do agree for PC gaming a higher resolution and smaller screen can be a stronger choice.


Wait till you have seen either a video game OR movie footage in 1600p. The difference between 1600p and 1080p is akin to seeing 640x480p vs 1080p. It will rock your socks, and then leave you broken and battered.... till you buy one.... (personal experience.) Still though, I am happy with my Surround from time to time, and nothing beats 3 monitors in productivity for me. Consider getting a similar size screen for the center monitor that is also 120 Hz like myself. When you do game on just the center monitor, you have the option of 3D or game at a solid 120 Hz refresh rate and everything is definitely liquid smooth.
Quote:


> However, buying one kick arse 30" 120hz 2560x1600 panel is a very good idea and would even be better in the space I'm thinking of.
> 
> I hope that answers your question!
> 
> Thanks again!


It would be great, but right now, we are flat panel technologically, and possibly adapter, limited to 1920x1080 120Hz. I am sure we will see 1200 120Hz monitor any day now, but I don't think many adapters (maybe a 570/580/590) can support such a refresh rate at 1600p. The most you could hope for is [email protected] right now


----------



## Arizonian

Quote:


> Originally Posted by *RagingCain;13808511*
> To me they used to have been known for releasing solid tank-like drivers, and testing new things out on the beta front. Now it seems the drivers are just put out willy nilly, they make minor changes to ONLY the most popular titles OR those that are being actively complained about in the community. Its not a terrible approach, after all, there are a billion titles that most of them do work adequately, however, when you are paying for a 500$ or 700$ video card, thats not the support you want to see. You want your card to work out the box screaming, and that is it in a nutshell. There is nobody doing this right now. Not AMD/ATi, not nVIDIA, nobody. The primary reason I believe because QA is one of the (if not the ONE) first places budget cuts affect (hehe, I am biased being a recently dismissed QA person
> 
> 
> 
> 
> 
> 
> 
> .)


I'd have to disagree, the last two drivers focused on 560 & 580 cards mostly. I can understand your fustration because the 590 hasn't received the proper attention it currently needs and issues haven't been addressed. They also improved 3D support in games, software, and hardware level. They are improving their bread and butter most popular cards so far. I don't and haven't had any issues with any of the drivers and keep up to date with each release.


----------



## RagingCain

Quote:


> Originally Posted by *Arizonian;13809476*
> I'd have to disagree, the last two drivers focused on 560 & 580 cards mostly. I can understand your fustration because the 590 hasn't received the proper attention it currently needs and issues haven't been addressed. They also improved 3D support in games, software, and hardware level. They are improving their bread and butter most popular cards so far. I don't and haven't had any issues with any of the drivers and keep up to date with each release.


Both the 560 an 580 do get a lot of focus because, going by observations, has to be the most popular mid range card, and also the oldest of the newest generation (580). Its going to be getting mature finally.

This still points to primarily responses to largest crowds which is not how it used to be. Every card used to get full love and support.

The 3d has been getting numerous updates, true. Multi-monitor support still has numerous glitches just in the control panel alone, let alone in some games it just causes the game to crash or flicker. It feels like beta testing a cheap or open source program half of the time. Many of the profile changes often fail to save on reboot or even failed to get applied.

There are plenty of issues still needing to be fixed that were on my 580s since day one. Low gpu usage comes to my mind as the worst offender. DA2 is still unplayable at times due to artifacting shadows for a 5xx card on very high is another example.

Throw in a driver failure/recovery and bsods and we are at the 58xx/5900 crappy driver levels from ATI. Thats how low we have gotten.

Post powered by DROID X2


----------



## Arizonian

Quote:



Originally Posted by *RagingCain*


This still points to primarily responses to largest crowds which is not how it used to be. Every card used to get full love and support.

The 3d has been getting numerous updates, true. Multi-monitor support still has numerous glitches just in the control panel alone, let alone in some games it just causes the game to crash or flicker. It feels like beta testing a cheap or open source program half of the time. Many of the profile changes often fail to save on reboot or even failed to get applied.


Point taken. Duly noted.

What would you say about the SLI enchancment? How would you rate SLI performance in drivers as of lately?

I'm thinking about going SLI instead of changing a 580 when Kepler comes out which will take me easily to Maxwell.

I'm figuring Kepler being first run 28nm fabrication will be like first run Fermi. I'd wait it out and by Maxwell the bugs will be ironed out.


----------



## yellowtoblerone

Quote:



Originally Posted by *ReignsOfPower*


Well I like to look at it this way. Is 1100 bucks worth playing these games slightly smoother?

Ragin- Any RE: concerning BIOS 3?










You shouldn't be talking about "worth it" with nvidia


----------



## RagingCain

Quote:



Originally Posted by *Arizonian*


Point taken. Duly noted.

What would you say about the SLI enchancment? How would you rate SLI performance in drivers as of lately?

I'm thinking about going SLI instead of changing a 580 when Kepler comes out which will take me easily to Maxwell.

I'm figuring Kepler being first run 28nm fabrication will be like first run Fermi. I'd wait it out and by Maxwell the bugs will be ironed out.


Do you mean evga's sli enhancement patcher? Because that is usually 100% worth it. The pain of it is they rarely tweak the beta drivers. Installing old enhancements over newer drivers will work but often times the game glitches fixed in the newer drivers get unfixed.

I am with you 100% on waiting for Maxwell and maybe dual GPU it again. Hopefully with more robust VRM. If maxwell follows road map we may see a tri-gpu card given PCI-e 3.0 power delivery ability/bandwidth and the efficient maxwell design.

Post powered by DROID X2


----------



## EvilClocker

RaginCain, Can you send me your Bios with Voltage Tweak


----------



## RagingCain

Quote:


> Originally Posted by *EvilClocker;13814093*
> RaginCain, Can you send me your Bios with Voltage Tweak


Sent. Might have to just post them on the forums. I gave EVGA plenty of time to fix their BIOS.

Edit: Attached them.

backa.rom - Plain EVGA GTX 590 GPU0 - BIOS 92
backb.rom - Plain EVGA GTX 590 GPU1 - BIOS 93
GTX590A.ROM - Modded BIOS 92, stock voltage 0.975v, correct voltage display.
GTX590B.ROM - Modded BIOS 93, stock voltage 0.975v, correct voltage display.

All properly down volt/downclock on idle, the stock clocks are 630 MHz (standard EVGA).

Confirmed that on 275.33 drivers, having zero issues since fixed (the ones I previously could barely install.)

You can also down volt to your hearts content, but keep an eye on Afterburner... for some reason on 2.2.0 Beta 3, its not adjusting the voltages on the other GPUs even if you have it synchronize settings for identical GPUs so you have to manually adjust them lower on each one.

GTX590A-2.ROM is a special ROM, which I have enabled the nVidia OEM message saying "Your running RagingCain's Modded 92/93 BIOS."

Please don't use if you don't know what you are doing. I take no responsibility for any screw ups or burnt cards. You can always attempt a blind flash if it goes horrible, hence the backa/backb.roms.

To flash, you will need to read a guide and create a bootable USB drive, put nvflash and associated CP.exe with the ROMs in the root directory.

Once booted into C:\ (prompt) the command to begin is:
nvflash -4 -5 -6 GTX590A.ROM

It will display a lot of text/data then ask you (Y)es, (A)bort, (S)kip. If you want to continue press Y.

Once back on C:\ command line type:
nvflash -4 -5 -6 GTX590B.ROM

Y again for the win.

Note: If you have two 590s, after you flash one GPU0 (you have two GPU0s, and two GPU1s), it will scan for anymore GPU0s, and ask if you want to flash it too. Y is the same answer here.

So if you have 2 590s, you will have to press Y twice when its asked to successfully flash both GPU0s and twice more for GPU1s after nvflash is ran again.


----------



## yellowtoblerone

Another Asus 590 has come in stock on newegg. I was pondering pulling the trigger today again. I decided against it since I still don't know if they're coming out with rev 2 or not.


----------



## Arizonian

Quote:


> Originally Posted by *yellowtoblerone;13818957*
> Another Asus 590 has come in stock on newegg. I was pondering pulling the trigger today again. I decided against it since I still don't know if they're coming out with rev 2 or not.


I haven't heard of any revison 2. It would mean they admit there was a problem.

I have heard ASUS putting out a GTX 580x2 dual gpu which is a non-refernce pcb of two fully loaded 580's no holding back. It might be rumor.


----------



## Masked

There will be NO revision 2.

The run of 590's is done, there will be no more staggered releases.

I actually discovered how many 590's we have as a vendor and I'm very suprised.

The ONLY upcoming dual GPU card is the Asus 580x2 that will be priced at over 1000$ on 1.5g's.

The Asus model, isn't that impressive...Nor am I currently impressed with my 590's, thus the 6990's but, in truth I'm even less impressed with the 6990's so, let's not go there all-together.

Stock is out, it won't be back in, if you find a 590 and truly want one, get it while you can.


----------



## RagingCain

Nothing like Kingpin to restore some faith in this card with his Zombie creation:
http://kingpincooling.com/forum/showthread.php?t=1279

THE fastest single video card in the world. His vantage benchmark was better than my SLI 580s @ 965 MHz.


----------



## Booty Warrior

Quote:


> Originally Posted by *RagingCain;13826689*
> Nothing like Kingpin to restore some faith in this card with his Zombie creation:
> http://kingpincooling.com/forum/showthread.php?t=1279
> 
> THE fastest single video card in the world. His vantage benchmark was better than my SLI 580s @ 965 MHz.


Well... damn.


----------



## RagingCain

Quote:


> Originally Posted by *Booty Warrior;13826859*
> Well... damn.


Best part, according to him, he isn't even done tweaking it, and has more voltage mods to do for the Memory.

On another note, request for a 1.0v bios was made, so I attached it here.

Please use responsibly, and if you are not overclocking, lower the voltage.

I also want to add, I haven't had one gosh darn problem since flashing. Looks like I am stable again.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13826880*
> Best part, according to him, he isn't even done tweaking it, and has more voltage mods to do for the Memory.
> 
> On another note, request for a 1.0v bios was made, so I attached it here.
> 
> Please use responsibly, and if you are not overclocking, lower the voltage.
> 
> I also want to add, I haven't had one gosh darn problem since flashing. Looks like I am stable again.


Stable with the new WHQL as well?


----------



## RagingCain

Quote:


> Originally Posted by *Masked;13828024*
> Stable with the new WHQL as well?


Yep, been playing Black Ops and Bad Company 2 no problem, just did Boincers gone Bonkers with no problem, and the female friend has been enjoying the Sims 3 in Surround.


----------



## Krazeswift

Does anyone have a modded Asus 590 bios to hand?

I had some really nice stable clocks when i could ramp it up to 963mv









Cheers guys.


----------



## RagingCain

Quote:


> Originally Posted by *Krazeswift;13831862*
> Does anyone have a modded Asus 590 bios to hand?
> 
> I had some really nice stable clocks when i could ramp it up to 963mv
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers guys.


If anybody wants me to mod any bios, just attach their personal GPU0/GPU1 bios in a .zip or .rar to the post and give me the settings. To get your own BIOS, open up GPU-z and select the specific GPU at the bottom with the drop down box, and next to BIOS version there is a tiny little "BIOS Chip Image" this is actually to save a dump of your BIOS.

If you have SLI, I only need the first card. IF they are not identical BIOS versions, then definitely give me all 4 and you can flash 4 times.

I *WILL NOT* change clock speeds, and *the max voltage I will unlock is 1.05v*.


----------



## Shinobi Jedi

I'm kinda bummin.

It looks like my PSU may be going out, so now that I know that I have to upgrade that, I really want to add a second GTX 590 for Quad SLI.

I was going to come into the money next week to get one too, but the 2 that were in stock at the Fry's in the area are gone as well as the stock at EVGA's own store.

I'm hoping someone is going to read online all the bad hype about how the card won't overclock and return one to a Fry's. But my expectations are pretty managed.. As in, I expect to not end up being able to find a second card to SLI with my current one.

It is what it is, I guess..

Games are getting crushed already, but I liked RagingCain's advice of going big with the 590's and using them through Kepler until they make sure the new architecture has all the kinks worked out.

My plan was to get a second card, and then get the Dell 2560x1600 monitor everyone raves about, or 3 of them. But with only one card, I don't know if it'll be worth it. I like to turn "everything" up to 11 when I game









But I can't really complain or be bummed, I guess. Just on this one card, I am still getting an avg. of 80-120fps on BFBC2 with my Vsync turned off and everything turned up including in game AA at 32xaa (or whatever it exactly is) on my 60" Plasma..

Well, if I somehow land into another one, I'll be stoked. But if not, is it worth it to go get a 2560x1600 display or 3 with just one 590 if you 'have' to turn everything up to "11"?

The Spinal Tap metaphor hits perfectly how I feel about the GTX 590..

Q:"Why would you buy a card with so low clocks that you can't overvolt into a small generator...?"

A:"Yeah but..... This one goes to 11."


----------



## Arizonian

Quote:


> Originally Posted by *Shinobi Jedi;13833548*
> 
> I was going to come into the money next week to get one too, but the 2 that were in stock at the Fry's in the area are gone as well as the stock at EVGA's own store.It is what it is, I guess.
> 
> It looks like my PSU may be going out, so now that I know that I have to upgrade that, I really want to add a second GTX 590 for Quad SLI.
> 
> A:"Yeah but..... This one goes to 11."


The GTX 590 was a limited run and it may be hard to find now. I been watching and none of the Fry's in Arizona have any in stock ever anylonger. Newegg every now and a blue moon and it's scooped up same night.

A Corsair 1200AX is an awesome PSU and most of the EVGA members would concur if your thinking about doing it right.

"Spinal Tap"


----------



## Pipeman597

Quote:


> Originally Posted by *RagingCain;13814423*
> Sent. Might have to just post them on the forums. I gave EVGA plenty of time to fix their BIOS.
> 
> Edit: Attached them.
> 
> backa.rom - Plain EVGA GTX 590 GPU0 - BIOS 92
> backb.rom - Plain EVGA GTX 590 GPU1 - BIOS 93
> GTX590A.ROM - Modded BIOS 92, stock voltage 0.975v, correct voltage display.
> GTX590B.ROM - Modded BIOS 93, stock voltage 0.975v, correct voltage display.
> 
> All properly down volt/downclock on idle, the stock clocks are 630 MHz (standard EVGA).
> 
> Confirmed that on 275.33 drivers, having zero issues since fixed (the ones I previously could barely install.)
> 
> You can also down volt to your hearts content, but keep an eye on Afterburner... for some reason on 2.2.0 Beta 3, its not adjusting the voltages on the other GPUs even if you have it synchronize settings for identical GPUs so you have to manually adjust them lower on each one.
> 
> GTX590A-2.ROM is a special ROM, which I have enabled the nVidia OEM message saying "Your running RagingCain's Modded 92/93 BIOS."
> 
> Please don't use if you don't know what you are doing. I take no responsibility for any screw ups or burnt cards. You can always attempt a blind flash if it goes horrible, hence the backa/backb.roms.
> 
> To flash, you will need to read a guide and create a bootable USB drive, put nvflash and associated CP.exe with the ROMs in the root directory.
> 
> Once booted into C:\ (prompt) the command to begin is:
> nvflash -4 -5 -6 GTX590A.ROM
> 
> It will display a lot of text/data then ask you (Y)es, (A)bort, (S)kip. If you want to continue press Y.
> 
> Once back on C:\ command line type:
> nvflash -4 -5 -6 GTX590B.ROM
> 
> Y again for the win.
> 
> Note: If you have two 590s, after you flash one GPU0 (you have two GPU0s, and two GPU1s), it will scan for anymore GPU0s, and ask if you want to flash it too. Y is the same answer here.
> 
> So if you have 2 590s, you will have to press Y twice when its asked to successfully flash both GPU0s and twice more for GPU1s after nvflash is ran again.


Well I have flashed both of your bios revisions to my EVGA GTX-590 but the max volts I can set per Afterburner is 0.963. Is this because the drivers are limiting it? I use 275.33.


----------



## RagingCain

Quote:


> Originally Posted by *Pipeman597;13833661*
> Well I have flashed both of your bios revisions to my EVGA GTX-590 but the max volts I can set per Afterburner is 0.963. Is this because the drivers are limiting it? I use 275.33.


Evil Clocker was saying the same thing, its really weird.

I am reading mine as 0.975v

What are you guys using for voltage monitoring/unlocking?

I am using Afterburner 2.2.0 Beta 3, with Voltage Monitoring, and Unlocked voltage.

Try these. If they don't work, you are going to have to give me a different BIOS to work with so a stock EVGA bios.


----------



## Krazeswift

Quote:


> Originally Posted by *RagingCain;13833241*
> If anybody wants me to mod any bios, just attach their personal GPU0/GPU1 bios in a .zip or .rar to the post and give me the settings. To get your own BIOS, open up GPU-z and select the specific GPU at the bottom with the drop down box, and next to BIOS version there is a tiny little "BIOS Chip Image" this is actually to save a dump of your BIOS.
> 
> If you have SLI, I only need the first card. IF they are not identical BIOS versions, then definitely give me all 4 and you can flash 4 times.
> 
> I *WILL NOT* change clock speeds, and *the max voltage I will unlock is 1.05v*.


Thanks RagingCain, if your able to mod the attached bios for me that would be brilliant!

Thanks again


----------



## RagingCain

Quote:


> Originally Posted by *Krazeswift;13834440*
> Thanks RagingCain, if your able to mod the attached bios for me that would be brilliant!
> 
> Thanks again


Here ya go. Let me know if its working.

The other users using the EVGA Bios's seem to be locked to 0.963v.

Let me know if yours is unlocked all the way to 1.05v.

@All
If anybody wants a BIOS modded with the fan speeds unlocked let me know, I can try 10 - 100%. Actually, I won't won't lower the speed at all but I might be able to unlock up to 100% if you want.


----------



## Krazeswift

Quote:


> Originally Posted by *RagingCain;13835157*
> Here ya go. Let me know if its working.
> 
> The other users using the EVGA Bios's seem to be locked to 0.963v.
> 
> Let me know if yours is unlocked all the way to 1.05v.
> 
> @All
> If anybody wants a BIOS modded with the fan speeds unlocked let me know, I can try 10 - 100%. Actually, I won't won't lower the speed at all but I might be able to unlock up to 100% if you want.


Awsome thanks for that!

Ill let you know how it goes, but tbh its the first time ive ever flash a card let alone a dual one! Im reading up on how to do it as we speak, but if you or anyone could give me any pointers that would be great!


----------



## Pipeman597

Quote:


> Originally Posted by *RagingCain;13834095*
> Evil Clocker was saying the same thing, its really weird.
> 
> I am reading mine as 0.975v
> 
> What are you guys using for voltage monitoring/unlocking?
> 
> I am using Afterburner 2.2.0 Beta 3, with Voltage Monitoring, and Unlocked voltage.
> 
> Try these. If they don't work, you are going to have to give me a different BIOS to work with so a stock EVGA bios.


Thanks for your help. This bios gave the same result 0.963 volts with same Afterburner as you. I do like the way your custom bios makes both gpu's the same voltage. I overclock to 690 1380 1800. 700 gave black screens Heaven Benchmark.


----------



## RagingCain

Quote:


> Originally Posted by *Pipeman597;13836599*
> Thanks for your help. This bios gave the same result 0.963 volts with same Afterburner as you. I do like the way your custom bios makes both gpu's the same voltage. I overclock to 690 1380 1800. 700 gave black screens Heaven Benchmark.


If someone has stock 90/91 EVGA Bios's and share them, I will try editing them. Perhaps something special was done to the EVGA's Bios 92/93 to limit the voltage. Not sure why mine was showing 0.975, but I rebooted today and now my max is 963mV.
Quote:


> Originally Posted by *Krazeswift;13835811*
> Awsome thanks for that!
> 
> Ill let you know how it goes, but tbh its the first time ive ever flash a card let alone a dual one! Im reading up on how to do it as we speak, but if you or anyone could give me any pointers that would be great!


Check this out for basic tips, but you will need to learn to make a bootable usb drive, and use the nvflash program.


----------



## RagingCain

Double Post.


----------



## Krazeswift

Quote:


> Originally Posted by *RagingCain;13835157*
> Here ya go. Let me know if its working.
> 
> The other users using the EVGA Bios's seem to be locked to 0.963v.
> 
> Let me know if yours is unlocked all the way to 1.05v.


Flash successful, but mine is also locked at 963mv but as im not watercooled this isnt a big deal.

Thanks again for your help.


----------



## Recipe7

Anyone with a 590 running on the latest drivers care to defend this review?

Does not seem to reliable. Shows almost 30-50% increase in fps for some titles.

http://www.overclock3d.net/reviews/gpu_displays/zotac_gtx590_sli_5760x1080_nvidia_surround_review/1


----------



## Arizonian

Quote:


> Originally Posted by *Recipe7;13842100*
> Anyone with a 590 running on the latest drivers care to defend this review?
> 
> Does not seem to reliable. Shows almost 30-50% increase in fps for some titles.
> 
> http://www.overclock3d.net/reviews/gpu_displays/zotac_gtx590_sli_5760x1080_nvidia_surround_review/1


Why are you planning on buying a 590?


----------



## Recipe7

Do you mean.

Why, are you planning on buying a 590?

or

Why are you planning on buying a 590?

I just want to shed some light on the reviews.


----------



## Arizonian

Quote:


> Originally Posted by *Recipe7;13842165*
> Do you mean.
> 
> Why, are you planning on buying a 590?
> 
> or
> 
> Why are you planning on buying a 590?
> 
> I just want to shed some light on the reviews.


Why, are you planning on buying a 590? Was what I meant.

I personally can't understand how the 590 (two down clocked 580's) beat the 580's in SLI? Something is a miss but I might be wrong.

I also thought the 6990 was slightly better than the 590 but I'm sure the review was done at stock clocks for both. I know over clocked the 6990 would surpass the 590 as it currently stands. I'm also optimistic that drivers will improve and allow better over clocks as Ragincain has proven with his modded BIOS. I've seen other reviews two months back that shown a pretty tight race as well. But when I posted the results we got flamed and those posts along with the culprit who came over to flame the thread got deleted.


----------



## Recipe7

Yes, I am actually planning on a 590, but also considering a 580 3gb.

I'll be upgrading from a 5870, and based on plenty of reviews, I would basically have a 100% increase in average FPS if I went the 590 route. However, I would only have a 50-60% increase with the 580.

I was amazed by the tests in the link I posted, but I just can't believe them until someone can really prove those numbers.


----------



## Pipeman597

So far I am pleased going from a HD5970 to a GTX-590 not a OMG difference but nice speed boost. The 275.33 drivers seem very stable and I like them better than latest ATI drivers. I am water cooled with a XSPC Razor water block and temps never go over 45C no mater what I throw at it.I don't have a huge over clock 690 1380 1800 at .963 volts. I do use surround mode. I have 3, 1920x1200 monitors, different brands. For some reason I can't go over 5760x1080 resolution. Setting custom resolution to 5760x1200 fails. That wasn't a problem for the HD5970 it would run at full 5760x1200 with no tweaks. Two of the monitors do show as generic PNP in device manager. Is this normal or is there a work around for this? 5760x1080 looks good but not native to my monitors.


----------



## Manac0r

Just want to introduce myself. I was lucky to get a free upgrade from my 580sli's to EVGA 590 Quad Sli, due to a stocking error









I have read this thread from the very start, and will post pictures of my setup when it arrives on Tuesday as proof. From Masked's comments about mature drivers needed (choking not throttling) and Ragingcains work on editing the bios so both cores have same voltage, this has a been a goldmine of information. Wonder if anyone can clear up the following points:

1) If the second core by default has less voltage, is it running less effectively? Would flashing the bios cause the cards to increase performance even without Oc'ing any clock speeds?

2) Have the latest drivers improved scaling? Are they maturing steadily? Or is still a lot of work is needed?

3) I always use d3doverride, as I hate tearing. Providing I get a constant 60fps I m happy, i assume these cards will provide that experience at stock? Also my i980x is oc'd to 4.1ghz I assume this won't bottleneck the system? 12gig corsair 1600mhz on Rampage III extreme. I only have a single 360 radiator, but two pumps with my cards on one loop and CPU on the other. Obsidian 800d case with extra case fan.

Thanks for the input, and sorry for along first post, but as you can imagine i have been reading and researching a lot and these are some niggles. I plan on using the first bios Raging Cain posted on my EVGA 590's will that bring all voltages level, or could I just use the Afterburner work around?

TIA


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Recipe7;13842100*
> Anyone with a 590 running on the latest drivers care to defend this review?
> 
> Does not seem to reliable. Shows almost 30-50% increase in fps for some titles.
> 
> http://www.overclock3d.net/reviews/gpu_displays/zotac_gtx590_sli_5760x1080_nvidia_surround_review/1


I don't know if I can defend it best to your satisfaction, but I can say I did get a 10+fps increase in my Metro 2033 bench at Very High Settings with AAA and Tesselation. On the 270 drivers it was average 48fps just like most reviews. Now, on the bench the average is 61fps-

In game my frame rates are almost always over 80fps and often into the hundreds like the review. There are a few spots where it dips all the way to 30, but it lasts maybe a second or two, before going back up into the 90's that it seems more like poor optimization.

I hope this review can be defended. Because it sure makes me feel better about only having a single 590, since I wanted to get a second one and the stock has seemed to disappear.

Either way, I am really loving my card


----------



## exlink

Quick Question:

I currently run a 3x 1440x900 (4320x900) surround set-up on my GTX 590 and am planning on adding a 4th "monitor" to the mix. The monitor is really a Vizio 26" LED LCD HDTV (1080p) that I would put above my surround set-up. The Vizio would obviously serve as my bedroom TV and then second as a 1080p monitor for my system. I know that the GTX 590 can run 4 monitors (at least according to EVGA's description) and was wondering if there was an easy way that I could possibly switch off if I wanted to game on the TV (1080P) or on my surround set up (4320x900)?

Thanks in advance!


----------



## Manac0r

I'm not sure I fully underdstand, but in Nvidia control panel under multiple displays, you can uncheck boxes for each display, thus you could uncheck your HDTV, and vice versa, to stop signals being sent to it You can also do this through right clicking desktop and going to display. Not sure if I m stating the obvious, but hope it helps.


----------



## Masked

Quote:


> Originally Posted by *Arizonian;13842221*
> Why, are you planning on buying a 590? Was what I meant.
> 
> I personally can't understand how the 590 (two down clocked 580's) beat the 580's in SLI? Something is a miss but I might be wrong.
> 
> I also thought the 6990 was slightly better than the 590 but I'm sure the review was done at stock clocks for both. I know over clocked the 6990 would surpass the 590 as it currently stands. I'm also optimistic that drivers will improve and allow better over clocks as Ragincain has proven with his modded BIOS. I've seen other reviews two months back that shown a pretty tight race as well. But when I posted the results we got flamed and those posts along with the culprit who came over to flame the thread got deleted.


Nay.

The 590 is only under-clocked currently because it's "locked" via drivers.

The first week of release, several of us proved that the card was capable of 750+mhz with a MAX voltage of 1.05v (1.02v for safety).

With the cards OC'd to that point and original drivers...The 590 out performed the 6990.

That being said, the 6990 CAN OC to a higher stable 900+mhz so, eventually the 6990 does beat out the 590 but, game-wise, the 590 actually has a better response than the 6990's.

I can attest to this.

Right now, I'm running 2x6990's ~ I find the drivers to be...appauling to say the least and scaling is absolutely terrible.

Although I'm finding some serious errors VIA my 590's at the moment, I have faith that custom fixes and/or Nvidia will eventually address them.

Also, don't post your findings "in the wild", here works just fine...Unfortunately the majority of OCN Is very close minded and childish...It's not worth the argument with someone who can't rightly grasp the concepts at hand.


----------



## [email protected]

275.33 kingpin uses a 1.2v voltage so there is a solution
the recent driver possesses a flange that OCP limit the voltage increase


----------



## ReignsOfPower

Quote:


> Originally Posted by *Masked;13853724*
> .....
> 
> Right now, I'm running 2x6990's ~ I find the drivers to be...appauling to say the least and scaling is absolutely terrible.
> 
> [snip]
> 
> Also, don't post your findings "in the wild", here works just fine...Unfortunately the majority of OCN Is very close minded and childish...It's not worth the argument with someone who can't rightly grasp the concepts at hand.


NVIDIA drivers are half the reason why I switched to green after a 3870x2, 4870x2 and a 5970. I got sick and tired of Catalysts. I wanted to get a console-like ease-of-use when I play my games. i.e. Install a set of drivers, launch game and it runs without hitch. Particularly my OLD games as I still play them quite a bit (Have a large Steam and gog.com library) The other half was the noise (I'm not going to watercool unless it's as easy as H70 type arrangement) I upgrade far too often to have to worry about waterblocks and disconnecting loops and stuff every 6 - 12 months.

Re those people who harp on about how Tri Fire costing the same as 580 SLi @ 1000 bucks bla bla. If I had a dollar for each of those posts I'd be rocking 4x 3GB 580's by now.


----------



## Draygonn

Quote:


> Originally Posted by *exlink;13851011*
> Quick Question:
> 
> I currently run a 3x 1440x900 (4320x900) surround set-up on my GTX 590 and am planning on adding a 4th "monitor" to the mix. The monitor is really a Vizio 26" LED LCD HDTV (1080p) that I would put above my surround set-up. The Vizio would obviously serve as my bedroom TV and then second as a 1080p monitor for my system. I know that the GTX 590 can run 4 monitors (at least according to EVGA's description) and was wondering if there was an easy way that I could possibly switch off if I wanted to game on the TV (1080P) or on my surround set up (4320x900)?
> 
> Thanks in advance!


I use an HDMI connection for my Sony Bravia 32EX500. It's set as the default #1 display. Normally the HDMI is not connected and I use my U2711 or AW2310. When I want to watch Blu-Rays on the EX500 (in another room) all I have to do is connect the HDMI and it automatically switches to the EX500 as the main display. I don't have to pull up NVIDIA Control Panel or Win 7 Screen Resolution to switch displays. When I am done watching movies I unplug the HDMI and it automatically switches back to the computer monitor.


----------



## Canis-X

I would like to get some opinions from the GTX590 crowd, if you would be so kind as to give me a moment of your time. I am truly undecided as to what to do.

What to do, your thoughts?? Two GTX590 or not two GTX590...


----------



## Shinobi Jedi

If you care about Bench scores then maybe go with the 6990's.

If you care about Performance and Image Quality, then you want to stick with the 590's..

Personally, I'm a gamer so I care more about Performance and Image Quality...

If you check Guru3D.com's reviews of the GTX 590 and similar cards, you'll see a tiny blurb where they mention that ATI/AMD employs optimizations in their drivers purposely to inflate bench scores at the cost of image quality.

Combined with the fact that ATI/AMD's drivers are not often very reliable or stable, then the choice gets even easier.

If you care about having a huge E-Peen and random numbers and bar graphs, then you are going to feel disappointment with the 590's compared to the 6990's.

But - I guarantee that if you put a rig with a 590 in it next to a rig with a 6990, same game, same monitor, same image - the image on the 590 will be significantly superior with much more stable drivers.

Also RagingCain is working with EVGA on new bios that will allow more OC headroom. And the new 275 drivers improved my Metro 2033 bench scores by 10fps+ upping it into 61fps.

Me? I'm all for overclocking when it brings needed performance in a game. But I don't need to OC at all with my 590. And I only have one! Personally, I am still considering adding another. But it also would require a new PSU and I'm dreading that switchout and having to redo all my cable management. Especially, if I'm not sure if I need that much for a single display like I'm rocking right now.

One thing I love about the card not really needing to be OC'd or being limited in it, is that I don't spend my few precious hours of free time running benches - but rather playing games. Which is why I'm into rigs in the first place.

If you check the Guru3D reviews, you'll see the AMD bench optimizations at the expense of image quality mentioned. Nvidia wants more exposure on this. Many sites won't, for fear of pissing off AMD/ATI and being banned from getting Hardware to test.

But it is there and it is real. AMD/ATI makes good products. I use their 5870m in my Asus ROG G73JH-A1. But my experience with their drivers from that card is what made me choose the 590 over the 6990. That and the noise factor. I have an awesome 7.1 surround system in my home. I don't need to waste it by having it be drowned out by a huge fan blower just for an extra 2-12fps.

If you care about Bench numbers over image, go with: 6990

If you care about image quality and stability and noise, go with: 590

I hope that helps you with what is definitely a tough decision, or as I like to think of it a "good" problem.

Cheers!


----------



## Jobotoo

I am highly considering getting one, or two, GTX 590 cards. I want to water-cool them for sure and wanted to know what your opinions were regarding with EVGA's Hydro Coper versions, or with a different water-block. If different, which card and water-blocks would you recommend?

I play on one 30" monitor with a resolution of 2560x1600. I also have a smaller monitor on the side that I use to monitor temps and voice comma while gaming.

What kind if performance increase would I see with one card, or two cards, compared to my current GTX 465?

I also work in photography (Photoshop, Lightroom, etc.) and I want to know if I would have any compatibility issues, or should things work just like they do know, albeit with better performance?


----------



## Robitussin

should be getting my hydro copper today so I will let you know my personal experience. I'm most interested personally in whether or not I need to flash the other bios or not? Also if EVGA comes out with a rev2 with more memory/better VRMs will it be available for stepup/RMA if I just bought this card?


----------



## RagingCain

Quote:


> Originally Posted by *Robitussin;13879311*
> should be getting my hydro copper today so I will let you know my personal experience. I'm most interested personally in whether or not I need to flash the other bios or not? Also if EVGA comes out with a rev2 with more memory/better VRMs will it be available for stepup/RMA if I just bought this card?


There is no revision 2 nor will you be able to step-up if they had one, but they may RMA you different version if you demand it, or they run out of old stock.

If you want to be 100% error free you should flash a 925v bios to GPU0 and GPU1. Some peoples GPUs are running "fine". Although if they did the level of game testing and BOINC usage I did, they may find a crash/error or two.) So they aren't really fine but its good enough.
Quote:


> Originally Posted by *Jobotoo;13872568*
> I am highly considering getting one, or two, GTX 590 cards. I want to water-cool them for sure and wanted to know what your opinions were regarding with EVGA's Hydro Coper versions, or with a different water-block. If different, which card and water-blocks would you recommend?
> 
> I play on one 30" monitor with a resolution of 2560x1600. I also have a smaller monitor on the side that I use to monitor temps and voice comma while gaming.
> 
> What kind if performance increase would I see with one card, or two cards, compared to my current GTX 465?
> 
> I also work in photography (Photoshop, Lightroom, etc.) and I want to know if I would have any compatibility issues, or should things work just like they do know, albeit with better performance?


Very high performance difference over a single 465. In the magnitude of 100~150% increase in FPS.

The HydroCopper block made by Swiftech is very restrictive in flow rate, but it really does keep the GPUs cooler. As far as I know, Swiftech is a little on the low end of quality block wise, but it gets the job done. I just don't know if they sell them separately. EK Blocks that use Nickel are currently in halted-production state due to a manufacturing process of plated nickel reacting with Distilled Water, Kill Coil, and CopperSulfate (biocides.) EK is offering the chance to get a full replacement on their products from now till the end of June, and I think I have to sign up for both my EK Supreme and EVGA E770 Water Block. I have had good results with DangerDen, they made an excellent GTX 580 block. The 590/6990 block I am told looks like a male genitalia, so that would throw me off a little (6990 block especially). The aquacomputer brand, there is a member here with that block, you should go through the thread and look for his post and PM him as I believe he is the only one here with the AquaFX 590 block. That would leave two more brands: Heatkiller and Koolance. The Koolance one looks great, but I have no experience with their brand. So that leaves HeatKiller which its GPU block is extremely high quality, and looks like serious business, and one of our members has it.

No issues with Photoshop CS5.1 Extended by me but thats with the modded BIOS, and I don't use Lightroom, but you can Fold / BOINC with no problem, so the CUDA enriched applications should be just fine.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;13879681*
> No issues with Photoshop CS5.1 Extended by me but thats with the modded BIOS, and I don't use Lightroom, but you can Fold / BOINC with no problem, so the CUDA enriched applications should be just fine.


Lightroom runs well but, there are some issues with occasional "glitching".

Alatar, could we please get Cain's modified Bios on the front page considering it's "proven" at this point that it works -- Might as well include a guide about how to flash GPU's as well.

Unfortunately, the mentality of most OCN users seems to be: 'Read the first page and nothing else, TLDR' ~ So, we might as well make it easier for them.


----------



## Canis-X

Quote:


> Originally Posted by *Shinobi Jedi;13872166*
> If you care about Bench scores then maybe go with the 6990's.
> 
> If you care about Performance and Image Quality, then you want to stick with the 590's..
> 
> Personally, I'm a gamer so I care more about Performance and Image Quality...
> 
> If you check Guru3D.com's reviews of the GTX 590 and similar cards, you'll see a tiny blurb where they mention that ATI/AMD employs optimizations in their drivers purposely to inflate bench scores at the cost of image quality.
> 
> Combined with the fact that ATI/AMD's drivers are not often very reliable or stable, then the choice gets even easier.
> 
> If you care about having a huge E-Peen and random numbers and bar graphs, then you are going to feel disappointment with the 590's compared to the 6990's.
> 
> But - I guarantee that if you put a rig with a 590 in it next to a rig with a 6990, same game, same monitor, same image - the image on the 590 will be significantly superior with much more stable drivers.
> 
> Also RagingCain is working with EVGA on new bios that will allow more OC headroom. And the new 275 drivers improved my Metro 2033 bench scores by 10fps+ upping it into 61fps.
> 
> Me? I'm all for overclocking when it brings needed performance in a game. But I don't need to OC at all with my 590. And I only have one! Personally, I am still considering adding another. But it also would require a new PSU and I'm dreading that switchout and having to redo all my cable management. Especially, if I'm not sure if I need that much for a single display like I'm rocking right now.
> 
> One thing I love about the card not really needing to be OC'd or being limited in it, is that I don't spend my few precious hours of free time running benches - but rather playing games. Which is why I'm into rigs in the first place.
> 
> If you check the Guru3D reviews, you'll see the AMD bench optimizations at the expense of image quality mentioned. Nvidia wants more exposure on this. Many sites won't, for fear of pissing off AMD/ATI and being banned from getting Hardware to test.
> 
> But it is there and it is real. AMD/ATI makes good products. I use their 5870m in my Asus ROG G73JH-A1. But my experience with their drivers from that card is what made me choose the 590 over the 6990. That and the noise factor. I have an awesome 7.1 surround system in my home. I don't need to waste it by having it be drowned out by a huge fan blower just for an extra 2-12fps.
> 
> If you care about Bench numbers over image, go with: 6990
> 
> If you care about image quality and stability and noise, go with: 590
> 
> I hope that helps you with what is definitely a tough decision, or as I like to think of it a "good" problem.
> 
> Cheers!


Thanks for the input!! + Rep for the time and effort!!


----------



## 1spike

Was wondering if i would get a good performance increase going from 2x msi cyclone 460s 1gbs to a evga gtx590 classified card thanks


----------



## Manac0r

Quote:


> Originally Posted by *Jobotoo;13872568*
> I am highly considering getting one, or two, GTX 590 cards. I want to water-cool them for sure and wanted to know what your opinions were regarding with EVGA's Hydro Coper versions, or with a different water-block. If different, which card and water-blocks would you recommend?
> 
> I play on one 30" monitor with a resolution of 2560x1600. I also have a smaller monitor on the side that I use to monitor temps and voice comma while gaming.
> 
> What kind if performance increase would I see with one card, or two cards, compared to my current GTX 465?
> 
> I also work in photography (Photoshop, Lightroom, etc.) and I want to know if I would have any compatibility issues, or should things work just like they do know, albeit with better performance?


Well I went from 580 Sli, to a 590 Quad. I haven't played with voltage, but with a slight overclock these two cards handle anything you throw at them. I'm a gooning gamer. Benchmarks are not my thing. These cards run so much quieter and cooler than my 580's. The performance is great. I have a Dual monitor setup like you, I use d3doverider to maintian 60fps. At high res, on a single monitor quad sli is amazing. Battlefied 2, STALKER CoP, Witcher 2 ( slightly stuttery, but mature drivers will iron this out). Personally, 2 of these cards is the way to go. While waiting for them to arrive I read all the posts, etc and was slightly worried. Nvidia have pulled out a card that performs, but is quiet and cool. For 60fps they are impressive. Double up as my pappy used to say. Quad Sli for sure.


----------



## Pipeman597

Quote:


> Originally Posted by *RagingCain;13879681*
> There is no revision 2 nor will you be able to step-up if they had one, but they may RMA you different version if you demand it, or they run out of old stock.
> 
> If you want to be 100% error free you should flash a 925v bios to GPU0 and GPU1. Some peoples GPUs are running "fine". Although if they did the level of game testing and BOINC usage I did, they may find a crash/error or two.) So they aren't really fine but its good enough.
> 
> Very high performance difference over a single 465. In the magnitude of 100~150% increase in FPS.
> 
> The HydroCopper block made by Swiftech is very restrictive in flow rate, but it really does keep the GPUs cooler. As far as I know, Swiftech is a little on the low end of quality block wise, but it gets the job done. I just don't know if they sell them separately. EK Blocks that use Nickel are currently in halted-production state due to a manufacturing process of plated nickel reacting with Distilled Water, Kill Coil, and CopperSulfate (biocides.) EK is offering the chance to get a full replacement on their products from now till the end of June, and I think I have to sign up for both my EK Supreme and EVGA E770 Water Block. I have had good results with DangerDen, they made an excellent GTX 580 block. The 590/6990 block I am told looks like a male genitalia, so that would throw me off a little (6990 block especially). The aquacomputer brand, there is a member here with that block, you should go through the thread and look for his post and PM him as I believe he is the only one here with the AquaFX 590 block. That would leave two more brands: Heatkiller and Koolance. The Koolance one looks great, but I have no experience with their brand. So that leaves HeatKiller which its GPU block is extremely high quality, and looks like serious business, and one of our members has it.
> 
> No issues with Photoshop CS5.1 Extended by me but thats with the modded BIOS, and I don't use Lightroom, but you can Fold / BOINC with no problem, so the CUDA enriched applications should be just fine.


The block I use is the XSPC Razor GTX590. Only 100$, looks good, thin profile and easy to install with pre cut pads. Not very restrictive but temps are only 2-3C higher than EK. I am very pleased with this reasonably priced block.


----------



## 2010rig

what's up peeps, I haven't checked this thread in a while.

has there been any recent developments with the 590, positive / negative?

I see the Egg has a few in stock.
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=gtx+590&x=0&y=0

Quote:


> Originally Posted by *Shinobi Jedi;13872166*
> If you care about Bench scores then maybe go with the 6990's.
> 
> If you care about Performance and Image Quality, then you want to stick with the 590's..
> 
> Personally, I'm a gamer so I care more about Performance and Image Quality...
> 
> If you check Guru3D.com's reviews of the GTX 590 and similar cards, you'll see a tiny blurb where they mention that ATI/AMD employs optimizations in their drivers purposely to inflate bench scores at the cost of image quality.
> 
> Combined with the fact that ATI/AMD's drivers are not often very reliable or stable, then the choice gets even easier.


I mentioned this before, and an OCN Mod basically made fun of me and deleted my post.

I always wondered if AMD fixed that.


----------



## Masked

Quote:


> Originally Posted by *2010rig;13890062*
> what's up peeps, I haven't checked this thread in a while.
> 
> has there been any recent developments with the 590, positive / negative?
> 
> I see the Egg has a few in stock.
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=gtx+590&x=0&y=0
> 
> I mentioned this before, and an OCN Mod basically made fun of me and deleted my post.
> 
> I always wondered if AMD fixed that.


Use end improvements, yes...Nvidia end improvements aren't something I'd hold my breath for.

The 6990's do overcompensate during benches, they always have...And IMO there is a sacrifice in quality as well 'for the score'.


----------



## Robitussin

Hey guys after a long list of annoyances I have the card installed.









So add me to the list:yessir:


----------



## RagingCain

Friendly OCN Competition: 590 SLI vs 6990 CFX
http://www.overclock.net/graphics-cards-general/1043957-friendly-ocn-competition-590-sli-vs.html#post13900881

Looking for 590 boys to help out, water and air cooled are welcomed.


----------



## ReignsOfPower

I'd help out but I'm not on water and without your volt mod


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13901287*
> I'd help out but I'm not on water and without your volt mod


Mentioned it in the 6990, might do a lightweight on AIR only group (includes CPU), interested? You can always OC (with caution) on 267.85 drivers.


----------



## Alatar

Updated.

And subbed to the competition thread, can't wait to see how that goes


----------



## ReignsOfPower

Ok I'll play Ragin







I'll use your BIOS fix. I hope it doesn't cause fireworks








Is your BIOS a modification of EVGA's unlocked fan BIOS?


----------



## evilmustang66


View attachment 215738
evilmustang66 my pics of my system


----------



## RagingCain

Quote:


> Originally Posted by *ReignsOfPower;13904379*
> Ok I'll play Ragin
> 
> 
> 
> 
> 
> 
> 
> I'll use your BIOS fix. I hope it doesn't cause fireworks
> 
> 
> 
> 
> 
> 
> 
> 
> Is your BIOS a modification of EVGA's unlocked fan BIOS?


I can make one for you with modded fan % up to 100%

Post powered by DROID X2


----------



## Canis-X

RaginCain:

Are these issues present in the ASUS flavored version of the 590? If so, can you hook me up with a corrected BIOS for the voltage and fan issue possibly? Never done a BIOS-mod so I'm a little freaked about doing it....if you don't mind helping me out I would greatly appreciate it.


----------



## ReignsOfPower

Quote:


> Originally Posted by *RagingCain;13907269*
> I can make one for you with modded fan % up to 100%
> 
> Post powered by DROID X2


Sure thing! Did you need me to make a BIOS backup of it currently installed? Here is the link from EVGA. It allows between 30 - 100% fan. Your mod brings them up to 0.925 a pop right? ftp://ftp.evga.com/BIOS/1598_fanunlock_1.0.zip


----------



## XXXfire

Quote:


> Originally Posted by *Robitussin;13898362*
> Hey guys after a long list of annoyances I have the card installed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So add me to the list:yessir:


That looks way tight dude. Those Laing D5 pumps are the shiz.

Are a lot of you fellas running Surround? I'm curious. Is there any word on Nvidia expanding their software to include variations of 3x1 ?


----------



## Robitussin

thanks for the vote of confidence, I can say the new arrangement will most likely change, I just am waiting for an RMA from ppcs on an incorrect order of more fittings, got the wrong OD, the card was a pain to bleed I moved it every which way but I still have bubbles, been working on an OC now, my weekend is coming up tho so lots of time to get it all right.

Have the voltage variations been fixed? I pulled up GPUZ and didn't notice a difference in the two, will post screens when I get home. Been working with EVGA precision and getting decent numbers with the little time I have been able to put into it so far, will try and get preliminary posts up tonight as well if everything goes ok. Left my pc on backing up and creating a new system disc before i left for work
















*EDIT
Did some quick settings with evga precision gonna post some ss and such:
performance test2
extreme test2 *looks like it didn't get the oc in that time ** restarted and tried again same thing, guess its too much of something don't feel like messing with it tonight so I'm done
*Test 1*















*Test 2*


----------



## [email protected]

which driver you use?


----------



## RagingCain

Quote:


> Originally Posted by *XXXfire;13912033*
> That looks way tight dude. Those Laing D5 pumps are the shiz.
> 
> Are a lot of you fellas running Surround? I'm curious. Is there any word on Nvidia expanding their software to include variations of 3x1 ?


A couple of us are, do you mean, running portrait vs. landscape mode? Or more monitors such as 4/5/6x1 ?


----------



## Shinobi Jedi

RagingCain,

Sorry, can you clarify which vBios to grab if I'm running EVGA's 92/93 unlocked fan bios?

Also:

I'm still debating between getting a 120hz display or an IPS 2560x1600 display.

Has anyone tried the 3D Vision with a single 590? Does it bring it to it's knees? I hear it can be really cool with games that are optimized for it.

If it runs well with a 590, that might be the deciding factor..

I'm torn. I want a 120hz for FPS and BF. But they all seem to come in only 24". The 27" made by Acer seems to have disappeared and I can't find much info about any new 120hz panels coming down the pipe except Samsung's new one. But it's not compatible with NVidia 3D Vision..

And then 2560x1600 IPS panels seem to only come in 30", but have awesome eye candy and resolution..

I also figured I could do 3D Surround with 3 24" 120hz panels, but I don't know if it'll be worth it over my 60" Plasma.

Anybody running one or the other that has any input, it's greatly appreciated.

If I go with 1 or 3 24" displays, I was thinking of the Asus 23" model. I know the PQ won't be great, but it seems to be the best in that class. Anyone have any experience with this?

Thanks again.

Cheers


----------



## Anglis

Quote:


> Originally Posted by *Shinobi Jedi;13930547*
> RagingCain,
> 
> Sorry, can you clarify which vBios to grab if I'm running EVGA's 92/93 unlocked fan bios?
> 
> Also:
> 
> I'm still debating between getting a 120hz display or an IPS 2560x1600 display.
> 
> Has anyone tried the 3D Vision with a single 590? Does it bring it to it's knees? I hear it can be really cool with games that are optimized for it.
> 
> If it runs well with a 590, that might be the deciding factor..
> 
> I'm torn. I want a 120hz for FPS and BF. But they all seem to come in only 24". The 27" made by Acer seems to have disappeared and I can't find much info about any new 120hz panels coming down the pipe except Samsung's new one. But it's not compatible with NVidia 3D Vision..
> 
> And then 2560x1600 IPS panels seem to only come in 30", but have awesome eye candy and resolution..
> 
> I also figured I could do 3D Surround with 3 24" 120hz panels, but I don't know if it'll be worth it over my 60" Plasma.
> 
> Anybody running one or the other that has any input, it's greatly appreciated.
> 
> If I go with 1 or 3 24" displays, I was thinking of the Asus 23" model. I know the PQ won't be great, but it seems to be the best in that class. Anyone have any experience with this?
> 
> Thanks again.
> 
> Cheers


You'll get a bunch of preference answers. I'm in the same boat as you. 3 x Asus VG236H or a u3011. You get positives for each. The u3011 will run every game at 2560 resolution without any problems. The PQ is great, but for gaming I hear the AG (anti glare coating) can be really off putting as it gives a grainy look.

The Asus will still look good if you mount them to a ergotech and angle them right. 3D gaming does look amazing, but 3D surround still has some problems. You would probably end up gaming 3D on one monitor for a lot of the games. Right now i'm leaning toward 3 asus monitors, but if I change my mind, it will be a u2711 and one Asus vg236h.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Anglis;13931448*
> You'll get a bunch of preference answers. I'm in the same boat as you. 3 x Asus VG236H or a u3011. You get positives for each. The u3011 will run every game at 2560 resolution without any problems. The PQ is great, but for gaming I hear the AG (anti glare coating) can be really off putting as it gives a grainy look.
> 
> The Asus will still look good if you mount them to a ergotech and angle them right. 3D gaming does look amazing, but 3D surround still has some problems. You would probably end up gaming 3D on one monitor for a lot of the games. Right now i'm leaning toward 3 asus monitors, but if I change my mind, it will be a u2711 and one Asus vg236h.


I'm with you. I think I'm gonna do the same


----------



## taveston

Quote:


> Originally Posted by *Shinobi Jedi;13931657*
> I'm with you. I think I'm gonna do the same


I returned the U3011 as, for me, the anti glare coating was truly horrific and also gave me headaches - but this is a highly subjective view as others do not even notice it.

I bought the Apple Cinema Display 27" instead and am v pleased with it. Note that Quad SLI is not supported through the mini display port though. I bought an Atlona DP400 and ran it through the DVI-D instead, Quad SLI worked but I got green/pink distortions.

I have now returned the DP400 and will sell one 590 and live with Dual SLI instead. Two cards was probably overkill anyway and was pretty loud! One card runs everything at max...


----------



## Masked

I run 3x Samsung 2253BW's at work and at home.

I realize I'm kind of "stuck" in the past at 1680x1050 but, at 8000x1, it was hard to argue with when I originally got the trio.

When I run the 2x 6990's ~ I always have issues with the 3rd monitor...No matter what it is, the drivers just give me absolute hell.

On the 590's ~ Running 3 monitors is suprisingly easy, I run them as individuals...I've had a very good experience.

Running 1 card as well, I have NO troubles what-so-ever doing 3 monitors...

We also have a Sharp Aquios in the office that the 590's have absolutely no problems interacting with...It actually looks pretty damn sexy.


----------



## JVLajingoku

mister cain im from holland and i would like too enter ur oc contest.
but could u please help me out with mine bios?
im first gonna flash mine first card then im gonna do the other.


----------



## Jobotoo

I'm on a 30" monitor @ 2560x1600. What benefit would I get going with two cards versus one card?

(Besides back in the 3dfx Voodoo days, I have not done SLI since.)


----------



## ReignsOfPower

Hi Ragin,

Here are my BIOS' attached. Both are the fan unlocked BIOS' from EVGA.


----------



## Canis-X

So, FedEx FINALLY got my cards to me yesterday. Jerk said that they tried to get someone at the house but no-one was home.......right!, I have twin 18 m/o daughters that are NOT quiet......EVER...LOL Anyway, installed only one as my mobo does not support SLi and played a game for a few hours. I like it a lot!! Thanks for convincing me to keep them!!







I'll post a few pics of my cards tonight as I am going to install the second one and test it out....not taking pics of it installed in my case though, I am getting ready to move my rig into a new case and it looks pretty bad ATM, well to me it does. ASUS is shipping out my Crosshair V Formula today so hopefully this weekend the wife/kids will let me have some time to rebuild then I will take the nice shots.









May I join the club??


----------



## capchaos

Quote:


> Originally Posted by *Canis-X;13950246*
> So, FedEx FINALLY got my cards to me yesterday. Jerk said that they tried to get someone at the house but no-one was home.......right!, I have twin 18 m/o daughters that are NOT quiet......EVER...LOL Anyway, installed only one as my mobo does not support SLi and played a game for a few hours. I like it a lot!! Thanks for convincing me to keep them!!
> 
> 
> 
> 
> 
> 
> 
> I'll post a few pics of my cards tonight as I am going to install the second one and test it out....not taking pics of it installed in my case though, I am getting ready to move my rig into a new case and it looks pretty bad ATM, well to me it does. ASUS is shipping out my Crosshair V Formula today so hopefully this weekend the wife/kids will let me have some time to rebuild then I will take the nice shots.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> May I join the club??


You should try the sli patch it worked on my amd motherboard when i had my dual 590's


----------



## XXXfire

Quote:



Originally Posted by *RagingCain*


A couple of us are, do you mean, running portrait vs. landscape mode? Or more monitors such as 4/5/6x1 ?


Sorry, I was terribly unclear. I didn't mean variations of 3x1, but variations of Surround extending a larger array of monitors. 4x1, 5x1, et al.

I've heard very positive reports about Nvidia's implementation, especially in relationship to AMD's notoriously clunky 1st generation alternative. I'm wondering how they plan to expand their support for it. Fast as these GPUs are, there's a relatively narrow scope of application doing justice to their performance capability. Between 3D & massive resolutions, I'm beginning to think the latter is gaining momentum toward popular prominence. I'm wondering when they'll implement the feature for single GPU cards, myself.


----------



## Canis-X

Quote:



Originally Posted by *capchaos*


You should try the sli patch it worked on my amd motherboard when i had my dual 590's


I would definitely if the mobo wasn't so close to being shipped out (shipping today







). I'm not in any big hurry. I have lots of work to do on this new build....and stuff to purchase to get it the way I want it. It is going to take me some time as I figure out what needs to be purchased first.


----------



## RagingCain

Quote:



Originally Posted by *XXXfire*


Sorry, I was terribly unclear. I didn't mean variations of 3x1, but variations of Surround extending a larger array of monitors. 4x1, 5x1, et al.

I've heard very positive reports about Nvidia's implementation, especially in relationship to AMD's notoriously clunky 1st generation alternative. I'm wondering how they plan to expand their support for it. Fast as these GPUs are, there's a relatively narrow scope of application doing justice to their performance capability. Between 3D & massive resolutions, I'm beginning to think the latter is gaining momentum toward popular prominence. I'm wondering when they'll implement the feature for single GPU cards, myself.


I personally am hating the support 3D is getting, primarily because I don't use it, I could if I had the ridiculous glasses, but I am already half blind with astigmatism as it is so glasses on glasses would just... suck comfort wise. I am actually surprise its getting so much attention, while new games have glaring bugs and issues through various sets of drivers. There focus is 100% 3D right now, which is not fair to the rest of us. I wouldn't even mind 50/50%, but lets face it, we are only getting the odd bug fixed that is sent by the community, so they aren't even actively testing their own drivers. That is a trend I faced with 5870 cards, and it really pissed me off, at least nVidia has been delivering on some of the biggest bugs out there.

The Surround feature, as I have mentioned before, has to be the Achilles heel of nVidia drivers. It just feels tact on without much thought and without much testing. For users like me who are constantly switching back and forth it has gotten much, much, much, better with the latest few drivers, but before it was ridiculous bad. I would tell it to go to surround from SLI / 3D Performance mode, and it would just look at me like a puppy in training, totally clueless.

I don't believe there is 4x1, 5x1 support not in Surround, but I do believe with 4 GPUs, you can run a monitor on each GPU. Thats not terrible, but its not what you were getting after, and its a bit ridiculous as a single 5830/5850 could run 3 monitors by themselves without much of an issue. Although I remember it not getting along well in CrossfireX and two monitors at times.

Although AMD have really been pushing CrossfireX and improved the scaling immensely, I keep seeing the technology on nVidia being kept... the same. Its not getting better, its not getting worse, just stagnates. New gpu, old SLI. I am of course not expecting miracles or magic performance, but come on, lets increase the bandwidth, lets increased the scaling factor or something.

I am hoping we see a impressively changed architecture in Maxwell. I don't doubt Keplar will be good, but the performance increases in this last generation were just... not as high as I believe everyone was expecting.

I also don't see the need for such high performance either, don't get me wrong, we have plenty of overkill possible setups, but I don't think that should stop progress, I think eventually, we are going to see co-processors on board GPUs, we are going to see a new SLI architecture, maybe something as simple as two bridges between cards, or something new all together, especially with PCI-E 3.0 unleashes more power consumption, and (I know not necessary) more bandwidth. When we start seeing that, then I bet we will see multi-monitor crazy support by nVidia.


----------



## armartins

Cain/XXXFire... that is one question that doesn't let me sleep... when the hell Nvidia will implement support for at least 3 monitors in a single VGA. I've played a lot of BFBC2, I use eyefinity since march 2010 (Catalyst 10.3) with my 5970... honestly I've had less issues than I expected I've played just cause 2, Dirt 2, DAO, Anno (forgot the year lol 1404?), GTA 4 was installed by my nephew too... and crossfire has been rocking ever since... but, a HUGE but, if you take the time I have played games in %... It will be 75% WoW, 20% BFBC2 and 5% all those others. And man at WoW AMD sucks really hard... Using DX11 crossfire won't work (now I've made some progress using radeonPRO to try forcing it, despite being windowed). If you use DX11 the eyefinity resolution disapear and the only way you have to play the game using DX11 is using windowed fullscreen then it will run at the eyefinity resolution, but crossfire is disabled, and if I decide to play using DX9 API (crossfire works, but scalling is poor) I'll be almost always VRAM capped, 1GB per GPU just isn't enough DX11 manages VRAM much better. So the options are:
1) Use DX11 play with ultra settings avg 30fps at cities/high detail areas(WoW is playable with 30fps)
2) Use DX9 play with 120fps avg but when you enter a city or a high detail area you can go as low as 1-5fps and even crash.

SLI works fine with WoW in DX11 and the overal performance single/SLI is MUCH better... but I'm definetely not buying another video card with a framebuffer smaller than 1.5Gb per gpu (570 for instance), I find the 580 SLI as an overkill setup since I can't stand wow with AA (yes wow is VRAM limited at 1gb per gpu without AA at 5304x1050, 3x Dell [email protected]), overkill and overpriced. I'm sure that ONE 580 watercooled and well overclocked could do this job better than any other GPU but Nvidia requires SLI... lame.


----------



## CSHawkeye

Is this decent??

http://3dmark.com/3dm11/1412593;jsessionid=dcr7nf62khm5?show_ads=true&page=%2F3dm11%2F1412593%3Fkey%3DmxKJfxmXjBC2uTCE3QyNa5epHjW3d5


----------



## Canis-X

of ownership!


----------



## Shinobi Jedi

Question for single GTX 590 and 3D Vision Surround owners:

If you have all 3 DVI-D ports on your card running to 3 120hz panels, how are you getting Blu-Ray HD Surround sound? My understanding is those codecs need an HDMI connection, as my sound card needs an HDMI in from the video card and then runs HDMI out to my receiver to run 7.1 Blu-Ray HD sound.

However, these 120hz displays only do 120hz with a DVI-D connection, so using HDMI even with a DVI-D adapter caps the resolution @ 60hz.

My only option is the digital 2.1 SPDIF running to my receiver with a simulated surround codec, or using headphones with a 7.1 Dolby card adapter.

Is there any other sound cards that can get DolbyHD and DTS-HD from the mobo without having to be connected to the 590?

This little issue is really bumming me out on 3D Surround. How can I game good at FPS if I don't have real location sound via my Home Theater to play off of?

Any ideas for a solution?

In terms of a 120hz 3D monitor, I initially went and bought the Asus VG263H with the bundled Nvidia 3D Vision kit.

Now, I'm a big fan of Asus gaming notebooks. My G73JH-A1 still rocks my world (AMD 5870m in that baby, so no brand bias here)..

However, their monitor left something to be desired. As soon as I fired it up - Huge dead pixel in bottom left center of screen. I took it back, and they had already sold the other one they had. They already had another return on the shelf from the same thing. The PC sales guys said that Asus rocks when it comes to notebooks, but when it comes to PC displays, they're often a disappointment riddled with pixel issues.

So I exchanged it for the 24" BenQ xl2410t. I didn't find many reviews on it, but knew it was LED over LCD and I used to sell BenQ products when working for a high end Home Theater dealer, so I know they're a pretty good brand. Though I read a few QC issues with this monitor with light bleeding, etc. Personally I can deal with light bleed more than a dead pixel sticking out against a moving picture. When that happens my eyes get OCD and just stare at the pixel.. I hate dead or stuck pixels..

And the BenQ? Not only is it a pixel perfect screen, but there is absolutely no light bleed whatsoever on the display. It also sports some of the best blacks I've ever seen for a TN panel. I haven't gotten the change to really put it through it's paces yet, but so far I'm much happier with this one than the Asus.

Cons to the monitor? Out of the box, the calibration is so bad you almost think the display is a POS. But you can get ICC profiles for the display online through Google and once applied, it is a fantastic picture.

The other thing I'm kinda on the fence about, that many others love, is the matte screen. I usually prefer Glossy as I sit in a darkened room and would rather have colors pop, than be slightly dulled out. Plus I don't like the screen door effect some bad matte jobs can have. However, this matte has to be the best one I've ever used as colors do not seem diminished, and only the most faint hints of the screen door effect can be seen against only the blackest of backgrounds when looking at it from an off angle.

It is however, still Matte. So I'm gonna game on it for awhile and see how I like it. But again, I'd rather have a good Matte job on a pixel perfect screen with great blacks for a TN, then a glossy that may or may not have a dead pixel on purchase.

Depending on how I can work out the HD Sound/DVI-D 120hz issue, and how much better it seems to play FPS at 24" @ 120hz with 2.0 sound, vs. 60" Plasma @ 60hz with 7.1 sound, will decide whether I keep this panel or all 3.

I am liking the idea more of just maybe having one 120hz 3D panel and one IPS 2560x1600 panel.

Thanks to all those that get through this book I just wrote and can help me find a solution!

Cheers


----------



## SVKOverclocker

Quote:



Originally Posted by *sarojz*


I've got my Evga gtx590 installed and left it stock for now. I'm very happy with it so far and I've had no problems. Here's my proof, it's a screen shot with various apps running to show an 590 is installed.



Hi sarojz I want to ask you how many fans you have on your computer ?


----------



## rush2049

Hey everyone, been tinkering with my card a bit.... trying to test individual overclock settings to push the max stable I can get on stock volts....

So far I have stable 612(stock) with a stable 1850 memory, but I am still pushing the memory.....

Will post screenshots when I combine them later.

edit:

not stable: 612(stock) and 2004 mem
stable: 612 and 1950 mem

double edit:

I slaved my old gtx 275 for physx.... in batman arkam asylum I am getting 20-30 more fps... and no dips in fps ever, a rock solid 80.....


----------



## taveston

That's around what I get with the same GPU, CPU and overclock

Quote:



Originally Posted by *CSHawkeye*


Is this decent??

http://3dmark.com/3dm11/1412593;jses...3QyNa5epHjW3d5


----------



## rush2049

quick update.....

GeForce GTX 5807721544100237.0649.41192.4GDDR5384 114.11.02.01581.1244$499£399
GeForce GTX 5906071215855 58.7577.7327.7GDDR52x384114.11.02.02488.3365$699£570[8]

Half Way: 689.51379.5928.5 (x2) = 1857 Crashed
Test 2: 660

(x2)1850 Crashed
Test 3: 612 1225

(x2) 1850 Pass
Test 4: 612 1225

(x2) 2004 Fail /w recovery (mem curruption)
Test 5: 612 1225

(x2) 1950 Pass
Test 6: 650 1300

(x2)1950 Fail /w recovery (mem curruption)
Test 7: 650 1300

(x2)1850 Fail /w recovery (mem curruption -- stable for a while)

Reviewers claimed overclocks.....

With volt @ 1050 mV:809 1618

(x2)2120

Stock Volt: 678

(x2)1850(x4)3700

Stock Volt (1 core):687

(x2)1957
Stock Volt (2 core):657

(x2)1857


----------



## rush2049

update 2:

GeForce GTX 5807721544100237.0649.41192.4GDDR5384114.11.02.01581.1244$499£399
GeForce GTX 590607121585558.7577.7327.7GDDR52x384114.11.02.02488.3365$699£570[8]

All testing done with Heaven 2.5, DX11, shader:high, Tess:norm, Ani:4, Antiff, windowed, 1680x1050

Half Way: 689.51379.5928.5 (x2) = 1857Crashed
Test 2:660

(x2)1850Crashed
Test 3:6121225

(x2) 1850Pass
Test 4:6121225

(x2) 2004Fail /w recovery (mem curruption)
Test 5:6121225

(x2) 1950Pass

Test 6:6501300

(x2)1950Fail /w recovery (mem curruption)
Test 7:6501300

(x2)1850Fail /w recovery (mem curruption -- stable for a while)
Test 8:6501300

(x2)1830Fail /w recovery (mem curruption -- stable for a while+)

Test 9:6501300

(x2)1710Pass --very stable (long test)
Test 10:6701340

(x2)1710Pass
Test 11:6901340

(x2)1710Fail /w recovery (black screen)
Test 12:6801340

(x2)1710Fail /w recovery (black screen)

Test 13:6701340

(x2)1820Fail /w recovery (mem curruption -- near end of bench)
Test 14:6701340

(x2) 1800Fail /w recovery (black screen)
Test 15:6601340

(x2) 1800STABLE

Reviewers claimed overclocks.....

With volt @ 1050 mV:8091618

(x2)2120

Stock Volt:678

(x2)1850(x4)3700

Stock Volt (1 core):687

(x2)1957
Stock Volt (2 core):657

(x2)1857


----------



## rush2049

Take these at a glance, I wasn't trying for high scores (was super multi-tasking at the same time):

ALL @ Test 15: 660 1340------ (x2) 1800 STABLE

GPUZ: http://www.techpowerup.com/gpuz/vs7xp/









Heaven 2.5:









3Dmark Vantage: P24392 http://3dmark.com/3dmv/3265349

3Dmark11: P8237 http://3dmark.com/3dm11/1413672


----------



## CSHawkeye

Quote:



Originally Posted by *taveston*


That's around what I get with the same GPU, CPU and overclock



Thanks. Think I am going to stick with this now. Its powering my U3011 while I have a GT520 powering my dual 20 inch monitors for 2D use.


----------



## dvanderslice

Just purchased a eVGA GTX 590 Classified to replace my GTX 275s in SLI. Running stock clocks right now while I get used to what this can do. Ordered a new monitor too as what I have at 1680x1050 doesn't do the card justice. Here's some proof pictures I just took along with a question I have, I'm not sure this was answered as I couldn't find it in the search, I figured this was the best place to ask...

I noticed that when using Precision or MSI Afterburner or Rivatuner etc. that the fan speed is always recognized as 0 for GPU1 but not for GPU2. I went to upgrade the bios as was recommended by eVGA to unlock the fans. This made no difference. I noted that the bios versions are different for each GPU, note the attached photo _capture.jpg_ (second to last thumbnail) that I took of Precision's GPU info. Is this normal? How do I have it so it recognizes that GPU1 has the same fan settings as GPU2? Is this possible? And is it normal to have different bios versions for each GPU? This strikes me as odd as well. Is there a way to update the bios so it is the same for each GPU in the card? I've tried plugging the monitor into one gpu or the other while updating the bios, but that doesn't make a difference.

Really thats the only issue that I have run into. I had a GTX 295 and a 9800GX2 through the years and never had this issue with monitoring fans or bios versions being different, so I'm unsure of what the problem is. I notice i'm not the only one. Could you guys give me a hand?

Great card regardless i'm getting mucho performance out of it so far and it runs much cooler than the 275s that I had in SLI. Very pleased.


----------



## Masked

After all of the conjecture about using a secondary card for Physx, I'm wondering if it's really worth it.

Those of you that have, have you seen that big of a difference?

I've heard it "helps" with CS5 and other work-based apps...Obviously it makes a big difference in games...But, is it difference enough to warrant a 550ti?

I've used a seperate card before but, from what I've heard, it's a bit different with the 590...Any comments as such?


----------



## rush2049

I have a 275 as a physx slave card right now. You are right, in games that use it, it does keep the fps higher and much more steady with less dips in intensive scenes.

In programs like adobe photoshop I can't say as I haven't tested yet, but I have a lot of video footage to edit here soon so I will report back on adobe premiere rendering times with and without when I am finished. (I own CS5, not 5.5 or whatever is current now)


----------



## taveston

Quote:


> Originally Posted by *dvanderslice;13965715*
> Just purchased a eVGA GTX 590 Classified to replace my GTX 275s in SLI. Running stock clocks right now while I get used to what this can do. Ordered a new monitor too as what I have at 1680x1050 doesn't do the card justice. Here's some proof pictures I just took along with a question I have, I'm not sure this was answered as I couldn't find it in the search, I figured this was the best place to ask...
> 
> I noticed that when using Precision or MSI Afterburner or Rivatuner etc. that the fan speed is always recognized as 0 for GPU1 but not for GPU2. I went to upgrade the bios as was recommended by eVGA to unlock the fans. This made no difference. I noted that the bios versions are different for each GPU, note the attached photo _capture.jpg_ (second to last thumbnail) that I took of Precision's GPU info. Is this normal? How do I have it so it recognizes that GPU1 has the same fan settings as GPU2? Is this possible? And is it normal to have different bios versions for each GPU? This strikes me as odd as well. Is there a way to update the bios so it is the same for each GPU in the card? I've tried plugging the monitor into one gpu or the other while updating the bios, but that doesn't make a difference.
> 
> Really thats the only issue that I have run into. I had a GTX 295 and a 9800GX2 through the years and never had this issue with monitoring fans or bios versions being different, so I'm unsure of what the problem is. I notice i'm not the only one. Could you guys give me a hand?
> 
> Great card regardless i'm getting mucho performance out of it so far and it runs much cooler than the 275s that I had in SLI. Very pleased.


There is only one fan I think so one being set at 0% is normal - same with my card

I also have 2 different BIOS versions for each GPU so I think that is normal too


----------



## Canis-X

Mine are also different....odd.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *rush2049;13968163*
> I have a 275 as a physx slave card right now. You are right, in games that use it, it does keep the fps higher and much more steady with less dips in intensive scenes.
> 
> In programs like adobe photoshop I can't say as I haven't tested yet, but I have a lot of video footage to edit here soon so I will report back on adobe premiere rendering times with and without when I am finished. (I own CS5, not 5.5 or whatever is current now)


When you do this, do you need to use an SLI Bridge on both cards? Or just stick it in a PCI slot and just select it for PhysX in the Nvidia control panel?

I've got my 275's, and I'm interested in trying it, but I'm only running a 800w PSU and so I'm assuming it can't handle both, right?

If so, I need to upgrade my PSU and try this.


----------



## rush2049

I have heard of people running a 590 on a 600 watt, it really depends on what all is in your system and how far overclocked it is..... but as far as my recommendation, I would say no not enough.....

you cannot sli cards from different generations.... sli is if they work together to render, but physx is not rendering....

but honestly unless you do cuda programming, or play a ton of physx enabled games.... you won't see a different, (results pending on cuda in adobe products, testing that this week)

(forgive spelling and grammer, I am very tired when writing this post)


----------



## rush2049

Hey Alatar, if I could get my gpu speed in the op changed to the current I would appreciate it.

24 hours Folding Stable:
core: 660 mhz
shader: 1320 mhz
memory: 1800 mhz

Proof: http://www.techpowerup.com/gpuz/vs7xp/


----------



## Alatar

yeah np

gonna change it in a second.


----------



## Saeid

you are all rich bastards >-)


----------



## Shinobi Jedi

Quote:


> Originally Posted by *rush2049;13974245*
> I have heard of people running a 590 on a 600 watt, it really depends on what all is in your system and how far overclocked it is..... but as far as my recommendation, I would say no not enough.....
> 
> you cannot sli cards from different generations.... sli is if they work together to render, but physx is not rendering....
> 
> but honestly unless you do cuda programming, or play a ton of physx enabled games.... you won't see a different, (results pending on cuda in adobe products, testing that this week)
> 
> (forgive spelling and grammer, I am very tired when writing this post)


That's what I thought. Thanks for the response! +rep









Last side note on monitors:

I am in love with this BenQ XL2410T panel. Great PQ quality after calibration, amazing 120hz performance.

And I have to say, if your eyes can handle it, 3D gaming with Nvidia's 3D Vision is no joke. Best 3D I've ever seen. Blows anything and everything away in Avatar or anything else playing at the cinemas. BFBC2 being 3D Vision Ready, takes on a whole different experience-

And what I love about the 590, is, I'm getting a constant 60fps in 3D at stock clocks. 60fps in 3D!

Any other kind of 3D is BS as far as I'm concerned compared to this. Even PS3 gaming as it's only 720p and not true 120hz. Nvidia 3D vision is the only way to game or watch movies in 3D in true 1080p. Any 590 or 500 series owner that already has a 120hz panel, should really consider it. You can get wired glasses for $99. And IR ones for a little more online. The fact that Batman:AA and BFBC2 and Metro 2033 all come 3D Vision Ready makes it worth the extra $99 alone.

And if you're shopping for a 120hz display for FPS, I can't recommend this BenQ enough. The picture looks like arse out of the box, but you can find calibration profiles online and with a little tweaking, get an LCD that produces some of the deepest blacks I've ever seen for that kind of display. And one of the best Anti-Gloss matte jobs I've ever seen. I usually prefer Glossy for how it makes the colors pop, but this one really is perfect. And pixel perfect too. The 590 just hums and purrs with this display while it demolishes games. 3D or 2D 120hz. As a gamer who doesn't care about benches, I can't stop being impressed with this card.

The 590 is definitely Super-Fly Pimp


----------



## rush2049

LOL, thanks for the rep and nice 'review' ...... but your speaking to the choir, we all love our cards....


----------



## bughole5

The 590 isnt fully utilizing its power, i think future drivers will fix this since there has only been literally 3 drivers out so far. Check out the crazy gpu usage spikes I get.....










I get 60 fps with GTA IV with enbseries on with everything maxed on 1920x1080, but Ill get 30 fps on Splinter Cell Conviction and other dx9 games, ill get 40-50 fps.....but in a dx11 game like battlefield bad company 2 ill get 80-90 fps same with dragon age II...mafia II with physx ill get 29 fps....lol god i cant wait for these drivers to come out to fix these dumb issues...


----------



## rush2049

You have what they call bottlenecking, I get it too. It is because of the Phenom architecture.

Solution: overclock HT Link as much as possible, or wait for bulldozer.


----------



## bughole5

Quote:


> Originally Posted by *rush2049;13986140*
> You have what they call bottlenecking, I get it too. It is because of the Phenom architecture.
> 
> Solution: overclock HT Link as much as possible, or wait for bulldozer.


bottleneck? I overclocked this thing from 3.6 to 4.0....what the f


----------



## rush2049

You overclocked the processor speed, not the rate at which it can supply commands to the PCI-Express lanes. Thats what the HT Link speed controls.

The higher the HT-Link the faster it can talk to the card, but don't let the HT-Link exceed the NB clock speed, as the NB is the total bandwidth the cpu has for talking to the rest of the computer.... you wouldn't want the PCI-Express lanes requesting more than there was to offer.....


----------



## bughole5

what should i change, some things are locked as you can see


----------



## rush2049

Mess with the HT Multiplier and the HT ref. Clock......
but you really should be doing this from the bios and not in windows.....
also for any increase in the HT Link speed, make sure it doesn't exceed the NB speed......


----------



## bughole5

Quote:


> Originally Posted by *rush2049;13986218*
> Mess with the HT Multiplier and the HT ref. Clock......
> but you really should be doing this from the bios and not in windows.....
> also for any increase in the HT Link speed, make sure it doesn't exceed the NB speed......


HT is locked and i cant change anything in bios because the options arnt there

I can only slide down HT Multiplier......btw are you getting the same issues as me?


----------



## rush2049

Well then I am sorry for your situation, you are going to have to deal with it.

I have to ask though, is your system prime95 stable?


----------



## bughole5

Quote:


> Originally Posted by *rush2049;13986246*
> Well then I am sorry for your situation, you are going to have to deal with it.
> 
> I have to ask though, is your system prime95 stable?


i didnt try any stress programs but ive been playing games like crysis and the system is running stable with load cpu temperature hitting 58 degrees and thats only when i play crysis 2, in other games load temp hits 50-55

are you getting the same issues as me with awful gpu usage? Are you going to upgrade to intel ?


----------



## bughole5

this guy http://forums.nvidia.com/index.php?showtopic=202670

has an i7 and hes have similar issues where easier games are not performing as well as they should


----------



## grunion

Quote:


> Originally Posted by *bughole5;13986122*
> The 590 isnt fully utilizing its power, i think future drivers will fix this since there has only been literally 3 drivers out so far. Check out the crazy gpu usage spikes I get.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I get 60 fps with GTA IV with enbseries on with everything maxed on 1920x1080, but Ill get 30 fps on Splinter Cell Conviction and other dx9 games, ill get 40-50 fps.....but in a dx11 game like battlefield bad company 2 ill get 80-90 fps same with dragon age II...mafia II with physx ill get 29 fps....lol god i cant wait for these drivers to come out to fix these dumb issues...


Are the clock speeds or voltages dropping out also?


----------



## bughole5

Quote:


> Originally Posted by *grunion;13986440*
> Are the clock speeds or voltages dropping out also?


nope, the clock speeds and voltage are constant at the max level


----------



## bughole5

this is when im playing gta IV


----------



## bughole5

What I dont understand is that if its a bottleneck why does the gpu usage stay constant at 90%+ when i play dragon age II and ill get awesome fps, but if im playing a game like spinter cell conviction the gpu usage will be 40% and spike up to 70% at the most....the f is that, sounds like driver instability to me homies


----------



## bughole5

this is what usage looks like when im playing dragon age 2....


----------



## bughole5

this is for bulletstorm


----------



## ReignsOfPower

Are you using Vsynch? If your computer can render 60 fps with only 70% GPU use, no big deal. What's more, as to not insult your intelligence, some games have their engines FPS limited to a specific number unless you unlock it yourself. I know Source engine is limited to 250fps. If you're getting 30fps in splinter cell where others are not well it could very well be CPU bottleneck as already mentioned.


----------



## bughole5

lost planet 2 DX11 is utilizing the gpu the way it should and im getting great fps.......and now i dont understand why all these heavy games are running so well but a game like splinter cell, check this out......










in a game like splinter cell conviction im getting 30 fps come on bro....*** drivers


----------



## bughole5

Quote:


> Originally Posted by *ReignsOfPower;13986750*
> Are you using Vsynch? If your computer can render 60 fps with only 70% GPU use, no big deal. What's more, as to not insult your intelligence, some games have their engines FPS limited to a specific number unless you unlock it yourself. I know Source engine is limited to 250fps. If you're getting 30fps in splinter cell where others are not well it could very well be CPU bottleneck as already mentioned.


it just dosnt make sense to me though bro....why is the cpu being able to communicate well with the gpu while playing heavy games like lost planet 2, crysis 2, dragon age II, but can't do it while playing splinter cell....bulletstorm has its fps capped thats why its at 62 fps constant


----------



## Masked

It's the drivers.

Period.

If you want a good set of drivers for the 590 right now, you basically have to write your own.

Just like the bios situation, it's going to take a while for anything "official" to be released that will even address some of these issues.

Patience, is a virtue.


----------



## bughole5

Quote:



Originally Posted by *Masked*


It's the drivers.

Period.

If you want a good set of drivers for the 590 right now, you basically have to write your own.

Just like the bios situation, it's going to take a while for anything "official" to be released that will even address some of these issues.

Patience, is a virtue.



exactly, when i upgraded from the previous driver to the current 275.33 all of a sudden i was able to reach above 50% gpu usage in games, in the future drivers the gpu will become stable. It's not a bottleneck because a person with an i7 is having the same issues.


----------



## Smo

I've just paid for my EVGA GTX 590 Classified Edition - can't wait for it to arrive! I ordered it from the States though (I'm based in the UK) so could be waiting a week or more


----------



## RagingCain

Quote:



Originally Posted by *bughole5*


it just dosnt make sense to me though bro....why is the cpu being able to communicate well with the gpu while playing heavy games like lost planet 2, crysis 2, dragon age II, but can't do it while playing splinter cell....bulletstorm has its fps capped thats why its at 62 fps constant



Quote:



Originally Posted by *Masked*


It's the drivers.

Period.

If you want a good set of drivers for the 590 right now, you basically have to write your own.

Just like the bios situation, it's going to take a while for anything "official" to be released that will even address some of these issues.

Patience, is a virtue.



Quote:



Originally Posted by *bughole5*


exactly, when i upgraded from the previous driver to the current 275.33 all of a sudden i was able to reach above 50% gpu usage in games, in the future drivers the gpu will become stable. It's not a bottleneck because a person with an i7 is having the same issues.


Its 70~90% drivers, and please edit your posts to add more data, try to stop the double/triple/quadruple posting when you want to add more data.

Secondly, you should give nVidia feedback on their forums, I rarely see anybody with a 590 posting complaints their cards aren't working full power, and I feel like the only one at times submitting 590 tickets.


----------



## bughole5

Quote:



Originally Posted by *RagingCain*


Its 70~90% drivers, and please edit your posts to add more data, try to stop the double/triple/quadruple posting when you want to add more data.

Secondly, you should give nVidia feedback on their forums, I rarely see anybody with a 590 posting complaints their cards aren't working full power, and I feel like the only one at times submitting 590 tickets.



i have posted on the nvidia forums, there are a couple of people with i7 cpu's that have posted issue of low fps as well.


----------



## rush2049

Too bad there isn't a way to crash your video card generate a dump / statistics and send them to Nvidia.

If there was I would do it repeatedly for benchmarks that use <100% of the gpu and cpu..... aka.... not running correctly.....


----------



## bughole5

Quote:



Originally Posted by *rush2049*


Too bad there isn't a way to crash your video card generate a dump / statistics and send them to Nvidia.

If there was I would do it repeatedly for benchmarks that use <100% of the gpu and cpu..... aka.... not running correctly.....


muhaha the kid with the i7 cant get the fps he sould be getting


----------



## RagingCain

Just a heads up, I may be getting banned soon. I have already had two infractions in the last hour for no reason, or more correctly infractions that we disagree on severly. You can find me on EVGA, Nvidia, or kingpin cooling all the same name

Post powered by DROID X2


----------



## rush2049

Oh, sad to see you go...... hope it doesn't happen.

In other news, put my x6 up to 4ghz.... still fiddling with voltages, but I got a quick 3dmark 11 run in: P8768 3DMarks http://3dmark.com/3dm11/1430757


----------



## bughole5

Quote:


> Originally Posted by *rush2049;14007686*
> Oh, sad to see you go...... hope it doesn't happen.
> 
> In other news, put my x6 up to 4ghz.... still fiddling with voltages, but I got a quick 3dmark 11 run in: P8768 3DMarks http://3dmark.com/3dm11/1430757


do you have lost planet 2? if so what fps do you get on benchmark B with everything maxed on dx11?


----------



## shane_p

I've been reading this thread for a long time, finally decieded to post
here is my MARK 11 score, not too shabby I think

still playing around with overclock though, hope to beat this soon

hey Red Faction Armageddon what FPS are you guys with 590 getting?
I am getting lower FPS on it than Metro 2033!!!
I mean still playable, just seems graphically demanding?


----------



## rush2049

Quote:


> Originally Posted by *bughole5;14007996*
> do you have lost planet 2? if so what fps do you get on benchmark B with everything maxed on dx11?


I do not.


----------



## shane_p

Quote:



Originally Posted by *bughole5*


do you have lost planet 2? if so what fps do you get on benchmark B with everything maxed on dx11?


Here is my run with benchmark B
DX11 everything maxed


----------



## rush2049

New beta of MSI Afterburner:

http://event.msi.com/vga/afterburner/download.htm


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


New beta of MSI Afterburner:

http://event.msi.com/vga/afterburner/download.htm


Go here for the latest 2.2.0 Beta 4.

http://forums.guru3d.com/showthread.php?t=346506


----------



## rush2049

That is what I linked to.... but not a forum post.... it was beta3, now it is beta4....


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


That is what I linked to.... but not a forum post.... it was beta3, now it is beta4....


Sorry, it was linked to the main page of Afterburner which sports the 2.1.0 download, didn't want anyone confused so added the forum post, which is the official Afterburner thread, its just not hosted by MSI.


----------



## bughole5

Quote:



Originally Posted by *shane_p*


Here is my run with benchmark B
DX11 everything maxed





you have a 1000 dollar cpu and a 750 dollar gpu and you're only getting 80 fps.....im getting 50.4 fps for a 180 dollar cpu and 750 dollar gpu


----------



## bughole5

what are some stable overclocks for the gtx 590 without changing voltage


----------



## CSHawkeye

Is this decent:










http://3dmark.com/3dm11/1435279;jses...Nf6RcuPN0WT49R


----------



## RagingCain

Quote:



Originally Posted by *bughole5*


you have a 1000 dollar cpu and a 750 dollar gpu and you're only getting 80 fps.....im getting 50.4 fps for a 180 dollar cpu and 750 dollar gpu


94.5 fps != 80 fps for starters, and if your number is accurate (50.4 fps) then his score is 87.5% better than yours.

Secondly, not all of us use our computer to just play video games.

Quote:



Originally Posted by *bughole5*


what are some stable overclocks for the gtx 590 without changing voltage


Again, refrain from double posting, edit your posts to ask new questions if nobody as responded since you posted last.

If you want information, its not a good idea to try and insult people's purchases.


----------



## bughole5

Quote:



Originally Posted by *CSHawkeye*


Is this decent:










http://3dmark.com/3dm11/1435279;jses...Nf6RcuPN0WT49R



hey can you play a dx9 game and post a picture of your gpu usage while in game please


----------



## shane_p

Quote:


> Originally Posted by *bughole5;14013376*
> you have a 1000 dollar cpu and a 750 dollar gpu and you're only getting 80 fps.....im getting 50.4 fps for a 180 dollar cpu and 750 dollar gpu


I play everything in 3D
I need every FPS I can get (and I run with everything on max quality for Nvidia Control Panel: supersample, Ambient Occlusion, etc...)

Quote:


> Originally Posted by *bughole5;14013388*
> what are some stable overclocks for the gtx 590 without changing voltage


I have my overclock set to what I want in the BIOS (stock clocks) and then I don't have to overclock everytime I change drivers or OS, my overclock is permenant (well until I can get a little higher more stable)

anyway for now my settings are:
700Mhz, 3,600Mhz RAM (1,400 Shader)


*working on a higher overclock though*


----------



## Yokes29

Hey 590 owners, got a question regarding the EVGA GTX 590 Hydro Copper Classified:
Does anyone have an idea if this card is on a limited production run? Meaning, will we see more 590 HC's later down the road by September or October? Was just curious because i'm thinking of picking one up and then a 2nd 590 later (PS: This is not to replace my 480 SLI in my 1st rig). 
Also, i'm gaming at 2560 x 1440, one 590 should be enough to play most games on max with low AA/AF settings yes? (except Cryis, Metro, Crysis 2 DX11, and possibly Withcer 2)
Any input would be much appreciated. Thanks


----------



## Robitussin

Hey Yokes29, Yes you should be fine with most games with those settings. As far as the 590 HC's being a limited run, I believe they are not. You can still find the other dual GPU HC's so I can't imagine they would be different, also you can always pick up a second one used pretty easily generally. GL with the purchase


----------



## Yokes29

Thanks for the input, good to hear! Thanks Robitusin +1 Rep!


----------



## Masked

Quote:



Originally Posted by *Robitussin*


Hey Yokes29, Yes you should be fine with most games with those settings. *As far as the 590 HC's being a limited run, I believe they are not. *You can still find the other dual GPU HC's so I can't imagine they would be different, also you can always pick up a second one used pretty easily generally. GL with the purchase










Incorrect.

Do the math.

They cherry picked GPU's and Nvidia moved on to Kepler in December.

So in other words, I'm asking you how you could POSSIBLY think the 590 wasn't limited when the dye has changed and Kepler has been under production for the past 5 months.

Is it the the shoe fairies? Could it be leprechauns? Are they making these cards at night while the coremakers sleep?!?!? OMG CONSPIRACY!!!










The 590 was a 1 run shot with a staggered release...It is limited...There won't be any more.

EVGA's release is staggered and the HC's are extremely limited.

My company as a VENDOR got about 1/20th of what we normally get because of how limited this run was. (I can't give actual numbers, sorry)


----------



## Smo

Quote:



Originally Posted by *Masked*


Incorrect.

Do the math.

They cherry picked GPU's and Nvidia moved on to Kepler in December.

So in other words, I'm asking you how you could POSSIBLY think the 590 wasn't limited when the dye has changed and Kepler has been under production for the past 5 months.

Is it the the shoe fairies? Could it be leprechauns? Are they making these cards at night while the coremakers sleep?!?!? OMG CONSPIRACY!!!










The 590 was a 1 run shot with a staggered release...It is limited...There won't be any more.

EVGA's release is staggered and the HC's are extremely limited.

My company as a VENDOR got about 1/20th of what we normally get because of how limited this run was. (I can't give actual numbers, sorry)


Bugger - I'd better start saving for a second one ASAP then (I'm really anal and won't want to run two cards that look different in SLi).


----------



## Yokes29

Quote:


> Originally Posted by *Masked;14033228*
> Incorrect.
> 
> Do the math.
> 
> They cherry picked GPU's and Nvidia moved on to Kepler in December.
> 
> So in other words, I'm asking you how you could POSSIBLY think the 590 wasn't limited when the dye has changed and Kepler has been under production for the past 5 months.
> 
> Is it the the shoe fairies? Could it be leprechauns? Are they making these cards at night while the coremakers sleep?!?!? OMG CONSPIRACY!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 590 was a 1 run shot with a staggered release...It is limited...There won't be any more.
> 
> EVGA's release is staggered and the HC's are extremely limited.
> 
> My company as a VENDOR got about 1/20th of what we normally get because of how limited this run was. (I can't give actual numbers, sorry)


I see...








So i guess whatever is in stock now is most likely the last batch then.
Hehehe everyone go buy their 2nd 590 noaw!!


----------



## RagingCain

I have posted my 590s up for sale, going to have to bench them off though very soon.

http://www.overclock.net/main-components/1053428-asus-r3e-black-edition-24gb-ddr3.html


----------



## Smo

Why are you selling up bud?


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14038724*
> Why are you selling up bud?


Even moar powa









Going to be looking at either 3GB or 2GB VRAM cards for possible 3x 2560x1600 monitor.


----------



## Smo

More power than quad SLi 590s!? Not going red are we









And please forgive my ignorance (I'm new to all this) - I thought the 590 has 3GB of VRAM? Or is VRAM and RAM not the same thing?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14033228*
> Incorrect.
> 
> Do the math.
> 
> They cherry picked GPU's and Nvidia moved on to Kepler in December.
> 
> So in other words, I'm asking you how you could POSSIBLY think the 590 wasn't limited when the dye has changed and Kepler has been under production for the past 5 months.
> 
> Is it the the shoe fairies? Could it be leprechauns? Are they making these cards at night while the coremakers sleep?!?!? OMG CONSPIRACY!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 590 was a 1 run shot with a staggered release...It is limited...There won't be any more.
> 
> EVGA's release is staggered and the HC's are extremely limited.
> 
> My company as a VENDOR got about 1/20th of what we normally get because of how limited this run was. (I can't give actual numbers, sorry)


According to the EVGA reps at their forums, there are still small quantities of 590's being made and manufactured, and because so, are still hard to find.

They have shown up twice in the EVGA store and sold out again pretty fast. Have them put you on their notice and email you when they come back in. That's what I did, and like I just said, they've shown up back in stock at the EVGA store twice since it's been said all remaining stock has been released to market.

I believe the hydro's are easier to find than the regular cards, too.

I'm not saying you're wrong and they're right, in fact you probably are right. I'm just reporting that the Evga reps are saying different and that more 590's are being made, though they could just be full of it and referring to the same staggered release you were referring to


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14038916*
> More power than quad SLi 590s!? Not going red are we
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And please forgive my ignorance (I'm new to all this) - I thought the 590 has 3GB of VRAM? Or is VRAM and RAM not the same thing?


It has 3GB total, evenly distributed per GPU, so effectively 1.5GB.

No, I am really thinking 3x Gainward Phantoms 3GB 580s if I can get them stateside, or something similar 3GB.
Quote:


> Originally Posted by *Shinobi Jedi;14038949*
> According to the EVGA reps at their forums, there are still small quantities of 590's being made and manufactured, and because so, are still hard to find.
> 
> They have shown up twice in the EVGA store and sold out again pretty fast. Have them put you on their notice and email you when they come back in. That's what I did, and like I just said, they've shown up back in stock at the EVGA store twice since it's been said all remaining stock has been released to market.
> 
> I believe the hydro's are easier to find than the regular cards, too.
> 
> I'm not saying you're wrong and they're right, in fact you probably are right. I'm just reporting that the Evga reps are saying different and that more 590's are being made, though they could just be full of it and referring to the same staggered release you were referring to


I think it was Jacob, or John the product manager, I saw this too, stating that they have kept up production of the 590s for the time being.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14039072*
> It has 3GB total, evenly distributed per GPU, so effectively 1.5GB.
> 
> No, I am really thinking 3x Gainward Phantoms 3GB 580s if I can get them stateside, or something similar 3GB.


Interesting - I never thought of it that way! Do you think perhaps that could be the cause it lags behind the 6990 in some benchmarks? I always assumed that really it was gaming resolution that determined how important RAM was.


----------



## RagingCain

Quote:



Originally Posted by *Smo*


Interesting - I never thought of it that way! Do you think perhaps that could be the cause it lags behind the 6990 in some benchmarks? I always assumed that really it was gaming resolution that determined how important RAM was.


Well most of what fills up your VRAM is textures, and the frames of the rendered images, so quite a bit of its your resolution.

Your max resolution is determined by memory capacity, but your FPS potential (meaning max FPS) can be limited by VRAM, but not guaranteed.

In general the more VRAM you have the better, but very very very very few games use up to 1.5GB @ 1920x1080/1200 resolution.


----------



## Masked

Quote:



Originally Posted by *Shinobi Jedi*


According to the EVGA reps at their forums, there are still small quantities of 590's being made and manufactured, and because so, are still hard to find.

They have shown up twice in the EVGA store and sold out again pretty fast. Have them put you on their notice and email you when they come back in. That's what I did, and like I just said, they've shown up back in stock at the EVGA store twice since it's been said all remaining stock has been released to market.

I believe the hydro's are easier to find than the regular cards, too.

I'm not saying you're wrong and they're right, in fact you probably are right. I'm just reporting that the Evga reps are saying different and that more 590's are being made, though they could just be full of it and referring to the same staggered release you were referring to


I talk to their marketing department like...Weekly.

The chip itself is no longer being made, they stopped processing Fermi on the production line in July/August (This is what my sources say).

Considering this is a cherry picked GPU and what exists sample wise atm (can't give detail), production on Kepler has been going since November//full Production in December.

Evga got X # of cores...Asus got X # of cores...If EVGA fell so far behind they're still assembling their PCB's well...That's their own problem and it still doesn't change the fact that only X # exists.

Asus, I've been told, held back on their stock in order to create the MARS edition...Which is why they're still not/have not been releasing more.

Knowing EVGA they 50/50'd their stock and there are certainly less people that WC on the market so the HC will be easier to find.

This is still an extremely limited card, with only X #'s released...That's a hard line and not one that's going to change.

I don't mean to sound arrogant, assuming or well, I really want to say something else but, I'll get in trouble...Anyway, only X # of cores exist...When I said only X were "released", the #'s I see are cores not the actual full release numbers...So in that regard, it could be possible. They told us it was staggered which, obviously there's evidence of.


----------



## brettjv

Thread cleaned. Let's have no more of _that_ talk.


----------



## RagingCain

I guess I missed an argument.

Anybody else with 590 sli having low GPU usage in Crysis 2 Dx11?

Post powered by DROID X2


----------



## Masked

Quote:


> Originally Posted by *RagingCain;14048346*
> I guess I missed an argument.
> 
> Anybody else with 590 sli having low GPU usage in Crysis 2 Dx11?
> 
> Post powered by DROID X2


Yes, I logged in from work to screw around a bit.

I'm getting 70-80% on 1 core, 0 on the other using the most recent WHQL w/stock Bios.

You didn't miss much, just drama per the usual.


----------



## RagingCain

Quote:


> Originally Posted by *Masked;14048881*
> Yes, I logged in from work to screw around a bit.
> 
> I'm getting 70-80% on 1 core, 0 on the other using the most recent WHQL w/stock Bios.
> 
> You didn't miss much, just drama per the usual.


Its only in DX11 again, always DX11....


----------



## rush2049

I must've missed it as well... oh well...

For Crysis 2 I can't even play because I have a second card for physx. which puts me on their known bug list. I get the insane flickering because of my third card, even if I disable it I get the flickering. Nvidia says they are working on a new driver to fix it....

sigh....

Got a question for yall, how cheap can I get a gtx 580? Would it be possible to sell a single gtx 590 to switch.... I got the impression a 590 was cheaper....

My AI programming really started working a ton faster because of the extra cuda cores, but I hit the vram wall..... lol...... I will never be satisfied.....


----------



## Masked

Quote:



Originally Posted by *RagingCain*


Its only in DX11 again, always DX11....


I'm starting to see this more and more in some of our beta apps.

Do you feel that it's a driver issue or purely one of DX11 not utilizing SLI?


----------



## RagingCain

Quote:



Originally Posted by *Masked*


I'm starting to see this more and more in some of our beta apps.

Do you feel that it's a driver issue or purely one of DX11 not utilizing SLI?


I have a far out theory.

DX11's biggest unleashed weapon is making games "forced" multi-threaded. I.e., code your game with DX11, and DX11 will handle multi-core/thread CPUs. Civilization V is the only title I know of that has it activated, and it was activated recently too. I can use all 12 threads on my CPU in the 50-75% on Civ.

I think some where along the way there was a small brush into DX11 multi-threading for GPUs. 99% of games would already have profiles, and we see this. But new DX11 games would have no profile, till nVidia and ATI make them etc.

For video card drivers that have no "over-ride" profile for the game i.e. SLI Profile / CAPs, DX11 attempts to multi-thread the code through the GPUs natively through the OS. Its just not as efficient or effective as SLI or CFX.

Like I said, its a far out theory. Seeing 40% GPU usage, that is in sync on every card just doesn't make sense to me. I would expect one GPU at 99% while the others are at 0% if there was multi-GPU issue.

Alternatively, nVidia's default SLI profile is just so generic and sucky, they can claim these massive 300% benefits with new driver release. It could be a some weird crazy marketing gimmick, but why target primarily DX11 titles?


----------



## ReignsOfPower

I've not noticed any problems with Crysis 2 in DX11 in terms of GPU use. I have noticed though that when I shoot something from close range, all the dust and particles that get thrown up in the air make my rig go from a smooth frame rate to single digits until it clears up.


----------



## Smo

Guys - would you happen to know if my PSU (NZXT Hale90 850W Gold) would be able to support two GTX 590s in Quad-SLi? Seems pretty tight to me, I read a great review where someone managed 1088w out of his Hale90 but I'm not massively confident in doing it and just 'hoping for the best'.


----------



## RagingCain

Okay, probably not the best time to do a write up-I am totally drunk. I wrote this up for someone specific, I figured I would share how to Flash our 590s. This is really just getting the USB Flash drive setup, not really the "executing" the nvflash.exe with commands. So you will still need to know how to use Nvflash. I will probably just add that to this later if you guys want.

I have all the necessary files except BIOS included in this file setup on media fire.
LINK

If you have all the files already that's great, but if you have any issues... thats probably why.

This is real simple to do, and 27% of the time mistakes are from formatting, and the other 72% of time its human error. The other 1% means nVflash needs an update.

1.) You need a flash drive you can format. Don't use above 16GB, in fact, people have had issues with 16GB, but I haven't ever had it. So a good 16MB flash drive can work, or a 4GB one.

2.) What you are going to do is format it to FAT32 with a the HP DISK STORAGE FORMAT UTILITY. So you need to install that first. Don't waste your time doing it with Windows, as there are more steps.

3.) Now what you want to do is run HP DISK STORAGE FORMAT UTILITY, and it will look like this:









The reason I selected BLUE over Quick Format, is that some people do have issues using Quick Format, instead of a full fledge format. Keep it mind, try with quick, but if you have issues try again without Quick Format. If you still have issues get on here and ask.

4.) Now, one of the files I included, are the basic Win I/O files and boot sector files called USB IMAGE, which you need to setup as such before executing START. What you are doing is creating a DOS Startup Disk as shown in images, and that folder has everything ready, just select the folder, and once everything else is setup, click START.

P.S. For now, don't use a USB Drive name, DOS hates you, and you are more than likely not sure why you are getting error, its because your name is longer than I think 6 characters.

So where you downloaded your files get there and set it up like the image shows:









5.) Once its completed (no errors) you should have a blank USB Drive. Don't worry, its done correctly if its blank. To 100% verify its setup a boot drive, what you can do is change Folder Options to 100% visibility, no hidden files, no hidden system files, and they should show up.

6.) Now you need to copy the NVFLASH.EXE and its associated 32.sys and 64.sys files on to the root of the USB drive. There should be nothing else visible if things are hidden in your Windows 7.

7.) What you do now is boot up from a flash drive, which is quite easy to do. If you are terrible at this kind of thing, just post which motherboard you have. If everything worked correctly, you should be seeing C:\\ (DOS Prompt). There are a few commands, but the one you just created is nvflash.

8.) Learn the commands for flashings BIOS's for NVFLASH, put them the BIOSs in the root of your USB, boot from the USB, and you can flash them. nvflash GTX590A.ROM is the most basic command, and for the most part it works on its own, with few exceptions.

P.S. For ASUS, you have to press F10 during POST to bring up the Temporary Boot ORDER so you can just quickly and one time boot from USB drive (or CD-ROM etc.) I think for EVGA its F8 or DEL.

P.P.S. Reign, I know I owe you BIOSs. I am drunk, tomorrow after an exam, I will make them for you.


----------



## RagingCain

Quote:



Originally Posted by *smo*


guys - would you happen to know if my psu (nzxt hale90 850w gold) would be able to support two gtx 590s in quad-sli? Seems pretty tight to me, i read a great review where someone managed 1088w out of his hale90 but i'm not massively confident in doing it and just 'hoping for the best'.


nooo don't do it!

It isn't enough, the cards need more amperage.


----------



## brettjv

Quote:


> Originally Posted by *RagingCain;14051923*
> nooo don't do it!
> 
> It isn't enough, the cards need more amperage.


Small point of clarification







... the amperage from a PSU will never come up short and compromise your cards operation w/o you knowing.

Well, if you try to draw too much, you may get ripple in your current (which degrades components over time but has to be absurdly bad in order to screw up how your cards operate in the short term), or your PSU may shut off due to tripping over-current protection, but ... if it isn't shutting down, it will deliver the requested amperage.

850W may be close for quad-SLI, but my bet is that w/o any serious overvolting, and as long as you don't try to (like) run super-PI and Furmark at the same time, that PSU would actually work fine.


----------



## zerounleashednl

Hi guys, finally the cards are here... after some hesitating between dual ASUS MATRIX GTX580's (triple slot x2 =







pfff...) or these beauties... I decided to just go for the QUAD SLI solution!

So, if you could add me to the memberlist plz!


----------



## Smo

Porn


----------



## Alatar

Quote:


> Originally Posted by *zerounleashednl;14067732*
> Hi guys, finally the cards are here... after some hesitating between dual ASUS MATRIX GTX580's (triple slot x2 =
> 
> 
> 
> 
> 
> 
> 
> pfff...) or these beauties... I decided to just go for the QUAD SLI solution!
> 
> So, if you could add me to the memberlist plz!


nice! adding you in a minute









E: I'm at an airport in Trondheim, Norway currently and I'm not sure if OCN is working properly, the internet connection here is kind of quirky.


----------



## zerounleashednl

Quote:


> nice! adding you in a minute
> 
> E: I'm at an airport in Trondheim, Norway currently and I'm not sure if OCN is working properly, the internet connection here is kind of quirky.


Great! Oh btw they are at stock clocks... (for now)


----------



## Smo

Just got a call from the Mrs - My EVGA Classified 590 is at home waiting for me









So wish I wasn't at work right now - I'll post pictures up later.


----------



## Arizonian

Quote:



Originally Posted by *Smo*


Just got a call from the Mrs - My EVGA Classified 590 is at home waiting for me









So wish I wasn't at work right now - I'll post pictures up later.


Ah the anticipation, it's a great feeling, just think your hrs from crazy fun FPS.


----------



## Smo

Quote:



Originally Posted by *Arizonian*


Ah the anticipation, it's a great feeling, just think your hrs from crazy fun FPS.


I'm like a kid at Christmas!


----------



## Masked

Quote:



Originally Posted by *RagingCain*


I have a far out theory.

DX11's biggest unleashed weapon is making games "forced" multi-threaded. I.e., code your game with DX11, and DX11 will handle multi-core/thread CPUs. Civilization V is the only title I know of that has it activated, and it was activated recently too. I can use all 12 threads on my CPU in the 50-75% on Civ.

I think some where along the way there was a small brush into DX11 multi-threading for GPUs. 99% of games would already have profiles, and we see this. But new DX11 games would have no profile, till nVidia and ATI make them etc.

For video card drivers that have no "over-ride" profile for the game i.e. SLI Profile / CAPs, DX11 attempts to multi-thread the code through the GPUs natively through the OS. Its just not as efficient or effective as SLI or CFX.

Like I said, its a far out theory. Seeing 40% GPU usage, that is in sync on every card just doesn't make sense to me. I would expect one GPU at 99% while the others are at 0% if there was multi-GPU issue.

Alternatively, nVidia's default SLI profile is just so generic and sucky, they can claim these massive 300% benefits with new driver release. It could be a some weird crazy marketing gimmick, but why target primarily DX11 titles?


I'm actually in agreement except I'm seeing it in almost every game we currently have...

Not sure why ATM but looking into it.


----------



## Xtri

Hello, I just got my Gainward GTX 590 today. And when I check GPU-Z one of the GPU's are always at full speed while the other is not. The fans are running at 1500rpm, and it's quite loud, is this normal when it is idle?

Also, GPU 1 usage is high in for example Crysis and Bad Company 2, but GPU 2 runs at max 37 % usage. Is this normal?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *RagingCain;14038841*
> Even moar powa
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going to be looking at either 3GB or 2GB VRAM cards for possible 3x 2560x1600 monitor.


Hmm...

Just as I was about to pull the trigger on a second 590, you and some of the members of the EVGA forum have me thinking otherwise. Some there are claiming that 1.5gb of VRam isn't even enough for 5760x1080

Is this true? If so, I may go for a pair of 3gb 580's

Though I'd be slightly bummed to lose my little LED bling on the 590 card. That light up logo is like the hood ornament when it comes to the aesthetics of my rig. It's what makes it Pimp...


----------



## xFyre

Quote:


> Originally Posted by *Shinobi Jedi;14076836*
> Hmm...
> 
> Just as I was about to pull the trigger on a second 590, you and some of the members of the EVGA forum have me thinking otherwise. Some there are claiming that 1.5gb of VRam isn't even enough for 5760x1080
> 
> Is this true? If so, I may go for a pair of 3gb 580's
> 
> Though I'd be slightly bummed to lose my little LED bling on the 590 card. That light up logo is like the hood ornament when it comes to the aesthetics of my rig. It's what makes it Pimp...


Yes, 3GB cards would be more adequate for surround. If you want aesthetics, why don't you check out the 3GB Lightning XE 580s?


----------



## Arizonian

Quote:


> Originally Posted by *Xtri;14076050*
> Hello, I just got my Gainward GTX 590 today. And when I check GPU-Z one of the GPU's are always at full speed while the other is not. The fans are running at 1500rpm, and it's quite loud, is this normal when it is idle?
> 
> Also, GPU 1 usage is high in for example Crysis and Bad Company 2, but GPU 2 runs at max 37 % usage. Is this normal?


No it's not normal you should be at full throttle 99%, your bottle necking. Overclock your i7 920 which will help.

Have you set your fan on manual settings? Does Gainward come with a GPU software like Afterburner or Precision?

You want to enter settings and set your fan settings to manual and you choose how much fan percentage to run at depending on how hot your card gets. The auto fan settings for all cards don't work well.


----------



## Xtri

I'm running my i7 920 at 4 ghz. I tried to clock it back to 2.6 ghz, didn't make a difference


----------



## Arizonian

Quote:


> Originally Posted by *Xtri;14078378*
> Stop with the bottleneck stuff, I'm running my i7 920 at 4 ghz, bottleneck discussions are over-rated. I tried to clock it back to 2.6 ghz, didn't make a difference


Didn't see you any OC on it. Then what would keep a card from reaching 99% performance, anyone? PCIe lane going bad? Just trying to help. Have you checked GPU-Z to verify if your @ x16?


----------



## Xtri

Sorry for the negative comment, just a bit tired from staying up all night trying to get stuff sorted out









Anyways, It reaches 98% on GPU 1, but on GPU 2 it's not really doing it's full potential.

And yeah, GPU-Z is also where I monitor it. The max it has reached on GPU 2 is 37 %.
And yes, It is 16x.

i7 920 OCed should be enough though shouldn't it or do I have to get into higher resolutions to reach it's full potential? Running single screen at 1920x1080, games I've tested are Crysis and Battlefield Bad Company 2 mainly.


----------



## delavan

When I had my GTX590 classified, I noticed that the GPU clocks were downclocking (throttling) if I wasn't asking enough out of it...

Try to raise the eye candy to the max...shot in the dark....


----------



## kevink82

Wondering wut temps u guys get with crysis 2 dx11? Just got 2 590s and they r reaching 90c which is a little worrying for me.


----------



## Smo

Quote:



Originally Posted by *kevink82*


Wondering wut temps u guys get with crysis 2 dx11? Just got 2 590s and they r reaching 90c which is a little worrying for me.










90c is nothing to worry about, most of the benchmarks I've seen they're hitting 86/87.


----------



## kevink82

Glad to hear that ive made a custom profile to lower the temps more, heres some pics of the new toys. They are both gigabyte brand and new batch according to the supplier.

Love the geforce logo that shines keke.


----------



## xjonathanvx

Very clean build, kevink82. Looks amazing!


----------



## xjonathanvx

Here are some older pics of my rig I have access to here at work. I'll edit this post and get some of my updated pics with a little less dangling cables....enjoy! The cards are Classy Hydro Copper LE's from EVGA.


----------



## dboythagr8

Quote:



Originally Posted by *xFyre*


Yes, 3GB cards would be more adequate for surround. If you want aesthetics, why don't you check out the 3GB Lightning XE 580s?


Where can you get the Lightning XE 580s? I can't find them anywhere? I'm trying to decide between them and 590s


----------



## bughole5

some games still get 40 fps min with the 590, i think people looking for fps should either get a 590 or 2x580's.....spend the dough man. Also, get the i5 2500k


----------



## xFyre

Quote:



Originally Posted by *dboythagr8*


Where can you get the Lightning XE 580s? I can't find them anywhere? I'm trying to decide between them and 590s


MSI is currently making more, I don't think there's any store that has them in stock right now. I asked an MSI rep on the OCUK forums about it and he said that they would probably be shipped to retailers halfway through this month. I assume it will be the same for Newegg, NCIX, etc.

Keep an eye out for them


----------



## capchaos

for users with dual 590's aka quad sli. here are my results with crysis 2. As you can see quad sli does not scale at all.


----------



## xFyre

Quote:


> Originally Posted by *xjonathanvx;14081383*
> Here are some older pics of my rig I have access to here at work. I'll edit this post and get some of my updated pics with a little less dangling cables....enjoy! The cards are Classy Hydro Copper LE's from EVGA.


What SLI connector are you using, it looks awesome! They look like these, but the guy selling them says they're not flexible.. Maybe he's wrong? Because they don't look all that rigid to me.


----------



## Shinobi Jedi

Is anyone running Crysis 2 through Steam able to get the benchmark to work? I can't for the life of me get it going.

I'm really close to pulling the trigger on another 590. Maybe as early as tomorrow.

Are there any dual 590 owners running their's with at 5760x1080 with three Full HD/3D 120hz panels and the latest drivers? All the reviews I can find are with old drivers. If so, I have a few questions:

How's the performance? Are you still hitting caps because of the 1.5 Vram per GPU at that resolution?

Will an i7 920 OC'd at 4.0ghz be too much of a bottleneck to bother with 2 590's in Quad and those panels?

While I know 3D Vision Surround is nearly impossible with any of these cards, even the 3gb ones (If I understand things correctly) - but how about for watching 3D movies? Could I run a 3D film to each panel at the same time, if say I wanted to have a few friends over and divide them up and have a few watch the film from each display? I'm assuming this is not the case no matter what setup I have and 3D movies can only run to one panel at a time, yes?

(Interestingly, I picked up a second pair of Nvidia 3D Vision glasses, to see if the screen would be big enough for one to watch a 3D game, while another plays it. Not only does it totally work, but even at 10' back, this 24" BenQ puts out a vivid 3D picture to both my wife's eyes and mine at the same time. Admittedly, the screen is a little small from that distance, but not enough to take you out of the movie. Nothing anyone who didn't grow up in the 70's or 80's wouldn't be used to. After all, most TV's were 19" back in those days..)

If I could get the card at a place like Fry's where I got my first card, instead of online, it'd be a no brainier. I'd go pick it up and try it out. If it wasn't worth it, or working right, I'd have 30 days to return it for a refund. But, I'll have to buy online, so I'm not sure if I should pull the trigger on a second one, or wait for Kepler. Seeing some respected owners on here already dumping their cards for 3gb VRAM ones is a little disconcerting as well.

Like I said on here before, If the card wasn't such a limited production run and botique, I'd just wait it out and see how what's coming up performs. And if the next gen stuff is disappointing or full of bugs, then I'd get a second 590 if it can do what I'm looking for.

Any input from dual 590 owners running three Full HD/3D 120hz panels on the latest drivers, and if you're still hitting caps with the 1.5gb VRAM would be appreciated!

To be clear, I understand that you definitely will with a 2560x1600 IPS panel, but until they can do 120hz, I'll stick with the 1080p ones that can.









I bought the BenQ XL2410T and I can't say enough good about it. I love this panel. I'm planning on getting 2 more, and I don't want to buy another 590, just to find that I'm still going to hit performance caps because of VRAM.

If that's the case, then I'd rather follow others lead and dump my 590 for 2 3gb 580's, or even just hold off on the two panels and another card altogether and just go with what I got until Kepler comes and we see if it's a worthy upgrade, or new architecture that is going to give every new owner the early adopter blues..

To in which case, if I waited to find , and then see that a second 590 would give me what I want and be the way to go, by then, they'll all be gone.

Right now, my rig is performing pretty Pimp with just the one 120hz panel for sure, so I'm open to waiting out for 2 more and a rig upgrade if it ends up being worth it.

So, I'm really trying to find out if 2 590's is enough to run 3 120hz panels, play games at full settings in that kind of a surround setup, and run a 3D film concurrently to all 3 panels (or in surround) - if even possible, and lastly, if my I7 920 @ 4.05 is enough for that setup or will it still be a bottleneck that makes that kind of investment pointless without a new CPU/Mobo?

Also, game wise, I'm interested in doing this for BF3. They say that if you can run BFBC2 well, you can run BF3 well. And I can. I get almost 100fps with everything turned up and 32QSAA. But I noticed that BFBC2 is one of the few games that really seemed to get a huge performance boost from a second 590, and I'd really love to be able to get 120fps constant with everything turned up when BF3 comes out in Oct.

But, I also am not sure if it's worth the investment for a second 590 and a new PSU to run it, if Kepler or whatever else is going to blow that performance out of the water.

(Sigh) None of this would be an issue if the cards weren't in such limited run yet high in demand.. But I guess that goes with the territory with vid cards at this level.

The MSI 3Gb 580 Lightning's look like fantastic cards, but another set of colored fans, just running in the card, doesn't have the same panache as the GeForce LED logo.

But of course, performance is what matters in the end.

Thanks to all who read my whole rant and any input to help me decide!

Cheers!


----------



## zerounleashednl

I read the whole rant and got some answers, but not to all your questions...









Quote:



Are there any dual 590 owners running their's with at 5760x1080 with three Full HD/3D 120hz panels and the latest drivers? All the reviews I can find are with old drivers. If so, I have a few questions:


 Got a QUAD SLI setup here with 3 screens, two 1920x1200 and one 1920x1080 (120Hz/3D) and can't seem to get it to work together. If I finally do, I have some answers on the 5760x1080 1,5gb VRAM usage. For now I only have some answers on VRAM usage on a single screen with and without 3D Vision enabled (read on...)

Quote:



To be clear, I understand that you definitely will with a 2560x1600 IPS panel, but until they can do 120hz, I'll stick with the 1080p ones that can.


 From my understanding the limitations are within the Dual Link DVI and HDMI cables (max [email protected]) and not with the 2560x1600 screens.

Quote:



Also, game wise, I'm interested in doing this for BF3. They say that if you can run BFBC2 well, you can run BF3 well. And I can. I get almost 100fps with everything turned up and 32QSAA. But I noticed that BFBC2 is one of the few games that really seemed to get a huge performance boost from a second 590, and I'd really love to be able to get 120fps constant with everything turned up when BF3 comes out in Oct.


 An interesting one. So I fired up BFBC2 and recorded some MSI Afterburner readings (see below) including CPU and VRAM usage.

Quote:



The MSI 3Gb 580 Lightning's look like fantastic cards, but another set of colored fans, just running in the card, doesn't have the same panache as the GeForce LED logo.


 I know exactly what you mean, and I don't like the blue/white theme on the MSI... Love the GeForce LED log though on the 590's! Haha... but that's a matter of taste...









Quote:



So, I'm really trying to find out if 2 590's is enough to run 3 120hz panels, play games at full settings in that kind of a surround setup, and run a 3D film concurrently to all 3 panels (or in surround) - if even possible, and lastly, if my I7 920 @ 4.05 is enough for that setup or will it still be a bottleneck that makes that kind of investment pointless without a new CPU/Mobo?


 Also recorded the CPU usage on my i7 2600K @ 4.5 GHz (see screens below).









By zerounleashednl at 2011-07-03








By zerounleashednl at 2011-07-03

Conclusions:

* Quad SLI in BFBC2 scales very well (4x 80%+ and 70%+ readings)
* Non-3D 200+ FPS and 3D Vision +/- 60 FPS (no V-Sync enabled)
* For some reason the GPU's never goes 100%, not even on 1 GPU...







Could it be a CPU bottleneck (even when CPU is between 40% and 70%?) Or is it driver related?
* VRAM usage on 1 screen 1080p is >1041 MB (non-3D) and >1265 (3D Vision).
* With 3 screens you wil definitely hit the 1500MB VRAM wall and I don't know what the impact is on user experience... (will you even notice?) For example Crysis 2 on 1080p single screen: after the High Res Texture update VRAM is ALWAYS 1500MB but the FPS never dips below 50 on QUad SLI, even with 3D Vision enabled, so the impact on user experience is minimum. Maybe I will do some VRAM testing with Crysis 2 also...









Hope this input will help you.


----------



## Shinobi Jedi

That is super helpful. Plus rep!

Hmm.. It's an interesting quandary.

I'm not entirely sure if I'll get 3 screens or not because of where to put them. So if I stick with one screen, the a second 590 would really make a difference.

If I go with 3 screens, it sounds like I'll hit the VRam wall, to in which it'd have to be 2 580 3GB, or wait for Kepler. Which I'd probably do rather than go through the hassle of dumping my 590 and getting 2 580 3GB.

I might not do anything at this point, and just keep rocking what I have until Kepler or we see what BF3 launches and what it really needs to play it.


----------



## xFyre

Quote:



Originally Posted by *Shinobi Jedi*


That is super helpful. Plus rep!

Hmm.. It's an interesting quandary.

I'm not entirely sure if I'll get 3 screens or not because of where to put them. So if I stick with one screen, the a second 590 would really make a difference.


If you're getting 2 590s, do consider getting a 120Hz panel, even if you don't care about 3D. It'd be a waste to run 2x590s on 60Hz.


----------



## Shinobi Jedi

Quote:



Originally Posted by *xFyre*


If you're getting 2 590s, do consider getting a 120Hz panel, even if you don't care about 3D. It'd be a waste to run 2x590s on 60Hz.


I have one! I just got it









The BenQ XL2410T. I can't say enough good about it. I really, really love the panel. Though PQ is horrible out of the box and needs a calibration.

I want to get two more. But I don't know if I'll have the space. That's my issue.

But yeah, 120hz is where it's at. I used to have my rig hooked up to my 60" Plasma. But I had no problem losing that nice huge screen for 120hz smoothness.









3D is really cool too. I've been playing BFBC2 with it and it is so much fun.


----------



## xFyre

Quote:



Originally Posted by *Shinobi Jedi*


I have one! I just got it









The BenQ XL2410T. I can't say enough good about it. I really, really love the panel. Though PQ is horrible out of the box and needs a calibration.

I want to get two more. But I don't know if I'll have the space. That's my issue.

But yeah, 120hz is where it's at. I used to have my rig hooked up to my 60" Plasma. But I had no problem losing that nice huge screen for 120hz smoothness.










Awesome then!

Yup, 120Hz gaming is really awesome. I was going with a 30" 2560x1600 monitor, but after I tried out a 120Hz panel at a friend's house, I gave up on a larger monitor for the higher refresh rate.


----------



## kevink82

I though lcd panel refresh rate is different to crt refresh rate and makes absolutely no difference unless u game in 3d? But i dislike the backlight bleeding on the benq and acer monitors.....

But if u gonna take on surround gaming then the 590 isnt for u with the limited vram if u want to turn those eye candies up.


----------



## xFyre

Quote:


> Originally Posted by *kevink82;14090897*
> I though lcd panel refresh rate is different to crt refresh rate and makes absolutely no difference unless u game in 3d?


Wrong. With a 120Hz monitor you can either play in 2D @ 120Hz (up to 120FPS can be displayed, versus 60FPS on a normal monitor), or, you can play in 3D @ 60FPS per eye.

120Hz monitors offer a much smoother experience. Even when you're dragging windows across your desktop, the difference is visible.


----------



## xjonathanvx

Quote:



Originally Posted by *zerounleashednl*


I read the whole rant and got some answers, but not to all your questions...









Got a QUAD SLI setup here with 3 screens, two 1920x1200 and one 1920x1080 (120Hz/3D) and can't seem to get it to work together. If I finally do, I have some answers on the 5760x1080 1,5gb VRAM usage. For now I only have some answers on VRAM usage on a single screen with and without 3D Vision enabled (read on...)



Unfortunately, you won't be able to get these monitors to work together. All three monitors must have the same native resolution, refresh rate, and sync polarity.

Edit: The Nvidia FAQ shows that they can be different native resolutions, but must share a common resolution. (Lowest one it seems)


----------



## xjonathanvx

Quote:


> Originally Posted by *kevink82;14090897*
> I though lcd panel refresh rate is different to crt refresh rate and makes absolutely no difference unless u game in 3d? But i dislike the backlight bleeding on the benq and acer monitors.....
> 
> But if u gonna take on surround gaming then the 590 isnt for u with the limited vram if u want to turn those eye candies up.


The 590's are fine for surround. I have to switch to a single screen on only two games I currently own; Crysis with DX11/Hi-Res Textures @ultra and Metro2033 @ max settings. Not bad imo.

Sure, 3gb would have been great, but to be honest, there just aren't that many demanding games around. My surround performance, to me, is not worth selling my cards off to put towards a tri-sli 580 3gb build.


----------



## Shinobi Jedi

Quote:



Originally Posted by *kevink82*


I though lcd panel refresh rate is different to crt refresh rate and makes absolutely no difference unless u game in 3d? But i dislike the backlight bleeding on the benq and acer monitors.....

But if u gonna take on surround gaming then the 590 isnt for u with the limited vram if u want to turn those eye candies up.


No Blacklight Bleed whatsoever on my BenQ









Firmware update (V.5) fixed that issue right up in all the XL2410T. It is gone, like yesterday's donut


----------



## kevink82

Idk if im gonna replace my dell for 3 surround monitor i dont see much of games running at 120fps with msaa; ;


----------



## Smo




----------



## Arizonian

Congrats SMO


----------



## Smo

Feels good









(but I want another one







)


----------



## zerounleashednl

Quote:


> Originally Posted by *Smo;14092702*
> Feels good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (but I want another one
> 
> 
> 
> 
> 
> 
> 
> )


Go for it!


----------



## Arizonian

Quote:


> Originally Posted by *zerounleashednl;14092931*
> Go for it!


You'd have to upgrade your 850W power supply. I say try it out and see if it's even really needed. Enjoy.


----------



## Smo

Quote:


> Originally Posted by *zerounleashednl;14092931*
> Go for it!


I'm saving up, but it'll take a few months








Quote:


> Originally Posted by *Arizonian;14092961*
> You'd have to upgrade your 850W power supply. I say try it out and see if it's even really needed. Enjoy.


Can I do harm to any of my components if I throw in another and there's not enough power?


----------



## Arizonian

Quote:


> Originally Posted by *Smo;14092997*
> I'm saving up, but it'll take a few months
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can I do harm to any of my components if I throw in another and there's not enough power?


Won't hurt components but will black screen as soon as games start or possibly not boot up at all. 850 watts on a good qaulity PSU is enough to run two GTX 580's but not three. Essentially with two GTX 590's you'd be running four down clocked GTX 580's.

Especially if your only one monitor, you have way more than enough GPU power. I think you'll find yourself very happily satisfied with the one GTX 590.


----------



## Smo

Quote:


> Originally Posted by *Arizonian;14093436*
> Won't hurt components but will black screen as soon as games start or possibly not boot up at all. 850 watts on a good qaulity PSU is enough to run two GTX 580's but not three. Essentially with two GTX 590's you'd be running four down clocked GTX 580's.
> 
> Especially if your only one monitor, you have way more than enough GPU power. I think you'll find yourself very happily satisfied with the one GTX 590.


So far - yes! It's very good, I wasn't blown away by it but the performance certainly is something!

I'm not really interested in a 3-monitor setup, it's more than I need. I think you're right, maybe one is enough... but then again... is it?


----------



## Alatar

Quote:


> Originally Posted by *Smo;14092219*
> snip


added, enjoy the card!


----------



## tlr3715

I have a pair of EVGA Classified 590s on the way. Newegg had them in stock for like 2 hours on Friday so I jumped on it. I am having a lot of issues with my 3 way SLI setup, which the 590s should correct. I'll try to provide "proof" once I have it set up.

How is everyone's experience with the 590s so far? What is the highest clock speed that provides a stable experience on air?


----------



## xjonathanvx

Go back 164 pages and look at the list with stable clocks/proof. ; )
You wont need to worry about OCing the card at all, it runs most games @ stock clocks just fine. These OC's were done with the drivers that still allowed voltage control before they were locked because people were thinking since it had GF110 it would surely OC like a 580...and then the cards went *pop*


----------



## kevink82

Only problem i get is now dirt3 would randomly crash flicker and show weird texture compare to my hd6970 which is sometimes crash on exiting game......

Other than that my 590s rocking fine atm with other games like crysis 2, black ops etc etc...


----------



## Masked

The latest drivers are crashing me even in Doom 3...Imagine that...

Rolling forward to the new beta...Let's see if that helps.


----------



## RagingCain

Well guys its been fun working with you (beating our heads against the wall) but this is my last shebang with the 590s. I will be doing the GTX 590 vs 6990 bench off, but after that, my GTX 590s will have been sold to a private buyer.

Hopefully we can get a few more of you guys participating with us, but we all know it can be a pain benching/clocking these cards, so no pressure.


----------



## Masked

Quote:


> Originally Posted by *RagingCain;14127224*
> Well guys its been fun working with you (beating our heads against the wall) but this is my last shebang with the 590s. I will be doing the GTX 590 vs 6990 bench off, but after that, my GTX 590s will have been sold to a private buyer.
> 
> Hopefully we can get a few more of you guys participating with us, but we all know it can be a pain benching/clocking these cards, so no pressure.


The only reason I haven't sold mine yet is b/c of the surround at home.

I'm getting extremely close...

Might just end up going for 2x580 2gb at this point.

Not sure yet.


----------



## Smo

Does anyone else get low FPS in the Witcher II?

I'm getting like 25FPS everything fully maxed out at 1900x1200 with my sig rig. Could this be down to drivers?


----------



## Alatar

You have übersampling on and haven't done the SLI fix?

See here worked ok for me. Better perf but I got some weird lighting issues sometimes.


----------



## Smo

Quote:


> Originally Posted by *Alatar;14129441*
> You have übersampling on and haven't done the SLI fix?
> 
> See here worked ok for me. Better perf but I got some weird lighting issues sometimes.


Thanks for the link, really appreciate it!

However I don't have a 'Compatibility' section or 'SLI Compatibility bits' value.


----------



## Shinobi Jedi

Quote:



Originally Posted by *RagingCain*


Well guys its been fun working with you (beating our heads against the wall) but this is my last shebang with the 590s. I will be doing the GTX 590 vs 6990 bench off, but after that, my GTX 590s will have been sold to a private buyer.

Hopefully we can get a few more of you guys participating with us, but we all know it can be a pain benching/clocking these cards, so no pressure.


Thanks for all your help RagingCain!

Make sure to let us know what you end up with and how it compares to your 590 Quad setup!

I was ready to buy a second 590 myself, but they're all Ghandi.


----------



## ReignsOfPower

Quote:



Originally Posted by *RagingCain*


Well guys its been fun working with you (beating our heads against the wall) but this is my last shebang with the 590s. I will be doing the GTX 590 vs 6990 bench off, but after that, my GTX 590s will have been sold to a private buyer.

Hopefully we can get a few more of you guys participating with us, but we all know it can be a pain benching/clocking these cards, so no pressure.


Thanks for all your input bud. You better go Quad 3GB EVGA GTX 580's on water!

I'm waiting for the Kepler GPU's, Might consider water cooling at that stage as 2011 Sockets will be about and is pretty much a full rebuild anyway.


----------



## tlr3715

Why are so many uses talking about getting rid of their 590s? Is there something wrong with them? I have a 590 SLI setup coming in the mail this week, since I had trouble with heat in my 3 way SLI setup.

Any reason I should try to build a different 3 way SLI setup instead of the quad 590 setup I'm getting?


----------



## Smo

Quote:



Originally Posted by *tlr3715*


Why are so many uses talking about getting rid of their 590s? Is there something wrong with them? I have a 590 SLI setup coming in the mail this week, since I had trouble with heat in my 3 way SLI setup.

Any reason I should try to build a different 3 way SLI setup instead of the quad 590 setup I'm getting?


I think it's a combination of VRAM per GPU (1.5GB compared to 3GB on some GTX 580s) having an effect on people using 3-way surround, poor overclocking potential (possibly driver related, but time will tell) and Quad SLi driver support.


----------



## shane_p

Quote:


> Originally Posted by *tlr3715;14134425*
> Why are so many uses talking about getting rid of their 590s? Is there something wrong with them? I have a 590 SLI setup coming in the mail this week, since I had trouble with heat in my 3 way SLI setup.
> 
> Any reason I should try to build a different 3 way SLI setup instead of the quad 590 setup I'm getting?


well I think they must be running multiple monitors at high res?
as I have found my single 590 to be just perfect, and great performance overkill actually (everything maxed and FPS as high as my monitor allows)
mind you I only have 1x monitor at a lower res (all games in 3D though)

I think the 1.5GB of ram limit only affects very high res, CRYSIS 2 with DX11 and 3D Vision running I only see 1GB of VRAM actually used
(again lower res at 1680X1050)


----------



## Smo

Quote:


> Originally Posted by *shane_p;14137031*
> well I think they must be running multiple monitors at high res?
> as I have found my single 590 to be just perfect, and great performance overkill actually (everything maxed and FPS as high as my monitor allows)
> mind you I only have 1x monitor at a lower res (all games in 3D though)
> 
> I think the 1.5GB of ram limit only affects very high res, CRYSIS 2 with DX11 and 3D Vision running I only see 1GB of VRAM actually used
> (again lower res at 1680X1050)


Yeah, you're absolutely right - it's a monster on a single monitor (aside from driver issues with some titles, but that's to be expected with a new card anyway). However for the surround gamers it's more beneficial to run the 3GB version of the GTX 580, which is currently the best possible card to use (IMO) for high res triple monitor gaming.


----------



## L1eutenant

Um, i have one, im not going to try to pull it out of the case.

I haven't touched it so it would be stock.

The brand is Nvidia?


----------



## Semedar

Add me, please.









Asus running at stock settings

http://www.techpowerup.com/gpuz/gwds7/

Pictures:

Semedar's RIG Picasa Album


----------



## helifax

Quote:


> Originally Posted by *zerounleashednl;14090254*
> I read the whole rant and got some
> Conclusions:
> 
> * With 3 screens you wil definitely hit the 1500MB VRAM wall and I don't know what the impact is on user experience... (will you even notice?) For example Crysis 2 on 1080p single screen: after the High Res Texture update VRAM is ALWAYS 1500MB but the FPS never dips below 50 on QUad SLI, even with 3D Vision enabled, so the impact on user experience is minimum. Maybe I will do some VRAM testing with Crysis 2 also...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this input will help you.


I am running 3D Vision Surround on one single Asus GTX 590 (Oced).
Running Crysis 2 with Patch 1.9 + DX11 Patch+ High Texture Patch everything on ultra settings. In 3D Surround You get framerate from 30-60fps depending on the scene. The VRAM gets 1.5GB very fast but...

The only problems I am experiencing is after load. Meaning there are 2-3 secs of laggy movement after which the framerate goes up. I haven't experienced any lag spike (AT ALL) during gameplay and everything is very fluid...

As for 2 GTX 590 in SLI. Correct me if I'm wrong but... I know the RAM doesn't SUM UP. only the processing power...which means you will basically still have 3GB (1.5Gb per GPU) of usable Video memory...

Conclusion: 1xGTX 590 is enough to power 3 displays in 3D. The GPU1 is allways at 99% and GPU2 fluctuates between 80-99%. Because of this extra cooling is needed as it hots up like hell). On watercooling after 1h or so of usage at 99% the temp reaches 75 Celcius degrees. (Normal operating temp is around 60-65 and idle is 38-42).

Hope this helps


----------



## Alatar

Quote:



Originally Posted by *L1eutenant*


Um, i have one, im not going to try to pull it out of the case.

I haven't touched it so it would be stock.

The brand is Nvidia?


I doubt the card is OEM. What's the aib? Asus? EVGA? zotac? etc. I'll add you when you have that info









updating in a sec


----------



## L1eutenant

Quote:



Originally Posted by *Alatar*


I doubt the card is OEM. What's the aib? Asus? EVGA? zotac? etc. I'll add you when you have that info









updating in a sec


Palit?


----------



## Alatar

Quote:



Originally Posted by *L1eutenant*


Palit?


Yep.

Want to provide a screenshot of msi afterburner, gpu-z etc.?


----------



## L1eutenant

Quote:



Originally Posted by *Alatar*


Yep.

Want to provide a screenshot of msi afterburner, gpu-z etc.?










will pic of the box do?

As im trying to avoid downloading programs that i don't use

Will this do?


----------



## Smo

Quote:



Originally Posted by *L1eutenant*


will pic of the box do?

As im trying to avoid downloading programs that i don't use

Will this do?


It's not really proof though bud - could you not upload a photograph of it with your name present somewhere?


----------



## th3illusiveman

so how high do these usually go on stock volts?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *helifax;14146711*
> I am running 3D Vision Surround on one single Asus GTX 590 (Oced).
> Running Crysis 2 with Patch 1.9 + DX11 Patch+ High Texture Patch everything on ultra settings. In 3D Surround You get framerate from 30-60fps depending on the scene. The VRAM gets 1.5GB very fast but...
> 
> The only problems I am experiencing is after load. Meaning there are 2-3 secs of laggy movement after which the framerate goes up. I haven't experienced any lag spike (AT ALL) during gameplay and everything is very fluid...
> 
> As for 2 GTX 590 in SLI. Correct me if I'm wrong but... I know the RAM doesn't SUM UP. only the processing power...which means you will basically still have 3GB (1.5Gb per GPU) of usable Video memory...
> 
> Conclusion: 1xGTX 590 is enough to power 3 displays in 3D. The GPU1 is allways at 99% and GPU2 fluctuates between 80-99%. Because of this extra cooling is needed as it hots up like hell). On watercooling after 1h or so of usage at 99% the temp reaches 75 Celcius degrees. (Normal operating temp is around 60-65 and idle is 38-42).
> 
> Hope this helps


Very much!

I really want to add 2 more monitors and take the plunge, but if I do it means probably having to add a desk somewhere long enough to do it. My current desk rests just a little over 60" long. And my monitor is 24" so I'm assuming I'm going to be short. Plus it's also where we have our 60" Plasma, so while it's not a pain to move one display back and forth so as not to block the Plasma, three, much less where to put them when not in use, is a bit of a dilemma.

Plus rep though! I wasn't sure about doing it with just one card, but now I am and will, once I get my spacing issues figured out!


----------



## Robitussin

Shinobi Jedi I have been debating going to 120hz from my big TV as well, I like the massive screen but I feel like I'm missing out on something when I read all the stellar reviews from people about 120hz gaming, and 3d ect. I may just have to pull the trigger on it and see for myself. Do the games look alot better since the pixels are so much smaller? I have been playing on a large TV for awhile now, this is my 3rd setup, and I just dont want to drop 700$ an a smaller screen for sub par looks. I have been hoping someone would come along with a similar setup that had done it to see what they say, but your saying go for it eh?









thanks in advance +rep


----------



## RagingCain

Quote:


> Originally Posted by *Robitussin;14156299*
> Shinobi Jedi I have been debating going to 120hz from my big TV as well, I like the massive screen but I feel like I'm missing out on something when I read all the stellar reviews from people about 120hz gaming, and 3d ect. I may just have to pull the trigger on it and see for myself. Do the games look alot better since the pixels are so much smaller? I have been playing on a large TV for awhile now, this is my 3rd setup, and I just dont want to drop 700$ an a smaller screen for sub par looks. I have been hoping someone would come along with a similar setup that had done it to see what they say, but your saying go for it eh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks in advance +rep


Hey guys, I just wanted to drop in and mention that there is a huge difference between a TV and PC monitor interms of input latency you boys know that right? The difference is night a day.

Secondly, if you are going with a 120 Hz monitor (not a TV monitor!), try and get the ASUS, I have a Acer 120 Hz monitor that is 23.6", and the ASUS is much superior in terms of color saturation and actual response time. Mine is indeed beautiful, but the ASUS looks more natural, better brightness, where as the Acer is fairly bright, but slightly cartoonish in colors. Really red Reds, really blue Blues, and faded colors in between.


----------



## L1eutenant

Quote:


> Originally Posted by *Smo;14156020*
> It's not really proof though bud - could you not upload a photograph of it with your name present somewhere?


Will this do?


----------



## Arizonian

When I was learning about 3D vision I compiled a thread of info and found reviews, info etc if anyone is interested in some monitors, the ASUS included.

3D Monitors w/120hz Refresh Rate & Nvidia Kit Q & A

I'm running 3D on one GTX 580 and I know the GTX 590 would do great with 3D vision.


----------



## Shinobi Jedi

Quote:



Originally Posted by *RagingCain*


Hey guys, I just wanted to drop in and mention that there is a huge difference between a TV and PC monitor interms of input latency you boys know that right? The difference is night a day.

Secondly, if you are going with a 120 Hz monitor (not a TV monitor!), try and get the ASUS, I have a Acer 120 Hz monitor that is 23.6", and the ASUS is much superior in terms of color saturation and actual response time. Mine is indeed beautiful, but the ASUS looks more natural, better brightness, where as the Acer is fairly bright, but slightly cartoonish in colors. Really red Reds, really blue Blues, and faded colors in between.


I have to disagree about the Asus, RagingCain. I went through two of them and they both had dead pixels right out of the box. And the Fry's store had three more open box returns due to the same issue. I like Asus a lot as a company. I love my G73JH-A1 ROG Notebook. But according to the Fry's guys (who are by no means experts) and their experience with returns, Asus makes awesome notebooks and mobo's but the QC at their display division is lacking, I guess.

As I've said on here before, I really, really love my BenQ XL2410T. I was hesitant about it, due to a lack of online reviews, and a run of complaints about light bleed. But mine has no light bleed whatsoever. And after researching the web, I found that the light bleed problem was fixed with a firmware update, and when I checked mine in the service menu it was running the same firmware update.

Admittedly, the PQ is pretty bad out of the box. But whether you do a true ISF/Spyder calibration yourself, or do what I did, and get an ICC profile from tftreview, and once calibrated and/or profile applied, the PQ has some of the deepest blacks I've ever seen on an LCD. PQ at the sRGB setting with calibration hits 6500K almost on the dot and greyscale is great.

I'm usually not a fan of matte/AG screens as I usually like bright colors that pop off the screen like on a Glossy. Hence initially going with Asus as their panel is a glossy screen.

But then I found out Glossy isn't good for 3D Surround gaming as you'll have images from one screen reflecting off another and it screws up the 3D, that talked me into trying the BenQ with the AG. But regardless, I have to say this is the best AG job I've ever seen. No window/screen door effect. Colors are still bright. Picture is not distorted. I am very pleased with the matte job.

_Robotussin_ - Thanks for the rep! Let me earn it by trying to answer your question:

As for going from 60" to 24" 120hz/Full HD is it worth it? 
Hell Yes! RagingCain explained it perfectly. FPS are so buttery smooth. This is why I'm trying to still get a second 590. Because I know 2 590's Quad SLI on BFBC2, and ideally BF3, scales well, and puts the game at an average of over 200fps on one 1080p/120hz panel. I'm sure it won't be constant, but I like to think with figures like that, I could get a solid 120fps Vsync. Though I've yet to see any picture tearing anyways.

But you know what really tips the favor to the 24" over my 60"? 
3D Gaming.

I bought the glasses on a lark, really getting the panel for the 120hz/fps goodness. But when I tried it - DAAAYYYUM! I'm a filmmaker who loves tech but hasn't really gotten a hard on for 3D in cinemas yet. Not when shot with the cameras, or done in post. But 3D Gaming? Wow. Totally a new experience. Crazy immersion. I think the difference that makes it better than movies is the simple fact of having the viewer also be the player and interacting with what's on the screen. Moving around a concrete wall in 3D just "feels" different even though the input is the same. The depth of field can help with play too. Especially with long distance fire fights.

There is definitely some light Ghosting on this BenQ during some 3D. Often it can be adjusted through contrast or the IR emitter to almost nothing. But knowing that Ghosting on 3D Games is more due to the nature of the Nvidia/Shutter Glasses technology than the display and just a nature of the beast, makes it easier to ignore when playing - whereas a dead or stuck pixel draws my attention and drives me crazy no matter what.

Also, 24" isn't that small. Is it 60"? No. But I'm 37 and can still remember my days as a kid when the average CRT television set size was 19" and it was normal to view a screen that small from as far away as 10' back.

And you know what? I tried watching some Tron 3D and Avatar 3D from my couch which is 11' back from the display and my normal viewing distance for my 60" Kuro. And not only was the screen plenty big to get and stay lost in the movie, but in a way, the 3D looked better, or more distinct, because the frame of the monitor/picture is in your field of view to do exactly that - frame the image, which lets you take in the whole image, and in doing so - get a better sense of the depth of field in the image.

The BenQ XL2410T runs $389 in US currency. And is a little more expensive than the Asus or Acer as it's an LED monitor. I want to get three, and will eventually, but am living with just one for the time being due to oft mentioned spacing issues. And I'm plenty satisfied with just one for right now. Maybe that's what you should do? Just get one for now and see how you like it?

As of now, it seems that if you want to do 1080p 3D gaming, your only option is Nvidia's 3D Vision, and in effect one of these 24" panels. Unless you want to spend a few grand on a compatible DLP HDTV which so far is only made by Mitsubishi. And while DLP might be great for 3D, it's not great for normal HD content as it has very limited viewing angles.

So knowing that, until they come forth with a 27-30" 1080p/120hz panel- (Acer had a limited release of one that was snapped up quick and seems to be unavailable now. I guess due to the Tsunami/Earthquake disaster)

Or until they can make an IPS 2560x1600 panel do 120hz Full HD/3D -

And therefore one's selection is limited to these 24" panels if they want to buy now - I can't recommend the BenQ XL2410T enough. It will look like a$$ picture wise out of the box. But a true calibration, or getting a calibration ICC profile online will yield one of the best pictures I've ever seen on any LCD panel. I will definitely at some point be buying two more, unless before then better tech is released at a competitive price.

I give it, and 24" Full HD and 3D gaming my highest recommendation.

And with owning both, if you ever miss the big screen, it just means a switch of cables in the back, right?

Hope this helps!

Cheers!


----------



## helifax

Hello again!

I am using 3 x Samsung Syncmasters 2233rz 120hz 3D monitors. I know..they are quite old and not full HD (1680x1050 native resolution).
I feel the need to say the following:

The GOOD:

- They are VERY GooD MONITORS!
- I bought them at 3 different moments in time (2009 and 2011).
- Not one of them has dead pixels (maybe I just got lucky with 3 out of 3).
- Color is good.
- Small input latency.
- The 20000:1 contrast is enough.
- The brightness is ok. (could be a little better in 3D but is OK)
- In 3D ghosting is kept to a minimum.
- They "warm-up" pretty fast.
- Nice design.

The BAD:

- not FULL HD.
- Supports only dual link DVI
- 16:10 aspect ratio and 22'3 size.
- Now the tricky part: there are 2 versions!!! the nov 2009 version and nov 2010 version. There is difference in TINT between the 2 revisions. I suggest get 3 of the same model and MAKE (for all monitors not only these ones).
- No longer produced/build/manufactured.
- In 3D mode you cannot make any monitor changes: brightness, contrast or colors (this available for any monitor & 3D Vision).

Now some downsides for the 3D Vision:
- you CANNOT calibrate the color on each monitor (like you can do in EyeFinity) in Surround (2D & 3D) - HUGE MINUS. So ensure you have 3 monitors of the same MODEL AND MAKE!
- No monitor control of brightness, contrast and colors. Everything is LOCKED up.
- If you are experiencing GHOSTING, try this as it might help (especially for AMD CPU users): http://forums.nvidia.com/index.php?showtopic=198407 The latest drivers provided a fix. but still with this "work-around"I find less ghosting.

As for the good sides of 3D Vision...everything which is not mentioned here)


----------



## Tept

Figured it out


----------



## tlr3715

I finally got my GTX 590 SLI setup going this week. I noticed that in The Witcher 2 and Shogun 2 the framerates were better in the 3 way SLI setup I had last month that had overheating problems.

I had a feeling it would not perform as well, but I did not expect it to be as noticible as it is.

I might try overclocking the cards to see if I can improve framerates.

At least they are running cool. Looks like 85C under load, as opposed to 97C+.


----------



## Tept

Asus GTX 590 Bios @ Raging Cain

Not exactly sure what all settings you want. All I want is the card at 1.0v


----------



## Robitussin

Thanks Shinobi Jedi and RagingCain for the replies. I ended up picking up a ASUS VG236H almost grabbed the 27'' ACER but after reading a ton of reviews I felt more comfortable with the ASUS. I will see how it goes, got it on Newegg so I can always send it back if I don't like it. Hopefully I do though. I was going to buy 3 in surround and switch my 590 for some 580 3gbs in SLI but the awesome deal I thought I had fell through last minute







but I figure its all fine I will wait till next gen cards come out now and be happy with 1920x1080 and 3d for now







thanks again guys for the posts +rep

EDIT* Arizonian I'm going through your post right now looks very informative tho thanks for the link!


----------



## RagingCain

The GTX 590 Overclocking & Flashing Thread:
http://www.overclock.net/nvidia/1063263-gtx-590-flashing-overclocking-thread.html


----------



## max883

Can somone send me a Gainward gtx 590 bios 0.962v or put it here, il be very hapy


----------



## RagingCain

Quote:


> Originally Posted by *max883;14179278*
> Can somone send me a Gainward gtx 590 bios 0.962v or put it here, il be very hapy


Have to post the original and some on here can modify it.


----------



## max883

i dont have the originale;( i flasht my gainward gtx590 to evga gtx590 0.972v your bios.
but there are some slowedowns in games and 3dmark 11. 10.700 3dmarks. using nvidia driver 275.33. gtx590: gpu core 720 shader 1440 mem 1730. the score in 3dmark 11 is ok, but it can be better if the card didn have the slowdowns! is it that i have the wrong bios for my card?


----------



## Juggalo23451

Quote:



Originally Posted by *max883*


i dont have the originale;( i flasht my gainward gtx590 to evga gtx590 0.972v your bios.
but there are some slowedowns in games and 3dmark 11. 10.700 3dmarks. using nvidia driver 275.33. gtx590: gpu core 720 shader 1440 mem 1730. the score in 3dmark 11 is ok, but it can be better if the card didn have the slowdowns! is it that i have the wrong bios for my card?


anytime you flash a card to need to have the back up of the original bios file


----------



## RagingCain

Quote:



Originally Posted by *max883*


i dont have the originale;( i flasht my gainward gtx590 to evga gtx590 0.972v your bios.
but there are some slowedowns in games and 3dmark 11. 10.700 3dmarks. using nvidia driver 275.33. gtx590: gpu core 720 shader 1440 mem 1730. the score in 3dmark 11 is ok, but it can be better if the card didn have the slowdowns! is it that i have the wrong bios for my card?



Quote:



Originally Posted by *Juggalo23451*


anytime you flash a card to need to have the back up of the original bios file


^ Juggalo hit the hammer on the nail there.

I suspect the slow downs have nothing to do with the BIOS and your GPU is too high. Are you on water cooling down the GTX 590? Otherwise that clock might be a little too high for the stock cooling and the slow downs you are experience are to the card slowing it down to cooling it self.

You should be at the peak of the PDL, so you shouldn't be experiencing any other throttling.


----------



## max883

Gtx590 stock cooler. I dint have the slowdowns with the standard gainward gtx590 bios 0.926v runing gpu 700 shader 1400 mem 1730.

with evga gtx590 bios i testet gpu 680 shader 1390 mem 1730 same problem, slowdowns;(
temps are normal so so there must be someting with the evga bios dont fit the gainward gtx590?

just hope someone have a gainward gtx 590 0.972v so i can test it


----------



## RagingCain

Quote:



Originally Posted by *max883*


Gtx590 stock cooler. I dint have the slowdowns with the standard gainward gtx590 bios 0.926v runing gpu 700 shader 1400 mem 1730.

with evga gtx590 bios i testet gpu 680 shader 1390 mem 1730 same problem, slowdowns;(
temps are normal so so there must be someting with the evga bios dont fit the gainward gtx590?

just hope someone have a gainward gtx 590 0.972v so i can test it











Is your voltage still 0.972v? You need to lower that back to stock. If the issue is gone at stock, then its not the BIOS. Which leans more to that your card is overheating.


----------



## shane_p

anyway 590 BIOS editing

here is the sweet spot I found:
and edited my post for a permenant overclock


Uploaded with ImageShack.us
at .988v
anymore I found downlclock, and anyless and I would crash sometimes
if anyone wants i can post my bios


----------



## shane_p

anyway 590 BIOS editing

here is the sweet spot I found:
and edited my post for a permenant overclock, and a slight overvolt


Uploaded with ImageShack.us

at .988v
anymore I found it would and anyless voltd and I would crash sometimes
if anyone wants i can post my custom bios

sorry double post was a mistake


----------



## Alatar

Quote:


> Originally Posted by *L1eutenant;14157211*
> Will this do?


yes

and sorry for the long time without an update, I was occupied with something else.

@ Raging I'll add your flashing thread to the OP







Looks very informative.


----------



## max883

had to lower the vcore to 0.938 gpu 700 shader 1400 mem 1728 now its ok.

i wish the card can run 720 for the gpu, but its trhotling at those speeds ;(


----------



## Shinobi Jedi

I went ahead and snatched a second one yesterday while it was in stock at EVGA.

I'll get pictures posted of the two when I get the second one in.

I'm not expecting a huge performance increase, or anything close to double of what I'm getting now.

I got it mainly for solid 60fps for 3D gaming and I know BFBC2, and ideally BF3, scales really well with Quad SLI and was adding an extra 50fps in some benches. I'm gonna be sticking with a single 1080p 120hz monitor for now too, so I'm not tripping on the VRam limit. If I go bigger on the displays and hit a wall, I'll consider moving them then and getting into whatever is worthy at that time.

I'm psyched.

Thanks to all that read my lengthy diatribe's and helped me with a decision!


----------



## Smo

Quote:


> Originally Posted by *Shinobi Jedi;14187575*
> I went ahead and snatched a second one yesterday while it was in stock at EVGA.
> 
> I'll get pictures posted of the two when I get the second one in.
> 
> I'm not expecting a huge performance increase, or anything close to double of what I'm getting now.
> 
> I got it mainly for solid 60fps for 3D gaming and I know BFBC2, and ideally BF3, scales really well with Quad SLI and was adding an extra 50fps in some benches. I'm gonna be sticking with a single 1080p 120hz monitor for now too, so I'm not tripping on the VRam limit. If I go bigger on the displays and hit a wall, I'll consider moving them then and getting into whatever is worthy at that time.
> 
> I'm psyched.
> 
> Thanks to all that read my lengthy diatribe's and helped me with a decision!


Exactly what I'm planning to do - looking forward to seeing how it turns out for you.


----------



## RagingCain

Quote:


> Originally Posted by *shane_p;14184974*
> anyway 590 BIOS editing
> 
> here is the sweet spot I found:
> and edited my post for a permenant overclock, and a slight overvolt
> 
> 
> Uploaded with ImageShack.us
> 
> at .988v
> anymore I found it would and anyless voltd and I would crash sometimes
> if anyone wants i can post my custom bios
> 
> sorry double post was a mistake


Yes on the 275.xx drivers seemed to have lowered max over clock is about 700 @ 0.963v and it used to be 725 @ 0.963.

Post powered by DROID X2


----------



## helifax

Quote:


> Originally Posted by *RagingCain;14188038*
> Yes on the 275.xx drivers seemed to have lowered max over clock is about 700 @ 0.963v and it used to be 725 @ 0.963.
> 
> Post powered by DROID X2


Using the 275.33 & 275.50 drivers I clocked my card at 0.963V using these clocks (stable clocks):
Core :705Mhz
Shaders: 1410 Mhz
Memory: 1900 Mhz.

The card is being used in 3D Vision Surround only and NOT once had any problems (driver crash or system halts or even video glitches). I keep it on water and it reaches 75 degrees after long periods of 99% GPU usage (2 hours or more).

I managed to get higher clocks in 2D but when switching to 3D the drivers crash.

Hope this helps.


----------



## RagingCain

Quote:


> Originally Posted by *helifax;14191009*
> Using the 275.33 & 275.50 drivers I clocked my card at 0.963V using these clocks (stable clocks):
> Core :705Mhz
> Shaders: 1410 Mhz
> Memory: 1900 Mhz.
> 
> The card is being used in 3D Vision Surround only and NOT once had any problems (driver crash or system halts or even video glitches). I keep it on water and it reaches 75 degrees after long periods of 99% GPU usage (2 hours or more).
> 
> I managed to get higher clocks in 2D but when switching to 3D the drivers crash.
> 
> Hope this helps.


Did you increase the memory from the Start or after you hit 705 core?

If you want to try, you may want to lower memory speed, it can limit your core overclock or even benchmark scores entering the GDDR5 Error Correction activation.


----------



## Manac0r

Politely asking to get these ASUS bioses, modded for a slight voltage oc. My 590's run in quad sli on Water. 0.963v is what I've heard is safest for these cards.

I meant to upload the EVGA bios, brain is a bit frazzled tonight.


----------



## RagingCain

Quote:


> Originally Posted by *Manac0r;14192257*
> Politely asking to get these ASUS bioses, modded for a slight voltage oc. My 590's run in quad sli on Water. 0.963v is what I've heard is safest for these cards.
> 
> Much obliged


You should use the other thread for this, I really wanted to see a bit more people wanting their BIOSes edited over there!

BIOS attached!

Edit:
Link added for the other thread one more time!
http://www.overclock.net/nvidia/1063263-gtx-590-flashing-overclocking-thread.html

Its also now in my cleaned signature.

Manac0r, why did you give me an ASUS Bios, when you are using an EVGA card?

I have deleted my upload, I don't want people to be cross brand flashing. There is no need, and it immediately destroys warranties. I should have saw that way faster than I did.


----------



## Manac0r

I posted an update in the other link. Thanks for spotting that Raging, its been a long day. I've swapped so many cards around today. I have the EVGA in here now. Was a honest mistake. Sorry.


----------



## RagingCain

Quote:



Originally Posted by *Manac0r*


I posted an update in the other link. Thanks for spotting that Raging, its been a long day. I've swapped so many cards around today. I have the EVGA in here now. Was a honest mistake. Sorry.


Its okay, I am quite tired and grumpy too!

I have your BIOS uploaded to the BIOS Archive in the other thread!

@ALL
I really could use some love from our european brothers!

I need the MSI and Gainward A&B Bios!!!

Feel free to add attach them in the other thread and I will use them!


----------



## helifax

Quote:


> Originally Posted by *RagingCain;14191192*
> Did you increase the memory from the Start or after you hit 705 core?
> 
> If you want to try, you may want to lower memory speed, it can limit your core overclock or even benchmark scores entering the GDDR5 Error Correction activation.


Hi,

The memory speed was the last one increased from 1850 to 1900. In benchmarks and games there is a slight increase in performance. That is why I put it @ 1900 Mhz.


----------



## RagingCain

Quote:


> Originally Posted by *helifax;14198103*
> Hi,
> 
> The memory speed was the last one increased from 1850 to 1900. In benchmarks and games there is a slight increase in performance. That is why I put it @ 1900 Mhz.


Memory seems way too high. It should be 1728 to start with. Did you lower everything to stock to see if you still get slow downs? You need to test stock.

Your memory speed looks high enough to cause Error Correction to occur which significantly lowers performance while on. It can also be just an unstable overclock on the GPU core too though. Test stock.

Post powered by DROID X2


----------



## max883

is it posible to disable my asus GTX 590 bios protection? and how?


----------



## max883

Reformatting the USB stick did the trick for getting rid of the General Protection
















Booted into DOS on the USB stick, and before I did anything I did a "nvflash --save backup.rom" just in case (still have it).

After that, "nvflash -r" to turn off write protection, followed by "nvflash -4 -5 -6 <romname>". Answered y/yes to all prompts. After it was done I rebooted using Ctrl+Alt+Del.

Used same process both times.





















Now using the Patched EVGA GTX590 bios


----------



## RagingCain

Quote:



Originally Posted by *max883*


Reformatting the USB stick did the trick for getting rid of the General Protection
















Booted into DOS on the USB stick, and before I did anything I did a "nvflash --save backup.rom" just in case (still have it).

After that, "nvflash -r" to turn off write protection, followed by "nvflash -4 -5 -6 <romname>". Answered y/yes to all prompts. After it was done I rebooted using Ctrl+Alt+Del.

Used same process both times.





















Now using the Patched EVGA GTX590 bios

















Seriously there was no point to do this.

There is no point what so ever to flash a different brand.

You just lost your entire warranty for no reason since I provided Asus BIOSes.

Post powered by DROID X2


----------



## Jeppzer

If this isn't proof enough, tell me.


----------



## ranerX3

^ card pics maybe ?
and what inside that huge box from GB


----------



## Jeppzer

Shazam.

Cards are bundled with GB M8000X Gaming mouse..
And my three screens were bundled with Zowie EC1 mouses.. So I have 5 new mouses. :/


----------



## helifax

Quote:



Originally Posted by *RagingCain*


Memory seems way too high. It should be 1728 to start with. Did you lower everything to stock to see if you still get slow downs? You need to test stock.

Your memory speed looks high enough to cause Error Correction to occur which significantly lowers performance while on. It can also be just an unstable overclock on the GPU core too though. Test stock.

Post powered by DROID X2


No no no, you miss-understood me... I am not getting any slowdowns at these speeds. I get slowdowns or even driver crashes IF I increase the freq beyond 705Mhz and USING 3D Vision (in 2D works). 
The ideea of the initial post was to let people know what are my clocks & voltages on a successfully GTX 590 running in 3D Vision Surround. (You can get better clocks in 2D or when using 3D Vision on one monitor.)

I followed up your advice and lowered the memory freq and found some loss in performance (nothing in real case scenarios...only in benchmarks).

Hope this makes my original intention of the poster clear.
PS: Thanks for the tips


----------



## RagingCain

Quote:



Originally Posted by *helifax*


No no no, you miss-understood me... I am not getting any slowdowns at these speeds. I get slowdowns or even driver crashes IF I increase the freq beyond 705Mhz and USING 3D Vision (in 2D works). 
The ideea of the initial post was to let people know what are my clocks & voltages on a successfully GTX 590 running in 3D Vision Surround. (You can get better clocks in 2D or when using 3D Vision on one monitor.)

I followed up your advice and lowered the memory freq and found some loss in performance (nothing in real case scenarios...only in benchmarks).

Hope this makes my original intention of the poster clear.
PS: Thanks for the tips










Well I was just pointing out that under more stress, i.e. using 3D, you were showing your cards stability was lost. Meaning in actuality, that even if its working in 2D, its not actually stable. Something like CUDA or a very GPU intensive game like Metro2033 will more than likely crash your drivers/GPU at some point as well.

Most DX9 titles don't stress out the card and some of the more newer DX11 titles have been. Most notably DX11 in Crysis 2 is sniffing out a lot of instability in peoples cards, which of course translates to, latest patch in Crysis 2 is bugged.

Quote:



Originally Posted by *Jeppzer*











Shazam.

Cards are bundled with GB M8000X Gaming mouse..
And my three screens were bundled with Zowie EC1 mouses.. So I have 5 new mouses. :/


Please send me your BIOS!!!!

Edit: One from each GPU on your 590!

PM me if you need to know how to do this, or visit my GTX 590 Flashing & Overclocking thread for a quick BIOS Grabbing guide.


----------



## Jeppzer

Quote:



Originally Posted by *RagingCain*


Please send me your BIOS!!!!

Edit: One from each GPU on your 590!

PM me if you need to know how to do this, or visit my GTX 590 Flashing & Overclocking thread for a quick BIOS Grabbing guide.


I'll get back to you with the four gpu bioses as soon as I stick those sweet cards of heavenly honey in my case and get things running.


----------



## Fallendreams

Here's a GPU-Z Shot i will post a picture tomorrow with the GTX 590 in the case.


----------



## tlr3715

Please add me to the club. I have my cards running at stock settings at the moment.


----------



## Shinobi Jedi

Finally got my second card! Here's the proof! Please add me to the list!

I'm a little concerned about my temps as my X58 mobo is a first generation so I had to put the cards in the slots right next to each other to enable Quad SLI.

Of course it's the first card in Slot one I'm worried about. As the fan is pushed right up against card two.

Temps Idle:
GPU 1: 41-51c (depending if card is downclocked or not)
GPU 2: 41-51c
GPU 3: 37-41c
GPU 4: 37-43c

Temps at Full Load*:
GPU 1: 89c
GPU 2: 90c
GPU 3: 76c
GPU 4: 79c

*This was "stress tessed" by putting Metro 2033 on 'Pause' with all 4 GPU's at %95. All 4 GPU's stayed at %95 as I let the game stay on the Pause Screen for 45mins. Fans were set to Auto through EVGA precision with a monitoring profile set to bump the fans up to %100 should the GPU's hit 80c. These numbers are the very highest points they reached during that period. The temps hovered for a long time between 84-87c on GPU 1 & 2, with both seeming almost stable at 88c-89c. 90c on GPU 2 didn't happen until near the end of the test period and only lasted a few seconds, when GPU's 3 & 4 fan kicked in at %100. That knocked GPU's 1 & 2 temps right back down to 88c.

To be clear, "In Game" the temps on the first card have never gone above 81c. Not on Metro 2033 or BFBC2 after at least an hour of play. (I've only tested on those games as I know they scale well and really use all 4 GPU's)

So, if I need a new mobo, is it worth it to get a new CPU as well? Is it worth it to get Sandy Bridge yet? Or should I wait for Ivy Bridge? Or should I just get another X58 mobo like an EVGA X58 Classified FTW?

Or do nothing, as I'm fine?

I also noticed one of my Dimm Slots is dead, knocking me down to 8gb of Ram. I can RMA it as I have a lifetime warranty, but if I'm gonna yank it out, I'd rather put something new back in. So, if it's worth waiting until after Sandy Bridge, and just roll with what I have, I'm okay with just 8gb of Ram if I'm just gaming, right?

And if my temps on my top card, stay around 78-81c in game, I'm cool with the two cards next to each other like that until I upgrade, right?

As for the performance, it's beyond what I expected. I'm really happy with it. For example, the Prologue Level on Metro 2033 usually would chug at some point. After installing this second card, that level averaged 70fps! It was a whole new play experience.

I'm gonna do some more testing in game. I look forward to feedback on my temps and if it's critical I do a mobo (and maybe CPU) change or not.

Thanks for reading! Please add me to the list!

Cheers!


----------



## XXXfire

For ya'll 590 users with water-blocks, what kind of temperatures are you seeing on the cores & voltage regulators? I'm especially interested in quad-sli figures. As a follow up, has maintaining these beefy GPUs @ a nice & chilly operating temperature resulted in substantial impact on your peak (stable) overclocks?


----------



## RagingCain

Quote:



Originally Posted by *Shinobi Jedi*


Finally got my second card! Here's the proof! Please add me to the list!


Congrats!

Quote:



I'm a little concerned about my temps as my X58 mobo is a first generation so I had to put the cards in the slots right next to each other to enable Quad SLI.


Not going to lie your temps are indeed on the high side.

Quote:



Of course it's the first card in Slot one I'm worried about. As the fan is pushed right up against card two.

Temps Idle:
GPU 1: 41-51c (depending if card is downclocked or not)
GPU 2: 41-51c
GPU 3: 37-41c
GPU 4: 37-43c

Temps at Full Load*:
GPU 1: 89c
GPU 2: 90c
GPU 3: 76c
GPU 4: 79c

*This was "stress tessed" by putting Metro 2033 on 'Pause' with all 4 GPU's at %95. All 4 GPU's stayed at %95 as I let the game stay on the Pause Screen for 45mins. Fans were set to Auto through EVGA precision with a monitoring profile set to bump the fans up to %100 should the GPU's hit 80c. These numbers are the very highest points they reached during that period. The temps hovered for a long time between 84-87c on GPU 1 & 2, with both seeming almost stable at 88c-89c. 90c on GPU 2 didn't happen until near the end of the test period and only lasted a few seconds, when GPU's 3 & 4 fan kicked in at %100. That knocked GPU's 1 & 2 temps right back down to 88c.


Due to spacing requirements, thats about all you can do is adjust the fan profile. Normally I would also suggest undervolting, but that is something you want to avoid on the 590. Despite running cooler, its already prone to being at too low of a voltage for many cards.

It would also help to know the room temperature. If it is above 75f (~23.33c) this obviously will greatly impact the entire systems coolness, if this was at a lower temperature, then it is definitely something you are going to have to work out.

I will tell you this, spacing out the cards will help in the 5c~10c range, but you also may want to consider different cases and water cooling.

Quote:



To be clear, "In Game" the temps on the first card have never gone above 81c. Not on Metro 2033 or BFBC2 after at least an hour of play. (I've only tested on those games as I know they scale well and really use all 4 GPU's)

So, if I need a new mobo, is it worth it to get a new CPU as well? Is it worth it to get Sandy Bridge yet? Or should I wait for Ivy Bridge? Or should I just get another X58 mobo like an EVGA X58 Classified FTW?

Or do nothing, as I'm fine?


If you get a Sandy Bridge, you need to have 2x PCI-E slots with the full 16x. The 590 needs as much PCI-E bandwidth it can possibly gobble up. I have already tested this, I saw in the realm of 25~50% performance lost per card running it in 8x.

Also of note I have an EVGA x58 E770 Classified board with a water block installed. So if you are going water and a new board, that will definitely help









Quote:



I also noticed one of my Dimm Slots is dead, knocking me down to 8gb of Ram. I can RMA it as I have a lifetime warranty, but if I'm gonna yank it out, I'd rather put something new back in. So, if it's worth waiting until after Sandy Bridge, and just roll with what I have, I'm okay with just 8gb of Ram if I'm just gaming, right?


Should be enough!

Quote:



And if my temps on my top card, stay around 78-81c in game, I'm cool with the two cards next to each other like that until I upgrade, right?

As for the performance, it's beyond what I expected. I'm really happy with it. For example, the Prologue Level on Metro 2033 usually would chug at some point. After installing this second card, that level averaged 70fps! It was a whole new play experience.


Yes, as long as you always got your eyes on temperature. There is nothing wrong with up to 85c, some people argue 90c too, but I don't go above 85c when on air cooling.

Quote:



I'm gonna do some more testing in game. I look forward to feedback on my temps and if it's critical I do a mobo (and maybe CPU) change or not.

Thanks for reading! Please add me to the list!

Cheers!


I want to see benchmarks!


----------



## Smo

I thought circa 90c was the norm for an air cooled card? I'm running a single 590 in a Raven RV02-EW (with the uprated AP181 fans) and my stock card hits 88c when under maximum load for long periods of time. I hope this isn't something to be concerned about!


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14217833*
> I thought circa 90c was the norm for an air cooled card? I'm running a single 590 in a Raven RV02-EW (with the uprated AP181 fans) and my stock card hits 88c when under maximum load for long periods of time. I hope this isn't something to be concerned about!


I don't know what the thermal max is for the 590, but the 580, you want it under 95~97c. So according to nVidia, anything under that is no harm no foul. I just personally keep it under 85c for the longevity of the card.

Its a personal habbit of mine, don't let me scare you. Officially its okay, but I wouldn't want it that way









Although I love the Raven, I still think the 590 dumps air into the case. Which sort of defeats the purpose of the 90deg rotation on the motherboard. If you can push that air out faster, the better.

Of course water cooling is always an option.


----------



## PannonOC

add me please to the club


----------



## kevink82

Quote:


> Originally Posted by *XXXfire;14212137*
> For ya'll 590 users with water-blocks, what kind of temperatures are you seeing on the cores & voltage regulators? I'm especially interested in quad-sli figures. As a follow up, has maintaining these beefy GPUs @ a nice & chilly operating temperature resulted in substantial impact on your peak (stable) overclocks?


Mine idle at 35c on 2 gigabyte gtx590s, max i got it to go running crysis 2 is 51c with everything max out at 2560x1440. Doesnt improve overclocking without voltage mod i still hit the wall at around 630 on the core as i dont want to increase my voltage just yet... not sure the vrm temps as i didnt check.


----------



## RagingCain

Quote:


> Originally Posted by *PannonOC;14218524*
> add me please to the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ~Images~


They are gorgeous! Welcome!

@All
*GREAT NEWS, YOU CAN ADJUST THE POWER DRAW LIMITATION SETTINGS WITH NVIDIA INSPECTOR (1.9.5.5.)!!!!! No need for my experimental BIOSes!!!!*

I have to go now though.


----------



## Jeppzer

Quote:


> Originally Posted by *PannonOC;14218524*
> add me please to the club


Is that a MagiCool xtreme nova 1080 radiator I spot in the reflection?


----------



## sarojz

Ooooo soooo shiny!! Looks good dude, are those EVGA's water blocks or another third partys products? Welcome to the club!!


----------



## Shinobi Jedi

Well, I've gone ahead and done some more testing in game. The highest the GPU has hit so far has been 83c in BFBC2 SP campaign. But that was for a brief moment. Most temps usually stay in high 70's or at most 80-81c. So I think I'm cool..

When the temps were recorded on my last post, it actually was 78F in my apartment at the time, so maybe that had an effect too. The only thing that has made the temps crawl above 85c is looping the 3D Mark11 Demo that stresses all the GPU's to the max, or when I "stress tested" by leaving the Metro 2033 game on the pause screen for 45min with all four GPU's hovering at %95.

Of course the temps are even lower on games that aren't scaling all four GPU's to their full extent like Crysis 2. Or if I enable VSync then the GPU's don't seem to need to go full blast %90+ all the time and run around %60 and temps are pretty nominal in that case.

How long should I be playing in game to get an accurate read on temps? 30min? An Hour?

I'm a little hesitant to do water cooling as I don't know if I have the time for the upkeep and maintenance and the once or twice a year tear down and cleaning of hoses, etc. Or it would be my luck that I would be one of those unfortunate few who had their WC system burst and fry their components.

I think spacing the cards is all I may need. But if I was going to get a new Mobo, should I get another X58 and run my I7 920 Bloomfield off of it? Or go for Sandy Bridge? Or stick with what I have and wait for Ivy bridge?

Here's some 3DMark 11 Benches with the 275.50 Beta driver:




If there's any other benches I should run, I'm down to do it.

So my temps seem to be staying in high 70's to 81c at the most, with only one brief moment noticed so far of GPU 2 hitting 83c during heavy gaming.

If that stays the case, then I should be cool with my cards next to each other in my current setup for awhile correct?

I figure I'm not overclocking them and they have a lifetime warranty, so if they die earlier than they should, then I can RMA them for another. And I always have EVGA precision running on my G13 or G19 so I can keep an eye on FPS and Temps anyways.

I know the EVGA tech support says up to 97c is within spec as that's when the voltage protection or whatever it is kicks in, and they also said the card will shut itself down at 105c to avoid frying it.

So with those factors in mind, I should be good to go right? I'd hate to not be able to relax and enjoy the rig %100 right now because in the back of my head I'm tripping on temps. But if I'm not hitting more than 83c in game, then I'm thinking I don't have to worry about it. Interestingly, my temps at idle (42-46c) and at full load seem to either match or better any 590 Quad SLI reviews I can find. My temps so far have been better than the temps on the review at Guru3D and they thought 43c idle and 85c full load were fantastic.

So, unless anyone is catching something I'm not, I can table this dilemma for now, right? If not, what should I do for a mobo switch? Get another X58 and keep my Bloomfield? Get a Sandy Bridge? (What's a mobo that has the full bandwith needed? Is the EVGA P67 FTW one? And is there a certain type of SB CPU I want? Aren't some locked, while others aren't?) - Or stick with what I have and wait for Ivy Bridge or better?

Thanks everyone's feedback. Especially RagingCain! +Rep on the last response!

Cheers!


----------



## Smo

Quote:



Originally Posted by *RagingCain*


I don't know what the thermal max is for the 590, but the 580, you want it under 95~97c. So according to nVidia, anything under that is no harm no foul. I just personally keep it under 85c for the longevity of the card.

Its a personal habbit of mine, don't let me scare you. Officially its okay, but I wouldn't want it that way









Although I love the Raven, I still think the 590 dumps air into the case. Which sort of defeats the purpose of the 90deg rotation on the motherboard. If you can push that air out faster, the better.

Of course water cooling is always an option.


I'm afraid for the moment that watercooling is not an option for me - I'm not really interested in it. I may consider it for a future build but for now I'll give it a miss!

I had a quick reconfigure of the fan profiles using Precision (standard the maximum the 590 uses is 40%) and essentially upped the fan usage by 10-15% across the board, so now at full load the fan speed sits at 65% and my GPUs have hit a maximum of 77c, no higher.

Happy with that, seems a happy medium for me regarding temps/noise.


----------



## NoDoz

I am jealous of you all.


----------



## Jeppzer

Quote:



Originally Posted by *NoDoz*


I am jealous of you all.


No need for that! Join us instead!


----------



## Smo

Quote:



Originally Posted by *Jeppzer*


No need for that! Join us instead!


Agreed


----------



## PannonOC

Quote:


> Originally Posted by *Jeppzer;14219656*
> Is that a MagiCool xtreme nova 1080 radiator I spot in the reflection?


its a Phobya Xtrem NOVA 1080 + 9 Phobya G-silent RED fan


----------



## Jeppzer

Quote:



Originally Posted by *PannonOC*


its a Phobya Xtrem NOVA 1080 + 9 Phobya G-silent RED fan


You sir, are my hero.
Where have you mounted it?


----------



## Alatar

Wow, so many new 590s since the last update










Going to take a min!


----------



## Alatar

Quote:



Originally Posted by *PannonOC*


add me please to the club



















I'm sorry if I missed it somehow but would you mind listing the brand of these beauties?







(asus, evga etc.)


----------



## Jeppzer

This post states that he has Gigabytes.


----------



## Alatar

Quote:



Originally Posted by *Jeppzer*


This post states that he has Gigabytes.










Nice!


----------



## Jeppzer

The gigabyte userbase is growing.









Also, I ran into a problem with my build...
Electritian comming over sometime next week to install new wiring to my gameroom. Old ones couldn't handle the load.









So, bioses and benchies will have to wait awhile longer, unless I move everything to the kitchen wich has better wiring.. Hmm.


----------



## tlr3715

Something about water and food grease just does not mix with a GTX 590 PC... Unless you are planning on gutting out the fridge and using it as a case.


----------



## Jeppzer

The condensation scares me too much to even think about it.
Oh well, atleast my case has wheels so I can use it as a Bobbycar (







)until I have an outlet outside of the kitchen that can handle the 9-18 Amp load.


----------



## Arizonian

Quote:


> Originally Posted by *Alatar;14233473*
> Wow, so many new 590s since the last update
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going to take a min!


I was declined a step up from EVGA to go from single GPU to dual GPU or I'd be an owner too as it came out about two and a half months later. Would have loved one. Now I have to wait till tax refund time before I can SLI my current 580.

Ah well.


----------



## tlr3715

Anyone notice that newegg has a new version of the EVGA GTX 590 classified going for $20 less than before? They were in stock for a few hours this week.

Model is 03G-P3-1596-AR

The model I bought was 03G-P3-1598-AR. Pictures look identical and same clock speeds.

Anyone know what the difference is.


----------



## hammer32261

http://i714.photobucket.com/albums/ww141/HAmmer32261/IMG_0843.jpg
EVGA
Stock clocks


----------



## Manac0r

I have my wc Quad Sli 590 (EVGA), oc'd to 725Mhz/1856mem clock, 0.963v. I ve stress tested under furmark, and all seems well. Is it ok to leave it at this OC when playing games? TIA


----------



## H4rd5tyl3

Quote:



Originally Posted by *tlr3715*


Anyone notice that newegg has a new version of the EVGA GTX 590 classified going for $20 less than before? They were in stock for a few hours this week.

Model is 03G-P3-1596-AR

The model I bought was 03G-P3-1598-AR. Pictures look identical and same clock speeds.

Anyone know what the difference is.


Yes, I bought two the other day. I looked at specs and they are exactly the same. The only difference I can tell is that they don't come with the shirt, mousepad, and all that other stuff.


----------



## Juggalo23451

Here is proof of my oc 700 core linked with shader. 1900 memory, .963 volts. 3dmark11 score as well
I am going find out if I am stable at 705 soon then 710 if I am lucky


----------



## RagingCain

Quote:



Originally Posted by *Juggalo23451*


Here is proof of my oc 700 core linked with shader. 1900 memory, .963 volts. 3dmark11 score as well
I am going find out if I am stable at 705 soon then 710 if I am lucky






Nice. Wait till you mess with PDL. An extra 400~500 GPU score.

Quote:



Originally Posted by *H4rd5tyl3*


Yes, I bought two the other day. I looked at specs and they are exactly the same. The only difference I can tell is that they don't come with the shirt, mousepad, and all that other stuff.


Hey send me the BIOS when you get a chance. I think its the silent 590 revision...

Post powered by DROID X2


----------



## Tept

Using Asus 590 bios provided by Ragingcain. GPU2 no longer goes into power saving mode, sits at my OC'd speed perma, GPU1 however downclocks to 405mhz while idle. Looking for help figuring out how to get GPU2 to join GPU1 in sleepy time like its supposed to.


----------



## RagingCain

Quote:



Originally Posted by *Tept*


Using Asus 590 bios provided by Ragingcain. GPU2 no longer goes into power saving mode, sits at my OC'd speed perma, GPU1 however downclocks to 405mhz while idle. Looking for help figuring out how to get GPU2 to join GPU1 in sleepy time like its supposed to.


I released a new bios. Try flashing it. What have you already tried?

Post powered by DROID X2


----------



## Tept

Reinstalled drivers twice using clean install, 275.50 and 275.33. Looked through nvidia control panel twice and tried rebooting using power management on adaptive and prefer maximum performance rebooting each time. Looked through nvidia inspector and dont have a clue what to screw with in there.


----------



## RagingCain

Quote:



Originally Posted by *Tept*


Reinstalled drivers twice using clean install, 275.50 and 275.33. Looked through nvidia control panel twice and tried rebooting using power management on adaptive and prefer maximum performance rebooting each time. Looked through nvidia inspector and dont have a clue what to screw with in there.


Did you re-flash.


----------



## Tept

yep, using the north american bios for asus you have in the other thread.


----------



## RagingCain

Quote:



Originally Posted by *Tept*


yep, using the north american bios for asus you have in the other thread.


Just you and Manc0r. So weird. Flash stock BIOS see if it idles again. Have you played with Nvidia inspector at all?

Post powered by DROID X2


----------



## Shinobi Jedi

Hey all, I could use some more advice:

So I'm wondering if my CPU is bottle necking my Quad setup?

Because when I set my Bios to run my CPU @ 4.2, it has to run the Ram at 2:8/1067 speeds to be stable at that setting.

Yet, when I set the Ram to run at the mobo's specified speed of 2:10/1333, and the max I can OC my CPU to keep this stable is 3.3ghz - There is only a 100pt difference in 3DMark11 Extreme settings between the 4.2/1067 clock and the 3.3/1333 clock

So, is that a bottleneck? It's not like I'm getting bad performance. Quite the contrary. The only games that seem to not be running full blast are DX9 ones that are console ports. They all seem to run at a capped 60fps which I'm sure is because of the games config's and not a GPU scaling issue.

I'm just curious.

Temps in game still seem to be good. So far the highest I've hit on the top card is 83c, but I'm still considering a mobo switch out just for piece of mind if anything.

I know the new Sandy Bridge mobo's and CPU's can limit the card if they don't have the right slot setup to do a full x16 or whatever, but what about this revised one? Would this mobo and a properly OC'd I7 2600k be enough to really open the cards up?

http://www.evga.com/products/moreInf...%20Family&sw=5

And if not, and I still want to consider sticking with the X58 chipset, does that mean I can just upgrade the CPU to open the cards up? Or should I consider a new mobo as well?

Or, stick with what I have and wait for Ivy Bridge, because temps aside, it's not an issue performance wise?

Also, my second card was purchased a week ago from last Friday just a couple hours after becoming available in the EVGA online store, and arrived last Wed. And it has the same part number as my first one. I doubt the different part number on Newegg signifies a secret revision.

As I found when shopping for my Asus ROG notebook, that Newegg often has a different part number on products delegated to their site as they often have slight differences than the standard product.

For example, my Asus ROG notebook model is G73JH-A1. Newegg sells the exact same notebook under the model G73JH-X1 with the difference being that Newegg's doesn't come with a backpack, gaming mouse or Blu-Ray player like my model number does. Much like here where the Newegg version of the 590 doesn't come with the Mousepad and T-Shirt and is just the card. I really hope I'm wrong. I would love to get/see revised cards. But considering the timing of my second card's availability in EVGA's own store and having the same part number as the first one, and knowing that Newegg changes part numbers on other products the same as in this case, makes me feel pretty confident that it's the Newegg thing and not revised cards.

Of course we won't really know until RagingCain gets the bios of that part number, to in which I really hope I'm wrong and happily would eat crow.

Thanks to any and all for answers and input as to whether my CPU is holding back my 590 Quad setup and what to do if so.

Cheers!


----------



## Tept

Quote:



Originally Posted by *RagingCain*


Just you and Manc0r. So weird. Flash stock BIOS see if it idles again. Have you played with Nvidia inspector at all?

Post powered by DROID X2


I havent changed anything from factory settings on nvidia inspector. It did idle on stock BIOS. I remember my idle temps prior to the change and they definately were not 20c apart as they are now.


----------



## Smo

Guys I have an interesting problem - I started up Crysis today and for some reason my GPU usage doesn't go above ~50% anymore, leaving me with low FPS in some sequences. My card is the EVGA Classified on air and is completely unmodified, running the beta drivers (275.50). It was perfectly fine yesterday









*Edit:* I swapped back to my 2407WFP and it's perfectly fine again...

*shrugs*










*Re-Edit:* Nope. Couple of minutes of gameplay and usage dropped back down to ~50% on each GPU. Tested Crysis 2 and it appears fine.


----------



## Tept

@Smo: Prefer Maximum Performance is set for Power Management in nVidia Control Panel? If not, do that, I actually noticed a "smoothness" difference in my performance when I set it to that.

@RagingCain
I've figured out the GPU2 not going into P8 is related to multiple monitors. When I disable my secondary monitor, it drops into P8, turn it back on, it goes back into P0. Secondary is an LG 32" LED TV connected by (PC)DVI-D>VGA---VGA(TV). I'm going to try switching to DVI-D>HDMI----HDMI when I get back home.


----------



## Tept

On a side note I'm currently stable @ 745mhz core/1490 shader, 1830mhz memory @ .963v
Over 1830 on the memory I start to get performance loss, probably memory error garbage.

I'm gunna take a stab at 750 core when I get back.


----------



## Smo

Quote:



Originally Posted by *Tept*


@Smo: Prefer Maximum Performance is set for Power Management in nVidia Control Panel? If not, do that, I actually noticed a "smoothness" difference in my performance when I set it to that.

@RagingCain
I've figured out the GPU2 not going into P8 is related to multiple monitors. When I disable my secondary monitor, it drops into P8, turn it back on, it goes back into P0. Secondary is an LG 32" LED TV connected by (PC)DVI-D>VGA---VGA(TV). I'm going to try switching to DVI-D>HDMI----HDMI when I get back home.


Cheers for the suggestion - but I've never been able to open the NVIDIA Control Panel. God knows why, but it doesn't actually load up.


----------



## Tept

Odd. That might be something you wanna fix. Those drivers like to keep your power consumption as low as possible while still being able to power the gpu's and that causes issues in high demand situations in games. That setting I believe to be critical.


----------



## Smo

Quote:



Originally Posted by *Tept*


Odd. That might be something you wanna fix. Those drivers like to keep your power consumption as low as possible while still being able to power the gpu's and that causes issues in high demand situations in games. That setting I believe to be critical.


I decided to have a go at regressing to the 275.33 drivers and during installation it moaned about nvvsc.exe -install (or something like that). I continued with the installation and rebooted.

Control Panel still wouldn't open so I checked Services and found that it was set to Automatic but 'Stopped'. I clicked start and it gave me this;










I thought 'to hell with it,' and double clicked the Control Panel icon and it loaded up, but the only thing I have access to is;

Stereoscopic 3D
- Set up stereoscopic 3D
- View compatibility with games

*Edit: *Even though I'm back on 275.33 I'm still getting erratic GPU usage - never used to do it! I'm a bit baffled.


----------



## Tept

Def need to troubleshoot the control center, I'd say its vital.


----------



## RagingCain

Quote:



Originally Posted by *Smo*


I decided to have a go at regressing to the 275.33 drivers and during installation it moaned about nvvsc.exe -install (or something like that). I continued with the installation and rebooted.

Control Panel still wouldn't open so I checked Services and found that it was set to Automatic but 'Stopped'. I clicked start and it gave me this;










I thought 'to hell with it,' and double clicked the Control Panel icon and it loaded up, but the only thing I have access to is;

Stereoscopic 3D
- Set up stereoscopic 3D
- View compatibility with games

*Edit: *Even though I'm back on 275.33 I'm still getting erratic GPU usage - never used to do it! I'm a bit baffled.

Image


Hey Smo, try increasing the CPU vcore voltage a little, it may be slightly unstable (but not unstable enough to crash) that it lowers performance. If no change, then lower your CPU vcore.

Also try a different program. Every other game (except 3DMark11) I get the same crappy GPU usage.


----------



## xtnod

Thinking about getting this card since no 6990 is available anywhere. Do you guys recommend the ASUS or EVGA??

The Asus is available at my microcenter...


----------



## Tept

Ya can't get it to go into P8 while second monitor is active.


----------



## tlr3715

Asus works well enough. But with EVGA you get higher clocks and a better warranty.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14262405*
> Hey Smo, try increasing the CPU vcore voltage a little, it may be slightly unstable (but not unstable enough to crash) that it lowers performance. If no change, then lower your CPU vcore.
> 
> Also try a different program. Every other game (except 3DMark11) I get the same crappy GPU usage.


Thanks for the suggestion - unfortunately I've just started the commute to work so can't get to my PC for another 13 hours or so.

Currently my 2500k is at 4.4GHz with the only BIOS mods being a x44 multiplier and VRM frequency set to 350. I'm assuming that because I want to keep the idle downclock (on the CPU) that I would have to increase the offset voltage? If so I'm thinking change that to -0.010 and see if the GPU usage in Crysis is still erratic?

I played Crysis 2 to see if there was any difference and it was hitting 98% GPU usage and behaving as normal. Same with Metro 2033.

Do you think slight instability in the chip could just rear it's head randomly? After all I've been playing the original Crysis flawlessly and all of a sudden last night my GPU usage decides to be pants!

I just can't fathom what seems to have changed.


----------



## RagingCain

Quote:



Originally Posted by *Smo*


Thanks for the suggestion - unfortunately I've just started the commute to work so can't get to my PC for another 13 hours or so.

Currently my 2500k is at 4.4GHz with the only BIOS mods being a x44 multiplier and VRM frequency set to 350. I'm assuming that because I want to keep the idle downclock (on the CPU) that I would have to increase the offset voltage? If so I'm thinking change that to -0.010 and see if the GPU usage in Crysis is still erratic?

I played Crysis 2 to see if there was any difference and it was hitting 98% GPU usage and behaving as normal. Same with Metro 2033.

Do you think slight instability in the chip could just rear it's head randomly? After all I've been playing the original Crysis flawlessly and all of a sudden last night my GPU usage decides to be pants!

I just can't fathom what seems to have changed.


Sounds like the rest of us, waiting for drivers...

Most Dx9 titles and AAA Dx11 titles work perfectly. Anything else is a whole nother question. It sucks, but thats the way it is.

My first card was dead, so I arrived about a week and half late in the scene, but pretty much since release. We have been waiting for drivers.

I have heard from other forums, nVidia especially, to use driver set 267.91 (or 85) if you like overclocking. Really big gains on performance most titles are in 99% usage.

What I think happened is that the 267.xx team was different than the 27x.xx team, so obviously a lot of "game fixes" probably didn't make to each generation.

Rumor has it that 280 is just around the corner. I personally don't understand their numbering convention at all, but I will take it stride, and remember the good driver sets so far.


----------



## Tept

Ok maybe someone in here can help me understand *** is with this. Dood with 2x EVGA gtx 480's OC'd to i think its 850/2200 scored a P47667 on 3dmark vantage GPU physx on.
I've finally gotten myself to tap slightly into the 48k's. Now heres where I'm confused. On Jane Nash and New Calico tests we both scored nearly identical scores on FPS. However on Texture Fill, Color Fill, Pixel Shader, Stream Out, GPU Particles, and Perlin Noise he scored way over mine, in every test. For example in Perlin Noise I do about 170 while he scored 230.

I mean am I doing something wrong because it would just stump me if 2 480's can defeat "2 580's" at that magnitude. CPU and PhysX have no bearing here because my CPU and PhysX scores are significantly higher than his.

On another note on my PSU I have 4 8pin female ports specifically labeled for PCI-E and 4 8pin female ports specifically labelled for pretty much everything else. There is also an 8pin PCI-E cable coming out of the main harness of the PSU.

Being that the 590 calls for 2 8pins, I have the 8pin coming directly out of the main harness of the PSU and 1 8pin going into the PCI-E specific ports on the PSU plugged into the 590. Can anyone tell me if theres any good reason why I should unplug the main harness PCI-E and run the 590 directly off 2 of the PCI-E plug ins?










^ Incase my explanation is garbage. Red is PCI-E specific.


----------



## Shinobi Jedi

Clocked my cards up to 1340/1728 just for kicks and got a new 3DMark11 Performance score of - 14311P

http://3dmark.com/3dm11/1542312

Not bad. Not sure how stable it is though. Or really worth it for performance when everything runs 120fps+ with everything maxed out on just about every game I have.

While I'm sure people are hitting VRam limits on 2560x1600 or Surround, On one 120hz Full HD/3D 1080p display, one or two of these cards runs like a dream.

The 590's are so good to me right now that every time I quit a game, I feel like I want to have a cigarette and I don't even smoke.


----------



## zimnydave

Hi there.

Is there any possibility to buy a double set of GTX 590 HC from evga online stock here: http://www.evga.com/products/moreInfo.asp?pn=03G-P3-1599-A2&family=GeForce%20500%20Series%20Family&sw= and deliver to the Poland?

Do I have any chance to get it and send it to a Poland? I see it's not available right now. But I wonder when it will be available. It seems to the "Notify me" option doesn't work properly (at least in my case).

Thanks for the reply.

PS. Of course here, in Poland i can't get any HC card (590 and even 580).

David.


----------



## RagingCain

Quote:



Originally Posted by *Tept*


Ok maybe someone in here can help me understand *** is with this. Dood with 2x EVGA gtx 480's OC'd to i think its 850/2200 scored a P47667 on 3dmark vantage GPU physx on.
I've finally gotten myself to tap slightly into the 48k's. Now heres where I'm confused. On Jane Nash and New Calico tests we both scored nearly identical scores on FPS. However on Texture Fill, Color Fill, Pixel Shader, Stream Out, GPU Particles, and Perlin Noise he scored way over mine, in every test. For example in Perlin Noise I do about 170 while he scored 230.

I mean am I doing something wrong because it would just stump me if 2 480's can defeat "2 580's" at that magnitude. CPU and PhysX have no bearing here because my CPU and PhysX scores are significantly higher than his.

On another note on my PSU I have 4 8pin female ports specifically labeled for PCI-E and 4 8pin female ports specifically labelled for pretty much everything else. There is also an 8pin PCI-E cable coming out of the main harness of the PSU.

Being that the 590 calls for 2 8pins, I have the 8pin coming directly out of the main harness of the PSU and 1 8pin going into the PCI-E specific ports on the PSU plugged into the 590. Can anyone tell me if theres any good reason why I should unplug the main harness PCI-E and run the 590 directly off 2 of the PCI-E plug ins?


I don't understand. 2x 480s are faster than a 590?

The 480 is barely slower than 580, I think the difference was about 10~15% at most.

The 590 is about as powerful as 2x 570s at stock. 2x 480s would be on par, or slightly better than 2x 570s. Especially if overclocked, easily the 480s would be more powerful.

So the 590 is about 1.66x SLI 580, where as SLI 480 is about 1.75~1.80x SLI 580.

Quite simply, the 590 is not as powerful as 2 580s, but more like 1 and a half 580s.


----------



## Wogga

here is mine Palit GTX590 (NE5X590012F7)


----------



## Jeppzer

Holy crapper.
Been working for almost two hours to get the quad sli to work.
Now it turns out the sli bridge was on backwards... Faaantastic that they can release something that can be connected several ways, when all other MoBo connectors are one way.









Now I got another problem tho. Surround vision working, BUT when I try to reassign Monitor positions. All three go black, then one comes back on with the message Signal out of reach.

No idea what causes that..

Anyway, I hit the reset button after several minutes of nothing happening.

(I just moved my monitors, the physical ones that is, to match what nvidia controlpanel said...)
I'd still be interested in a software solution to that issue tho.

Anyhow, someone here wanted my bioses? I'll search and pm you now!

Edit:
Also, I solved my power issue by disconnecting my freezer and dragging and extensioncord from that outlet to power my computer instead.
(so I got to eat alot of icecream as a reward. Not bad for a homefix until the electricians are back from vacation.)

Edit 2:


----------



## Smo

I took your advice and played around with my voltages - made no difference so I've set them back to stock. I can only assume it's driver related.

I don't know how they can release a product like this with such inadequate drivers. It's like a new type of engine with no fuel to power it. Beggars belief.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14285823*
> I took your advice and played around with my voltages - made no difference so I've set them back to stock. I can only assume it's driver related.
> 
> I don't know how they can release a product like this with such inadequate drivers. It's like a new type of engine with no fuel to power it. Beggars belief.


Its definitely strange, but the fact that at least most games run, still run above 60 fps, I am okay with.... some days.

The alternative being the ATI drivers from hell I had to endure. This is technically a stroll in the park in comparison, but I still feel like I paid way too much for card performance to be "trickled" out month after month.

I think it just goes back to SLI profiles being optimized and months between releases suck. I would rather have nightly builds to be honest. I would then update once a week type deal.


----------



## Jeppzer

OC'd a bit.

CPU to 3,8GHz 590's to 650MHz, all three on stock voltage so far.
Folding on everything to check for instability, been going for two hours, everythings still up, temperatures OK but not great.

Edit: Oh yeah, forgot to say, CPU hasn't gone over 57c even after two ours of 100% load on all cores.


----------



## Tept

Jeppzer you can go significantly further with your OC and still stay plenty cool. @ my OC I hit 75-77c at 1.415v on a maximum intelburntest 8 threads and I have more or less the same cpu cooling you have. Are you push-pull fan setup with the fans blowing out the back?

PS: I only hit 65-67c during real world use and thats only if the application was cpu heavy.


----------



## RagingCain

If anyone is interested, I am posting on nVidia's forums, I am trying to get nVidia to give us information about our cards, specifications not disclosed to the public after the point of sale. If any of you wanted to chime in to help me out, that would be great.

Thanks Jug for commenting.

http://forums.nvidia.com/index.php?showtopic=205833&st=0&gopid=1268382entry1268382


----------



## hectorco

Hi Abiosis, i saw your new v. card - best one from Evga- as everyone is telling in forums, and i have one on backorder right now and waiting for receiving it soon via UPS. But according to your knowledge, have you played Crysis 2 and experienced any graphics problems, black screen issues or a frozen screen, and which driver are you using to run this strong video card.? I hope your suggestions can help me for running this game (take a look at crysis2.com forums). Thanks for your help.!!!!:


----------



## Shinobi Jedi

Quote:



Originally Posted by *RagingCain*


If anyone is interested, I am posting on nVidia's forums, I am trying to get nVidia to give us information about our cards, specifications not disclosed to the public after the point of sale. If any of you wanted to chime in to help me out, that would be great.

Thanks Jug for commenting.

http://forums.nvidia.com/index.php?s...&#entry1268382


I left a post for you! I went a little harsher than I really feel, but they need to take notice..

Let's see if this thread doesn't get you the info/tools you need, then maybe at least have them put a more considerable effort in Single Cards with dual GPU's or Quad SLI support in the drivers!

However I typed it in a hurry and posted and realized it has some incomplete sentences, and there's no edit function! Oh well. I trust they'll still get the point.

Cheers


----------



## vertex

Fyi- I'm pretty well done with these cards. Failure of a company to acknowledge its responsibility to its consumers demand is destine for doom.... See my post on Nvidia forum below..
__________________________________________________ ___________________
Dear Nvidia,
I have spent many dollars on your gtx 590's only to feel like the development ball was dropped and I'm the one to pay for it. This is my first ever Nvidia card set-up (previously an ATI guy) and this experience isn't engaging. There's a shroud of secrecy on what the issues are and it's only disadvantaging you in my consumer opinion. Having spent 2K+ on your product I do feel as though thing's need to be made right. Obviously I can't do anything against an 800lb Gorilla company with a market value in the billions. All I can do as a consumer is choose not do business with you and advise my friend's of my experience. In the end I realize you'll choose a path and walk it but failing to address the issues your customers have is likely going to have negative long term effects in my opinion. That is unless this segment of the market isn't where you're going. Your choice, My choice.

Regards,
Vertex


----------



## Smo

Are these artifacts?



















I certainly hope not.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14309823*
> Are these artifacts?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I certainly hope not.


No, it looks like game glitching/drivers. Crysis 1 is the game, but is it modded at all or playing a custom level?

Artifacting is much more specific (dots/speckles) that seem to emulate similar to dead pixels. Wrong pixel color etc.

GPU Memory errors tend to be specific textures (like the floor) going invisible and or models/objects' parts gone missing, or stretched infinitely in a specific direction.

This looks like a driver glitch. To you get the same issue on stock (non-overclocks)? If you do, then try playing in DX9 (see if its API related). After that its worth trying different drivers after you re-install those.


----------



## Smo

Quote:



Originally Posted by *RagingCain*


No, it looks like game glitching/drivers. Crysis 1 is the game, but is it modded at all or playing a custom level?

Artifacting is much more specific (dots/speckles) that seem to emulate similar to dead pixels. Wrong pixel color etc.

GPU Memory errors tend to be specific textures (like the floor) going invisible and or models/objects' parts gone missing, or stretched infinitely in a specific direction.

This looks like a driver glitch. To you get the same issue on stock (non-overclocks)? If you do, then try playing in DX9 (see if its API related). After that its worth trying different drivers after you re-install those.


As always I can count on you









It's Crysis yeah - I'm using Silents High Resolution Foliage, Rygels High Resolution Textures, Level 6 CUUDATS autoconfig, Lemars HD Suit and Xigmateks XFX 2.

Interestingly I've seen one of the GPU memory errors you mentioned - when playing BF:BC2 I noticed that in one level, shut the game down immediately and it hasn't occured since.

I've also seen what appear to be dead pixels, they occured when hovering the mouse over the 'X' on the top of a window, but they're not doing it right now.

Could it be unstable drivers? As mentioned previously I've not actually experienced a flawless install - the last couple of times (275.50 and 275.33) I've received an error during install but they completed anyway.

I'm thinking NVIDIA dropped the ball here. The card is an awesome product, but it's creators aren't supporting it the way they should. Almost as if they don't care - but it's the most powerful card they make. How could they not care?


----------



## RagingCain

Quote:



Originally Posted by *Smo*


As always I can count on you









It's Crysis yeah - I'm using Silents High Resolution Foliage, Rygels High Resolution Textures, Level 6 CUUDATS autoconfig, Lemars HD Suit and Xigmateks XFX 2.

Interestingly I've seen one of the GPU memory errors you mentioned - when playing BF:BC2 I noticed that in one level, shut the game down immediately and it hasn't occured since.

I've also seen what appear to be dead pixels, they occured when hovering the mouse over the 'X' on the top of a window, but they're not doing it right now.

Could it be unstable drivers? As mentioned previously I've not actually experienced a flawless install - the last couple of times (275.50 and 275.33) I've received an error during install but they completed anyway.

I'm thinking NVIDIA dropped the ball here. The card is an awesome product, but it's creators aren't supporting it the way they should. Almost as if they don't care - but it's the most powerful card they make. How could they not care?


I am not sure why. Looks like greed is winning out perhaps?

Anyways, I have been running 275.50 for the better part of what, two weeks? Moving the story on, I was playing around in the profiles... or needed to more specifically, this is what I came across:










Back to 275.33 for me, maybe even 270.61 if I see the same thing.

Heavily Chinese/Taiwanese titles, and language. Even Bad Company 2, halflife were recognizable .exes, but I can't help but wonder that regional drivers exist for a reason. Could explain a variety of issues we have been having.

What is up with nVidia??? I don't get it. I know the main team is working 28x.xx drivers but still.

I would also anticipate that it maybe a single Mod interfering with Crysis. I don't know enough about Crymodding, just the texture packs and ToD.

Perhaps someone else with more experience will help. But to answer your question, yes these "GPU errors" can definitely be driver related. But because you are also in modded territory that adds a a few more things it can be, botched installed (not necessarily your fault), could be overwritten files that should not be overwritten for example.


----------



## Shinobi Jedi

Wow...

So, I started playing Witcher 2 tonight with the new patch and that game now max's out all four GPU's in Quad SLI almost constantly. Game runs fantastic with fps often in the triple digits, but with my GPU's going %90+ non stop, it wasn't long before GPU 2 was hitting 90c..

So far this is the only game that does this.

It looks like I'm gonna have to drop $320 on an EVGA X58 Classified FTW since it seems to be the best one for spacing the cards and keeping them both a full 16x.

I want to stick with EVGA because they hooked me up pretty well RMA wise before, I like the step-up program and 24/7 tech support and warranty (especially considering I have a dead dimm slot on my current 2yr old X58) - and because their corporate office and warehouse is in my area, I pretty much get one day shipping no matter what, whether it's a purchase or RMA so I'm not stuck with long down times.

So that being the case, I figure X58 Classified FTW, right?

Or should I just wait for Socket 2011 at the end of the year and not play Witcher 2?

I'm thinking I may do both. Get the Classified X58 now while it's on sale and give my new 590's some much needed space asap, and then go for the socket 2011 when it's out and proven.

But if anyone has any other suggestions, I'm all ears.

Oh, and for those Quad SLI users who want to give all four of their GPU's a workout constantly in a game? Witcher 2 with the new 1.3 patch...

Impressive. Most impressive.

Late!


----------



## RagingCain

Quote:


> Originally Posted by *Shinobi Jedi;14316209*
> Wow...
> 
> So, I started playing Witcher 2 tonight with the new patch and that game now max's out all four GPU's in Quad SLI almost constantly. Game runs fantastic with fps often in the triple digits, but with my GPU's going %90+ non stop, it wasn't long before GPU 2 was hitting 90c..
> 
> So far this is the only game that does this.
> 
> It looks like I'm gonna have to drop $320 on an EVGA X58 Classified FTW since it seems to be the best one for spacing the cards and keeping them both a full 16x.
> 
> I want to stick with EVGA because they hooked me up pretty well RMA wise before, I like the step-up program and 24/7 tech support and warranty (especially considering I have a dead dimm slot on my current 2yr old X58) - and because their corporate office and warehouse is in my area, I pretty much get one day shipping no matter what, whether it's a purchase or RMA so I'm not stuck with long down times.
> 
> So that being the case, I figure X58 Classified FTW, right?
> 
> Or should I just wait for Socket 2011 at the end of the year and not play Witcher 2?
> 
> I'm thinking I may do both. Get the Classified X58 now while it's on sale and give my new 590's some much needed space asap, and then go for the socket 2011 when it's out and proven.
> 
> But if anyone has any other suggestions, I'm all ears.
> 
> Oh, and for those Quad SLI users who want to give all four of their GPU's a workout constantly in a game? Witcher 2 with the new 1.3 patch...
> 
> Impressive. Most impressive.
> 
> Late!


Thanks for that, I have purposely held off from playing due to this.


----------



## Jeppzer

Guys/girls and its, what are your cards idle temps when you have your system on 24/7?


----------



## RagingCain

Quote:


> Originally Posted by *Jeppzer;14319697*
> Guys/girls and its, what are your cards idle temps when you have your system on 24/7?


32~37c for me (on water)

Why what are you getting?


----------



## Jeppzer

~40c . 20 above ambient. CPU MB and RAMS idle around 30c

But if you almost hit 40 on water.. Then I'm not going to worry about it.


----------



## Tept

Shinobi are you watercooled on the cards?

Reason I ask is when I play Witcher 2 on 1.3, I hit about 82c on all max settings with the fan at 90% and I'm @ .963v @ 725/1450/1850

I've seen it tap 84c but its only for seconds, then it runs back down to 82c.

Ok I see your air now lol. Thats pretty high. Suggest gettin a good amount of high cfm fans on the side of the case. I run my comp in a room that stays about 77F consistantly and I NEVER see those kinds of temps and atm my side fan is a weak weak weak 230mm fan thats like 90cfm, if that, I'm getting rdy to throw 4 110cfm 120mm fans on the side. I'll let ya know if I see noticeable drops in temps on witcher.


----------



## HOOPAJOO

Newegg has the 590's in stock again as of this moment. I just bought the EVGA version.

I'll be joining the club shortly.


----------



## Jeppzer

Quote:



Originally Posted by *HOOPAJOO*


Newegg has the 590's in stock again as of this moment. I just bought the EVGA version.

I'll be joining the club shortly.










Welcome to the green side!


----------



## RagingCain

Hey all, just give you an early heads up, I will be selling my R3E Black Edition, instead of going to water cooling on it.

Give you guys an early chance at taking it off my hands, had a brand new one or two weeks after RMA. (saving me from making a for sale thread.)


----------



## Robitussin

Quote:



Originally Posted by *RagingCain*


Hey all, just give you an early heads up, I will be selling my R3E Black Edition, instead of going to water cooling on it.

Give you guys an early chance at taking it off my hands, had a brand new one or two weeks after RMA. (saving me from making a for sale thread.)


What did you decide to go with instead? Or are you moving to a different socket? I'm rmai'ng my sli le what kinda price r u looking for? With the thunderbolt/nic? Glws


----------



## RagingCain

Quote:



Originally Posted by *Robitussin*


What did you decide to go with instead? Or are you moving to a different socket? I'm rmai'ng my sli le what kinda price r u looking for? With the thunderbolt/nic? Glws


Actually, I am going to be looking for a flat 400$, going back to my WC EVGA E770 for the time being. No socket upgrade. I am not touching SB with a 10ft pole, especially since the new SB-Es are rumored to be on socket 2011, any IB or SB-E will kill the SB. More so the IB with the 3D transistors, but I will be getting second gen IBs, not the initial batch, so the IB-Es.

Yes I am including the Thunderbold/Killer NIC as well as all accessories, they are all included.


----------



## kevink82

Only sandy bridge-e is on socket 2011 ivy bridge is using 1155 gigabyte boards already added bios support for ivy bridge with their boards. Heres a link on coolaler es ivy bridge cpu and it ran fine with his z68 ud3p.

http://forum.coolaler.com/showthread.php?t=269530


----------



## Robitussin

Quote:



Originally Posted by *RagingCain*


Actually, I am going to be looking for a flat 400$, going back to my WC EVGA E770 for the time being. No socket upgrade. I am not touching SB with a 10ft pole, especially since the new SB-Es are rumored to be on socket 2011, any IB or SB-E will kill the SB. More so the IB with the 3D transistors, but I will be getting second gen IBs, not the initial batch, so the IB-Es.

Yes I am including the Thunderbolt/Killer NIC as well as all accessories, they are all included.


I wonder if I may what your reasoning is behind such an assumption? Not to be frank as I'm sure based on your posts and rep that you have been doing this longer then I have, but why did you decide to wait for IB-e rather then going sb-e/2011 as soon as it comes out? Would you if you were in such predicament as mine do something similar? As in stay with 1366 rather then move on to a socket with a longer perceived lifespan, or are you saying your comfortable skipping a generation, based on your current system, in effect skipping 2011's first run since the difference may or may not be negligible.

I for one am somewhat dissapointed buying into 1366 when I did, I only got a little use out of the socket compared to 1155 owners but that's the way of things I guess. Although 4channel memory is intriguing I think atm 1155 is the way to go for users looking for a new system I would be interested in your opinion.

Also great price on a top of the line mobo, if someone were, as I may/may not be, in the market for a 1366 mobo this will sell quickly with or without a FS thread. Again GLWS


----------



## vwmikeyouhoo

Finally flashed my bios on the 590 and did some overclocking, which was long overdue. So far so good @ 0.963


----------



## vwmikeyouhoo

Update on 3mark11 score. Im pretty satisfied.


----------



## Fallendreams

Quote:


> Originally Posted by *RagingCain;14332255*
> Actually, I am going to be looking for a flat 400$, going back to my WC EVGA E770 for the time being. No socket upgrade. I am not touching SB with a 10ft pole, especially since the new SB-Es are rumored to be on socket 2011, any IB or SB-E will kill the SB. More so the IB with the 3D transistors, but I will be getting second gen IBs, not the initial batch, so the IB-Es.
> 
> Yes I am including the Thunderbold/Killer NIC as well as all accessories, they are all included.


Agree with you there. Good price. Only reason i went from 1366 (I7 930) to sandy bridge was because i7 2600k was only 140 dollars from Intel retail program. Couldn't pass that up.


----------



## Jeppzer

URL to results

3dmark vantage results


----------



## vwmikeyouhoo

Thing thing feels like it wants to keep going. still only at .963.


----------



## Uncivilised

I know this has been asked around a lot but i may have a chance to purchase one of these but i am skeptical of the vrm's. How bad are they really? Will they blow on stock? How much is too much voltage for these cards


----------



## Smo

Quote:


> Originally Posted by *Uncivilised;14349548*
> I know this has been asked around a lot but i may have a chance to purchase one of these but i am skeptical of the vrm's. How bad are they really? Will they blow on stock? How much is too much voltage for these cards


They're nothin like as bad as people are making out - and they most certainly won't blow at stock. RagingCain is doing his very best to provide us with modified and safely overclockable BIOSes in the flashing thread. It's a fantastic card (more so for single screen gaming) just with currently poor driver support. On top of that NVIDIA won't release the required information to push the cards to their maximum safely. Time will get us there though.


----------



## RagingCain

Quote:


> Originally Posted by *Robitussin;14332833*
> I wonder if I may what your reasoning is behind such an assumption? Not to be frank as I'm sure based on your posts and rep that you have been doing this longer then I have, but why did you decide to wait for IB-e rather then going sb-e/2011 as soon as it comes out? Would you if you were in such predicament as mine do something similar?


Yes absolutely, for you frankly, I wouldn't go for SB-E, I would get a 2500K or 2600K if you want 8 threads, and a medium motherboard to tied you over. I wouldn't stress too much.
Quote:


> As in stay with 1366 rather then move on to a socket with a longer perceived lifespan, or are you saying your comfortable skipping a generation, based on your current system, in effect skipping 2011's first run since the difference may or may not be negligible.


That's exactly why I don't need to upgrade, my current system. I am waiting for also a second generation non-fermi GPU and do a package deal all at once. I have been completely unimpressed by Fermi, and if it wasn't driver support by nvidia, I may have been lured back to ATI, although I am still pissed at them too.
Quote:


> I for one am somewhat dissapointed buying into 1366 when I did, I only got a little use out of the socket compared to 1155 owners but that's the way of things I guess. Although 4channel memory is intriguing I think atm 1155 is the way to go for users looking for a new system I would be interested in your opinion.


1155 is a dead technology, and no matter what SB users try and shove down your x58 throat, their socket is more than likely not getting any new CPUs, nor is the Chipset handling the future i5/i7 CPUs, so there is no point in buying one now, unless you really want one. Until recently not only is the x58 CPU selection still great (meaning i7) but the Chipset was superior, had more PCI-E lanes than you could shake a stick at, and triple channel memory which can come in handy if you like speed. These simple things hold the SB platform back.
Quote:


> Also great price on a top of the line mobo, if someone were, as I may/may not be, in the market for a 1366 mobo this will sell quickly with or without a FS thread. Again GLWS


I may or may not sell it now, ASUS personally contacted me, and set up a T3 support call for today for about 2 hours. I am supposed to list every single issue I have discovered with it, if that provides results great, if not, this overpriced mobo is gone.

Not a single update in 3 months.
Quote:


> Originally Posted by *Uncivilised;14349548*
> I know this has been asked around a lot but i may have a chance to purchase one of these but i am skeptical of the vrm's. How bad are they really? Will they blow on stock? How much is too much voltage for these cards


Right now as it stands they seem to be hold up well. If are you very worried get a very good warrantied card, or avoid it all together.

I was apprehensive for the first month, but I also benchmarked the hell out of it too that month. The VRMs are an issue when doing extreme benchmarking/overclocking, not your standard run of the mill 10% overclock.

Highest voltage I managed to get was 1.063v, and that was at a frequency of 830~835 MHz I believe. It was a fairly large overclock. 200 MHz over EVGA stock or 223 MHz over stock 590s.


----------



## Uncivilised

Quote:


> Originally Posted by *RagingCain;14351257*
> Highest voltage I managed to get was 1.063v, and that was at a frequency of 830~835 MHz I believe. It was a fairly large overclock. 200 MHz over EVGA stock or 223 MHz over stock 590s.


Hmm 1.063v isnt actually too low. 1.05v is usually what i wud stick to anyways. Do all gtx 590's do 1.063v safely?


----------



## RagingCain

Quote:


> Originally Posted by *Uncivilised;14358196*
> Hmm 1.063v isnt actually too low. 1.05v is usually what i wud stick to anyways. Do all gtx 590's do 1.063v safely?


Well I am on water cooling, so I felt like being a bit "edgy."

1.063v is the maximum allowable voltage in non-ASUS cards. I just used that as a frame of reference.

I believe a stock 580 is at 1.037v. It was a fun albeit timely journey getting up there. I tried every 12/13mV from stock 0.913/0.925v all the way up to 1.063v, and every 5 MHz frequency from 630 to 830 MHz.

This card has a habit of hard crashing the computer without BSOD, so it even took an OS re-install at one point.


----------



## Masked

Nvidia also doesn't actually warranty the cards past 1.05v ~ That was in 3/4 press releases and repeated MANY times ad nausea.

So if you intend on going past 1.05v just remember your second core is actually at 1.07v and that voltage has been proven to actually "blow your card up".

I've had 2 customers do it recently whom actually got replacements but, they didn't realize core 2 was getting more juice...There is ALWAYS 1 core on this card that will get .01 - .02 more V.


----------



## Kosire

How is 590 right now? With drivers and stuff...?
Thinking of getting it, since it's a tad quiter and cooler than 6990.

Not planning on overclocking these at all, just for gaming with max @ 2560x1600


----------



## Smo

Quote:


> Originally Posted by *Kosire;14376035*
> How is 590 right now? With drivers and stuff...?
> Thinking of getting it, since it's a tad quiter and cooler than 6990.
> 
> Not planning on overclocking these at all, just for gaming with max @ 2560x1600


It's not perfect I must admit, but the only issues I've had are a couple of particle effects in Crysis (only when inside the alien ship) and one time a weird graphics glitch in BC2.

Apart from that the performance on one screen is astounding. I don't regret buying it at all.

Plus the new 28x.xx series of drivers are just around the corner.


----------



## kazukun

280.19-beta
Voltage adjustment is possible








http://3dmark.com/3dm11/1583539


----------



## rush2049

downloading the beta now!!!!!!!

voltage adjustment, OMG yes......

but you know what would make me really happy, if they made it so having a dedicated physx card didn't have crysis 2 flicker...... but I will test that first and report back, but if it works you might not hear from me for a while... lol


----------



## Wogga

*vwmikeyouhoo*, you're lucky =) mine dont want to work at [email protected] =( and even cant on [email protected] =((((

280.19 seems to give some little perfomance boost


----------



## Smo

Thank god for that - can't wait to get home and upgrade!


----------



## Wogga

damn! i even got +100 marks in heaven.
furmark no longer throttles GPU1 (had such problem on any freq besides stock)


----------



## Smo

Somebody please shed some light on this! Every time I try to install NVIDIA drivers, during the installation I get this;










The installation continues when I click OK, but the NVIDIA Control Panel doesn't work. If I try to run or start it I get this;










And whenever I try to uninstall I get this;










It's really starting to aggravate me - I want full functionality of my drivers!


----------



## Canis-X

Driver sweeper in safe mode maybe??


----------



## Smo

Quote:



Originally Posted by *Canis-X*


Driver sweeper in safe mode maybe??


Appreciate the reply mate - but I've already tried it.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14380459*
> Somebody please shed some light on this! Every time I try to install NVIDIA drivers, during the installation I get this;
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The installation continues when I click OK, but the NVIDIA Control Panel doesn't work. If I try to run or start it I get this;
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And whenever I try to uninstall I get this;
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's really starting to aggravate me - I want full functionality of my drivers!


Easy. Do a complete uninstall from Windows and then run driversweep in safe mode searching for nvidia drivers. Then when everything is gone, re-install drivers with Custom choices and enable "clean" install again. I have a sneaky way of making it work if that fails.

@all
I received an anonymous tip yesterday , that nvidia legal has informed corporate that their post sale changes were infact against consumer rights, and somebody (an unnamed business partner) already filed suit privately in US litigation to undo changes. Nvidia has the right to make such changes, but must make them public and for reason.

I then saw the rumor 180 beta was due out yesterday and may have been pushed out early in response.

Again please be careful, and I have plenty of good OC guidelines but I will need to do some more math all over again.

How does the PDL act so far?

Post powered by DROID X2


----------



## Smo

Quote:



Originally Posted by *RagingCain*


Easy. Do a complete uninstall from Windows and then run driversweep in safe mode searching for nvidia drivers. Then when everything is gone, re-install drivers with Custom choices and enable "clean" install again. I have a sneaky way of making it work if that fails.


I did as you said mate - when reinstalling (clean install) after running the sweeper I still get this;










Pops up when installing the graphics driver.


----------



## RagingCain

Quote:



Originally Posted by *Smo*


I did as you said mate - when reinstalling (clean install) after running the sweeper I still get this;










Pops up when installing the graphics driver.


Okay the sneaky way I mentioned of installing it is like this, and not guaranteed to be bug free.

You download the 275.50 drivers, then you download the 280.19 drivers, and copy in the Physx, Auto Update, Display.Driver... all the folders except the control panel folder INTO the 275.50 folder. All this means is that you will use the older control panel, but the newer drivers. Once everything is copied in and overwritten, you then run the setup from the 275.50 folder. This is about the only way that will work, unless you try re-downloading the International version (English language choice.)

@All - 590 Completely not mentioned in the release notes, if you want to see the serious amount of changes for this driver, I have included it in a PDF file, or if you want a direct link (for you paranoia types) its here:
http://us.download.nvidia.com/Window...ease-Notes.pdf

Edit 2:
Something is seriously wrong with these drivers! Unable to leave 540 MHz @ 0.88v Anybody else monitoring voltage / clock changes in Afterburner? Not going to 630 Mhz.... just 540 from idle.

Edit 3:
Okay, 3D Applications use whatever its set to, so its just needing to go to P3 unless in 3d application. Thats okay, I suppose.

Edit 4:
OMG, no more brute force PDL?! Whadafudge! Very exciting!


----------



## Canis-X

Well, at least they are addressing something that would help me out....LOL

Quote:



NVIDIA SLI
Enabled SLI technology on SLI‐certified motherboards with AMD chipsets: 990FX,
990X, and 970.


----------



## RagingCain

Quote:



Originally Posted by *Canis-X*


Well, at least they are addressing something that would help me out....LOL


Yay!


----------



## Smo

Quote:



Originally Posted by *RagingCain*


Okay the sneaky way I mentioned of installing it is like this, and not guaranteed to be bug free.

You download the 275.50 drivers, then you download the 280.19 drivers, and copy in the Physx, Auto Update, Display.Driver... all the folders except the control panel folder INTO the 275.50 folder. All this means is that you will use the older control panel, but the newer drivers. Once everything is copied in and overwritten, you then run the setup from the 275.50 folder. This is about the only way that will work, unless you try re-downloading the International version (English language choice.)


Thank you for trying to help, but I'm afraid that didn't work for me either


----------



## RagingCain

Quote:



Originally Posted by *Smo*


Thank you for trying to help, but I'm afraid that didn't work for me either










Go ahead and PM me everything you have tried so far, I will do my best to come up with other tricks.

Quote:



Originally Posted by *kazukun*


280.19-beta
Voltage adjustment is possible








http://3dmark.com/3dm11/1583539


Can't believe I didn't see this score... congratulations! Thats unbelievable. 267.85 overclocking with 275.50 3Dmark 11 performance enhancements.


----------



## kazukun

0.963v Min-1.05v Max

The BIOS used the thing which you gave to me


----------



## Tept

280.19 is being extremely picky. .963v @ 720mhz runs great, full performance, full time, even in witcher 2 which is the most intensive game I've seen yet. 280.19 comes out and I try to flick the V to just .975 and after about 15sec in Furmark my FPS in the stability test drops from 51 to 40 and just sits there, and thats NOT adjusting the core clocks AT ALL, ONLY the core voltage from .963 to .975. .988v and the drop to 40 occurs in less time, 1.0v+ and its always at 40, no matter what I put the core to.

Yes, I did adjust for both core voltages in Afterburner since it doesnt sync same cards properly after a new version comes out. I'm very much hoping this is just an issue with MSI compatibility with the new version however I did look in nvidia inspector and my cores + clocks are at the values I assign in afterburner.

I do know my card is capable of higher than .963 with performance gain because I rolled back to the 267.85 drivers and went all the way to 1.05v for kicks and saw noticeable improvements on core clocks up to 805mhz.


----------



## Wogga

oh, 51 fps in furmark? i'm only got 34...am i doing smth wrong?
anyway, was watching at voltage and freq in afterburner. didnt noticed any unnatural behavior exept that my card wants way too much voltage i think. 750 is barely stable @1.0v =(


----------



## Tept

Furmark is seriously picky. If I turn off my second monitor using Windows key+P, I drop into the 20's, soon as I turn the second monitor back on, jumps back up to 48-51. Also I'm not feeling Furmark is worth a damn for benchmark. On a setting Furmark put me at 40fps I turned around and tried Heaven Benchmark and scored noticeably higher than the setting that was getting me 51fps in Furmark. I'm currently @ 755mhz @ 1.0v and I'm done tweaking for tonight.


----------



## Tept

So anybody know if its just the way it works or if I'm doing something wrong but, No matter how awesome I get my core up to stable, my highs get higher but my lows stay right about the same. For example parts of 3dmark vantage, if I go with a low core, say, 665mhz, my peak fps at a particular scene is say 160fps, and my low low at a specific point is 67. Now if I crank up to 720mhz, the peak fps in the 1 scene goes up to 180, but my low low is still 67.

Particular symptom of something or is it just how it goes?


----------



## Robitussin

I don't read this thread enough anymore, I may have to check out these beta drivers if they are allowing for voltage control again, although really I may just wait for fill release as long as it doesn't disappear again :/


----------



## Opp47

Heres some pix of my Hydrocopper's.. hope u guys like
PROOF:


----------



## RagingCain

Lol I love that system, reminds me of someone elses....


----------



## Opp47

Quote:


> Originally Posted by *RagingCain;14398541*
> Lol I love that system, reminds me of someone elses....


----------



## ReignsOfPower

280.18 + BIOS Fix @ 720Core, 1900 Memory









Original Bench 270.51's @ Stock









Things certainly evolved quite a bit with some tweaking (Overclocking on 270.51's with the funny voltages pretty much got me nowhere but crashes back then) ~ 17% increase in performance.


----------



## Disturbed.TV

so good....


----------



## Tept

So anybody else getting wildly erratic FPS and Stuttering or lunging, not sure what to call it, during benchmarks at anything over .975v?


----------



## RagingCain

Quote:


> Originally Posted by *Tept;14408367*
> So anybody else getting wildly erratic FPS and Stuttering or lunging, not sure what to call it, during benchmarks at anything over .975v?


That would be Power Draw Limitation activate.

Millisecond changes between OC an 540 MHz.


----------



## Tept

That's what I assumed but MSI nor inspector showed a change in my power state so wasn't sure if it was just happening too quickly or if nvidia was trying to trick us lol.


----------



## HOOPAJOO

Sign me up.









My mid tower cable management looks atrocious so I don't want to take a pic of that. I've got a Lian-Li A77F full tower in the mail though.









So far I'm pretty impressed with the card. Build quality is top notch. The card is very quiet... Doesn't seem any louder than the Radeon 5770 it replaced. It gets quite hot with the default fan settings, but I've enabled a custom fan profile in EVGA precision that keeps it a bit cooler. I've only played bf2 and a source engine game (fortress forever) on it so far, but frame rates are of course extremely high with everything maxed. I bought this thing in preparation for BF3 so... I'm sure it wont let me down though.


----------



## RagingCain

Quote:


> Originally Posted by *Tept;14409307*
> That's what I assumed but MSI nor inspector showed a change in my power state so wasn't sure if it was just happening too quickly or if nvidia was trying to trick us lol.


Both









Believe it or not the PDL doesn't seem that aggressive to me.


----------



## Tept

I'm considering trying watercooling here soon for the hell of it because its so irritating to me. 3dmark11 and heaven benchmarks see an almost 25fps bounce when the PDL is at work. Which is just insane to me as I've gotten my temps to my card down to 72c under the hardest benchmarks.

Doing 700mhz @ .950v is good enough for me lol. Picking up a triple rad, laing pump, tubing and clamps for $80. Somehow I don't think cooling it is going to help because if I let the card cool at 0% under 90% fan and then load it up with .988v @ 740mhz, it starts lunging at particular scenes whether the temps are 50c or 70c. I can then bake the damn card with Witcher 2 to 74c to make sure the heat is nice and soaked through the card with it at .950v @ 700mhz and come out and get not a single lunge or stutter.

I'm gunna just deal with it probably and cross my fingers very firmly in hopes GTX 600's won't be as much of a disappointment as these were.


----------



## RagingCain

Quote:


> Originally Posted by *Tept;14418176*
> I'm considering trying watercooling here soon for the hell of it because its so irritating to me. 3dmark11 and heaven benchmarks see an almost 25fps bounce when the PDL is at work. Which is just insane to me as I've gotten my temps to my card down to 72c under the hardest benchmarks.
> 
> Doing 700mhz @ .950v is good enough for me lol. Picking up a triple rad, laing pump, tubing and clamps for $80. Somehow I don't think cooling it is going to help because if I let the card cool at 0% under 90% fan and then load it up with .988v @ 740mhz, it starts lunging at particular scenes whether the temps are 50c or 70c. I can then bake the damn card with Witcher 2 to 74c to make sure the heat is nice and soaked through the card with it at .950v @ 700mhz and come out and get not a single lunge or stutter.
> 
> I'm gunna just deal with it probably and cross my fingers very firmly in hopes GTX 600's won't be as much of a disappointment as these were.


I have feeling the same driver issues and the PDL are the way of the future. I am not suggesting red team, but at least wait till after release for sure. Don't beat yourself up, none of us had any idea they were going to do this, and many people still *don't* know.


----------



## Kvjavs

Is anyone running this card with an AMD processor? If so, how significant is the bottleneck, if any?


----------



## Tept

And the 590 takes a dump overnight for no apparent reason. .975v 710mhz, was watching a movie last night, did a Shut down, went to bed and got up today and powered it up and a red light was on the PCI part of the motherboard. Switched card to second pci-e 16 slot to no avail. So just to make sure it wasnt my PSU or mobo I went and picked up a 560 from best buy and plugged it in and here I am. yay!.


----------



## XXXfire

Quote:



Originally Posted by *Tept*


And the 590 takes a dump overnight for no apparent reason. .975v 710mhz, was watching a movie last night, did a Shut down, went to bed and got up today and powered it up and a red light was on the PCI part of the motherboard. Switched card to second pci-e 16 slot to no avail. So just to make sure it wasnt my PSU or mobo I went and picked up a 560 from best buy and plugged it in and here I am. yay!.


Dang man, my condolences. Get involved in a fast RMA buddy, maybe time your WC block with expected arrival of your replacement board & hopefully you won't suffer too much downtime.


----------



## Tept

LoL. Sure wish 590 could get the mhz:mV ratio that this 560 is getting. Wouldn't that be an amazing card lol.

Currently @ 1050mhz @ 1.050 stable.

Putting a 590 @ 975mhz @ .975v would cause me to need to clean my keyboard.

Edit: 
Correction-had to back it down to 1000mhz, seemed stable in Furmark however going into SC2 it seemed to wanna have a breakdown. But my point still stands, [email protected] would be plenty fine =P


----------



## Smo

Quote:



Originally Posted by *Tept*


And the 590 takes a dump overnight for no apparent reason. .975v 710mhz, was watching a movie last night, did a Shut down, went to bed and got up today and powered it up and a red light was on the PCI part of the motherboard. Switched card to second pci-e 16 slot to no avail. So just to make sure it wasnt my PSU or mobo I went and picked up a 560 from best buy and plugged it in and here I am. yay!.


Ouch, sorry to hear it mate!


----------



## Masked

Quote:



Originally Posted by *Tept*


And the 590 takes a dump overnight for no apparent reason. .975v 710mhz, was watching a movie last night, did a Shut down, went to bed and got up today and powered it up and a red light was on the PCI part of the motherboard. Switched card to second pci-e 16 slot to no avail. So just to make sure it wasnt my PSU or mobo I went and picked up a 560 from best buy and plugged it in and here I am. yay!.


And it's those situations where I thank god I bought EVGA!


----------



## RagingCain

590 vs 6990 has begun!

http://www.overclock.net/graphics-cards-general/1043957-friendly-ocn-competition-590-vs-6990-a-5.html#post14435791


----------



## Tept

Quote:


> Originally Posted by *Masked;14435789*
> And it's those situations where I thank god I bought EVGA!


It's warrantied. Same difference to me.


----------



## Canis-X

Good luck with the RMA process. I dealt with them for roughly 6-7 months.

Bits of advice that I learned:

Call their customer care dept (Direct line --> 510-818-4877) not their support group.
Ask them to fully bench and stress test the card that they will be sending you first, trust me.
ASUS USA support team consists of the following memebers:

Channel (supervisor)
Eric
Trinity
Aaron
Martin
I talked to all of them, they are all very nice people but make sure that you contact them at least daily to keep them on track, otherwise it will take forever and they are terrible for returning calls.


----------



## Fallendreams

Quote:


> Originally Posted by *HOOPAJOO;14411530*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sign me up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My mid tower cable management looks atrocious so I don't want to take a pic of that. I've got a Lian-Li A77F full tower in the mail though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far I'm pretty impressed with the card. Build quality is top notch. The card is very quiet... Doesn't seem any louder than the Radeon 5770 it replaced. It gets quite hot with the default fan settings, but I've enabled a custom fan profile in EVGA precision that keeps it a bit cooler. I've only played bf2 and a source engine game (fortress forever) on it so far, but frame rates are of course extremely high with everything maxed. *I bought this thing in preparation for BF3 so... I'm sure it wont let me down though.*


well while playing the alpha last week with both 275 and 280 drivers with only 1 core (Not SLI) in 1080p res with high setting in the options i was running minimum 80 -70fps but that's with 16af and 1aa but with only one core running at 630mhz that's freaking awesome i think.


----------



## vwmikeyouhoo

finally broke 11k graphics score with 1 card







I am pretty satisfied. Tempted to buy another one...










Shall I post this in the 6990 vs 590 thread?


----------



## rush2049

yes


----------



## ecnelitsep

Just ordered an EVGA 590 guys. Whats the best waterblock to go with? Anyone on water?


----------



## vwmikeyouhoo

Quote:


> Originally Posted by *ecnelitsep;14439286*
> Just ordered an EVGA 590 guys. Whats the best waterblock to go with? Anyone on water?


I currently have a EK block, no issues here although there has been some problems with some other ek blocks for other cards. Koolance makes a good block for these as well.


----------



## ecnelitsep

Quote:


> Originally Posted by *vwmikeyouhoo;14439527*
> I currently have a EK block, no issues here although there has been some problems with some other ek blocks for other cards. Koolance makes a good block for these as well.


What about the Heatkiller block?


----------



## vwmikeyouhoo

Quote:


> Originally Posted by *ecnelitsep;14439548*
> What about the Heatkiller block?


I can't imagine a heatkiller block being subpar. Either of these make great choices and will improve cooling by a ton.


----------



## ecnelitsep

Quote:


> Originally Posted by *vwmikeyouhoo;14439626*
> I can't imagine a heatkiller block being subpar. Either of these make great choices and will improve cooling by a ton.


Thanks mikey. I have till 6pm to order from Frozencpu so it'll get here tomorrow so i'll search and decide.


----------



## Juggalo23451

Quote:



Originally Posted by *ecnelitsep*


Thanks mikey. I have till 6pm to order from Frozencpu so it'll get here tomorrow so i'll search and decide.


Koolance water blocks would be the best choice


----------



## RagingCain

Quote:



Originally Posted by *Juggalo23451*


Koolance water blocks would be the best choice


He is biased! Go for those sexy heatkillers! The ooze WIN all over them!

P.S. Just got my voice mail. My GPUs are 0.975 too, it looks like a lot of people got voltage increases, not just 590s, but HeavyHemi started a thread on EVGA about his 580s going up 12mV too. Our BIOS says 0.963v. There must have been some issue noticed by nVidia.


----------



## ecnelitsep

Quote:


> Originally Posted by *Juggalo23451;14440254*
> Koolance water blocks would be the best choice


Based on performance or opinion?


----------



## Fallendreams

Another Manual G update
Quote:


> Quick update on the changes in the way the voltages are being reported in GPU utilities. This is not a bug and actually corrects the way the voltage is communicated to 3rd party GPU utilities. The older drivers (pre-R280) had a couple bugs which didn't round the voltages in the Perf and CVB tables to the board values in the SW caches of the Perf and CVB tables. When those values were actually applied to the HW, they were rounded up to the correct board values. The new 280.19 driver fixed that issue to make sure that the code only cached the true board voltages. However, in both cases the values which are applied to the GPU are the same so the only difference is the voltage that is displayed to the user. The actual GPU voltage has not changed from 275.50 to 280.19. We have verified this ourselves.


http://forums.nvidia.com/index.php?showtopic=206522&view=findpost&p=1273743
Quote:


> After discussing the changes in the GPU performance mode transitions with the software team, we have determined the behavior of the NVIDA GPUs with the latest R280.19 drivers to be working as intended. NVIDIA display drivers before R280.19 were slow to respond to GPU utilization changes and as a result would negatively affect GPU performance. With the changes in the new R280.19 drivers, the GPU will now be able to react faster to any sudden changes in GPU usage which can be seen in more frequent increases in GPU clock speeds and faster decreases to a lower level GPU clock speed. Throttling of the GPU clock speeds is similar to the changing of the CPU clock speeds based on CPU usage that users have been accustomed to for years in their PCs. If any of you notices a drop in GPU performance as a result of these changes, please let me know so that I can ask our software team to investigate your issue. I've mentioned one users complaint about it causing the fan to spin up and down on a laptop and software has agreed to make some changes to address this but this will likely not come until the September driver release. Also, the behavior The Hunter is seeing on his system after waking up from sleep is not expected behavior and we have not been able to reproduce it in house. We are getting further info from The Hunter.


http://forums.nvidia.com/index.php?showtopic=206522&view=findpost&p=1274087


----------



## Tept

Quote:



Originally Posted by *Canis-X*


Good luck with the RMA process. I dealt with them for roughly 6-7 months.

Bits of advice that I learned:

Call their customer care dept (Direct line --> 510-818-4877) not their support group.
Ask them to fully bench and stress test the card that they will be sending you first, trust me.
ASUS USA support team consists of the following memebers:
Channel (supervisor)
Eric
Trinity
Aaron
Martin
I talked to all of them, they are all very nice people but make sure that you contact them at least daily to keep them on track, otherwise it will take forever and they are terrible for returning calls.


Fortunately I went through a vendor, so I dont have to worry about dealing with ASUS. Unfortunately this vendor doesnt really offer the stress/bench prior to sending however if for some reason I dislike the card or have issues, its only a 48hr process to have another sent out and they do Advance RMA where they send the card to me prior to me sending the old card back. So 0 downtime.


----------



## max883

P12140 3dmark the new drivers for GTX-590 are sweeat









http://3dmark.com/3dm11/1487610;jses...AkZL6yBq5WkHxL

someone got a better score than me with one card?


----------



## vwmikeyouhoo

Quote:



Originally Posted by *max883*


P12140 3dmark the new drivers for GTX-590 are sweeat









http://3dmark.com/3dm11/1487610;jses...AkZL6yBq5WkHxL

someone got a better score than me with one card?


Result not found.


----------



## Tept

So, plugged the 590 back in with the 560 so I could attempt to flash stock bios back onto it and............... the card is working again. LOL.

Edit:
Ok upon plugging my 32" and 27" into the 590, it began doing its emo crap again. However if I unplug the 32" from the card and just boot up with the 27" the card does work. Just faulty card or is it possible this 32" LG strains the card or something?

Edit 2:
Upon more investigation I found its the specific DVI port the 32 was plugged into. I moved it to the other available port and now working with dual monitors again.

Definately still an obvious RMA but I'm wondering if theres any possibility that this 32 is the reason that the port fried. Possible? Likely?


----------



## Juggalo23451

Quote:



Originally Posted by *ecnelitsep*


Based on performance or opinion?










Performce and advice from a mod on waterblocks. Koolance has generally cools the vrms better.


----------



## ecnelitsep

Quote:



Originally Posted by *Juggalo23451*


Performce and advice from a mod on waterblocks. Koolance has generally cools the vrms better.


Glad i followed your advice. I also went with the Koolance


----------



## RagingCain

Hey Kazukun,

How high do you think you can get your 3DMark11 score... thinking about joining my 590 vs 6990 thread?

#1 Pair of 590s in 3DMark 11:
http://3dmark.com/3dm11/1612522








Its about time, no point running an OC thread if you got no meat behind your words.

Apparently it was Kaz, so a late congrats to him, I know I don't follow it like I should, so Kaz definitely deserves recognition!


----------



## rush2049

I am pushing my overclock with voltages now.
(running the 280.19 drivers, unmodded bios)

Since I am only on air my temps are high... getting like 78-81 degrees while running Heaven 2.5 on a loop for 30 minutes.

So far I am at 690mhz on the core, but I am keeping the mem at stock for now.

edit: -------950 mV on the Volts -------720,725,730 mhz unstable.
edit: It took 950 mV on the Volts to get 710 mhz stable.
It took 950 mV on the volts to get 690 mhz stable.
It took 938 mV on the volts to get 670 mhz stable.
It took 925 mV on the volts to get 650 mhz stable.

I am still pushing it... maybe I can get to 725? But my limit for temps I am setting for myself is 85C under a stress test..... I don't want to get any hotter, because it also will go higher when I am taxing everything instead of just the video card.


----------



## rush2049

I don't know if this has been discussed, but the voltage adjustment seems to snap to certain positions now....

925, 938, 950, 963 mV's I think this is what was addressed in the changes that the 280 family made to voltage adjustment and reporting.


----------



## RagingCain

Quote:



Originally Posted by *rush2049*


I don't know if this has been discussed, but the voltage adjustment seems to snap to certain positions now....

925, 938, 950, 963 mV's I think this is what was addressed in the changes that the 280 family made to voltage adjustment and reporting.


Its always done this. Its either 12 or 13mv increases. Same as 580s.


----------



## rush2049

Oh, well then pardon me for being ignorant of the fact till now. Last time I overclocked an nvidia part was the 7800 gtx. I ran and still do run my 275 gtx at stock.


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;14455975*
> Oh, well then pardon me for being ignorant of the fact till now. Last time I overclocked an nvidia part was the 7800 gtx. I ran and still do run my 275 gtx at stock.


Sorry I was in hurry, didnt mean to sound mean. meant to say the 590 voltage has always increased this way.









Post powered by DROID X2


----------



## Wogga

yesterday running 3DMark11 on different voltages and freq have made me sad =( even at 733 PDL ruins graphics score and rising mem freq above 1900 lead to unstability.
the highest graphics score was 11195 for one card with 730/1460/[email protected] maybe today i'll try to downclock CPU from 5.0 to something like x44 but with higher BCLK.
i wanted to participate in 590vs6990 competition but found that my 3DMark11 dont want to verify because of problems with reg key =) if basic edition can, i'll try it


----------



## RagingCain

Quote:


> Originally Posted by *Wogga;14457237*
> yesterday running 3DMark11 on different voltages and freq have made me sad =( even at 733 PDL ruins graphics score and rising mem freq above 1900 lead to unstability.
> the highest graphics score was 11195 for one card with 730/1460/[email protected] maybe today i'll try to downclock CPU from 5.0 to something like x44 but with higher BCLK.
> i wanted to participate in 590vs6990 competition but found that my 3DMark11 dont want to verify because of problems with reg key =) if basic edition can, i'll try it


The AMD users are saying that the 3DMark11 is bugged (which I think its just people using sandy bridge have lower scores than 980/990xs), so I might be postponing 3DMark11 for Crysis and Metro2033. I really wanted to do Heaven, as we didn't have PDL issues back in the 267.85 days


----------



## Cociotfp

Is it here i need to post to get into the GTX 590 club?

http://imageshack.us/photo/my-images...590resize.jpg/

http://imageshack.us/photo/my-images/717/gtx590.jpg/


----------



## Canis-X

^ YEEEEeeeeeeEEEEP!


----------



## Smo

Quote:



Originally Posted by *Canis-X*


^ YEEEEeeeeeeEEEEP!










That's what I thought!


----------



## RagingCain

Well it looks like Heaven lets you go past 725 MHz if any of you are interested in benchmarking with 590s. My computer is exhausted. I could not reach 800 MHz with 100% stability. I assume these drivers (280.19) are more GPU intensive, in a good way. I do not doubt that this 775~790 MHz I as able to achieve is very much more stable/powerful than 267.85 / 830 MHz.


----------



## Smo

I've noticed that my second GPU is constantly sitting at 630MHz core - is this normal? The first GPU is at 51Mhz.


----------



## RagingCain

Quote:



Originally Posted by *Smo*


I've noticed that my second GPU is constantly sitting at 630MHz core - is this normal? The first GPU is at 51Mhz.


It might be, depends on what application is running (websites with flash or GPU-z/NInspector) sometimes its something open you don't know it.

I would think it was not normal.


----------



## Smo

Quote:



Originally Posted by *RagingCain*


It might be, depends on what application is running (websites with flash or GPU-z/NInspector) sometimes its something open you don't know it.

I would think it was not normal.


Truth be told mate, I'm a bit pissed off with this card, and NVIDIA. I should have bought a 6990.


----------



## Fallendreams

Quote:


> Originally Posted by *Smo;14465769*
> Truth be told mate, I'm a bit pissed off with this card, and NVIDIA. I should have bought a 6990.


You would have same problem. Had 6970 go in out of 3d clocks when just sitting at the desktop or firefox with hw acc off. Here check this thread http://www.overclock.net/amd-ati/1051386-new-amd-got.html.

Sent from my Liberty using Tapatalk


----------



## Tept

I get 200 points less in 3dmark11 @ 725mhz @ .975v than I do @ 720mhz @ .963v. Something just seems wrong with that. Frustrating. I even see the stutter in Heaven it just doesn't seem quite as bad.


----------



## Tept

On another interesting note which I'm kind of happy about, the vendor I got my ASUS card from is out of ASUS 590's so they are giving me an EVGA instead lol. As replacement for my burnt out card that is.


----------



## Masked

Quote:



Originally Posted by *Smo*


Truth be told mate, I'm a bit pissed off with this card, and NVIDIA. I should have bought a 6990.


I have significantly more problems with my 6990's than I do my 590's.

At the end of the day, the 590 will eventually be addressed and the choke hold will be released.

I customized my drivers this go and while I wouldn't exactly call myself happy, I'm content.

That's a hell of a lot better than having an issue and AMD telling you "our driver is perfect".

Trust me, the 590 will grow into a good investment.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14473035*
> I have significantly more problems with my 6990's than I do my 590's.
> 
> At the end of the day, the 590 will eventually be addressed and the choke hold will be released.
> 
> I customized my drivers this go and while I wouldn't exactly call myself happy, I'm content.
> 
> That's a hell of a lot better than having an issue and AMD telling you "our driver is perfect".
> 
> Trust me, the 590 will grow into a good investment.


Nice!

That makes me even happier that I pulled the trigger on a second one


----------



## Smo

Quote:


> Originally Posted by *Masked;14473035*
> I have significantly more problems with my 6990's than I do my 590's.
> 
> At the end of the day, the 590 will eventually be addressed and the choke hold will be released.
> 
> I customized my drivers this go and while I wouldn't exactly call myself happy, I'm content.
> 
> That's a hell of a lot better than having an issue and AMD telling you "our driver is perfect".
> 
> Trust me, the 590 will grow into a good investment.


I really hope you're right bud - I'm getting towards the end of my patience here.

I still don't quite know how to sort my second GPU being at maximum clock speed all the time - it seems to have happened ever since plugging in my second XL2410T.


----------



## Tept

Quote:


> Originally Posted by *Smo;14475230*
> I really hope you're right bud - I'm getting towards the end of my patience here.
> 
> I still don't quite know how to sort my second GPU being at maximum clock speed all the time - it seems to have happened ever since plugging in my second XL2410T.


It happens with dual monitors. GPU 2 will never go idle if you have 2 monitors. It will always be in power state 0.

If you flip your monitors around in their DVI ports on the card, like instead of having them plugged in both in A, try A and B, try A1 and B, then try A2 and B. Certain combinations will land you having GPU1 staying in P0 all the time instead of GPU2. It's just the way it works.


----------



## Smo

Quote:


> Originally Posted by *Tept;14476451*
> It happens with dual monitors. GPU 2 will never go idle if you have 2 monitors. It will always be in power state 0.
> 
> If you flip your monitors around in their DVI ports on the card, like instead of having them plugged in both in A, try A and B, try A1 and B, then try A2 and B. Certain combinations will land you having GPU1 staying in P0 all the time instead of GPU2. It's just the way it works.


Oh right! I had no idea, please forgive my ignorance. + rep


----------



## Tept

Quote:


> Originally Posted by *Smo;14476967*
> Oh right! I had no idea, please forgive my ignorance. + rep


Don't blame ya for being irritated with it, its silly imo. If 1 GPU can run 1 monitor at lower power state, why can't 2 GPU's run 2 monitors at lower power state.


----------



## RagingCain

I didn't know that about dual monitors hmmm. Good to know Surround idles which is unfair for you guys.

Just so you guys know, I have already started Heaven/3DMark 11 benched in the 590 vs 6990 thread. Metro 2033 is up next (perhaps this evening) any of you are more than welcome to enter, all the instructions are in main of that thread. All I ask is that you submit benches in the same format as me. I don't know why but the 6990 guys seem really reluctant since they are going to win anyways.

Post powered by DROID X2


----------



## Tept

P63,310 3D Vantage http://3dmark.com/3dmv/3372045


----------



## Recipe7

Got my gtx590 in a few days ago:


----------



## Tept

Anyone looking to pick up a 590 or another 590, MobilePC.com has an open box with warranty evga classified for $600, I'm about a mm from picking it up myself.


----------



## ecnelitsep

Quote:


> Originally Posted by *Tept;14486990*
> Anyone looking to pick up a 590 or another 590, MobilePC.com has an open box with warranty evga classified for $600, I'm about a mm from picking it up myself.


If it was a serial based warranty asus i would buy it in a heartbeat but with EVGA its already been registered so the next owner is SOL


----------



## RagingCain

I am done benchmarking Metro2033 for the most part, I thought you guys would like to know this:

Metro2033 Run
CPU 4.4 / QPI 3.6
DRAM 1600 / Uncore 3.6 / PCI-E 115
GTX 590 SLI - *870-1740-927*
GPU Voltage: 1.050
Driver Version: 280.19
Avg. FPS: 85.67
Proof:

















I actually think I overclocked the memory too high, the 870/875 Core was rocking pretty nice. See what our cards can do with no PDL or decent cooling?

(Frequency)


----------



## Smo

Nice scores mate!


----------



## 2010rig

Quote:


> Originally Posted by *RagingCain;14489761*
> I am done benchmarking Metro2033 for the most part, I thought you guys would like to know this:


Good job RC, that's a crazy overclock you managed there.









Now I hope we can get some submissions from 6990 users to see how these cards truly compare, rather than gimped down cards during the reviews.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14489981*
> Nice scores mate!


Quote:


> Originally Posted by *2010rig;14491022*
> Good job RC, that's a crazy overclock you managed there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now I hope we can get some submissions from 6990 users to see how these cards truly compare, rather than gimped down cards during the reviews.


I appreciate it guys. I really don't feel like doing Crysis next... I feel like doing something fun and new, I am just not sure what yet though.

Someone requested Dirt 3, I am very tempted to test Crysis 2...

I guess I will ask in that thread.


----------



## 2010rig

Quote:


> Originally Posted by *RagingCain;14491206*
> I appreciate it guys. I really don't feel like doing Crysis next... I feel like doing something fun and new, I am just not sure what yet though.
> 
> Someone requested Dirt 3, I am very tempted to test Crysis 2...
> 
> I guess I will ask in that thread.


Do which ever one you want, you're pretty much the only one submitting right now anyway.


----------



## Jeppzer

Oh man, max temp of 54, that's really impressive!


----------



## RagingCain

Quote:


> Originally Posted by *2010rig;14491315*
> Do which ever one you want, you're pretty much the only one submitting right now anyway.


Very tempted to do Dirt 3 after setting it up. I ran it for the first time since I bought in the July sales. Game looks gorgeous.

Crysis/Warhead/2, Dirt 3, and Aliens vs. Predator are already installed/patched ready to be benched.
Quote:


> Originally Posted by *Jeppzer;14491331*
> Oh man, max temp of 54, that's really impressive!


Yep thats water cooling, she is a whore to setup right, but really lets you stand back and go.. gee that was kind of worth it. I think I need another 360 radiator installed to keep temperatures down for extended benchmarking and lower the maximum temperature to sub-45c.


----------



## Smo

Apologies for my crap videos - I had to prop my phone up with an HDMI-DVI adapter! I wanted to show you guys what's happening with my cards in the hope that I'm not an isolated case!

This first video is roughly a 2 and a half minute clip of MSI afterburner while playing a short section of Crysis v1.21. It shows the lack of uniformity when it comes to my GPU usage - notice at 1:51 it drops to 0%!





http://www.youtube.com/watch?v=r8PmHmy6Q1A[/ame[/URL]]

This second video is of what I see while playing - granted it's difficult to see when viewed on Youtube, but the gameplay is sluggish and choppy, totally ruining the experience. I suggest bumping it up to 1080p full screen and noting 18-30 seconds and 59-1:03 where it is much more obvious. I also apologise for my first kill-fail













http://www.youtube.com/watch?v=VL1hYwRr9fw[/ame[/URL]]

Is anyone else experiencing this?


----------



## Tept

Quote:


> Originally Posted by *Smo;14494302*
> Apologies for my crap videos - I had to prop my phone up with an HDMI-DVI adapter! I wanted to show you guys what's happening with my cards in the hope that I'm not an isolated case!
> 
> This first video is roughly a 2 and a half minute clip of MSI afterburner while playing a short section of Crysis v1.21. It shows the lack of uniformity when it comes to my GPU usage - notice at 1:51 it drops to 0%!
> 
> 
> 
> 
> 
> 
> This second video is of what I see while playing - granted it's difficult to see when viewed on Youtube, but the gameplay is sluggish and choppy, totally ruining the experience. I suggest bumping it up to 1080p full screen and noting 18-30 seconds and 59-1:03 where it is much more obvious. I also apologise for my first kill-fail
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is anyone else experiencing this?


I'm not sure if of Crysis effeciency with SLI so I can't comment on that but 1 thing I notice is you should get on MSI Afterburner 2.2.0 beta 5. Also is it set to Prefer Maximum Performance in nvidia control panel?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Smo;14494302*
> Apologies for my crap videos - I had to prop my phone up with an HDMI-DVI adapter! I wanted to show you guys what's happening with my cards in the hope that I'm not an isolated case!
> 
> This first video is roughly a 2 and a half minute clip of MSI afterburner while playing a short section of Crysis v1.21. It shows the lack of uniformity when it comes to my GPU usage - notice at 1:51 it drops to 0%!
> 
> 
> 
> 
> 
> 
> This second video is of what I see while playing - granted it's difficult to see when viewed on Youtube, but the gameplay is sluggish and choppy, totally ruining the experience. I suggest bumping it up to 1080p full screen and noting 18-30 seconds and 59-1:03 where it is much more obvious. I also apologise for my first kill-fail
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is anyone else experiencing this?


Crysis 1 and Crysis Warhead, while being fun games and pretty, are both horribly optimized code. No matter how good your Hardware is, that game is going to give you issues.

I would not use it at all as a barrometer for GPU or System performance. Just too many Gremlins in the code, that one never really knows if it's their system or, as it is in most cases - the code.

I would try a different game like Metro 2033, The Witcher 2, or even Crysis 2 over Crysis 1 and Warhead.

I'd put money on it in Vegas that the issues you're seeing are due to the game's poor optimization than your Jedi Hardware.

Believe it









(BTW, how are you liking your XL2410's? I really love mine after I got it calibrated. But I can't decide if I want to get two more, or, just get the 27" 3D Acer that is finally readily available. But even if I do, I'll keep this panel for something. I like it that much.)


----------



## Smo

Quote:


> Originally Posted by *Tept;14494405*
> I'm not sure if of Crysis effeciency with SLI so I can't comment on that but 1 thing I notice is you should get on MSI Afterburner 2.2.0 beta 5. Also is it set to Prefer Maximum Performance in nvidia control panel?


Thanks for the suggestion - I've just updated to Afterburner 2.2.0 Beta 5. I have also made sure my Control Panel is set to PMP. I'll quickly fire up Metro 2033 as suggested by Shinobi and will post back in a bit.
Quote:


> Originally Posted by *Shinobi Jedi;14494670*
> Crysis 1 and Crysis Warhead, while being fun games and pretty, are both horribly optimized code. No matter how good your Hardware is, that game is going to give you issues.
> 
> I would not use it at all as a barrometer for GPU or System performance. Just too many Gremlins in the code, that one never really knows if it's their system or, as it is in most cases - the code.
> 
> I would try a different game like Metro 2033, The Witcher 2, or even Crysis 2 over Crysis 1 and Warhead.
> 
> I'd put money on it in Vegas that the issues you're seeing are due to the game's poor optimization than your Jedi Hardware.
> 
> Believe it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (BTW, how are you liking your XL2410's? I really love mine after I got it calibrated. But I can't decide if I want to get two more, or, just get the 27" 3D Acer that is finally readily available. But even if I do, I'll keep this panel for something. I like it that much.)


Thanks for the input mate - I just quickly wondered if it could be voltage related (you never know, with all the throttling etc) so I quickly overclocked my card to 670/1340 @ 0.938v and enabled 'Prefer Maximum Performance' in my Control Panel as advised by Tept. It is stable according to Kombustor - so I'm happy with that.

First thing I did was fire up Dead Space II (all options fully maxed out at 1920x1080) for a quick test and this is what Afterburner showed me;










Not great!

I'm just about to fire up Metro 2033 (same as above, maxed out at 1920x1080 with all DX11 options enabled) and will post back with what happens in a few minutes.

As for the XL2410T - I loved it so much I bought a second. And on payday next month, I'm very tempted to buy a third! Admittedly, out of the box they both looked awful, but nothing a quick Google couldn't sort out. I love the response time, and playing at 120Hz over 60Hz is like night and day, I didn't expect the difference to be that noticeable but it's so silky smooth, I love it









*Edit:* Here's my Metro 2033 usage;










Pants









*Edit 2:* Crysis 2;


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14497394*
> Thanks for the suggestion - I've just updated to Afterburner 2.2.0 Beta 5. I have also made sure my Control Panel is set to PMP. I'll quickly fire up Metro 2033 as suggested by Shinobi and will post back in a bit.
> 
> Thanks for the input mate - I just quickly wondered if it could be voltage related (you never know, with all the throttling etc) so I quickly overclocked my card to 670/1340 @ 0.938v and enabled 'Prefer Maximum Performance' in my Control Panel as advised by Tept. It is stable according to Kombustor - so I'm happy with that.
> 
> First thing I did was fire up Dead Space II (all options fully maxed out at 1920x1080) for a quick test and this is what Afterburner showed me;
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not great!
> 
> I'm just about to fire up Metro 2033 (same as above, maxed out at 1920x1080 with all DX11 options enabled) and will post back with what happens in a few minutes.
> 
> As for the XL2410T - I loved it so much I bought a second. And on payday next month, I'm very tempted to buy a third! Admittedly, out of the box they both looked awful, but nothing a quick Google couldn't sort out. I love the response time, and playing at 120Hz over 60Hz is like night and day, I didn't expect the difference to be that noticeable but it's so silky smooth, I love it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Edit:* Here's my Metro 2033 usage;
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pants


Dead Space II is not a good choice. One 560TI is overkill, I had over 300 fps in DS2 on one 580. Its not what you want to hear but you might need to test more games. My metro 2033 usage is also terrible.

The only program I know of that uses a flat 95~99 % usage is 3DMark 11 (for Quad-SLI.) With Heaven as a close second with GPU usage hiccups.

What CPU usage are you using?

Post powered by DROID X2


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14497735*
> Dead Space II is not a good choice. One 560TI is overkill, I had over 300 fps in DS2 on one 580. Its not what you want to hear but you might need to test more games. My metro 2033 usage is also terrible.
> 
> The only program I know of that uses a flat 95~99 % usage is 3DMark 11 (for Quad-SLI.) With Heaven as a close second with GPU usage hiccups.
> 
> What CPU usage are you using?
> 
> Post powered by DROID X2


I don't mind testing more games if I need to! I also wouldn't mind erratic usage but it's making my games jerky and I can't stand for that, it ruins the experience!

Afterburner doesn't display CPU usage right? How would I go about viewing it?


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14497959*
> I don't mind testing more games if I need to! I also wouldn't mind erratic usage but it's making my games jerky and I can't stand for that, it ruins the experience!
> 
> Afterburner doesn't display CPU usage right? How would I go about viewing it?


Opening up task manager either by Ctrl+Alt+Delete or right click the taskbar in Windows 7.

Switch it to the performance tab, to monitor usage. Although its not as large a history as Afterburner you can see if for some reason you are maxing out usage.

Don't forget non-multithread applications have to run everything through just one core or thread. So while your Operating System may not effect operations a game running (doing all the math and loading it needs), will also be the same thread running the drivers and all the data to the GPU. It also gets all the data from the memory to the GPU, and from the hard disk to the memory.

This is what sucks about console ports. They usually are using one or two cores.

Other things that may cause this:
Unstable overclock on either CPU / RAM.
Unstable overclock on GPU core.
Unstable overclock on the GPU vram.

I see that your voltage is borderline for 670 MHz, I would test it at 0.950v and see if the overall usage increases. It might be GDDR5 error correction active, its not just memory overclock that can trigger your ECC.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14498074*
> Opening up task manager either by Ctrl+Alt+Delete or right click the taskbar in Windows 7.
> 
> Switch it to the performance tab, to monitor usage. Although its not as large a history as Afterburner you can see if for some reason you are maxing out usage.
> 
> Don't forget non-multithread applications have to run everything through just one core or thread. So while your Operating System may not effect operations a game running (doing all the math and loading it needs), will also be the same thread running the drivers and all the data to the GPU. It also gets all the data from the memory to the GPU, and from the hard disk to the memory.
> 
> This is what sucks about console ports. They usually are using one or two cores.
> 
> Other things that may cause this:
> Unstable overclock on either CPU / RAM.
> Unstable overclock on GPU core.
> Unstable overclock on the GPU vram.
> 
> I see that your voltage is borderline for 670 MHz, I would test it at 0.950v and see if the overall usage increases. It might be GDDR5 error correction active, its not just memory overclock that can trigger your ECC.


Thanks for that - as soon as I saw the word 'Task' in your reply I facepalmed! Haha, can't believe I forgot something as basic as that. WOuld it be worth testing the CPU usage in Metro?

I suppose it could be an unstable overclock - however I would rule out the GPU as it has been doing this since before I tried overclocking the 590 (the GPU has only been overclocked the last hour or so).

My i5 is running at 4.4GHz and the only settings I have changed in my BIOS is multiplier to 44 and VRM Frequency to 350. Everything else is stock. Right now (with Dead Space 2 minimized) CPU-Z is telling me my Core Speed is 4327MHz and voltage is 1.288v-1.304v (it's constantly changing) - could this be in the instability?

*Edit :* This is my CPU usage when playing a bit of Dead Space II (sorry, I know - but it's open!). I can try something like Metro 2033 if it helps.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14498153*
> Thanks for that - as soon as I saw the word 'Task' in your reply I facepalmed! Haha, can't believe I forgot something as basic as that. WOuld it be worth testing the CPU usage in Metro?
> 
> I suppose it could be an unstable overclock - however I would rule out the GPU as it has been doing this since before I tried overclocking the 590 (the GPU has only been overclocked the last hour or so).
> 
> My i5 is running at 4.4GHz and the only settings I have changed in my BIOS is multiplier to 44 and VRM Frequency to 350. Everything else is stock. Right now (with Dead Space 2 minimized) CPU-Z is telling me my Core Speed is 4327MHz and voltage is 1.288v-1.304v (it's constantly changing) - could this be in the instability?
> 
> *Edit :* This is my CPU usage when playing a bit of Dead Space II (sorry, I know - but it's open!). I can try something like Metro 2033 if it helps.


That definitely is a maxed out thread, I would anticipate that this is technically a bottleneck in DeadSpace II. Despite it being a gorgeous game, its only DX9 and that specific engine is single cored. The voltage fluctuation is indeed normal, but have you ever ran Prime95 for a few hours to see if its stable? Let me know if you need to know more about Prime95/setup.

Essentially, its similar to tripping over the floor as your running down a hallway. A CPU error doesn't always crash the system but "trips" over its own error. Now human's have been known to land on sharp objects and die from a simple trip. Thats the equivalent of a total crash on the CPU. For the most part we trip fall down, bruise a knee, and get right back up.

That could be what your CPU is doing, it gets an error, but its not severe enough to crash the system, so the Operating System handles it, and tries to get back up to keep running.

Now think, you are dealing with frequencies in 4.4 GHz, or 4.4 billion operations a second... per core. Depending on how often you are tripping, can determine system performance.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14498371*
> That definitely is a maxed out thread, I would anticipate that this is technically a bottleneck in DeadSpace II. Despite it being a gorgeous game, its only DX9 and that specific engine is single cored. The voltage fluctuation is indeed normal, but have you ever ran Prime95 for a few hours to see if its stable? Let me know if you need to know more about Prime95/setup.
> 
> Essentially, its similar to tripping over the floor as your running down a hallway. A CPU error doesn't always crash the system but "trips" over its own error. Now human's have been known to land on sharp objects and die from a simple trip. Thats the equivalent of a total crash on the CPU. For the most part we trip fall down, bruise a knee, and get right back up.
> 
> That could be what your CPU is doing, it gets an error, but its not severe enough to crash the system, so the Operating System handles it, and tries to get back up to keep running.
> 
> Now think, you are dealing with frequencies in 4.4 GHz, or 4.4 billion operations a second... per core. Depending on how often you are tripping, can determine system performance.


I really appreciate the help here, expecially the descriptions to try and help me understand what's going on. I've never run Prime95, no - maybe it's something I should look into.

I think I might pop into the BIOS and drop my multiplier down to 40 and see how my rig fares at 4GHz rather than 4.4Ghz.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14498391*
> I really appreciate the help here, expecially the descriptions to try and help me understand what's going on. I've never run Prime95, no - maybe it's something I should look into.
> 
> I think I might pop into the BIOS and drop my multiplier down to 40 and see how my rig fares at 4GHz rather than 4.4Ghz.


Thats okay, we all have to start somewhere







Overclocking is actually relatively new to me. I had done it whe I was I think 14 or 15, but I went for 10 years without ever doing it again. I had to relearn everything starting about last October









All Prime95 does is it runs all 4 of your cores at full speed with advanced algorithms for detecting Prime numbers. Doing a specific test called SmallFFTs puts all the data into the CPU and its cache, and begins to test if you have rounding errors, which translates into CPU logic and or calculation errors. These can appear as BSODS, or just less severe as system degradation. Essentially everything will run fine, appear fine, and then the next day you won't be able to boot into Windows.

Most people who don't stress test, usually get that second situation frequently in varying degrees, such as Windows won't boot, or I can't play video games, or I have no audio, etc. They then of course get on here and expect one of us to tell them whats wrongs. Usually simple stuff gets suggested first like drivers then someone like me, in the troubleshooting crowd, suggest CPU overclock, and the person at fault usually defends in till his dying breath because its been "solid" for days/weeks/months, and then refuses to check this issue.

Everytime I have a crash, it usually means I need to up the voltage, and retest. Once I am sure I am 100% stable, THEN, I usually reformat and re-install Windows as its probably more than likely that there was damaged being caused in the background. All of those crazy benchmarks I have been running have taken a toll on my Windows already (because of crashing.) I know I am looking at a reformat within a week.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14498629*
> Thats okay, we all have to start somewhere
> 
> 
> 
> 
> 
> 
> 
> Overclocking is actually relatively new to me. I had done it whe I was I think 14 or 15, but I went for 10 years without ever doing it again. I had to relearn everything starting about last October
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All Prime95 does is it runs all 4 cores at full speed with advanced algorithms for detecting Prime numbers. Doing a specific test called SmallFFTs puts all the data into the CPU and its cache, and begins to test if you have rounding errors, which translates into CPU logic and or calculation errors. These can appear as BSODS, or just less severe as system degradation. Essentially everything will run fine, appear fine, and then the next day you won't be able to boot into Windows.
> 
> Most people who don't stress test, usually get that second situation frequently in varying degrees, and then of course get on here and expect one of us to tell them whats wrongs. Usually someone in the troubleshooting suggest CPU overclock, and the person at fault usually defends in till his dying breath because its been "solid" for days/weeks/months, and then refuses to check this issue.
> 
> Everytime I have a crash, it usually means I need to up the voltage, and retest. Once I am sure I am stable, THEN, I usually reformat and re-install Windows as its probably more than likely that there was damaged being caused in the background.


So in theory then an unstable overclock (while not manifesting as BSODs) can still essentially corrupt Windows?


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14498659*
> So in theory then an unstable overclock (while not manifesting as BSODs) can still essentially corrupt Windows?


Absolutely.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14498712*
> Absolutely.


Interesting - but still I can't understand why all of a sudden my games aren't running smoothly any more. Unfortunately when trying to install the Control Panel I went through several different driver versions and have essentially lost track of what changed and where.

However, I reinstalled windows and then performed a clean install of 280.19. My BIOS was left unchanged (still OC'd to 4.4GHz). Perhaps I should have backed up what I wanted, cleared my CMOS and then booted into a Windows install?

Or are your games behaving the same way? (Most likely not, I tend to have **** luck).


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14498769*
> Interesting - but still I can't understand why all of a sudden my games aren't running smoothly any more. Unfortunately when trying to install the Control Panel I went through several different driver versions and have essentially lost track of what changed and where.
> 
> However, I reinstalled windows and then performed a clean install of 280.19. My BIOS was left unchanged (still OC'd to 4.4GHz). Perhaps I should have backed up what I wanted, cleared my CMOS and then booted into a Windows install?
> 
> Or are your games behaving the same way? (Most likely not, I tend to have **** luck).


I was just suggesting the possibility of the CPU not being able to deliver the performance you would expect and create an artificial bottleneck kind of what we are seeing but alas, GPU usage is a common problem for every Fermi architecture.

One of the reasons I am waiting till after Maxwell is released before upgrading again. I am not buying another GPU ever again that will not stay at 99% in game, or in SLI.

Its incredibly frustrating and lame.

I may not understand the complexities of the architecture or the driver level of communications or the DX API, but I do know when I buy a video card, I expect it to work as advertised. I didn't know that 99% operation or full speed was only achievable in certain applications & conditions, not disclosed to me at the time of purchase.


----------



## Tept

Smo, 4.4ghz @ 1.3v seems low, I very much doubt that its going to be Prime stable and as Raging said that will definately cause problems. Keep in mind instability in the CPU over time can also cause damage and life expectancy reduction. Say that when you 1st powered your CPU up you could run say 4.6ghz stable @ 1.39v. Well, after a fair amount of time running a 4.4ghz overclock at an unstable voltage you could lose the ability to overclock to 4.6ghz @ 1.39, now it will require 1.395v or 1.4v because of degradation to the CPU.

I very much suggest a Prime95 Blend test for at least 2 hours. If you have done any tweaking to memory you need to do a memtest to at least 30% coverage and alot of people would tell you to do 100%.

Stability is key to such a point as to where you need to change your settings in bios to firm settings rather than leave them on Auto. Set your CPU vcore to manual and put it @ say 1.31v, go into windows and run Prime for 2 hours or until a Worker fails. If a worker fails go into Bios and up the vcore by .005 until workers stop failing. I'm not 100% sure of i5's voltage safezone but if its anything like i7-2600k I would stay under 1.43 for daily use and that of course is reliant on your temps. If you see high 80's during Prime I would suggest against sticking with that overclock for 24/7, aim for 83-84 as max temps during Prime if you want your cpu to have a long life.

BTW Smo,
My truck: 



http://www.youtube.com/watch?v=x7qxdFl7_qw[/ame[/URL]]

650hp 1200tq - estimate based on similar trucks with the same tune on dyno


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14498907*
> I was just suggesting the possibility of the CPU not being able to deliver the performance you would expect and create an artificial bottleneck kind of what we are seeing but alas, GPU usage is a common problem for every Fermi architecture.
> 
> One of the reasons I am waiting till after Maxwell is released before upgrading again. I am not buying another GPU ever again that will not stay at 99% in game, or in SLI.
> 
> Its incredibly frustrating and lame.
> 
> I may not understand the complexities of the architecture or the driver level of communications or the DX API, but I do know when I buy a video card, I expect it to work as advertised. I didn't know that 99% operation or full speed was only achievable in certain applications & conditions, not disclosed to me at the time of purchase.


I feel the same way mate - I don't care if my hardware can produce ~50fps at 50% usage, I want it to provide me with the absolute maximum FPS it can, at maximum usage, at all times. Demanding maximum performance doesn't seem outrageous to me - after all, isn't that why people like you and me pay a premium for the highest spec hardware?

As for the juddery games, perhaps it's a mix of several things and I'm just blaming the card because of my ignorance when it comes to how a PC really works. After all, this is the first PC I've ever built so I should have expected teething problems I guess!
Quote:


> Originally Posted by *Tept;14500102*
> Smo, 4.4ghz @ 1.3v seems low, I very much doubt that its going to be Prime stable and as Raging said that will definately cause problems. Keep in mind instability in the CPU over time can also cause damage and life expectancy reduction. Say that when you 1st powered your CPU up you could run say 4.6ghz stable @ 1.39v. Well, after a fair amount of time running a 4.4ghz overclock at an unstable voltage you could lose the ability to overclock to 4.6ghz @ 1.39, now it will require 1.395v or 1.4v because of degradation to the CPU.
> 
> I very much suggest a Prime95 Blend test for at least 2 hours. If you have done any tweaking to memory you need to do a memtest to at least 30% coverage and alot of people would tell you to do 100%.
> 
> Stability is key to such a point as to where you need to change your settings in bios to firm settings rather than leave them on Auto. Set your CPU vcore to manual and put it @ say 1.31v, go into windows and run Prime for 2 hours or until a Worker fails. If a worker fails go into Bios and up the vcore by .005 until workers stop failing. I'm not 100% sure of i5's voltage safezone but if its anything like i7-2600k I would stay under 1.43 for daily use and that of course is reliant on your temps. If you see high 80's during Prime I would suggest against sticking with that overclock for 24/7, aim for 83-84 as max temps during Prime if you want your cpu to have a long life.
> 
> BTW Smo,
> My truck:
> 
> 
> 
> 
> 650hp 1200tq - estimate based on similar trucks with the same tune on dyno


Thanks very much for taking the time to reply in such detail. I'm more than willing to try a Blend test for a couple of hours.

I do wonder though, would setting the vcore to 1.31v @ 4.4GHz keep the voltage at 1.31v 100% of the time, or only at maximum usage? I'd like to keep the 1.6Ghz idle speed. Essentially I'm asking if when downclocked the vcore will remain at 1.31v or drop to the standard ~0.9v.

I reinstalled windows about a week ago, with the 4.4GHz overclock still in effect so I'm considering doing another format, clearing the CMOS and starting all over again...

If after manually setting my vcore and hitting 2+ hours Prime95 stable at 4.4GHz my games are still a bit juddery, I suppose I have no choice!

The only reason I've kicked up a fuss about this is because when I swapped from my 6950 2GB to the GTX 590 all my games ran as smooth as butter, and now they don't. It's really annoying! In all fairness I'm still considering swapping out my GTX 590 for 2x 6970s in Crossfire (or 2x shader unlocked 6950s).

As for your Truck - do you have a spec list or engine bay pics? I'd like to see more


----------



## Tept

It will still idle down, 1.31v is more of a "max" range.

No spec list lol, its a diesel, it has DrewTech DashDaq gauge panel/tuner, came with pre-loaded 250hp Spartan Diesel Tune, and I have an S&B cold air intake. Course I also took off all the EPA garbage they pile on these trucks now. <--- That alone is almost 100hp increase on these things because it runs through a Cat AND a Diesel Particulate filter system which is, man I should take a pic of it compared to say, a damn dog. It weighs almost 80lbs, between the cat, the dpf, and the stock piping, 80-90lbs, and thats only from downpipe to axle, that doesnt include tailpipe. But thats it, thats the mod list. These trucks come with 350hp/650tq stock @ around 30psi. The Spartan tune hits 43 in the last 2 gears(5 spd auto). Full weight is 8440lbs with me sitting in it on a half tank(15gal). At that weight I sat door to door with a 13.0sec mustang from a 40mph to 130mph roll.

And if you think my numbers are ridiculous for a almost stock diesel, go Youtube 340 Spartan tuned 6.4L powerstroke. Your jaw will fall off. Guys doing almost 2000tq on a 340 tune with spray.










Just incase there are doubts of ownership =P.

vvvvv This is the toy I miss though. I kick myself 2 to 3 times daily over selling this.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14506349*
> I feel the same way mate - I don't care if my hardware can produce ~50fps at 50% usage, I want it to provide me with the absolute maximum FPS it can, at maximum usage, at all times. Demanding maximum performance doesn't seem outrageous to me - after all, isn't that why people like you and me pay a premium for the highest spec hardware?
> 
> As for the juddery games, perhaps it's a mix of several things and I'm just blaming the card because of my ignorance when it comes to how a PC really works. After all, this is the first PC I've ever built so I should have expected teething problems I guess!
> 
> Thanks very much for taking the time to reply in such detail. I'm more than willing to try a Blend test for a couple of hours.
> 
> I do wonder though, would setting the vcore to 1.31v @ 4.4GHz keep the voltage at 1.31v 100% of the time, or only at maximum usage? I'd like to keep the 1.6Ghz idle speed. Essentially I'm asking if when downclocked the vcore will remain at 1.31v or drop to the standard ~0.9v.
> 
> I reinstalled windows about a week ago, with the 4.4GHz overclock still in effect so I'm considering doing another format, clearing the CMOS and starting all over again...
> 
> If after manually setting my vcore and hitting 2+ hours Prime95 stable at 4.4GHz my games are still a bit juddery, I suppose I have no choice!
> 
> The only reason I've kicked up a fuss about this is because when I swapped from my 6950 2GB to the GTX 590 all my games ran as smooth as butter, and now they don't. It's really annoying! In all fairness I'm still considering swapping out my GTX 590 for 2x 6970s in Crossfire (or 2x shader unlocked 6950s).
> 
> As for your Truck - do you have a spec list or engine bay pics? I'd like to see more


If the CPU seems stable, then I would suggest going around and figuring out what voltages are needed for 4.8 GHz and see if that makes a difference. I also suggest you get a better air cooler / heatsink than stock to go higher (temperatures need to be taken care of) but its not terrible for the 2500/2600k.

A lot of people think that the raw speed of SandyBridge is so much better than x58 processors but in reality, we can overclock things like QPI and Uncore and if I am not mistaken those values are either locked or integrated into the main CPU communication frequencies.

I don't know what voltage you would need for 4.8 GHz, I have no i7 2500/2600k processors, but I am sure someone can help you out.

I have a feeling that will smooth out performance for you, because of this:
http://www.hardocp.com/article/2011/05/03/nvidia_3way_sli_amd_trifire_redux/

Read the whole article. They *re-did* all their tests after a bunch of us suggested that the CPU / Chipset was seriously bottlenecking the 580s.

I think they even had one 580 in a x4 Slot, for craps sake... anyways, they re-did the testing and boy were we right.

I know thats 580s are not 590s, but the architecture responds very well to a higher overclock, something I am not happy about because you shouldn't need a 5 GHz CPU to utilize these products. It seems to be nVidia SLI needs ridiculous overclocks and I can't help but believe the 590 and Quad SLI shows this in the form of lower than expected GPU usage or just stuttery performance in general.


----------



## Smo

Quote:


> Originally Posted by *Tept;14506787*
> It will still idle down, 1.31v is more of a "max" range.
> 
> No spec list lol, its a diesel, it has DrewTech DashDaq gauge panel/tuner, came with pre-loaded 250hp Spartan Diesel Tune, and I have an S&B cold air intake. Course I also took off all the EPA garbage they pile on these trucks now. <--- That alone is almost 100hp increase on these things because it runs through a Cat AND a Diesel Particulate filter system which is, man I should take a pic of it compared to say, a damn dog. It weighs almost 80lbs, between the cat, the dpf, and the stock piping, 80-90lbs, and thats only from downpipe to axle, that doesnt include tailpipe. But thats it, thats the mod list. These trucks come with 350hp/650tq stock @ around 30psi. The Spartan tune hits 43 in the last 2 gears(5 spd auto). Full weight is 8440lbs with me sitting in it on a half tank(15gal). At that weight I sat door to door with a 13.0sec mustang from a 40mph to 130mph roll.
> 
> And if you think my numbers are ridiculous for a almost stock diesel, go Youtube 340 Spartan tuned 6.4L powerstroke. Your jaw will fall off. Guys doing almost 2000tq on a 340 tune with spray.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just incase there are doubts of ownership =P.
> 
> vvvvv This is the toy I miss though. I kick myself 2 to 3 times daily over selling this.


Nice dude - any idea what sort of turbo those things have as standard? A 100% bhp increase is almost unheard of, especially for something that comes out the factory on a production car.

Love the bike by the way, great excuse for getting yourself killed! I want one








Quote:


> Originally Posted by *RagingCain;14507662*
> If the CPU seems stable, then I would suggest going around and figuring out what voltages are needed for 4.8 GHz and see if that makes a difference. I also suggest you get a better air cooler / heatsink than stock to go higher (temperatures need to be taken care of) but its not terrible for the 2500/2600k.
> 
> A lot of people think that the raw speed of SandyBridge is so much better than x58 processors but in reality, we can overclock things like QPI and Uncore and if I am not mistaken those values are either locked or integrated into the main CPU communication frequencies.
> 
> I don't know what voltage you would need for 4.8 GHz, I have no i7 2500/2600k processors, but I am sure someone can help you out.
> 
> I have a feeling that will smooth out performance for you, because of this:
> http://www.hardocp.com/article/2011/05/03/nvidia_3way_sli_amd_trifire_redux/
> 
> Read the whole article. They *re-did* all their tests after a bunch of us suggested that the CPU / Chipset was seriously bottlenecking the 580s.
> 
> I think they even had one 580 in a x4 Slot, for craps sake... anyways, they re-did the testing and boy were we right.
> 
> I know thats 580s are not 590s, but the architecture responds very well to a higher overclock, something I am not happy about because you shouldn't need a 5 GHz CPU to utilize these products. It seems to be nVidia SLI needs ridiculous overclocks and I can't help but believe the 590 and Quad SLI shows this in the form of lower than expected GPU usage or just stuttery performance in general.


Thanks for posting that - I've just had a read through and I must say, I'm stunned. To think that to potentially utilise the full potential of my out-of-the-box NVIDIA hardware I would have to overclock a quad-core CPU to 4.8GHz is mad.

Good job I'm up for trying it then









I have an Antec Kuhler H2O 920 CPU Cooler so temperatures shouldn't be a problem. I'm currently at 4GHz, as last night I downclocked to look into my voltage. What I'll do is run some Benchmarks as is (with my 590 @ 670Mhz) then do a quick forum search for upping the 2500k to 4.8GHz.

I'll work out an overclock that will hopefully prove Prime95 stable and re-run my benchmarks, see what happens. This is also my first in-depth overclock so you're popping my cherry here Cain!

Stay tuned


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14508632*
> Nice dude - any idea what sort of turbo those things have as standard? A 100% bhp increase is almost unheard of, especially for something that comes out the factory on a production car.
> 
> Love the bike by the way, great excuse for getting yourself killed! I want one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for posting that - I've just had a read through and I must say, I'm stunned. To think that to potentially utilise the full potential of my out-of-the-box NVIDIA hardware I would have to overclock a quad-core CPU to 4.8GHz is mad.
> 
> Good job I'm up for trying it then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an Antec Kuhler H2O 920 CPU Cooler so temperatures shouldn't be a problem. I'm currently at 4GHz, as last night I downclocked to look into my voltage. What I'll do is run some Benchmarks as is (with my 590 @ 670Mhz) then do a quick forum search for upping the 2500k to 4.8GHz.
> 
> I'll work out an overclock that will hopefully prove Prime95 stable and re-run my benchmarks, see what happens. This is also my first in-depth overclock so you're popping my cherry here Cain!
> 
> Stay tuned


Kinky









Thats how most of us do it, we raise clocks, estimate the voltage needed, then we run Prime95 to test it out.

If it goes for more than 3~24 hours right off the bat good chance you got lucky and may even be able to lower voltage. I like to "trim the fat" on voltage, and use one increment higher than what took me to be stable.

So for 4.4 GHz (what I run 24/7) it took me exactly 1.389v to stabilize. This is just under my maximum safe voltage for my CPU. I went ahead and just put it as 1.4v, even though its Prime95 stable, just as a rule of habbit. I have been stable on Prime95 and IBT and every once in a while I will have a game crash for no apparent reason... I consider it to be some type of dip in voltage and usually just small nudge up ensures complete and total stability.

A lot of these errors can occur if you have too much voltage as well, going to overclocks beyond what your chip is capable requires more voltage, but more voltage does not guarantee more frequency or stability. So if you are starting low which is how I overclock, this may help you figure out how much voltage you need to add/adjust

Here is the order of instability when overclocking just the CPU I go by, but its not all encompassing.

*Usually Needs Large amounts of voltage increase*: +0.050v

Computer fails to enter post: Really Unstable, CPU Voltage Too Low
Computer freezes during POST or in BIOS: Really Unstable, Voltage Too Low
Computer freezes after POST but before entering Windows: Really Unstable, Voltage Too Low
*Needs moderate amounts of voltage increase*: +0.025v

Computer gets through POST and Windows Boot, but freezes on Desktop: Needs moar voltage... still pretty unstable.
Computer POSTS/Windows Loads, weird application errors come up: Needs moar voltage... still pretty unstable.
Computer boots to desktop, can't get a specific program to load: Needs moar voltage... still pretty unstable.
Computer boots to desktop, every thing looks okay, applications seem sluggish to load, fail to even load period but not necessarily errors: Needs moar voltage... still pretty unstable.
Computer boots to desktop, every thing looks okay, programs load, but there are a few strange errors in EventViewer Logs (drivers & media applications seem to have a lot of warnings/errors.) Needs moar voltage... still pretty unstable.
*Needs fine tuning via voltage increases*: +0.010v

Everything boots up, and everything loads, no visible signs of errors, but I randomly get BSOD involving error codes that involve drivers for example: 0x0000037 I believe is an nVidia driver (people think its the graphics card and not the CPU) / or CPU errors here 120~129, and or IRQ_NOT_LESS_THAN_OR_EQUAL errors. Any form of crash that is unexpected is unstable. Up ze voltagez!
Everything boots up, and everything loads, no visible signs of errors, but video games are not playable (crash): Any form of crash that is unexpected is unstable. Up ze voltagez!

Everything boots up, and everything loads, no visible signs of errors, maybe even video games are playable: *This is where most people STOP upping the voltage*. Because X Y Z game is playable. In reality, not la la land, its STRESS TESTING TIME!

*Needs fine tuning via voltage increases*: +0.00250~0.0050v

Prime95 runs for about 5 minutes, and the whole program crashes (not Windows). This is slightly unstable, but not enough to bring down the system. You have made progress, but still need to up voltages.
Prime95 (SmallFFTS Test) runs for about 1 minute, and an error popups. This is slightly more stable than the program not crashing (no matter how long Prime95 ran before crashing), but its even less likely to bring down the system. You have made progress, but still need to up more voltage.
Prime95 (SmallFFTs Test) runs for about 30 minute, and an error popups. This is much more stable, and in fact you have just entered the upper levels of stability even though you have had an error, but its even less likely to bring down the system. You have made progress, but still need to more voltages.
Prime95 (SmallFFTs Test) runs for about 3 hours, and an error popups. This is very stable, and in fact you have just entered the what some considered the highest levels of stability. Even though you have had an error, but its very less likely to bring down the system. You have made progress, but still in all honesty, need to up more voltages slightly.
Prime95 (SmallFFTs Test) runs for more than 8 hours, no errors. This is enough for me to claim stability, BUT its not as long as some people do run it, especially people who Fold/BOINC and need 100% accuracy on their CPU cycles. I encourage letting it run for at least 24 hours, but at this point, I use my own logic/experience. I let it run for 8 hours, and *then I know I am going to increase voltage by one or two increments after its done regadless*. To account for the unknown stability. It achieves essentially the same stability in less time, for least amount of voltage required (efficient.)
Once the CPU is "stable" then you switch testing to Prime95 Blend if you are about 4GB of RAM, or run CUSTOM for having large amounts of ram 8/12/16/24 GB of RAM (like me) and just increase memory usage to make it 80~90% of your system RAM. This ensures CPU / Communication / System RAM are also error free. I just like Prime95 because I am used to it and its been around almost as long as I have been playing with computers.

This is how I think others should test for stability. I will tell you though A LOT of people on this forum don't do this... I think primarily for either laziness, heat, or just because they don't understand how it works. I also liked to overclock, but I NEED a stable system. Thats whats very important to me. I think that overclocking has gotten so easy that anyone can now go in and just up multipliers without having to struggle with it etc, so there is nothing learned nor anything risked really.

A lot of people also use the excuse "nothing will run your hardware that hard like Prime95 or IBT" and that is not what the programs do. All those programs do is find an error in mathematics. An error or instability can occur under ANY difficulty or stress on the CPU. They illogically think that only stress produces error. This is simply incorrect, and misinformed. It also allows them to continue being lazy, and encourage others to do the same. It actually pisses me off severely. I don't mind people doing what they want with their hardware, but then they are essentially spreading lies to the new guys, who then learn it this way, AND show up months later and their entire computer doesn't boot. Argh... anyways, enough ranting.

I guarantee 50% of the errors on this forum in form of a game crash, driver crash, media crash, or x y z problem occured, is because a lack of proper overclocking testing. One other extremely useful tool is Window's EventViewer, which you can get to by going to Control Panel -> Small Icons (top right) -> Administrator's Tools -> Event Viewer and check out Windows Logs, specifically Applications, Windows, and Setup (if you are installing a program that crashes.)

It gives you the equivalent of the information found in a BSOD for each application crash. What happens though on an unstable system many background services/tasks/file transfers/updates are corrupted by instability giving you absolutely no notice of it. This is how eventually the System goes down. You get enough of these and its bye bye booting. Don't panic if you see errors though, you have to analyze what type of error, if its the same device/same application it maybe a faulty application. Getting errors here though is a good way to know something is not working the way it should, and many average users don't ever peak in here.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14509157*
> Kinky
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thats how most of us do it, we raise clocks, estimate the voltage needed, then we run Prime95 to test it out.
> 
> If it goes for more than 3~24 hours right off the bat good chance you got lucky and may even be able to lower voltage. I like to "trim the fat" on voltage, and use one increment higher than what took me to be stable.
> 
> So for 4.4 GHz (what I run 24/7) it took me exactly 1.389v to stabilize. This is just under my maximum safe voltage for my CPU. I went ahead and just put it as 1.4v, even though its Prime95 stable, just as a rule of habbit. I have been stable on Prime95 and IBT and every once in a while I will have a game crash for no apparent reason... I consider it to be some type of dip in voltage and usually just small nudge up ensures complete and total stability.
> 
> A lot of these errors can occur if you have too much voltage as well, going to overclocks beyond what your chip is capable requires more voltage, but more voltage does not guarantee more frequency or stability. So if you are starting low which is how I overclock, this may help you figure out how much voltage you need to add/adjust
> 
> Here is the order of instability when overclocking just the CPU I go by, but its not all encompassing.
> 
> *Usually Needs Large amounts of voltage increase*: +0.050v
> 
> Computer fails to enter post: Really Unstable, CPU Voltage Too Low
> Computer freezes during POST or in BIOS: Really Unstable, Voltage Too Low
> Computer freezes after POST but before entering Windows: Really Unstable, Voltage Too Low
> *Needs moderate amounts of voltage increase*: +0.025v
> 
> Computer freezes after POST and Windows Boot, but freezes on Desktop: Needs moar voltage... still pretty unstable.
> Computer POSTS/Windows Loads, weird application errors come up: Needs moar voltage... still pretty unstable.
> Computer boots to desktop, can't get a specific program to load: Needs moar voltage... still pretty unstable.
> Computer boots to desktop, every thing looks okay, applications seem sluggish to load, fail to even load period but not necessarily errors: Needs moar voltage... still pretty unstable.
> Computer boots to desktop, every thing looks okay, programs load, but there are a few strange errors in EventViewer Logs (drivers & media seem to have a lot of warnings/errors. Needs moar voltage... still pretty unstable.
> *Needs fine tuning via voltage increases*: +0.010v
> 
> Everything boots up, and everything loads, no visible signs of errors, but I randomly get BSOD involving error codes that involve drivers for example: 0x0000037 I believe is an nVidia driver (people think its the graphics card and not the CPU) / or CPU errors here 120~129, and or IRQ_NOT_LESS_THAN_OR_EQUAL errors. Any form of crash that is unexpected is unstable. Up ze voltagez!
> Everything boots up, and everything loads, no visible signs of errors, maybe even video games are not playable (crash): Any form of crash that is unexpected is unstable. Up ze voltagez!
> 
> Everything boots up, and everything loads, no visible signs of errors, maybe even video games are playable: *This is where most people STOP upping the voltage*. Because X Y Z game is playable. STRESS TESTING TIME!
> 
> *Needs fine tuning via voltage increases*: +0.00250~0.0050v
> 
> Prime95 runs for about 5 minutes, and the whole program crashes (not Windows). This is slightly unstable, but not enough to bring down the system. You have made progress, but still need to up more voltages.
> Prime95 (SmallFFTS Test) runs for about 1 minute, and an error popups. This is slightly more stable than the program not crashing (no matter how long Prime95 ran before crashing), but its even less likely to bring down the system. You have made progress, but still need to up more voltages.
> Prime95 (SmallFFTs Test) runs for about 30 minute, and an error popups. This is much more stable, and in fact you have just entered the upper levels of stability even though you have had an error, but its even less likely to bring down the system. You have made progress, but still need to up more voltages.
> Prime95 (SmallFFTs Test) runs for about 3 hours, and an error popups. This is very stable, and in fact you have just entered the what some considered the highest levels of stability. Even though you have had an error, but its very less likely to bring down the system. You have made progress, but still in all honesty, need to up more voltages slightly.
> Prime95 (SmallFFTs Test) runs for more than 8 hours. This is enough for me to claim stability, BUT its not as long as some people do run it, especially people who Fold/BOINC and need 100% accuracy on their CPU cycles. I encourage letting it run for at least 24 hours, but at this point, I use my own logic/experience. I let it run for 8 hours, and *then I know I am going to increase voltage by one or two increments after its done regadless*. To account for the unknown stability. It achieves essentially the same stability in less time, for least amount of voltage required (efficient.)
> Once the CPU is "stable" then you switch testing to Prime95 Blend if you are about 4GB of RAM, or run CUSTOM for having large amounts of ram 8/12/16/24 GB of RAM (like me) and just increase memory usage to make it 80~90% of your system RAM. This ensures CPU / Communication / System RAM are also error free. I just like Prime95 because I am used to it and its been around almost as long as I have been playing with computers.
> 
> This is how I think others should test for stability. I will tell you though A LOT of people on this forum don't do this... I think primarily for either laziness, heat, or just because they don't understand how it works. I also liked to overclock, but I NEED a stable system. Thats whats very important to me. I think that overclocking has gotten so easy that anyone can now go in and just up multipliers without having to struggle with it etc, so there is nothing learned nor anything risked really.
> 
> A lot of people also use the excuse "nothing will run your hardware that hard like Prime95 or IBT" and that is not what the programs do. All those programs do is find an error in mathematics. An error or instability can occur under ANY difficulty or stress on the CPU. They illogically think that only stress produces error. This is simply incorrect, and misinformed. It also allows them to continue being lazy, and encourage others to do the same. It actually pisses me off severely. I don't mind people doing what they want with their hardware, but then they are essentially spreading lies to the new guys, who then learn it this way, AND show up months later and their entire computer doesn't boot. Argh... anyways, enough ranting.
> 
> I guarantee 50% of the errors on this forum in form of a game crash, driver crash, media crash, or x y z problem occured, is because a lack of proper overclocking testing. One other extremely useful tool is Window's EventViewer, which you can get to by going to Control Panel -> Small Icons (top right) -> Administrator's Tools -> Event Viewer and check out Windows Logs, specifically Applications, Windows, and Setup (if you are installing a program that crashes.)
> 
> It gives you the equivalent of the information found in a BSOD for each application crash. What happens though on an unstable system many background services/tasks/file transfers/updates are corrupted by instability giving you absolutely no notice of it. This is how eventually the System goes down. You get enough of these and its bye bye booting. Don't panic if you see errors though, you have to analyze what type of error, if its the same device/same application it maybe a faulty application. Getting errors here though is a good way to know something is not working the way it should, and many average users don't ever peak in here.


Awesome post and very informative! Thanks again for that, it's hugely appreciated.

I must admit, I am very nervous about trying to push for 4.8GHz on my first Overclock, especially considering my huge lack of experience. However - I have found this post by TwoCables, who seems to definitely know his stuff;
Quote:


> Originally Posted by *TwoCables;14349107*
> I recommend trying these settings and then tweaking forward from there:
> 
> *Ai Tweaker*
> 
> *Ai Overclock Tuner:* Manual
> *BLCK/PCIE Frequency:* 100.0
> *Turbo Ratio:* By All Cores
> *By All Cores:* 48
> *Internal PLL Voltage:* Disabled
> *Memory Frequency:* use the rated speed for your memory
> 
> *DRAM Timing Control:* use the rated timings for your memory
> 
> *EPU Power Saving MODE:* Disabled
> 
> *Ai Tweaker\ CPU Power Management >*
> 
> *CPU Ratio:* Auto
> *Enhanced Intel SpeedStep Technology:* Enabled
> *Turbo Mode:* Enabled
> *Long Duration Power Limit:* Auto
> *Long Duration Maintained:* Auto
> *Short Duration Power Limit:* Auto
> *Additional Turbo Voltage:* Auto
> *Primary Plane Current Limit:* Auto
> 
> *Ai Tweaker* (in the DIGI+ VRM section)
> 
> *Load-Line Calibration:* Ultra High
> *VRM Frequency:* Manual
> *VRM Fixed Frequency Mode:* 350
> *Phase Control:* Extreme
> *Duty Control:* Extreme
> *CPU Current Capability:* 140% (it's _supposed_ to turn red in the UEFI)
> 
> *CPU Voltage:* Offset Mode
> *Offset Mode Sign:* +
> *CPU Offset Voltage:* 0.040V
> *DRAM Voltage:* use the rated voltage for your memory
> 
> *VCCSA Voltage:* Auto
> *VCCIO Voltage:* 1.15625V (using VCCIO Voltages in between 1.100V and 1.200V may help enable you to lower the vCore while maintaining stability)
> 
> *CPU PLL Voltage:* Auto
> *PCH Voltage:* Auto
> *CPU Spread Spectrum:* Enabled
> 
> *Advanced\ CPU Configuration >*
> 
> *CPU Ratio:* Auto
> *Intel Adaptive Thermal Monitor:* Enabled
> *Active Processor Cores:* All
> *Limit CPUID Maximum:* Disabled
> *Execute Disable Bit:* Enabled
> *Intel Virtualization Technology:* Disabled
> *Enhanced Intel SpeedStep Technology:* Enabled
> *Turbo Mode:* Enabled
> *CPU C1E:* Enabled
> *CPU C3 Report:* Auto
> *CPU C6 Report:* Auto
> 
> *Note:* that Offset Voltage of 0.040V gives me a Core Voltage in CPU-Z of about 1.384V - 1.392V while under full load in Prime95's Blend. So be cautious and adjust accordingly.
> 
> If these settings don't work for you, then try an Offset Voltage of 0.045V.


I think I will use this as a starting point. Once I've finished this next GTA IV benchmark I'm going to boot into the BIOS and enter the settings TwoCables provided above, and bump the voltage as necessary to run Prime95 for 2 hours minimum.


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14509573*
> Awesome post and very informative! Thanks again for that, it's hugely appreciated.
> 
> I must admit, I am very nervous about trying to push for 4.8GHz on my first Overclock, especially considering my huge lack of experience. However - I have found this post by TwoCables, who seems to definitely know his stuff;
> 
> I think I will use this as a starting point. Once I've finished this next GTA IV benchmark I'm going to boot into the BIOS and enter the settings TwoCables provided above, and bump the voltage as necessary to run Prime95 for 2 hours minimum.


I like TwoCables and puts out reliable stuffs/infos. I would say give it a whirl.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14509681*
> I like TwoCables and puts out reliable stuffs/infos. I would say give it a whirl.


Yeah he seems fairly knowledgable!

Benchmarks complete, so here they are;
Quote:


> *Metro 2033 @ 4GHz / 590 @ 630MHz*
> 
> Average Framerate: 30.33
> Max. Framerate: 178.03
> Min. Framerate: 2.74
> 
> *Metro 2033 @ 4GHz / 590 @ 670MHz*
> 
> Average Framerate: 33.00
> Max. Framerate: 221.63
> Min. Framerate: 6.70
> 
> *Crysis @ 4GHz / 590 @ 630MHz*
> 
> Average Framerate: 54.20
> Max. Framerate: 59.97
> Min. Framerate: 23.24
> 
> *Crysis @ 4GHz / 590 @ 670MHz*
> 
> Average Framerate: 56.04
> Max. Framerate: 62.17
> Min. Framerate: 31.55
> 
> *Mafia II @ 4GHz / 590 @ 630MHz*
> 
> Average Framerate: 100.8
> Max. Framerate: -
> Min. Framerate: -
> 
> *Mafia II @ 4GHz / 590 @ 670MHz*
> 
> Average Framerate: 107.6
> Max. Framerate: -
> Min. Framerate: -
> 
> *GTA IV @ 4GHz / 590 @ 630MHz*
> 
> Average Framerate: 62.8
> CPU Usage: 56%
> 
> *GTA IV @ 4GHz / 590 @ 670MHz*
> 
> Average Framerate: 62.1
> CPU Usage: 55%


Now I'm going to quickly jot down the Overclock settings as above and see what happens...

*Edit:* Well no luck there! I copied the settings exactly and my machine just wouldn't boot with the multiplier set at 48 - it kept freezing on the Windows logo. I kept creeping up with the offset until I reached +0.070 but it still wouldn't start up and I didn't really want to push any further until I know I can do safely!

It wouldn't do x47 either but it's booted OK at x46, but for some reason the core speed is staying at 4600.0MHz @ 1.392-1.400v. I don't particularly want to run the voltage this high if I don't have to (unless it is safe to do so, I hear people saying 1.44 is acceptable for 24/7). I also don't understand why it won't downclock to idle any more?


----------



## RagingCain

Quote:


> Originally Posted by *Smo;14509838*
> Yeah he seems fairly knowledgable!
> 
> Benchmarks complete, so here they are;
> 
> Now I'm going to quickly jot down the Overclock settings as above and see what happens...
> 
> *Edit:* Well no luck there! I copied the settings exactly and my machine just wouldn't boot with the multiplier set at 48 - it kept freezing on the Windows logo. I kept creeping up with the offset until I reached +0.070 but it still wouldn't start up and I didn't really want to push any further until I know I can do safely!
> 
> It wouldn't do x47 either but it's booted OK at x46, but for some reason the core speed is staying at 4600.0MHz @ 1.392-1.400v. I don't particularly want to run the voltage this high if I don't have to (unless it is safe to do so, I hear people saying 1.44 is acceptable for 24/7). I also don't understand why it won't downclock to idle any more?


You have something called Load Line Calibration on High/Ultra (if it matches TwoCables settings.) This disables C1E/SpeedStep/C-State on tons of motherboards. This is what controls idling etc.

It looks like people do go much higher, 1.55 etc. to get 5.0 GHz and some as low as 1.40v. I was under the impression that 32nm max voltage was about 1.4v like mine, but it maybe a more refined architecture than mine. Unfortunately, the SandyBridge is not a CPU I have tamed, and thus I would have a hard time guiding recommend safe courses of action.

Something tells me official max voltage by Intel is the same as mine.

You also have 4 GHz results, try testing 4.6 GHz and see if there is a noticeable difference.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14510800*
> You have something called Load Line Calibration on High/Ultra (if it matches TwoCables settings.) This disables C1E/SpeedStep/C-State on tons of motherboards. This is what controls idling etc.
> 
> It looks like people do go much higher, 1.55 etc. to get 5.0 GHz and some as low as 1.40v. I was under the impression that 32nm max voltage was about 1.4v like mine, but it maybe a more refined architecture than mine. Unfortunately, the SandyBridge is not a CPU I have tamed, and thus I would have a hard time guiding recommend safe courses of action.
> 
> Something tells me official max voltage by Intel is the same as mine.
> 
> You also have 4 GHz results, try testing 4.6 GHz and see if there is a noticeable difference.


I've just downloaded and updated my BIOS to the latest version (1850) and reset optimized defaults. I then attempted the previous settings listed by TwoCables but it wouldn't boot into a 46 multiplier like before which was odd, so I've reverted back to the only changes being x44 and VRM Freq. of 350 to get me booted. I've posted in the P67 owners thread in the hope that someone there can help me out too.

One thing I have noticed though (back on previous overclock) is that with only the multiplier and vrm frequency changed, at 4.4GHz my vcore (according to CPU-Z) is hitting as high as 1.380v! That seems FAR too high to me for 4.4, or am I tripping?


----------



## Tept

This could give you a baseline. I've seen a dozen different recommended safe voltage req's for these chips and I decided on a middle ground with a slight step to the left. 1.5v was someone, who made a very lengthy overclocking guide for Sandy Bridge, suggestion for the limit for max safe 24/7 voltage, this was the highest I found for safe daily use and 1 of the lowest I found was 1.4v. So I took a middle ground on that with the step, and chose 1.43v as my safemax and of course this depends on temps. You won't lose system idle with upping load line calibration. It might flucuate more than before but it won't just sit at max frequency the whole time.

Alot of cpu's dont like multipliers, a 44-45 multi and then tweaks to your bclk is what you wanna aim for. Seems BCLK requires less voltage than the hard multiplier. a 48 multi for me requires around 1.46v while I'm doing 4.770 right now @ 1.425v due to the high BCLK. And mine is about 10 runs of vantage, 5 of heaven, several games of SC2, 25 runs of Linx, 10 runs of IBT @ Very High, and 3hours of Prime95 blend stable.


----------



## Wogga

i'm running 24/7 5GHz @ 1.488v and some people even cant get higher than 4.5 @ 1.45v - Sandy OC is all about luck with CPU
btw Internal PLL Overvoltage helps to get higher clocks. i couldnt boot w/o it with multiplier higher than x47


----------



## RagingCain

Just a heads up 280.26 WHQL drivers have been released:
http://www.nvidia.com/object/win7-winvista-64bit-280.26-whql-driver.html

I don't think performance has changed a great deal but I will be switching in a few hours, *I will mention anything regarding voltage being unlocked*.

According to release notes: All previous fixes from 280.19 drivers included. With minor tweaking, mostly for extra 3D Profiles. Emphasis has been made on Crysis 2 and its 3D.

Claims to have Crysis 2 flickering in dual/tri SLI setups fixed. No mention of Quad, but then I didn't have flickering...

Looks like this isn't a major change, but has some tweaking. Definitely can't be bad.


----------



## Wogga

installed 280.26 and now i cant change the voltage...fixed 0.975 (btw not even stock voltage)


----------



## RagingCain

Quote:


> Originally Posted by *Wogga;14519748*
> installed 280.26 and now i cant change the voltage...fixed 0.975 (btw not even stock voltage)


Confirmed, voltage is locked down again, however 0.963v is no longer the maximum, my voltage still reads 0.975 like Wogga said. So I don't know if its what I have as my BIOS or Nvidia locked it to 0.975 and left default alone, i.e. changed maximum voltage 0.975v. I will play around with a BIOS flash later. Recommend to stick with 280.19 unless you are wanting to play Crysis 2...might test out a BIOS with the default voltage to 1.05 to see if it will allow that high.

I wouldn't get it upset just yet, it could just be a compromise for enthusiasts and maintain product safety. Most of us know that the sweet spot is about 725 MHz @ 0.975v anyways.

@Smo, Tept
I didn't want to leave it in the dark, so I hunted down the product specification sheet by Intel, and much to my surprise, Intel did say up to 1.52 (and by all estimates) Intel is usually on the conservative side.

I included the proof:

















Since there were notes, I included them too.

If you want a direct link:
http://www.intel.com/content/www/us/en/processors/core/2nd-gen-core-desktop-vol-1-datasheet.html


----------



## Robitussin

Do they still have voltage unlocked? Its only with 2.2 beta in afterburner right? SB overclocking is definitely more luck of the draw as you can see it took a good amount of vcore for me to get to 4.5 but I'm happy just to have a rig up and running again. From what I understand some chips just plain wont go past a certain multi


----------



## rush2049

The voltage is locked with 280.26

I can only go to .913mV now, instead of the .95mV I need to be stable 710 mhz.

I am going to test the crysis 2 fix claims, if it works I will stay on this one, if not I am going to revert for the overclock.

(I have a gtx 275 for physx and it causes flickering in crysis 2)


----------



## Wogga

hmmm...could i let PhysX to be done by CPU in 280.19? just found this option and didnt noticed that it was there before


----------



## rush2049

Quote:



The voltage is locked with 280.26

I can only go to .913mV now, instead of the .95mV I need to be stable 710 mhz.

I am going to test the crysis 2 fix claims, if it works I will stay on this one, if not I am going to revert for the overclock.

(I have a gtx 275 for physx and it causes flickering in crysis 2)


Confirm, crysis 2 flickering is gone, regardless of physX setting/setup.

Quote:



hmmm...could i let PhysX to be done by CPU in 280.19? just found this option and didnt noticed that it was there before


You could let your cpu do physx computation, but it is highly inefficient doing it.... It is much better to set one of the gpu's in an sli rig to do it. Or if you have a spare card to set it on that one.


----------



## Canis-X

Would a 9800GT severely hamper physX performance if I used it as a dedicated physX card in this setup? I really helped my previous quad Xfire setup when I had 2 5970's, not sure if the same could be said with my current GPU setup.


----------



## RagingCain

Quote:



Originally Posted by *Canis-X*


Would a 9800GT severely hamper physX performance if I used it as a dedicated physX card in this setup? I really helped my previous quad Xfire setup when I had 2 5970's, not sure if the same could be said with my current GPU setup.


In 590 SLI, my opinion would be that its worse. The 590 gpu is more than capable of handling all the data going around and Physx.

It would be worse to use the 9800GT, its a waste of power, waste of PCI-E Bandwidth, and a waste of CPU Cycles to get little or no benefit over using one of the 590 GPUs.

I don't have any data on this, but just an observational opinion.


----------



## Canis-X

That's what I was thinking, but wanted to be sure......thank you sir!!


----------



## Smo

Unfortunately my chip couldn't manage 4.8GHz at a voltage I considered acceptable... however, I have achieved a stable overclock at 4.7GHz which I'm very happy with! It has currently survived 45 minutes (when I started typing this post, now over 80 minutes) of turricans extreme settings in the Prime95 Blend test at 1.432v which I am content with for 24/7 use.

Max temps are below, with my CPU fan speed locked at 75%;

*Core 1 - 67c
Core 2 - 70c
Core 3 - 69c
Core 4 - 69c*

Prime95 is still going and temps are sat at 64-66c across the board. I'd like to extend a big thanks to everyone who offered their advice and helped me get this far.

Now, onto the interesting stuff - you may remember the whole reason I attempted the overclock was to test game performance. I ran 4 benchmarks with each game. These benchmarks were;

1. *CPU @ 4GHz, GTX 590 @ Stock (630MHz, 0.913v)*
2. *CPU @ 4GHz, GTX 590 Overclocked (670MHz, 0.950v)*
3. *CPU @ 4.7GHz, GTX 590 @ Stock (630MHz, 0.913v)*
4. *CPU @ 4.7GHz, GTX 590 Overclocked (670MHz, 0.950v)*

It is important to note that the drivers used for this test were 280.19, and the spec of the system tested were as follows;

- i5 2500k @ 4GHz & 4.7GHz
- Antec Kuhler H2O 920 CPU Cooler (Coolermaster Sickleflows P/P @ 75%)
- ASUS P8P67 Pro B3 Motherboard
- 8GB (2x4GB) G.SKill Ripjaws X RAM @ 8-8-8-24/1.5v
- EVGA Classified Edition GTX 590 @ 630MHz & 670MHz
- NZXT Hale90 850w Gold Power Supply

They were all tested on a *Benq XL2410T* monitor at *1920x1080/120Hz*.

Here are my results;

*Metro 2033*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 30.33*
Max. Framerate: 178.03
Min. Framerate: 2.74

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 32.33*
Max. Framerate: 192.20
Min. Framerate: 6.17

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 33.00*
Max. Framerate: 221.63
Min. Framerate: 6.70

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 32.00*
Max. Framerate: 190.15
Min. Framerate: 5.30

*Crysis*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 54.20*
Max. Framerate: 59.97
Min. Framerate: 23.24

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 55.22*
Max. Framerate: 61.74
Min. Framerate: 28.96

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 56.04*
Max. Framerate: 62.17
Min. Framerate: 31.55

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 73.70*
Max. Framerate: 81.56
Min. Framerate: 36.47

*Mafia II*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 100.8*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 106.9*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 107.6*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 109*
Max. Framerate: -
Min. Framerate: -

*GTA IV*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 62.8*
CPU Usage: 56%

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 63.97*
CPU Usage: 50%

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 62.1*
CPU Usage: 55%

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 64.65*
CPU Usage: 53%

Some interesting results to say the least! I must say though, that throughout these tests my GPU usage was all over the place - in all the games. It was barely, if ever, at a solid 90%+ usage for any length of time.


----------



## Tept

Quote:



Originally Posted by *Smo*


Unfortunately my chip couldn't manage 4.8GHz at a voltage I considered acceptable... however, I have achieved a stable overclock at 4.7GHz which I'm very happy with! It has currently survived 45 minutes (when I started typing this post, now over 80 minutes) of turricans extreme settings in the Prime95 Blend test at 1.432v which I am content with for 24/7 use.

Max temps are below, with my CPU fan speed locked at 75%;

*Core 1 - 67c
Core 2 - 70c
Core 3 - 69c
Core 4 - 69c*

Prime95 is still going and temps are sat at 64-66c across the board. I'd like to extend a big thanks to everyone who offered their advice and helped me get this far.

Now, onto the interesting stuff - you may remember the whole reason I attempted the overclock was to test game performance. I ran 4 benchmarks with each game. These benchmarks were;

1. *CPU @ 4GHz, GTX 590 @ Stock (630MHz, 0.913v)*
2. *CPU @ 4GHz, GTX 590 Overclocked (670MHz, 0.950v)*
3. *CPU @ 4.7GHz, GTX 590 @ Stock (630MHz, 0.913v)*
4. *CPU @ 4.7GHz, GTX 590 Overclocked (670MHz, 0.950v)*

It is important to note that the drivers used for this test were 280.19, and the spec of the system tested were as follows;

- i5 2500k @ 4GHz & 4.7GHz
- Antec Kuhler H2O 920 CPU Cooler (Coolermaster Sickleflows P/P @ 75%)
- ASUS P8P67 Pro B3 Motherboard
- 8GB (2x4GB) G.SKill Ripjaws X RAM @ 8-8-8-24/1.5v
- EVGA Classified Edition GTX 590 @ 630MHz & 670MHz
- NZXT Hale90 850w Gold Power Supply

They were all tested on a *Benq XL2410T* monitor at *1920x1080/120Hz*.

Here are my results;

*Metro 2033*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 30.33*
Max. Framerate: 178.03
Min. Framerate: 2.74

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 32.33*
Max. Framerate: 192.20
Min. Framerate: 6.17

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 33.00*
Max. Framerate: 221.63
Min. Framerate: 6.70

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 32.00*
Max. Framerate: 190.15
Min. Framerate: 5.30

*Crysis*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 54.20*
Max. Framerate: 59.97
Min. Framerate: 23.24

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 55.22*
Max. Framerate: 61.74
Min. Framerate: 28.96

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 56.04*
Max. Framerate: 62.17
Min. Framerate: 31.55

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 73.70*
Max. Framerate: 81.56
Min. Framerate: 36.47

*Mafia II*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 100.8*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 106.9*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 107.6*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 109*
Max. Framerate: -
Min. Framerate: -

*GTA IV*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 62.8*
CPU Usage: 56%

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 63.97*
CPU Usage: 50%

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 62.1*
CPU Usage: 55%

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 64.65*
CPU Usage: 53%

Some interesting results to say the least! I must say though, that throughout these tests my GPU usage was all over the place - in all the games. It was barely, if ever, at a solid 90%+ usage for any length of time.


Pick up The Witcher 2 if you wanna test if your gpu's run like they are supposed to.


----------



## Smo

Quote:



Originally Posted by *Tept*


Pick up The Witcher 2 if you wanna test if your gpu's run like they are supposed to.


I do have it actually - I haven't reinstalled it since I formatted my PC a couple of weeks ago. I could chuck it back on and benchmark it with Fraps I guess!

However - it's extremely poor that my GPUs would run at max in only one game...


----------



## RagingCain

Quote:



Originally Posted by *Smo*


Unfortunately my chip couldn't manage 4.8GHz at a voltage I considered acceptable... however, I have achieved a stable overclock at 4.7GHz which I'm very happy with! It has currently survived 45 minutes (when I started typing this post, now over 80 minutes) of turricans extreme settings in the Prime95 Blend test at 1.432v which I am content with for 24/7 use.

Max temps are below, with my CPU fan speed locked at 75%;

*Core 1 - 67c
Core 2 - 70c
Core 3 - 69c
Core 4 - 69c*

Prime95 is still going and temps are sat at 64-66c across the board. I'd like to extend a big thanks to everyone who offered their advice and helped me get this far.

Now, onto the interesting stuff - you may remember the whole reason I attempted the overclock was to test game performance. I ran 4 benchmarks with each game. These benchmarks were;

1. *CPU @ 4GHz, GTX 590 @ Stock (630MHz, 0.913v)*
2. *CPU @ 4GHz, GTX 590 Overclocked (670MHz, 0.950v)*
3. *CPU @ 4.7GHz, GTX 590 @ Stock (630MHz, 0.913v)*
4. *CPU @ 4.7GHz, GTX 590 Overclocked (670MHz, 0.950v)*

It is important to note that the drivers used for this test were 280.19, and the spec of the system tested were as follows;

- i5 2500k @ 4GHz & 4.7GHz
- Antec Kuhler H2O 920 CPU Cooler (Coolermaster Sickleflows P/P @ 75%)
- ASUS P8P67 Pro B3 Motherboard
- 8GB (2x4GB) G.SKill Ripjaws X RAM @ 8-8-8-24/1.5v
- EVGA Classified Edition GTX 590 @ 630MHz & 670MHz
- NZXT Hale90 850w Gold Power Supply

They were all tested on a *Benq XL2410T* monitor at *1920x1080/120Hz*.

Here are my results;

*Metro 2033*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 30.33*
Max. Framerate: 178.03
Min. Framerate: 2.74

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 32.33*
Max. Framerate: 192.20
Min. Framerate: 6.17

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 33.00*
Max. Framerate: 221.63
Min. Framerate: 6.70

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 32.00*
Max. Framerate: 190.15
Min. Framerate: 5.30

*Crysis*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 54.20*
Max. Framerate: 59.97
Min. Framerate: 23.24

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 55.22*
Max. Framerate: 61.74
Min. Framerate: 28.96

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 56.04*
Max. Framerate: 62.17
Min. Framerate: 31.55

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 73.70*
Max. Framerate: 81.56
Min. Framerate: 36.47

*Mafia II*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 100.8*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 106.9*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 107.6*
Max. Framerate: -
Min. Framerate: -

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 109*
Max. Framerate: -
Min. Framerate: -

*GTA IV*

*CPU @ 4GHz / 590 @ 630MHz*

*Average Framerate: 62.8*
CPU Usage: 56%

*CPU @ 4.7GHz / 590 @ 630MHz*

*Average Framerate: 63.97*
CPU Usage: 50%

*CPU @ 4GHz / 590 @ 670MHz*

*Average Framerate: 62.1*
CPU Usage: 55%

*CPU @ 4.7GHz / 590 @ 670MHz*

*Average Framerate: 64.65*
CPU Usage: 53%

Some interesting results to say the least! I must say though, that throughout these tests my GPU usage was all over the place - in all the games. It was barely, if ever, at a solid 90%+ usage for any length of time.


Good job sticking with it, you probably learned how to "drive" your system a little better, which is good for anyone. Hell, I am usually surprised every other day with something that greatly alters performance both good or bad.

Its what I thought, we are essentially at the mercy of Driver Team nVidia, but it was worth a good try. I will say this though, it can only get better... take that for what you will.

You should definitely jump in my 590 vs 6990 thread, I don't have another soul benchmarking.

Just read the main post (link is in my sig), and you will be in your own winner's circle of single 590s


----------



## RagingCain

630 MHz 590s vs 690 MHz 590s - DiRT 3 1920x1080x4/8AA

Stock









OC


----------



## MKHunt

Ohai, figured I should join. Currently stock, running 280.19 drivers, will OC when water cooling fittings arrive/when I get a game that can challenge this card.



I have a Koolance block I'll install when the fittings arrive. It's ballin.









ETA:
I would have included a 3dMark11 bench but it dislikes the beta drivers, and I can't be arsed to do 280.26 because my internet BLOWS.
3DMark Stock @ 4ghz: Here.
3DMark Overclocked @ 4Ghz: Here.
3DMark Stock @ 3.4Ghz: Here.
3DMark Overclocked @ 3.4Ghz: Here.

Heaven at 1080


Heaven 1050


----------



## HOOPAJOO

I got my full tower case in. Lian-Li A77F. I am no longer ashamed lol.

This is the rig after several hours of cable management:


















And here's my desk setup:


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14524598*
> Good job sticking with it, you probably learned how to "drive" your system a little better, which is good for anyone. Hell, I am usually surprised every other day with something that greatly alters performance both good or bad.
> 
> Its what I thought, we are essentially at the mercy of Driver Team nVidia, but it was worth a good try. I will say this though, it can only get better... take that for what you will.
> 
> You should definitely jump in my 590 vs 6990 thread, I don't have another soul benchmarking.
> 
> Just read the main post (link is in my sig), and you will be in your own winner's circle of single 590s


I've wanted to get involved in that thread for a while, but I thought something to do with my rig was faulty because of the erratic GPU usage - I haven't really seen anybody else complaining about it. It's very obvious that the GTX 590 is performing way below its capabilities.

*Edit:* I couldn't see your DiRT 3 screens earlier as I was on Android Tapatalk (which blows). I'm amazed at how your overclock on the 590s has affected the GPU usage - the difference is tremendous (even if FPS didn't change much).


----------



## RagingCain

Quote:



Originally Posted by *Smo*


I've wanted to get involved in that thread for a while, but I thought something to do with my rig was faulty because of the erratic GPU usage - I haven't really seen anybody else complaining about it. It's very obvious that the GTX 590 is performing way below its capabilities.

*Edit: *I couldn't see your DiRT 3 screens earlier as I was on Android Tapatalk (which blows). I'm amazed at how your overclock on the 590s has affected the GPU usage - the difference is tremendous (even if FPS didn't change much).


Thanks, believe it or not, everytime the average fps goes up by about 5, its considerably smoother faster overall. I have some amazing results for today. Of course I have no idea how amazing they are compared to other 590s or 6990s SINCE NO ONE IS BENCHING





































I just wanted to make an announcement to overclockers: do not go up to 1.05v. It must have some trigger in drivers to just crash your cards. Instead try doing everything you can from 1.038v. Just saved myself future countless hours of tweaking my OC's.

825 @ 1.05v - Completely unstable, tested 5 times (involving a reboot each time.)
830 @ 1.038v - Completely stable, tested 6 times (crashed at 55c+).

Also, the cards about seem much more stable below 55c. I watched my temps hit 57c today, after about 3 hours of benchmarking (DiRT 3) most if it actually in benchmark, and I went from stable to crash on 3 separate occasions. I let the cards cool down to idle 29/30c and the same overclock/overvoltage was stable for 7 runs x 3 minutes each. The max temperature was 47. After 5 more runs, I hit 55c on 3 of the GPUs, and then it just crashed halfway through the next run (I couldn't see temps before crash.) Again same overclock/voltage.


----------



## Smo

Quote:



Originally Posted by *RagingCain*


Thanks, believe it or not, everytime the average fps goes up by about 5, its considerably smoother faster overall. I have some amazing results for today. Of course I have no idea how amazing they are compared to other 590s or 6990s SINCE NO ONE IS BENCHING





































I just wanted to make an announcement to overclockers: do not go up to 1.05v. It must have some trigger in drivers to just crash your cards. Instead try doing everything you can from 1.038v. Just saved myself future countless hours of tweaking my OC's.

825 @ 1.05v - Completely unstable, tested 5 times (involving a reboot each time.)
830 @ 1.038v - Completely stable, tested 6 times (crashed at 55c+).

Also, the cards about seem much more stable below 55c. I watched my temps hit 57c today, after about 3 hours of benchmarking (DiRT 3) most if it actually in benchmark, and I went from stable to crash on 3 separate occasions. I let the cards cool down to idle 29/30c and the same overclock/overvoltage was stable for 7 runs x 3 minutes each. The max temperature was 47. After 5 more runs, I hit 55c on 3 of the GPUs, and then it just crashed halfway through the next run (I couldn't see temps before crash.) Again same overclock/voltage.


The voltage on these cards is a mystery!

I think I made a breakthrough with my GPU usage just now though - I sat and thought about exactly what's been changing with my PC recently when it hit me - my monitors. I'm now on BenQ XL2410Ts rather than the Dell 2407WFP, so the issue must be something to do with VSync and refresh rates.

I disabled VSync in game and my GPU usage shot up to 98% and hovered about there with the occassional dip (as is expected). However I get awful tearing still as Crysis runs in DX10, which for some reason appears to be locked at 60Hz. So it's a bit of a double-edged sword in that respect - I can either have erratic GPU usage and poor FPS, or an average of 70+ FPS, and horrible screen tearing. Awesome!

In other news it also appears that the 590 sits at the maximum allowed voltage if you're running two or more monitors from one card due to the power state. It also stops the core clocks from idling. I'm currently at 690MHz Core and 1380MHz shader @ 0.963v, but due to the power state, my card is constantly at 0.963v rather than dropping to the standard voltage of 0.913v.

I hate this personally, as I like to think that my components can 'rest' when I'm not using them and therefore live longer. This may not be factually correct, but it makes me feel better


----------



## RagingCain

Quote:



Originally Posted by *Smo*


The voltage on these cards is a mystery!

I think I made a breakthrough with my GPU usage just now though - I sat and thought about exactly what's been changing with my PC recently when it hit me - my monitors. I'm now on BenQ XL2410Ts rather than the Dell 2407WFP, so the issue must be something to do with VSync and refresh rates.

I disabled VSync in game and my GPU usage shot up to 98% and hovered about there with the occassional dip (as is expected). However I get awful tearing still as Crysis runs in DX10, which for some reason appears to be locked at 60Hz. So it's a bit of a double-edged sword in that respect - I can either have erratic GPU usage and poor FPS, or an average of 70+ FPS, and horrible screen tearing. Awesome!

In other news it also appears that the 590 sits at the maximum allowed voltage if you're running two or more monitors from one card due to the power state. It also stops the core clocks from idling. I'm currently at 690MHz Core and 1380MHz shader @ 0.963v, but due to the power state, my card is constantly at 0.963v rather than dropping to the standard voltage of 0.913v.

I hate this personally, as I like to think that my components can 'rest' when I'm not using them and therefore live longer. This may not be factually correct, but it makes me feel better










Suggest getting a 3rd monitor (and another 590), that crap is ridiculous, I have 3 monitors and all 4 GPUs are allowed to idle.

Vsync almost always lowers GPU usage, but I noticed that Vsync lowers GPU usage even if its not hitting your refresh rate (120 Hz). So, I agree Vsync definitely could use some work.

Considering how far multi-monitor support has come since launch with these cards, I can't really complain. Day one for me was excruciating. I didn't have 4 usable GPUs till the second week, it was stuck in 3 GPUs haha.

P.S. Benchmark!


----------



## Smo

Quote:



Originally Posted by *RagingCain*


Suggest getting a 3rd monitor (and another 590), that crap is ridiculous, I have 3 monitors and all 4 GPUs are allowed to idle.

Vsync almost always lowers GPU usage, but I noticed that Vsync lowers GPU usage even if its not hitting your refresh rate (120 Hz). So, I agree Vsync definitely could use some work.

Considering how far multi-monitor support has come since launch with these cards, I can't really complain. Day one for me was excruciating. I didn't have 4 usable GPUs till the second week, it was stuck in 3 GPUs haha.

P.S. Benchmark!


I wish - I don't have the money for another 590 just now, and nor will I for some time after that (plus I don't think I actually want one, considering scaling, VRAM etc). It's my Mum's birthday this week, my Dad two weeks after that then my Mrs a month after that! No money for me









I'll pop over to the benchmark thread and take a quick look at what's going on.

I also reinstalled the Witcher II and patch 1.3 - with everything fully maxed minus uber sampling. It was pushing the card extremely hard, but was still delivering over 60fps in fight scenes, 120+ indoors and 70-100 roaming. With the fan set at 95% the GPUs still hit 83c, talk about stress!


----------



## HOOPAJOO

For me, at stock speeds, Crysis 1 is completely playable with high settings at 2560x1600, without AA, but not on enthusiast settings. With high settings and no AA it gets between 40-80 fps--Mostly averaging around 60ish, and with 95% gpu utilization.


----------



## Manac0r

Been awhile, I was away and came back to the 280.26 WHQL. I installed them, and low and behold I could raise my voltage to 0.975v. I could lower it too, but before I was stuck at 0.963v.

Well with this increase in voltage I dedcided to see how it would affect my Overclock.

Before I was completely stable at 0.963v 700Mhz and 1800Mhz Mem. Temps never passed high 60's.

With tweak in voltage I got a stable OC at 0.975v 720Mhz 1850Mhz Mem. However under heavy loads temps creeped up to 83 DEGREES







These are water cooled as well.

Am I best off sticking to the original overclock? I m not too happy with the temps on my second OC. I do notice a great performance increase in Crysis 2, Witcher 2, but not sure the temp payoff is worth it.

Not sure if this counts as proof, but here are my 3dMark11 from my first OC, I will upload the results from second aftrer a wee bit of tweaking.

I uploaded my second results as a PDF, increase of about 190


----------



## RagingCain

Quote:


> Originally Posted by *Manac0r;14550334*
> Been awhile, I was away and came back to the 280.26 WHQL. I installed them, and low and behold I could raise my voltage to 0.975v. I could lower it too, but before I was stuck at 0.963v.
> 
> Well with this increase in voltage I dedcided to see how it would affect my Overclock.
> 
> Before I was completely stable at 0.963v 700Mhz and 1800Mhz Mem. Temps never passed high 60's.
> 
> With tweak in voltage I got a stable OC at 0.975v 720Mhz 1850Mhz Mem. However under heavy loads temps creeped up to 83 DEGREES
> 
> 
> 
> 
> 
> 
> 
> These are water cooled as well.
> 
> Am I best off sticking to the original overclock? I m not too happy with the temps on my second OC. I do notice a great performance increase in Crysis 2, Witcher 2, but not sure the temp payoff is worth it.
> 
> Not sure if this counts as proof, but here are my 3dMark11 from my first OC, I will upload the results from second aftrer a wee bit of tweaking.
> 
> I uploaded my second results as a PDF, increase of about 190


How many radiators do you have, and what is your water cooling setup? That temperature is completely unacceptable.

And what is ambient (room) temperature... this isn't a desert is it?


----------



## Manac0r

Quote:



Originally Posted by *RagingCain*


How many radiators do you have, and what is your water cooling setup? That temperature is completely unacceptable.

And what is ambient (room) temperature... this isn't a desert is it?


Room temp is about 26 degrees.

I have 2 360m radiators, cards on one loop, cpu on another.

The 80's seemed to be a one off, but when I increase the clock speed past 700mhz I do tend to reach the 70's.

Idle temps are mid 40's.


----------



## capchaos

Those temps are high for 2 360 rads


----------



## RagingCain

Very high, mine are warm and idle at 29 max temp of 4 hours at 59c. I am using 2x360s right now.

I need another 360, idle temp will be about 25, max temp around 45c.

Is it just one GPU / one card?

Does your CPU or whatever you water cool get hot too?


----------



## Manac0r

Quote:



Originally Posted by *RagingCain*


Very high, mine are warm and idle at 29 max temp of 4 hours at 59c. I am using 2x360s right now.

I need another 360, idle temp will be about 25, max temp around 45c.

Is it just one GPU / one card?

Does your CPU or whatever you water cool get hot too?


CPU hits 60 degrees under heavy load.

At stock speeds, the card idle at around 30 and go up to high 50's. I only get this increase in temp when overclocking.

Clock 700Mhz I get 65-67 degrees Max.
Clock 720Mhz I get 75-77 degrees.

Question: Good or bad idea to have the water tubing tied up. Trying to maximize airflow in case I tied up several of my water cooling tubes. Might this be causing it?

I have a very bad pic attatched, camera issues at the moment but if you squint real hard you will see what I mean.

Should I add the PC has been on all day, doing some Adobe work in After Effect/Photoshop, followed by 6 hours of intense gaming.

E: Seems to have stabled out. Staying in High 50's now when at 700Mhz. Thanks for the input guys. Think I just been burning the midnight oil literally.


----------



## HOOPAJOO

Are you running antifreeze instead of water?


----------



## Smo

Those are crazy temps, at 700MHz my card is only 5c hotter than yours (mine hits 73c @ 95% fan) and I'm on air. Something is definitely wrong - what sort of fans are on your rad?


----------



## HOOPAJOO

Maybe your pc has low self esteem. "Look, we're not on air anymore. You don't have to run so hot anymore."


----------



## Manac0r

Quote:


> Originally Posted by *Smo;14554678*
> Those are crazy temps, at 700MHz my card is only 5c hotter than yours (mine hits 73c @ 95% fan) and I'm on air. Something is definitely wrong - what sort of fans are on your rad?


By the way if you want that clock to idle correctly on multimonitor setup, in nvidia inspector, if youi right clock on overclocking, you see multi display power saver, you can set it up so it will drop that clock for you. you can specify when to idle, might be useful to you.

Even with 2 x 590 and my monitors plugged into seperate cards, I still get one clock constantly at full speed. Only disabling SLi makes them idle correctly.


----------



## RagingCain

Quote:


> Originally Posted by *Manac0r;14556714*
> By the way if you want that clock to idle correctly on multimonitor setup, in nvidia inspector, if youi right clock on overclocking, you see multi display power saver, you can set it up so it will drop that clock for you. you can specify when to idle, might be useful to you.
> 
> Even with 2 x 590 and my monitors plugged into seperate cards, I still get one clock constantly at full speed. Only disabling SLi makes them idle correctly.


Are you on two monitors or three?










Like I have mentioned earlier if 3 monitors can idle, I don't see why 2 can't.

Edit:
Oh wow, apparently this is HUGE deal for all nVidia users! I had no idea. I just looked it all up and Manac0r is right, all you have to do is enabled Multi-Monitor power saving, just becareful utilizing Photoshop with GPU / CUDA enabled or a HD movie between two monitors.

Edit:
http://forums.anandtech.com/showpost.php?p=31634894&postcount=28


----------



## Smo

Thanks for the suggestion but that REALLY didn't work for me - it turned my displays into a sea of god knows what. Looked like the graphics card was dead!

Turned it back off


----------



## MKHunt

So I did some benchmarks with RagingCain's help (total bench noob) and here are the results:

3DMark Stock 630/1728 @ 3.4Ghz: P9458 Gfx 9951
Here.

3DMark Stock 630/1728 @ 4ghz: P9671 Gfx 9990
Here.

3DMark Overclocked 670/1758 @ 3.4Ghz: P9887 Gfx 10525
Here.

3DMark Overclocked 670/1758 @ 4Ghz: P10096 Gfx 10514
Here.

Oh man benchmarking can be fun. Now I really can't wait for water. Raging Cain you can bet I'll throw my 590 in your benching thread once it's under water!


----------



## RagingCain

Quote:



Originally Posted by *Smo*


Thanks for the suggestion but that REALLY didn't work for me - it turned my displays into a sea of god knows what. Looked like the graphics card was dead!

Turned it back off










I am sorry it didn't work









I am not a huge fan of using nVidia Inspector (most of the time it doesn't work whatever I am doing) but I like having it to experiment with.

Quote:



Originally Posted by *MKHunt*


So I did some benchmarks with RagingCain's help (total bench noob) and here are the results:

3DMark Stock 630/1728 @ 3.4Ghz: P9458 Gfx 9951
Here.

3DMark Stock 630/1728 @ 4ghz: P9671 Gfx 9990
Here.

3DMark Overclocked 670/1758 @ 3.4Ghz: P9887 Gfx 10525
Here.

3DMark Overclocked 670/1758 @ 4Ghz: P10096 Gfx 10514
Here.

Oh man benchmarking can be fun. Now I really can't wait for water. Raging Cain you can bet I'll throw my 590 in your benching thread once it's under water!










Would be awesome! Benching is totally addictive, but I have to remember to actually play my games once in a while or I burn out.


----------



## ilukeberry

Does anyone else has this problem: fan randomly spins up (to 95%) on GTX 590.. It happens during desktop use or in game, it happened twice today.. I've had this card since April and it never happened before. I'm currently using 280.26 drivers.


----------



## Smo

Quote:



Originally Posted by *ilukeberry*


Does anyone else has this problem: fan randomly spins up (to 95%) on GTX 590.. It happens during desktop use or in game, it happened twice today.. I've had this card since April and it never happened before. I'm currently using 280.26 drivers.


The new 280.26 drivers are very buggy - on the GeForce forums there are a lot of unhappy people (one in particular I remember mentioning a similar problem). I would suggest reverting to 280.19 beta for a short while just to see if it happens again.

That said, are you using custom fan profiles in a program such as Afterburner or Precision?


----------



## ilukeberry

No i'm using default auto profile. I'm going back to 275.33


----------



## emett

Yeah i also had this happen a couple times after i've changed drivers but am not to worried as both time i have straight away checked the gpu temps and they are normal.


----------



## ilukeberry

GPU temps were fine on my card too. I've checked it in afterburner.. it was just a spin up of fan to 95% and then it slowly went back to 40%. Now I'm back on 275.33 and so far i didn't have this issue.


----------



## emett

Yeah same issue, it's bloody loud when it happens. I ripped the side of my pc off to so i could tell which fan was making the noise.


----------



## Manac0r

Never mind. Fixed it.


----------



## Jeppzer

I just saw this behemoth.
I'm in love.










ROG Mars II

Anyone wanna buy two gtx 590's?








If I can find a mobo that can hold two mars II in SLI...







.


----------



## RagingCain

Quote:


> Originally Posted by *Jeppzer;14583584*
> I just saw this behemoth.
> I'm in love.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ROG Mars II
> 
> Anyone wanna buy two gtx 590's?
> 
> 
> 
> 
> 
> 
> 
> 
> If I can find a mobo that can hold two mars II in SLI...
> 
> 
> 
> 
> 
> 
> 
> .


I really really really want 2 of these cards. I am hoping all my hardwork on getting the 590s more awesome will earn me enough Karma points that two MARS IIs show up in the mail. With a cherry on top that they were addressed to a Steve Jobs. SO I would be getting his cards would make me non-cynical... for a week.


----------



## Jeppzer

A PCI-E extender and I'd have enough room on my mobo for two of those...

Now I need to find a buyer for my 590's.








And a big hopetogod they'd still have two of those left whenever I place an order.

Tho, I read somewhere that asus are close to revealing an ARES MARS GTX 595...

Maybe wait for that and then blow my savings on an upgrade.


----------



## RagingCain

Quote:


> Originally Posted by *Jeppzer;14585138*
> A PCI-E extender and I'd have enough room on my mobo for two of those...
> 
> Now I need to find a buyer for my 590's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And a big hopetogod they'd still have two of those left whenever I place an order.
> 
> Tho, I read somewhere that asus are close to revealing an ARES MARS GTX 595...
> 
> Maybe wait for that and then blow my savings on an upgrade.


This is what the rumors of a 595 were. This card is really what we hoped the 590 would be.


----------



## MKHunt

I thought the MARS 2 looked interesting, but suddenly, this.

Also, only 3gb of memory. It's not that much of an upgrade to the 590 but for 2x as much as an EVGA HydroCopper. Yikes. I'd rather wait for the flagship 600 series


----------



## Opp47

Wow.. that thing looks awesome...









Kinda pricey tho.. Dontcha think..









*BOTH* my HydroCoppers cost just a tad more than one of those MARS monsters..
just doesnt seem to be worth the extra coin IMO..


----------



## jcde7ago

I'd definitely stick to a pair of 590s OC'd to 700-720 core each over a pair of these - not going to lie, the MARS II is a beast of a card, but it's really not worth the dough considering that you can score a pair of 590s for ONE of these cards...yeah, quad-SLI scaling isn't that great, but it's still a lot more power under the hood.

Also, once I move to a Sandy Bridge/U3011 setup, i'll be looking at Southern Islands/Kepler since 1.5GB of VRAM won't be nearly enough...


----------



## Wogga

all news about Ares 595 are about 3 months old..aint they changed it to mars 2?


----------



## Jeppzer

Quote:


> The beast supports 4-way SLI


Now we're talking expensive, 4 mars 2 cards..
I'm not going to lie, I am drooling right now.


----------



## Smo

Quote:


> Originally Posted by *Jeppzer;14588231*
> Now we're talking expensive, 4 mars 2 cards..
> I'm not going to lie, I am drooling right now.


Not 4 mate, just 2. Remember they're two GPUs per PCB.


----------



## Wogga

cool, but no watercooling -_-


----------



## Masked

We have 2 of the MARS II in the office and they're really not worth it...Loud too.

That aside, in a meeting this morning w/Nvidia a birdie told me that in a few months the cards will be opened up a bit more...Allow for a bit more OC'ing room.

Considering I can hit 750mhz at @.96v I'm some-what happy.

Still very...Unimpressed by this series of card...Especially since mine isn't performing exactly up to "par".

Find myself using the damn 6990's more than I do the 590's these days.


----------



## CSHawkeye

Just wondering as I might go back to a GTX 590 until the new stuff comes out how well this does in World of Tanks??


----------



## Chobbit

Hi, maybe you'se could help me to join this club.

I have an opportunity to buy a Zotac 590 which is classed as fualty but still works in games, the fault is described; by someone who doesn't understand hardware that good but apparently someone else has told them "That only one of the two chip is showing as working in its properties".

This sounds to me like both cores might be working fine, however only one is being displayed in either 'device manager' or 'GPU-ID' etc, and it could just be driver issues or need flashing.

Does anyone else think this could be the case or have had this issue?

Also what do you'se think would be a good price to pay for a second hand GTX590 where in the end only one chip may be working and probably wouldn't have any warrenty seems as its being offered to me and not returned?

Thanks


----------



## Opp47

Quote:



Originally Posted by *Chobbit*


Hi, maybe you'se could help me to join this club.

I have an opportunity to buy a Zotac 590 which is classed as fualty but still works in games, the fault is described; by someone who doesn't understand hardware that good but apparently someone else has told them "That only one of the two chip is showing as working in its properties".

This sounds to me like both cores might be working fine, however only one is being displayed in either 'device manager' or 'GPU-ID' etc, and it could just be driver issues or need flashing.

Does anyone else think this could be the case or have had this issue?

Also what do you'se think would be a good price to pay for a second hand GTX590 where in the end only one chip may be working and probably wouldn't have any warrenty seems as its being offered to me and not returned?

Thanks


I think drivers or a new flash might do the trick... but.... 
make sure u dont pay more than 2 or 300 for it in case it ends up really not working


----------



## Masked

Quote:



Originally Posted by *Chobbit*


Hi, maybe you'se could help me to join this club.

I have an opportunity to buy a Zotac 590 which is classed as fualty but still works in games, the fault is described; by someone who doesn't understand hardware that good but apparently someone else has told them "That only one of the two chip is showing as working in its properties".

This sounds to me like both cores might be working fine, however only one is being displayed in either 'device manager' or 'GPU-ID' etc, and it could just be driver issues or need flashing.

Does anyone else think this could be the case or have had this issue?

Also what do you'se think would be a good price to pay for a second hand GTX590 where in the end only one chip may be working and probably wouldn't have any warrenty seems as its being offered to me and not returned?

Thanks


That actually happens frequently in most games...By far not a faulty card...

If one of the chips didn't work, the card, itself, wouldn't work.


----------



## RagingCain

Quote:



Originally Posted by *Masked*


That actually happens frequently in most games...By far not a faulty card...

If one of the chips didn't work, the card, itself, wouldn't work.


That's what I would say. Perhaps its just a driver / game issue too. Have the heart to tell the poor guy









Post powered by DROID X2


----------



## Mixan

Have been following this nice thread for some time now and I would like to join









I have just started to overclock my gtx590 and wondered if I need to mod and flash my bios to be able to use 1v or more for my gpu cores?

I'm using 280.19 drivers and Im currently locked to 0.988mv in afterburner since 1v doesnt give me any boost while benching with FurMark, so I feel limited while temps are still good


----------



## Smo

Welcome to the club mate - the 590 overclocks fairly well actually considering how much people spread that they're totally awful and will explode. There are people running up to and over 200Mhz more than stock at 1.05/1.063v and their cards are alright.

I'm currently overclocked to 700/144 @ 0.950v on an EVGA 590 (stock 630/1260 @ 0.913v) and the card is perfectly fine, I've limited myself to this overclock due to my temperatures hitting ~73c at full load. I could go higher, but I like that temp!

Strangely though the 280.19 drivers should allow you to reach 1.05v, I think 1.063v requires one of Cain's modified BIOSes (available through the link in his sig).

Let us know how you get on


----------



## Masked

Quote:



Originally Posted by *Smo*


Welcome to the club mate - the 590 overclocks fairly well actually considering how much people spread that they're totally awful and will explode. There are people running up to and over 200Mhz more than stock at 1.05/1.063v and their cards are alright.

I'm currently overclocked to 700/144 @ 0.950v on an EVGA 590 (stock 630/1260 @ 0.913v) and the card is perfectly fine, I've limited myself to this overclock due to my temperatures hitting ~73c at full load. I could go higher, but I like that temp!

Strangely though the 280.19 drivers should allow you to reach 1.05v, I think 1.063v requires one of Cain's modified BIOSes (available through the link in his sig).

Let us know how you get on










You know...One of the interns came in this morning and actually repeated this which, is actually the only reason I'm commenting.

If you take your card above 1.05v on either core, the other raises automatically to 1.063 etc, to compensate.

The only instances of the GTX 590 "blowing up" is over 1.05v which is why your warranty (If you read the 3/4 press releases) only covers you up to 1.04/1.03 (That includes the differential, 1.05v is your absolute wall).

If you have AIDA or something equally as in-depth, it will give you the precise V reading of core 1 and core 2...There will always be a .01-.015 differential.

So essentially what I'm saying is, if you take this card above 1.05, you do so at your own risk and if it blows up, it's your fault because they've told you...They will NOT cover your warranty.

I actually had to refuse, 2 or 3 warranties this past week...One was because a customer took his card to 1.15 and when the 590 fried, expected me to send him a new one...Not only was he a righteous prick but, NVIDIA has changed their policy VIA vendors and the warranty of this card (Again, why I said 3/4 press releases).

Another customer took his card to 1.10 and I had to tell him, "Sorry, I sent you the press releases in our newsletters, if you didn't read them, it was your responsibility...And I'm sorry but, at this time I cannot honor your warranty per NVIDIA."

As long as you stay under 1.05 on BOTH cores, your warranty is still covered by the vast majority of manufacturers.

I'm not saying all manufacturers will deny you the warranty...I'm saying, they've warned you several times and given you documentation...If you take it beyond that, don't be surprised when a vendor refuses to honor your warranties.


----------



## Smo

Quote:


> Originally Posted by *Masked;14600459*
> You know...One of the interns came in this morning and actually repeated this which, is actually the only reason I'm commenting.
> 
> If you take your card above 1.05v on either core, the other raises automatically to 1.063 etc, to compensate.
> 
> The only instances of the GTX 590 "blowing up" is over 1.05v which is why your warranty (If you read the 3/4 press releases) only covers you up to 1.04/1.03 (That includes the differential, 1.05v is your absolute wall).
> 
> If you have AIDA or something equally as in-depth, it will give you the precise V reading of core 1 and core 2...There will always be a .01-.015 differential.
> 
> So essentially what I'm saying is, if you take this card above 1.05, you do so at your own risk and if it blows up, it's your fault because they've told you...They will NOT cover your warranty.
> 
> I actually had to refuse, 2 or 3 warranties this past week...One was because a customer took his card to 1.15 and when the 590 fried, expected me to send him a new one...Not only was he a righteous prick but, NVIDIA has changed their policy VIA vendors and the warranty of this card (Again, why I said 3/4 press releases).
> 
> Another customer took his card to 1.10 and I had to tell him, "Sorry, I sent you the press releases in our newsletters, if you didn't read them, it was your responsibility...And I'm sorry but, at this time I cannot honor your warranty per NVIDIA."
> 
> As long as you stay under 1.05 on BOTH cores, your warranty is still covered by the vast majority of manufacturers.
> 
> I'm not saying all manufacturers will deny you the warranty...I'm saying, they've warned you several times and given you documentation...If you take it beyond that, don't be surprised when a vendor refuses to honor your warranties.


Thank you very much for that - I'm glad to know it now! Fortunately for me I don't really want to increase my voltage further due to my operating temperatures. Not only that, but my warranty is important to me. Looks like I'm staying at 700MHz for the moment, and the max voltage I would assume comfortable with would be more like 0.988v









Once again, cheers. + rep.


----------



## CSHawkeye

Might be trying my card again, Smo any suggestions on drivers and settings. Which app did you use to OC the card??


----------



## Smo

Quote:



Originally Posted by *CSHawkeye*


Might be trying my card again, Smo any suggestions on drivers and settings. Which app did you use to OC the card??


I'd go with driver set 280.19 and overclock with MSI Afterburner (although Precision should be fine). I've set my core voltage to 0.950v (950) for both cores and Core Clock to 700Mhz. I then ran Kombustor for 15 minutes, then the 3DMark11 and Heaven 2.5 benchmarks to check maximum temps and stability. My fan profile is set to kick in at 95% over 60c and keeps my max temp at ~73c.


----------



## Jobotoo

Is there any probability of seeing a 6GB version of the 590? I would like to get a 590, but it does not have quite enough VRAM for my resolution.


----------



## 2010rig

Quote:



Originally Posted by *Jobotoo*


Is there any probability of seeing a 6GB version of the 590? I would like to get a 590, but it does not have quite enough VRAM for my resolution.


I think the closest bet would've been the ASUS Mars II, but that one is 1.5GB / GPU.

I haven't heard of any other manufacturers working on a refresh.


----------



## CSHawkeye

Quote:



Originally Posted by *Jobotoo*


Is there any probability of seeing a 6GB version of the 590? I would like to get a 590, but it does not have quite enough VRAM for my resolution.


I have the same monitor, will let you know how it goes..


----------



## Smo

Quote:



Originally Posted by *Jobotoo*


Is there any probability of seeing a 6GB version of the 590? I would like to get a 590, but it does not have quite enough VRAM for my resolution.


There hasn't been any indication as yet - NVIDIA wanted to keep the PCB of the 590 as small as they possibly could, which unfortunately is why the first revision of the 590 (the 501) has 'cost effective' (read as; not particularly good) VRMs. In all fairness I doubt that they will ever release a 6GB model, especially with Kepler being just around the corner.

I'm also 100% certain that you will hit the VRAM wall at that resolution, I max out my memory at 1080p on Crysis 2 with the High-Res pack and DX11 patches.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Smo;14603424*
> There hasn't been any indication as yet - NVIDIA wanted to keep the PCB of the 590 as small as they possibly could, which unfortunately is why the first revision of the 590 (the 501) has 'cost effective' (read as; not particularly good) VRMs. In all fairness I doubt that they will ever release a 6GB model, especially with Kepler being just around the corner.
> 
> I'm also 100% certain that you will hit the VRAM wall at that resolution, I max out my memory at 1080p on Crysis 2 with the High-Res pack and DX11 patches.


There is a post here somewhere by RagingCain that explains that this is not the case.

I can't remember the specifics, but essentially he was explaining with a lot better clarity than I am now, that the game isn't really hitting a VRam limit, but is just scaling memory, so it looks as if it's always maxed.

If you're getting good framerates most of the time at the resolution you're worried about, then you're probably not hitting a VRam wall.

But what about micro stuttering? Once might ask, in regards to the VRam wall..

Check your HD, I say. Unless you're running an SSD or a Velociraptor, often that micro stuttering is your HD not being fast enough (even at 7200rpm) to keep up with the cards/games and one sees jitters or frame rate stutters as the HD slows down the game to feed it bits of info.

But SMO, maybe in your case, you are hitting the wall running two XL2410T (so Jedi!)

Yet, if running one monitor, even an IPS at 2560x1600, I'd be surprised if walls were being hit especially with Quad SLI.

But I'll be the first to admit, I'm a gamer first and not a programmer or Hardware specialist, so I could be wrong on all of this.

I'm sure RagingCain can clarify when he pops on here.

Personally, I don't think one, or even two 590's is going to hit the same limitations that people were complaining about last May with the 280.19 driver.

But maybe I'm just being biased? Because unlike a few, I freaking love my cards. I don't overclock them, I just game, game, game. Often at 120-200fps+ no matter the game.

If you're an overclocker or system pusher 'only' then I can see why one may have problems with these cards. But if you bought them solely to play games, and think you're still going to have issues, then let me send you to rehab. Because that is some serious crack you're smoking.

For me, these cards just keep getting better and better


----------



## Smo

Quote:


> Originally Posted by *Shinobi Jedi;14603958*
> There is a post here somewhere by RagingCain that explains that this is not the case.
> 
> I can't remember the specifics, but essentially he was explaining with a lot better clarity than I am now, that the game isn't really hitting a VRam limit, but is just scaling memory, so it looks as if it's always maxed.
> 
> If you're getting good framerates most of the time at the resolution you're worried about, then you're probably not hitting a VRam wall.
> 
> But what about micro stuttering? Once might ask, in regards to the VRam wall..
> 
> Check your HD, I say. Unless you're running an SSD or a Velociraptor, often that micro stuttering is your HD not being fast enough (even at 7200rpm) to keep up with the cards/games and one sees jitters or frame rate stutters as the HD slows down the game to feed it bits of info.
> 
> I probably have this all backwards or wrong, but I'm sure RagingCain can clarify when he pops on here.
> 
> Personally, I don't think one, or even two 590's is going to hit the same limitations that people were complaining about last May with the 280.19 driver.
> 
> But maybe I'm just being biased? Because unlike a few, I freaking love my cards. I don't overclock them, I just game, game, game. Often at 120-200fps+ no matter the game.
> 
> If you're an overclocker or system pusher 'only' then I can see why one may have problems with these cards. But if you bought them solely to play games, and think you're still going to have issues, then let me send you to rehab. Because that is some serious crack you're smoking.
> 
> For me, these cards just keep getting better and better


Interesting! I'll have a look through his posts and see if I can find it. Admittedly I am still getting ~80-100fps in Crysis 2 with the DX11 patches and Hi-Res textures so you may well be right! MSI Afterburner has been lying to me, haha.


----------



## Jobotoo

Quote:


> Originally Posted by *jcde7ago;14586616*
> you can score a pair of 590s for ONE of these cards.


A pair of 590s for a grand? Where? I might get that!


----------



## Jobotoo

Quote:


> Originally Posted by *Shinobi Jedi;14603958*
> There is a post here somewhere by RagingCain that explains that this is not the case.
> 
> I can't remember the specifics, but essentially he was explaining with a lot better clarity than I am now, that the game isn't really hitting a VRam limit, but is just scaling memory, so it looks as if it's always maxed.
> 
> If you're getting good framerates most of the time at the resolution you're worried about, then you're probably not hitting a VRam wall.
> 
> But what about micro stuttering? Once might ask, in regards to the VRam wall..
> 
> Check your HD, I say. Unless you're running an SSD or a Velociraptor, often that micro stuttering is your HD not being fast enough (even at 7200rpm) to keep up with the cards/games and one sees jitters or frame rate stutters as the HD slows down the game to feed it bits of info.
> 
> But SMO, maybe in your case, you are hitting the wall running two XL2410T (so Jedi!)
> 
> Yet, if running one monitor, even an IPS at 2560x1600, I'd be surprised if walls were being hit especially with Quad SLI.
> 
> But I'll be the first to admit, I'm a gamer first and not a programmer or Hardware specialist, so I could be wrong on all of this.
> 
> I'm sure RagingCain can clarify when he pops on here.
> 
> Personally, I don't think one, or even two 590's is going to hit the same limitations that people were complaining about last May with the 280.19 driver.
> 
> But maybe I'm just being biased? Because unlike a few, I freaking love my cards. I don't overclock them, I just game, game, game. Often at 120-200fps+ no matter the game.
> 
> If you're an overclocker or system pusher 'only' then I can see why one may have problems with these cards. But if you bought them solely to play games, and think you're still going to have issues, then let me send you to rehab. Because that is some serious crack you're smoking.
> 
> For me, these cards just keep getting better and better


That is interesting. How can I know for sure. There is basically only one game that I want to play, probably for a few years, and at the moment precision indicates that all my VRAM, 1GB, is being used up. For the most part I get 40-60FPS, with shadows turned off. Shadows are not critical to me, but clipping and view distance is, so I want to be able to crank it up all the way and it still run smooth at 60+FPS. I am guessing that a GTX580 would allow me to get better FPS, not sure how much more though, and not sure if the 1.5GB of VRAM are going to cut it. So I am debating 2xGTX580 1.5GB, GTX580 3GB, or GTX590 3GB. I understand that the first and third option are 1.5GB x 2, and not really 3GB.

2 x GTX590 sounds awesome, but not sure how much I would benefit.

Any help steering me in the right direction would be appreciated. And thanks for the help so far.


----------



## Jobotoo

Quote:


> Originally Posted by *CSHawkeye;14603387*
> I have the same monitor, will let you know how it goes..


Thanks! How is the GTX580SLI working out? I'm guessing those are 1.5GB?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Jobotoo;14605506*
> That is interesting. How can I know for sure. There is basically only one game that I want to play, probably for a few years, and at the moment precision indicates that all my VRAM, 1GB, is being used up. For the most part I get 40-60FPS, with shadows turned off. Shadows are not critical to me, but clipping and view distance is, so I want to be able to crank it up all the way and it still run smooth at 60+FPS. I am guessing that a GTX580 would allow me to get better FPS, not sure how much more though, and not sure if the 1.5GB of VRAM are going to cut it. So I am debating 2xGTX580 1.5GB, GTX580 3GB, or GTX590 3GB. I understand that the first and third option are 1.5GB x 2, and not really 3GB.
> 
> 2 x GTX590 sounds awesome, but not sure how much I would benefit.
> 
> Any help steering me in the right direction would be appreciated. And thanks for the help so far.


Which game are you talking about? BF3? I was in the Alpha and there wasn't any customization other than low, med, high settings. But at High settings, I was getting around 90-200fps. The variance I attribute to the code, but I'm sure that one 590 will be able to play the game at Max settings at 2560x1600. I may be wrong in the end, but it seems that way to me. I'll have a better idea as will many others when the beta opens up next month.

Also, remember, while SP campaigns have eye candy that can crush systems, usually some of the more crippling features are turned off by default to improve or remove any lag in MP. So, even more so, one 590 should be enough.

But, if that's not the game you're thinking of, then I'm not sure.


----------



## Jobotoo

I would think that BF3 will be more demanding than the game I'm playing, which is actually SWTOR. Everyone has told me that my current vidcard is overkill, but in reality its not.


----------



## Masked

Quote:


> Originally Posted by *Jobotoo;14606651*
> I would think that BF3 will be more demanding than the game I'm playing, which is actually SWTOR. Everyone has told me that my current vidcard is overkill, but in reality its not.


It actually is.

When/if they get the hardware profiles working, the video card is overkill.

The issue is that in the latest drivers, the core usage isn't properly being reported...You can actually "tell" because the 2nd core is just within range, heat wise of the first core...The heat wouldn't exactly fluidly transfer over the PCB especially if you're on water so, it's in use.

In earlier drivers, the card is/was overkill, the same will be true once it pulls out of this "slump".

In SWTOR, myself and the others with the 590, report 40-50FPS at the lowest and in BF3 Alpha on high, the lowest so far has been 80.

SWTOR is a bit different, though, because it's processing a metric ton of people, environment etc...

So it actually //IS// overkill, all of fermi is...BF3 actually recommends 1x580 and SWTOR's entire system is virtually based around "the average user" ala a GTX 560.


----------



## emett

Mate as of right now there are no offical system requirements out for bf3. So how can you say they recommend a 580? Even if this was the recommended setup won't normally run a game on max setting.


----------



## Masked

Quote:


> Originally Posted by *emett;14614056*
> Mate as of right now there are no offical system requirements out for bf3. So how can you say they recommend a 580? Even if this was the recommended setup won't normally run a game on max setting.


...









My entire office has been in BF3 since day 1, were in Beta of BF2...Let's say we have a very open stream of communication.

The game is "developed" for the top end of what's available, that's your high setting...What's the most top end, commonly available card atm? The 580...

As anyone in alpha noticed, HIGH on the 580 in the alpha wasn't "maxing out" as you so toss out there...It was still getting 60-80FPS which is 1/4 over what's "accepted" as capable.

For ultra, the 580 will be getting @50fps, which considering we're all 590 owners, apparently...There won't be any problems.

Have been in SWTOR since Alpha as well and like I said, the only issue in an MMORPG is that detail is immediately processed, not buffered like it is in BF3.

P.S. "Mate", as a vendor, I see the system requirements and comment on them FAR before the retail public does...We even make a few suggestions, occasionally.


----------



## Smo

Quote:


> Originally Posted by *Masked;14614185*
> ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My entire office has been in BF3 since day 1, were in Beta of BF2...Let's say we have a very open stream of communication.
> 
> The game is "developed" for the top end of what's available, that's your high setting...What's the most top end, commonly available card atm? The 580...
> 
> As anyone in alpha noticed, HIGH on the 580 in the alpha wasn't "maxing out" as you so toss out there...It was still getting 60-80FPS which is 1/4 over what's "accepted" as capable.
> 
> For ultra, the 580 will be getting @50fps, which considering we're all 590 owners, apparently...There won't be any problems.
> 
> Have been in SWTOR since Alpha as well and like I said, the only issue in an MMORPG is that detail is immediately processed, not buffered like it is in BF3.
> 
> P.S. "Mate", as a vendor, I see the system requirements and comment on them FAR before the retail public does...We even make a few suggestions, occasionally.


If I may, where is it that you work exactly?


----------



## Masked

Quote:


> Originally Posted by *Smo;14614325*
> If I may, where is it that you work exactly?


Alienware, am the Admin of the "beta office"...We operate/test all potential builds, beta test an incredible amount and test every AW system currently on the market...We're also considered a vendor. My sig, is actually my work PC...

As I've stressed from day 1 of this thread, there are driver issues, until Nvidia pulls their heads out of their *ahems*, we're all along for the ride.

That does //NOT// mean this card isn't capable...Nor that it will not perform...It still performs 1.6x a 580...Hell, at 800mhz+ when I had it there, it was out-performing 2 580's at stock...

Saying it's not a capable platform is absolutely incorrect.

Fermi as a platform //IS// overkill for most of what's on the market because OpenGL and DX aren't perfect "bases" for graphics...They're simply, what works best...and what works best, isn't roofing what's currently out there.

In fact, I can think of only 1 game, out of EVERYTHING we've yet seen that's even remotely put the 590 to the test and that was Mafia 2...That's it.

For BF3, the developers have constantly stressed that if you can run BF2 on ultra, you'll be able to run BF3 "more than adequately"...

The 590 will handle BF3, accordingly.


----------



## max883

http://www.youtube.com/watch?v=m8S_eEv_A5k[/ame[/URL]] Battlefeald.3 64-player map









This is the reason i Have GTX-590


----------



## Sir_Gawain

Quote:


> Originally Posted by *Masked;14613736*
> In SWTOR, myself and the others with the 590, report 40-50FPS at the lowest and in BF3 Alpha on high, the lowest so far has been 80.


Long time lurker. Although not a member here id like to report that running the BF3 Alpha, I was only able to achieve 30-50 FPS and averaged 35-40 90% of the time (low,med or high). Only one GPU was being utilized and I was on 280.19









System specs:
Alienware Aurora R3, Intel i7 2600K - 4.1GHz - OC, 875W PSU, GeForce GTX 590, 8GB Ripjaws X 1866MHz, 24" U2410 UltraSharp


----------



## Masked

Quote:


> Originally Posted by *Sir_Gawain;14615022*
> Long time lurker. Although not a member here id like to report that running the BF3 Alpha, I was only able to achieve 30-50 FPS and averaged 35-40 90% of the time (low,med or high). Only one GPU was being utilized and I was on 280.19
> 
> 
> 
> 
> 
> 
> 
> 
> 
> System specs:
> Alienware Aurora R3, Intel i7 2600K - 4.1GHz - OC, 875W PSU, GeForce GTX 590, 8GB Ripjaws X 1866MHz, 24" U2410 UltraSharp


+1 to you, sir.

See, THIS, is good information...It tells you BF3 on the 280.19 is only actually utilizing 1 core while the 2nd idles...Again, that is a Beta driver so, I wouldn't expect a miracle but, that's solid info.

I'm honestly still using 275.33 at the moment...And on 1 card, (after Ragin's Bios "tweak") I'm seeing 80-90fps.

Will try the new WHQL a bit later and let you all know how it goes.

Just for future ref, alpha/beta with beta drivers, doesn't typically end well unless you tweak them a bit for yourself...The profiles aren't 100% and if you SLI, it's going to present some issues.


----------



## RagingCain

Quote:


> Originally Posted by *Sir_Gawain;14615022*
> Long time lurker. Although not a member here id like to report that running the BF3 Alpha, I was only able to achieve 30-50 FPS and averaged 35-40 90% of the time (low,med or high). Only one GPU was being utilized and I was on 280.19
> 
> 
> 
> 
> 
> 
> 
> 
> 
> System specs:
> Alienware Aurora R3, Intel i7 2600K - 4.1GHz - OC, 875W PSU, GeForce GTX 590, 8GB Ripjaws X 1866MHz, 24" U2410 UltraSharp


That would be SLI Profile probably missing. Try renaming the .exe to match Bad Company 2, but I don't believe that will work anymore, the SLI Profiles are activated by Windows Memory ID / Process ID too now, so simply renaming things isn't expected to always work. Expect FPS 50% more with the second GPU working... and optimized drivers.

@All
The VRMs on the 590, are better quality than the 570, and although different than the exact 580 VRMs, they are of equal performance (they aren't cheap.)

The maximum number of VRMs controls the maximum safe voltage, clean power delivery, and power stability. The "blowing up" factor can easily be done on ANY card over volted too high. The BIOS should never allow a maximum voltage that exceeds the VRM capacity's "safe zone." It is my firm belief that 5 VRMs support a 1.063v load, and are still within the safe zone, provided adequate cooling.

Keep in mind guys that setting 1.05v is the maximum, because the second GPU's VRMs gets an over volt of 13mV, so the second GPU gets 1.063v

Also note, the maximum overclocks are all lower, because the Drivers added an additional 13mV to the numbers but its not really there.

0.925 = 0.913v
1.050 = 1.038v

This is why maximum overclocks are lower than 267.85. Its not the only reason. With more GPU usage (full load) through optimized drivers, it draws more current/being stressed even more, so a higher overclock while obtainable on 267.85 was nearly as beneficial as some of these lower overclocks on 280.19.

Ex: 50% GPU usage @ 830 MHz, vs 85% GPU usage @ 775 MHz, The 775 MHz is a great deal faster simply because of the higher GPU usage.

Regarding the other V word: VRAM, VRAM usage is entirely dependent on the game.
DiRT 3 can run at 5760x1080 with 8xAA with 70 fps and uses 1350 MB VRAM at maximum.

Dragon Age II can run at 1920x1080 with 8xAA with 40 FPS and uses 1450 MB VRAM with all the textures and eye candy enabled.... which its horribly coded.

Crysis I is the same, and lower AA, about 2xAA, and you get 50 FPS and uses 1520 MB of VRAM (so as much as you have.) Crysis 2, does the exact same thing, only its quite playable at 5760x1080. I would anticipate that 1500 MB was the sweet spot, but because of competition and people moaning, a second gen 580 was released for 3GB which is super overkill in a majority of games at every resolution save 7680x1600, and that has to have 4x/8x AA to get up in the upper 2500MB.

nVidia drivers don't use conservative VRAM usage techniques. It pre-caches a lot of stuff into memory, like textures. This how you see Crysis 2 and other titles at 2000+ MB of VRAM at 1920x1200 and stupidly low resolutions (for the VRAM usage). ITS JUST CACHING STUFF TO MEMORY. In my opinion I like that, if you have it might as well use it in my opinion. ITS NOT MANDATORY. Its like having 24GB RAM instead of 12GB. You will use up to 12GB in Photoshop at a workstation doing high-res picture editing, but it doesn't give you much room for caching (slowing down loading or letting texture/colors pop-in in for example) so adding 12 GB of RAM doesn't improve the speed at all, as 12 GB was sufficient, BUT it will improve things like pop-in reduction or faster loading times, etc. This analogy directly translates to games. It doesn't slow the rendering down at all, thats entirely in the realm of the cards capabilities.

However, that being said, some games are poorly coded or simply place huge demand on video memory, or both. Its entirely game dependent.


----------



## Sir_Gawain

Quote:


> Originally Posted by *Masked;14615191*
> Just for future ref, alpha/beta with beta drivers, doesn't typically end well unless you tweak them a bit for yourself...The profiles aren't 100% and if you SLI, it's going to present some issues.


Thanks. I was on 280.19 simply because Nvidia recommended it as best for the BF3 Alpha....lol!

Thanks for all the good work you guys do over at Alienware. Despite what all the haters say, I couldnt be happier with my R3...Powerful and priced great. I really couldnt justify doing my own build with pricing coming out how it did (with my negotiated price) and getting the warranty I did (5yr).

My only true concerns with the system is if the 875W PSU is truly adequate under all types of loads with the 590, and that the 590 dumps all the hot air off GPU B into the case causing the card to run hotter than i'd prefer. I've toyed with the idea of water cooling it but hey...thats what the 5yr is for!


----------



## Sir_Gawain

Quote:


> Originally Posted by *RagingCain;14615362*
> That would be SLI Profile probably missing. Try renaming the .exe to match Bad Company 2, but I don't believe that will work anymore, the SLI Profiles are activated by Windows Memory ID / Process ID too now, so simply renaming things isn't expected to always work. Expect FPS 50% more with the second GPU working... and optimized drivers.


I figured as much. I'd give it a try but the Alphas closed now for like 2 weeks no? Unless you know a way in that I dont...


----------



## Masked

Quote:


> Originally Posted by *RagingCain;14615362*
> That would be SLI Profile probably missing. Try renaming the .exe to match Bad Company 2, but I don't believe that will work anymore, the SLI Profiles are activated by Windows Memory ID / Process ID too now, so simply renaming things isn't expected to always work. Expect FPS 50% more with the second GPU working... and optimized drivers.


Quote:


> Originally Posted by *Sir_Gawain;14615433*
> Thanks. I was on 280.19 simply because Nvidia recommended it as best for the BF3 Alpha....lol!
> 
> Thanks for all the good work you guys do over at Alienware. Despite what all the haters say, I couldnt be happier with my R3...Powerful and priced great. I really couldnt justify doing my own build with pricing coming out how it did (with my negotiated price) and getting the warranty I did (5yr).
> 
> My only true concerns with the system is if the 875W PSU is truly adequate under all types of loads with the 590, and that the 590 dumps all the hot air off GPU B into the case causing the card to run hotter than i'd prefer. I've toyed with the idea of water cooling it but hey...thats what the 5yr is for!


This, for your situation, Gawain, is true.

We get updates, profiles etc, as a package and because of the way BF3 "Alpha/open alpha" went down, you may not actually have the SLI profile, I didn't even think about that...Cain gets a +1 as well.

If you have any idea how to mess with drivers, you can actually "force" your own SLI profile...I believe that's what the last "update" did because I know some 580's were experiencing some issues as well.

Our office is a bit different in that we have no real "link" to the retail world, or actually most of the company (I thank god for that) but, thanks much...If you ever have an issue with it, feel free to beam me a PM and I can put your exact machine together, we'll have a go at your problem.

@Cain, in your opinion, do you foresee SLI profiles being enabled when "surround" is actually in use?

They seem to be dodging that question quite well so, figured I'd "toss the ball" around with someone else...See what your opinion is.


----------



## Sir_Gawain

Quote:


> Originally Posted by *Masked;14615523*
> Our office is a bit different in that we have no real "link" to the retail world, or actually most of the company (I thank god for that) but, thanks much...If you ever have an issue with it, feel free to beam me a PM and I can put your exact machine together, we'll have a go at your problem.


Probably best that way LOL!
Thanks for the offer as well, +1


----------



## RagingCain

Sorry I didn't know Alpha was down, I never get invited to anything... otherwise they would have to actually do their jobs and fix bugs or I would take their jobs








Quote:


> Originally Posted by *Masked;14615523*
> If you have any idea how to mess with drivers, you can actually "force" your own SLI profile...I believe that's what the last "update" did because I know some 580's were experiencing some issues as well.


You can always try messing with the hex bits with nVidia Inspector. I even suspect 280.26 may have added BF3 profile.
Quote:


> @Cain, in your opinion, do you foresee SLI profiles being enabled when "surround" is actually in use?
> 
> They seem to be dodging that question quite well so, figured I'd "toss the ball" around with someone else...See what your opinion is.


No actually I wouldn't expect activating Surround by default being greater in performance or adding GPU usage.... at least if there is no SLI Profile at all (no entry for the title). Usually if the SLI Profile isn't there, nothing else is there for the game and its standard run of the mill GPU usage due to driver team priority (although this isn't always the case explained below.) Think of it like default mode. No performance optimizations, no tweaks, just brute force GPU usage. Sometimes even GPU usage is only 40% because its just not told to use more (theory).

However, sometimes SLI Profiles have a bug, and have been disabled, while Surround may be fully operational. Usually if SLI Profile is just NOT there, there is no Surround as that is secondary to SLI.

Priority of Driver Team (By Observation):
*Physical Install & Software Errors*
*Single GPU Bugs & Glitches*
*Single GPU Game Profile Optimization - Quality vs. Perfromance* -> Finding the perfect combo balance to render the specific game engine.

AS OF NOW: *3D Vision* is high on the totem pole, second only to standard single gpu & single monitor.

*SLI Compatibility* <- 99% Possible, 1% they don't care/can't be done titles like Minecraft, Sims 3 etc.
*SLI Bugs & Issues* <- Anything acknowledged by nVidia will be fixed, everything else: No Guarantees

*2D Surround Compatibility* <- If Possible NOT guaranteed
*2D Surround Bugs & Issues* <- Not much support for older titles, at the mercy for users submissions (no active QA it seems.)

*3D Surround Compatibility* <- Rising in popularity, thus more man hours required.
*3D Surround Bugs & Issues* <- Not many users have this setup, its supported, most issues seem to be either just 3D or just Surround, again, would be at the mercy for user submssions but also does have active QA on this.

This is the trickle down concept and it makes sense. 3D is the only one I don't use, nor do I like because it steals QA/Bug Attention from technologies I do use, and it does no benefit except only to people who use 3D Vision. See if someone uses a single GPU, and I use SLI, and they fix an issue where the performance was lower on one card, often times that same bug was in SLI, and that same bug is also fixed, which means that improved performance in single GPU, usually means more performance in SLI etc. Its not always the case, but it makes sense.

So with that in mind, the priorities are there, and sometimes they skip a task or bug, or different teams work on different priorities or even just different bugs at the same priority level. Hence a driver release might fix the 3D Profile for a game, but no 2D Surround issues were adjusted. Or 3D Surround Compatibility was added, but there are 2D Surround issues still for example. So SLI maybe broken disabled a driver release, and 2D Surround is fully operational.

Thats how they come back with with these 87% improvement in Dragon Age II in SLI 580s for example. No der Sherlock, you had SLI crippled. Like Chris Rock said: "They be braggin about **** they SUPPOSED to do."


----------



## 2010rig

Benchmarks of the Mars II are available now!
http://www.overclock.net/hardware-news/1093987-various-asus-reveals-republic-gamers-mars.html



Mars II - 100%
GTX 590 - 86.86%
6990 - 86.68

6990 doesn't seem to be the world's fastest anymore.

What's the highest OC that anyone has achieved on a 590 so far?

Anyone get close to 782?


----------



## Sir_Gawain

Quote:


> Originally Posted by *2010rig;14615813*
> 6990 doesn't seem to be the world's fastest anymore.


That been apparent ever since the GTX 590 came out, as shown in 22 of the 36 examples in front of us LOL.


----------



## RagingCain

Quote:


> Originally Posted by *2010rig;14615813*
> What's the highest OC that anyone has achieved on a 590 so far?
> 
> Anyone get close to 782?


TiN and I.

TiN has LN2 up to 1300 MHz with a single 590 card + mods.

After that its me on H20 with 830 MHz on all 590 SLI thats on HwBot, non-HWBot is 875 MHz in Metro2033 benchmark, with 870 MHz + Proof. I don't think I have a 875 MHz screenshot.

Its not just GPU though, we have a 400 MHz underclock on memoriez







Most we can do is close the gap to about 200 MHz underclock.


----------



## 2010rig

Quote:


> Originally Posted by *RagingCain;14616054*
> TiN and I.
> 
> TiN has LN2 up to 1300 MHz with a single 590 card + mods.
> 
> After that its me on H20 with 830 MHz on all 590 SLI thats on HwBot, non-HWBot is 875 MHz in Metro2033 benchmark, with 870 MHz + Proof. I don't think I have a 875 MHz screenshot.
> 
> Its not just GPU though, we have a 400 MHz underclock on memoriez
> 
> 
> 
> 
> 
> 
> 
> Most we can do is close the gap to about 200 MHz underclock.


Damn, that's awesome.

Wonder if that's why there's a lack of submissions on your battle royale thread.


----------



## RagingCain

Quote:


> Originally Posted by *2010rig;14616180*
> Damn, that's awesome.
> 
> Wonder if that's why there's a lack of submissions on your battle royale thread.


Hah, who knows, I will give it some more time. Fall is around the corner, hopefully temps and other issues will be gone.

@All
Possible Tessellation as a source for bottleneck in DX11 Games:
http://www.overclock.net/video-game-news/1094239-techreport-crysis-2-tessellation-too-much.html

While its talking about AMD it clearly means its possible. From my understanding quite of few of us discovered games performing well below expectation, Crysis 2 was among them. According to this, this may very well be a condition of the hardware and we may never see our 590s (especially SLI) truly unleashed without the help of game developers using Dx11 efficiently and nVidia driver support. We may never see the day of 100% usage in all 4 GPUs consistently









Very sad.


----------



## saulin

Quote:


> Originally Posted by *RagingCain;14616054*
> TiN and I.
> 
> TiN has LN2 up to 1300 MHz with a single 590 card + mods.
> 
> *After that its me on H20 with 830 MHz on all 590 SLI thats on HwBot*, non-HWBot is 875 MHz in Metro2033 benchmark, with 870 MHz + Proof. I don't think I have a 875 MHz screenshot.
> 
> Its not just GPU though, we have a 400 MHz underclock on memoriez
> 
> 
> 
> 
> 
> 
> 
> Most we can do is close the gap to about 200 MHz underclock.


But all the AMD/ATI fanboys said, it wasn't possible. That even on water getting 700Mhz was about the best these cards could do without exploding


----------



## RagingCain

Quote:



Originally Posted by *saulin*


But all the AMD/ATI fanboys said, it wasn't possible. That even on water getting 700Mhz was about the best these cards could do without exploding

















Every bad thing you hear about the 590s outside of this thread, even by Phaedrus and many other esteemed members, is 100% all talk, or based off the very arrogant & loud first hand reviewers who killed their cards.

However the 590 is nowhere near perfect we do have very real issues:
Lack of driver focus / emphasis by nVidia.
Voltage lock downs / Power Draw Limitations.
Low GPU usage in either single or SLI Configuration.
Quad SLI issues.

Now add all the regular nVidia issues such as title compatibility to the mix.

I am glad that people are getting to see the real 590. I can't believe people have the nerve to call it a failure still with no experience with the actual card. The truth is, I could beat every 6990 user in this forum in the benchmark 590vs6990 thread, and people would say things like its just RagingCain doing it, or even better, its still a failure.


----------



## dboythagr8

What type of temps are you guys seeing with your 590's? I have a FT02 case, and am currently thinking about scooping up a 590 to replace the card in my sig. I get such good temps with the DCii though but if I can go SLI and only take up 2 slots instead of 3, that leaves room for Quad SLI if I decide to go all in.


----------



## Wogga

while running heaven at [email protected] (submission for 590vs6990 battle, single card) i was getting 38-39C under load. and never reached more than 40C, but thats with watercooling. w/o it on stock cooling load temps were 80-83C


----------



## MKHunt

Quote:


> Originally Posted by *Wogga;14624631*
> while running heaven at [email protected] (submission for 590vs6990 battle, single card) i was getting 38-39C under load. and never reached more than 40C, but thats with watercooling. w/o it on stock cooling load temps were 80-83C


Your OC's are so insanely high. You might be the person who pushes me past the 1V barrier. Also, how much rad does your 590 have? I'm working with 600mm of rad and it shares it with a 2600K but I hit 55C after 3ish hours of full loading with Witcher 2. Also, is one of your cores about 8-9C higher than the other under load? I had the same problem when running air, but I suspect it might have been due to the partial contact of the second HS.

590's are actually pretty good little cards. I used to believe the talk until i read RC's overclocking thread. I also realized that after initial purchase, chances of buying a second card were slim, so I might as well get a single 590 and outperform a single 580.









After messing around more with this card, you get the sense that it's really trying hard.


----------



## Wogga

Mora3 LT (9x120) for 2600K, 590 and Sabertooth MOSFET WB. Mora is cooled with 9 Scythe GentleTyphoons at 1850 rpm.

GPU1 is always +1C hotter than GPU2, so its like 26C/25C idle and 39C/38C under load.
800MHz is the highest stable clock i can reach =( there is much more better cards than mine. also my 590 have...a scar =) it works without C784 capacitor - maybe this is the reason of bad OCing


----------



## Smo

Quote:


> Originally Posted by *dboythagr8;14623308*
> What type of temps are you guys seeing with your 590's? I have a FT02 case, and am currently thinking about scooping up a 590 to replace the card in my sig. I get such good temps with the DCii though but if I can go SLI and only take up 2 slots instead of 3, that leaves room for Quad SLI if I decide to go all in.


We essentially have the same case so this could be of interest to you.

My motherboard (P8P67 Pro R.3) has two x16 slots. The one nearest the CPU has the bottom of the card venting directly onto the centre of one of the AP181 fans. The other leaves the bottom of the card in direct air flow (so the fan is essentially forcing air back up into the card).

I have my 590 overclocked to 700MHz core @ 0.950v. After running 3Dmark11 in performance mode (with the base of the card blocked in slot 1) my GPUs topped out at 68c. When moved into the direct flow of the fans in slot 2 they topped at 73c.

It appears that having direct flow forcing air back into the card is causing turbulence and harming the cooling capacity of the card.

Please bear in mind that my card uses a custom profile where 95% speed is used over 60c.


----------



## Masked

Quote:


> Originally Posted by *MKHunt;14624863*
> Your OC's are so insanely high. You might be the person who pushes me past the 1V barrier. Also, how much rad does your 590 have? I'm working with 600mm of rad and it shares it with a 2600K but I hit 55C after 3ish hours of full loading with Witcher 2. Also, is one of your cores about 8-9C higher than the other under load? I had the same problem when running air, but I suspect it might have been due to the partial contact of the second HS.
> 
> 590's are actually pretty good little cards. I used to believe the talk until i read RC's overclocking thread. I also realized that after initial purchase, chances of buying a second card were slim, so I might as well get a single 590 and outperform a single 580.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After messing around more with this card, you get the sense that it's really trying hard.


I've tested this on 5/6 systems very recently so, I believe I can actually address this with a certain level of certainty.

You don't have enough rad.

The setups I've been using w/the 2600k, a 990x and a 980x (oc'd to hell and back) include an XSPC 360, HAFx case, Monsoon resi's, Swifttech DC pump and 3/4in hosing.

The XSPC has 6x Coolermaster R4 fans @2000rpm

The highest temp I've seen at 100/100 is 55c...

On average I see high 30's only low 40's if gaming and benching, as I just said, 50-55.

With 2 590's that changes a lot because wattage dissapated just nearly doubled AND you've added 2 new pressure locations to your loop...

If you're going to push that card to 1V, your delta is going to change almost dramatically because you're pushing the same rad.

My suggestion to you, at those temps, you may be okay, it might peak into 65c/75c (even those are acceptable) but, I would perhaps get a beefier rad and look into dissipation.

Skinnee's lab has all the new tests on the latest batch of RADs, I'd check those out and definitely invest in 1.


----------



## MKHunt

Quote:


> Originally Posted by *Masked;14626646*
> I've tested this on 5/6 systems very recently so, I believe I can actually address this with a certain level of certainty.
> 
> You don't have enough rad.
> 
> The setups I've been using w/the 2600k, a 990x and a 980x (oc'd to hell and back) include an XSPC 360, HAFx case, Monsoon resi's, Swifttech DC pump and 3/4in hosing.
> 
> The XSPC has 6x Coolermaster R4 fans @2000rpm
> 
> The highest temp I've seen at 100/100 is 55c...
> 
> On average I see high 30's only low 40's if gaming and benching, as I just said, 50-55.
> 
> With 2 590's that changes a lot because wattage dissapated just nearly doubled AND you've added 2 new pressure locations to your loop...
> 
> If you're going to push that card to 1V, your delta is going to change almost dramatically because you're pushing the same rad.
> 
> My suggestion to you, at those temps, you may be okay, it might peak into 65c/75c (even those are acceptable) but, I would perhaps get a beefier rad and look into dissipation.
> 
> Skinnee's lab has all the new tests on the latest batch of RADs, I'd check those out and definitely invest in 1.


Argh I have the beefiest rads my case can handle without going external mount. RX120 at the back (which actually is external since it won't fit inside), RX240 on the bottom, RS240 on the top, all equipped with GELID Wing 12 PL running at 1900RPM. They're rated 1750, so maybe I should add a little resistance (slow them) to achieve better static pressure? I have three 2000rpm Sickleflows (R4) on my shelf but they don't push or pull _nearly_ as hard as these GELIDs do. At 100/100 my CPU hits 58 (mobo overvolts to 3.771 @4.4ghz no matter what I do) and my GFX hits 46 core 2 and 53 core 1.

I don't know what to make of the delta between the cores as it was there on air as well. WCing the card has taken it down from 11C max load to 7-8C max load. I think there's still an air pocket as well. When it gets to 43+ on both cores my frustrating vortex reappears between the gfx card and RX240. Thought I had bled that out, but apparently not. I should also probably change pump speed because right now the 35x is going full-tilt in a loop w/o much restriction which I've heard actually hurts temps.

ETA: I DO NOT have two 590's. Just one. When I talk about cores 1 and 2 that's Core 1 Card 1 and Core 2 Card 1.

It sounds like the systems you reference had a single RX360. I have the equivalent of that plus an extra RS240 and the 2600K has a lower TDW than any Intel Extreme CPU so theoretically I should be fine? Sorry if I misled you. I don't want anyone to look at my scores and think, "wow, that's two 590's? ... Really!?" It wouldn't help the case of the 590 at all.









I would NEVER run 1V 24/7. I generally OC for benches then back down to stock voltage or .925 for games since when I played BFBC2 at 740mhz I only saw a ~4fps increase, which is useless when everything already runs faster than my monitor.

Last night I did go to the big 1V for a couple hours to bench. PDL kicked in super hard for 3DMark so I went to Heaven. I could only get Heaven to run at 785/1570/864 @ 1.000V. I will never catch-up to Wogga at this rate







Temps never went above 44. I'll admit that I was hoping for more since the card went to 740 at .975 with ease. At .975 the max is about 743 but I don't like ugly numbers. It _seems_ like I have _enough_ rad but I do want lower temps considering this loop ran ~$570 before fans which were $19 each


----------



## Wogga

my card is broken, it works w/o one capacitor (teared it out) and yet had no problems running @1.05v for hours...there should be no fear.
now i'll try to find courage to flash MARS II BIOS onto my poor 590


----------



## RagingCain

Quote:


> Originally Posted by *Wogga;14628797*
> my card is broken, it works w/o one capacitor (teared it out) and yet had no problems running @1.05v for hours...there should be no fear.
> now i'll try to find courage to flash MARS II BIOS onto my poor 590


Make sure you know how to do a blind flash, a force flash, do it outside of windows and have a backup card just in case you need to reflash old bios using the other card for display. Don't do it the easy method in windows.

Prepping a crazy flash like this:
Uninstall all drivers/disable SLI / Surround.
Delete all nvidia files in safe mode.
Attempt from DOS, have backup BIOSes in the same drive named extremely simple: ex. backa.rom / backb.rom.
Pray.

Regarding water cooling temps, I was not satisfied with max temp so added a 3rd 120.3 radiator. This should not affect idle temps around 30~34c, but should lower max temp by about 10~15c, so hopefully max temp will now be 45c.

Post powered by DROID X2


----------



## Wogga

umm...those MARS II BIOSes are .bin, not .rom...i need some time to read about all this stuff and find the way to convert it to .rom


----------



## Smo

Quote:



Originally Posted by *Wogga*


umm...those MARS II BIOSes are .bin, not .rom...i need some time to read about all this stuff and find the way to convert it to .rom


Here you go mate;

http://www.softpedia.com/get/Tweak/V...S-Editor.shtml


----------



## Tept

What kind of outcome is expected with this Mars II bios flashing as far as possible damage to the card?

And wouldn't this still be useless due to PDL? Won't nvidia drivers still recognize it as a 590 and put it under the same garbage limitations?


----------



## Wogga

no more PDL i think =)

Smo, i already have NiBiTor but it wont recognize mars 2 bios -_-


----------



## Tept

Well let us know how it goes Wogga, I'm curious to if this is potentially the thing that redeems the 590.

PS: I flashed mine from Cains modified back to stock bios using bin files Wogga. It was using the windows nvflash method tho, not sure if that matters.


----------



## Wogga

tried just putting those bins over nvflash.exe and all i`ve got were several beeps and nvflash window, closed in a second.

anyways, i've already posted somewhere here (or maybe in flashing and OC thread) about one guy that flashed bios from 580 and his card didnt blowed up or smthing like this.. it was unstable but still worked. so i think it wont be painfull to flash mars bios. as you can see, the highest voltage in mars bios is 1.1v:

Performance Level 3
Core Clk: 782.00 MHz
Mem Clk: 1002.00 MHz
Shader Clk: 1564.00 MHz
Voltage: by ASIC (1.0375 V - 1.1000 V)

dont think it will burn my card.


----------



## Tept

Wogga your Bios are probably write protected, just run the nvflash.exe by itself and itll tell u the command to disable the write protection.


----------



## Shinobi Jedi

So...

How many of you are eyeing that EVGA 580 Classified FTW? I'm curious to see some benches on it compared to the 590 and the Mars II..

Have to say though, in terms of gaming performance increases, the Mars II does not seem worth the extra coin over the 590. Maybe it'd be worth it for benches or bragging rights, but when the biggest gap seems to be about 9-10fps, that just doesn't seem worth it as a gamer.


----------



## Smo

The EVGA 580 Classified FTW looks so, so nasty. I know it's a childish reason not to buy something, but I'm not sure I could bring myself to want to open the box!


----------



## Shinobi Jedi

Quote:



Originally Posted by *Smo*


The EVGA 580 Classified FTW looks so, so nasty. I know it's a childish reason not to buy something, but I'm not sure I could bring myself to want to open the box!


Ha! That's awesome SMO!

Aesthetics is something I care about too. That's one of the things that sold me on the 590's! As fluff and non-necessary the built in GeForce/Brand Name LED's are on the card, I love having them. Before, any non-gamer or non-PC enthusiasts would look at my rig and just think of it like a regular PC.

Now, with those little LED's shining through the window, my rig gets questions and compliments from those same people like it never did before.

Aesthetics do matter. Though I think there may be some blinging LED's on the 580 Classified FTW's as well, so if that part that's visible from the case looks cool, and the benches are incredibly superior, then I may have to take a look at it









But yeah, otherwise, it is kinda ugly looking.


----------



## Tept

If that thing is as incredibly overclockable as it looks like its going to be, I would have waited and SLI'd 2 of those over getting a 590.


----------



## emett

Masked my friend if when i pick up my copy of bf3 the side of the box says recommended gtx 580 i'll buy you a beer.
Any way as you say we'll all be fine with our cards so to argue over recommended settings seems trivial.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *emett;14633972*
> Masked my friend if when i pick up my copy of bf3 the side of the box says recommended gtx 580 i'll buy you a beer.
> Any way as you say we'll all be fine with our cards so to argue over recommended settings seems trivial.


I think Masked is pretty dead on.

All the demo's at E3 and other such events that had real live gameplay of BF3 with everything all turned up, I know was being run on a single 580.

Kepler GPU is not going to be out by Oct. 25th, and unless BF3 gets a last minute delay (something I wouldn't mind based on my experience with the Alpha),

that means with the 580 being the current highest GPU available, it's pretty much a sure thing what Masked said.

Now that I'm hearing that Kepler may be more in line with an evolutionary upgrade, than revolutionary as was originally thought, I'm even more psyched on owning the 590's. Of course, that may be wrong too. And the Kepler's may be all the hits and more.

But we do know that Kepler won't be out by Oct. 25th, BF3's release date.

So knowing that, if you let your skepticism outweigh your logic, it is going to cost you a beer. But you are right too, in that something definitely will come out sooner than later that will eclipse the 590's. That's the rub with PC gaming hardware. You can't really buy for the future, because there's always something better coming down the line. You have to buy for the moment. Of course, with so many games being console ports, the need to upgrade hardware for gaming is not as short as it used to be. And good rigs should be able to play all or most games at full blast for at least a couple years, I'd think...

Bottom line, it is widely reported that all the game footage demoed at E3 and like events that have wowed the media and industry was all being run off of a single 580.

I went for 590 SLI in hopes/trust of being able to maintain a 120fps with full settings. Whether or not this will happen, I'm not sure. But I am sure that even a single 590 will get you 60fps+ for BF3. (Doesn't mean I'm right though)


----------



## emett

Your missing my point. I was just meaning that if a gtx 580 can run it sweet on max then that will not be the "recommended" card in the system requirements. Normaly the "recommended' card will not run the game on max setting.
Take for example the recommended system requirements for crysis 2 dx11 with the ultra textures pack. ASUS Radeon HD 6950 2GB is what they state. With my gtx590 @1080p on ultra settings and dx11 i've seen the frame rate at40.
Do you see what i'm getting at?
They do this so as not to scare of buyers of games.


----------



## Arizonian

Quote:


> Originally Posted by *emett;14636951*
> Your missing my point. I was just meaning that if a gtx 580 can run it sweet on max then that will not be the "recommended" card in the system requirements. Normally the "recommended' card will not run the game on max setting.
> Take for example the recommended system requirements for crysis 2 dx11 with the ultra textures pack. ASUS Radeon HD 6950 2GB is what they state. With my gtx590 @1080p on ultra settings and dx11 I've seen the frame rate at40.
> Do you see what I'm getting at?
> They do this so as not to scare of buyers of games.


I understand what your saying. The game makers take the lowest system and graphics card that can run the game with lowest settings and say 'minimum requirements' as to not alienate potential buyers that can get away with playing the game if settings are turned down all the way.

In fact there is two statements on games. One is 'minimum requirements' and the other is 'recommended requirements' for better results.

BTW Emett, it's a great idea to list your specs in your bio 'edit system'. OCN members like to see what we got under the hood and if you ask questions to know your hardware when giving advice.

And - WELCOME to OCN!


----------



## emett

Done


----------



## Masked

Quote:


> Originally Posted by *Masked;14614185*
> ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My entire office has been in BF3 since day 1, were in Beta of BF2...Let's say we have a very open stream of communication.
> 
> The game is "developed" for the top end of what's available, that's your high setting...What's the most top end, commonly available card atm? The 580...
> 
> As anyone in alpha noticed, HIGH on the 580 in the alpha wasn't "maxing out" as you so toss out there...It was still getting 60-80FPS which is 1/4 over what's "accepted" as capable.
> 
> For ultra, the 580 will be getting @50fps, which considering we're all 590 owners, apparently...There won't be any problems.
> 
> Have been in SWTOR since Alpha as well and like I said, the only issue in an MMORPG is that detail is immediately processed, not buffered like it is in BF3.
> 
> P.S. "Mate", as a vendor, I see the system requirements and comment on them FAR before the retail public does...We even make a few suggestions, occasionally.


Quote:


> Originally Posted by *emett;14633972*
> Masked my friend if when i pick up my copy of bf3 the side of the box says recommended gtx 580 i'll buy you a beer.
> Any way as you say we'll all be fine with our cards so to argue over recommended settings seems trivial.


Please read what I //actually// said.

I'll paraphrase.

For the high end, the game is being engineered for the 580 because that IS the top end card that will be out.

When game-makers develop a game there is a high/mid/low...They typically make recommended a notch or two ABOVE the low due to the aspects of enjoyment and they ROOF the game based on what's currently out.

That's your high setting, absolutely...The game will be made to mostly max out on 1x 580...That doesn't mean it's recommended to play the game.

If you're hardcore about it, is that what I/ME/PERSONALLY recommend? Hell yes, game looks absolutely disgusting on a 580/590...

What I said, to any gamer in the world (what I said in BOTH situations) means, if you want to max out this game, you'll get a 580, if you want to blow it out, you'll SLI 580s, or 1x 590. IF you want to just play it, you'll keep your 480/560/470 etc.

What I actually enforced in //ALL// of my statements was/is the fact that the 590 is absolutely a capable platform of running BF3 and/or any game on the market and it absolutely is...

590 right now, IS THE ROOF...I hate to break it to you but, I've got 2 situations in this office where the 590 actually out-performs 580's in SLI...Granted, it's specific resolutions and in water cooled environment, it DOES happen...Developers LOOK at that data and say, this is our max.

Fermi as an architecture has surpassed what our software can actually achieve so, right now, there is no case of next-gen or going over what the 590 can achieve because as a single unit, it is the best and most capable platform card out right now, period.

Even Kepler is going back to the drawing board so the first PCB's that will be released will actually be 560/570s/580s, for OEM's.

In english, the 590 is going to be the king for a while...Even while it limps...I'd suggest you get used to the roof because we're going to be here for at least the next 6 months.


----------



## ReignsOfPower

The GTX580 Classified looks nice, but it's way too late into the lifecycle. No point spending a premium on a product that will be trumped with a reference version of the next gen card in <5 months. I also think the ASUS Mars II is a load of junk too. 1.5GB per GPU and twice the price without a waterblock? BAH! Imo the best dual gpu setup today is 2x3GB EVGA GTX 580 HC2's. Can't go wrong. Heaps of VRAM, great power delivery setup, single slot cooler. So much win.


----------



## Sir_Gawain

Masked;In english said:


> Perhaps you could finally shed light onto what brand GTX 590 is being put into Alienware systems???
> 
> [URL=http://imageshack.us/photo/my-images/199/gpuzkd.gif/] Uploaded with ImageShack.us


----------



## Masked

Quote:



Originally Posted by *Sir_Gawain*


Perhaps you could finally shed light onto what brand GTX 590 is being put into Alienware systems???


I knew that question would be coming one day, and I apologize but, I can neither comment nor answer any speculation.

It is a ref design but, saying anything beyond that would cause me extreme grief.


----------



## Sir_Gawain

Quote:


> Originally Posted by *Masked;14640498*
> I knew that question would be coming one day, and I apologize but, I can neither comment nor answer any speculation.
> 
> It is a ref design but, saying anything beyond that would cause me extreme grief.


LOL worth a shot I guess!


----------



## Masked

Quote:



Originally Posted by *Sir_Gawain*


LOL worth a shot I guess!


The issue is that some Vendors are still very bitter about the 590 stock being what it is...And if I answer that, I start the "blame game" which, actually comes back to hurt, basically everyone in the long run but, that won't stop anyone from QQ'ing.

It is a quality product and if you take it out, give it a gander, you're a smart boy, I'm sure you could/can figure out where it came from







...Beyond your observations, I just can't answer.


----------



## Smo

Quote:



Originally Posted by *Masked*


The issue is that some Vendors are still very bitter about the 590 stock being what it is...And if I answer that, I start the "blame game" which, actually comes back to hurt, basically everyone in the long run but, that won't stop anyone from QQ'ing.

It is a quality product and if you take it out, give it a gander, you're a smart boy, I'm sure you could/can figure out where it came from







...Beyond your observations, I just can't answer.


Sometimes I hate working in a job where I'm actually legally bound not to answer their questions.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Smo;14640684*
> Sometimes I hate working in a job where I'm actually legally bound not to answer their questions.


Just out of curiosity, SMO, would that be Mi-5? Or Mi-6?


----------



## Smo

Quote:



Originally Posted by *Shinobi Jedi*


Just out of curiosity, SMO, would that be Mi-5? Or Mi-6?

















MI6 obviously, I work with Bond









In all seriousness though I work at a game developer - all my mates want to know what games we're working on and what sort of new technology is on the way. I had to sign a legal contract thicker than my arm so unless information has already been officially released to the press I'm not at liberty to tell anyone anything or I'll get my ass sued off!

It sucks sometimes - for instance we're working on a huge title now which is looking amazing but I can't so much as tell my mates what it's called!


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Smo;14650270*
> MI6 obviously, I work with Bond
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness though I work at a game developer - all my mates want to know what games we're working on and what sort of new technology is on the way. I had to sign a legal contract thicker than my arm so unless information has already been officially released to the press I'm not at liberty to tell anyone anything or I'll get my ass sued off!
> 
> It sucks sometimes - for instance we're working on a huge title now which is looking amazing but I can't so much as tell my mates what it's called!


That's awesome! I spent some time many years ago as a lowly beta tester at Activision before they went evil.

Hopefully, that huge title will be out for PC's too! Though I'm not a biased gamer. I own consoles to play the exclusives like Halo or the Uncharted series. If the game is good, I don't care about the platform.

And btw, working for a "game developer" seems to be a perfect cover for MI6


----------



## Smo

Quote:


> Originally Posted by *Shinobi Jedi;14656499*
> That's awesome! I spent some time many years ago as a lowly beta tester at Activision before they went evil.
> 
> Hopefully, that huge title will be out for PC's too! Though I'm not a biased gamer. I own consoles to play the exclusives like Halo or the Uncharted series. If the game is good, I don't care about the platform.
> 
> And btw, working for a "game developer" seems to be a perfect cover for MI6


I'm the same way - I have a 360 and PS3 too. A good game is a good game regardless of what 's been built on. For instance I can't wait for the new Uncharted and the latest Gears of War.

I'm pretty sure I'll get away with telling you that it will be released on all platforms, but that's about as far as it goes lol.

Personally, I can't wait to start playing it (I'm finishing work on another project first which only has about a month left).


----------



## RagingCain

Okay Guys, I am going down for an extended period of time. System is being torn apart for the 5th time in 4 weeks, and I frankly don't feel like putting it back together.

I will be selling my motherboards, memory, and CPUs.

I will be running any help/BIOS ops from my laptop.


----------



## Smo

Quote:


> Originally Posted by *RagingCain;14664124*
> Okay Guys, I am going down for an extended period of time. System is being torn apart for the 5th time in 4 weeks, and I frankly don't feel like putting it back together.
> 
> I will be selling my motherboards, memory, and CPUs.
> 
> I will be running any help/BIOS ops from my laptop.


That sucks dude, what happened?


----------



## Opp47

Quote:


> Originally Posted by *RagingCain;14664124*
> Okay Guys, I am going down for an extended period of time. System is being torn apart for the 5th time in 4 weeks, and I frankly don't feel like putting it back together.
> 
> I will be selling my motherboards, memory, and CPUs.
> 
> I will be running any help/BIOS ops from my laptop.


NOOOOO!!!!

What happened man???


----------



## RagingCain

Two things happened.

First:
3 bent pins, cpu socket, motherboard, that was sitting in a protective box, in my closet, last touched in April..... 3. Bent. Protective box on all sides, inside a packaging box, ready to ship if it ever sold. I have a guy who says he can do it, but I trust him at beer pong, not so much delicate stuff.

Second thing:
Engine Light
4th time this year
2006 VW Passat 2.0T -> Class 5 type to insure or repair.

Note: Class 5 is more expensive than Benz/Lexus









Edit: Third thing happened:
ASUS G74SX just died. The one I bought 3 weeks ago. Its dead no boot up display i.e. no GPU action. Troubleshooted everything on it except CPU and GPU (no replacements). 99% sure CPU is defect free as they usually are. 100% Sure its the cheap second class GTX 560M which is not the same one advertised on their website that is sold as the cheap Best Buy model.

P.S. There are two GPUs with the exact same name and marketing: GTX 560M. By the way, don't buy the BestBuy model, not the same advertised on usa.asus.com in their models.

SO RAGING CAIN is... slightly...... annoyed.

Actually looking at an M17X Alienware just to have something I can get fixed locally, we have a Dell repair depot here, whom by the way rejected my employment application on the grounds of not having an A+ certification thats recent. Mine is over 12 years old, I got when I was 15, and they didn't require a new one every 12 months.

So anywho....

Post powered by Droid X2


----------



## Masked

Quote:



Originally Posted by *Smo*


I'm the same way - I have a 360 and PS3 too. A good game is a good game regardless of what 's been built on. For instance I can't wait for the new Uncharted and the latest Gears of War.

I'm pretty sure I'll get away with telling you that it will be released on all platforms, but that's about as far as it goes lol.

Personally, I can't wait to start playing it (I'm finishing work on another project first which only has about a month left).


What sucks for me/my office is that we don't only alpha/beta the hardware, we also alpha/beta the software.

I once took a screenshot in here and it actually cost me 2 weeks of hardcore whining from the "powers that be" and I got "spanked" for it (They sent me out into the field to wire a building...bleh) so, with windows 8 on the horizon (cough, cough), I can't really participate which, is what's frustrating.

I can point and say, "I think that's the problem all day" but, if I actually come out and say "Bla bla bla bla bla is the actual problem", I violate whatever confidentiality/NDA I'm currently under and it's like 2 weeks of paperwork back and forth whenever it happens so, I've resigned to whistling AND pointing profusely...

Quote:



Originally Posted by *RagingCain*


Two things happened.

First:
3 bent pins, cpu socket, motherboard, that was sitting in a protective box, in my closet, last touched in April..... 3. Bent. Protective box on all sides, inside a packaging box, ready to ship if it ever sold. I have a guy who says he can do it, but I trust him at beer pong, not so much delicate stuff.

Second thing:
Engine Light
4th time this year
2006 VW Passat 2.0T -> Class 5 type to insure or repair.

Note: Class 5 is more expensive than Benz/Lexus









Edit: Third thing happened:
ASUS G74SX just died. The one I bought 3 weeks ago. Its dead no boot up display i.e. no GPU action. Troubleshooted everything on it except CPU and GPU (no replacements). 99% sure CPU is defect free as they usually are. 100% Sure its the cheap second class GTX 560M which is not the same one advertised on their website that is sold as the cheap Best Buy model.

P.S. There are two GPUs with the exact same name and marketing: GTX 560M. By the way, don't buy the BestBuy model, not the same advertised on usa.asus.com in their models.

SO RAGING CAIN is... slightly...... annoyed.

Actually looking at an M17X Alienware just to have something I can get fixed locally, we have a Dell repair depot here, whom by the way rejected my employment application on the grounds of not having an A+ certification thats recent. Mine is over 12 years old, I got when I was 15, and they didn't require a new one every 12 months.

So anywho....

Post powered by Droid X2


Your first situation sucks but, technically you do have some rights in the form of RMA depending on the company...Look into it, you'd be surprised.

Your second situation blows...I just sold my turbo malibu...(Yes, you can turbo a malipu) That my friends and I were working on for 2 years because I felt the transmission starting to slip/go hardcore...Is a crazy expensive fix so, traded it in for a new zoom zoom + payments (Jeep Liberty 2011 Jet Edition).

The M17x's are a nice buy...Would definitely stand by them...I keep telling "the powers that be" the only thing that really matters anymore is that it looks pretty, it functions but, most of all, the warranty...People these days want their PC's fixed and NOW, not in 2 weeks or 3 weeks, NOW...So, put out a better warranty!

I think you'll enjoy it...

I'm kinda bummed about my 590's lately...I was once so proud and now it's kinda like bleh, I'd rather have the 6990's in for 90% of the work we're doing lately...

Hopefully something changes soon and be sure to make that call about the warranty, you'd be surprised what some companies would say, even after a year or 2.


----------



## Canis-X

Finally got my 590's blocked!! Idle temps 30C, playing BLOPS the highest that I get is 38C (all stock). Looking good, now time for some OC'ing and benching.

Hope that you like it.




























Cheers!


----------



## RagingCain

Quote:


> Originally Posted by *Masked;14672371*
> What sucks for me/my office is that we don't only alpha/beta the hardware, we also alpha/beta the software.
> 
> I once took a screenshot in here and it actually cost me 2 weeks of hardcore whining from the "powers that be" and I got "spanked" for it (They sent me out into the field to wire a building...bleh) so, with windows 8 on the horizon (cough, cough), I can't really participate which, is what's frustrating.
> 
> I can point and say, "I think that's the problem all day" but, if I actually come out and say "Bla bla bla bla bla is the actual problem", I violate whatever confidentiality/NDA I'm currently under and it's like 2 weeks of paperwork back and forth whenever it happens so, I've resigned to whistling AND pointing profusely...
> 
> Your first situation sucks but, technically you do have some rights in the form of RMA depending on the company...Look into it, you'd be surprised.
> 
> Your second situation blows...I just sold my turbo malibu...(Yes, you can turbo a malipu) That my friends and I were working on for 2 years because I felt the transmission starting to slip/go hardcore...Is a crazy expensive fix so, traded it in for a new zoom zoom + payments (Jeep Liberty 2011 Jet Edition).
> 
> The M17x's are a nice buy...Would definitely stand by them...I keep telling "the powers that be" the only thing that really matters anymore is that it looks pretty, it functions but, most of all, the warranty...People these days want their PC's fixed and NOW, not in 2 weeks or 3 weeks, NOW...So, put out a better warranty!
> 
> I think you'll enjoy it...
> 
> I'm kinda bummed about my 590's lately...I was once so proud and now it's kinda like bleh, I'd rather have the 6990's in for 90% of the work we're doing lately...
> 
> Hopefully something changes soon and be sure to make that call about the warranty, you'd be surprised what some companies would say, even after a year or 2.


I will give them a call, see what they say abbout the board.

Engine light was a broken hose clamp that manages one of the fuel lines. It was leaking vapors due to being knocked loose ever so slighty. VW CPU detected loss of pressure, and flashed me the light. It was only 95$ to diagnose, part was free, and got a full free detail, and air-to-nitrogen switch over while I wait.

I am honestly not dissapointed with the 590s, I totally was for a while, but then I did the benchmark thread. After seeing how much I could get out of them, all we need is for me to clean up the OC thread, maybe put some OC tips I go by as well. More than anything, we need better GPU usage/drivers. Nvidia has the ball in the court, and as much as they still "suck", they have come a long long long way since release drivers.

Edit:
BestBuy gave me 100% refund, including taxes, an apology, and a $50 gift card or a free printer. I took the gift card and got some Blu-Rays. Spoke to the manager the whole time, he said he will immediately tell corporate about the advertising issue, and felt truly bad, and gave me a free 1L Aquafina, my favorite water / carcinogenic bottle.

First day back on campus, and first day of new job on campus... which I had to leave early because my car issues. Probably not the best impression. What the hell.

P.S. I hate freshman. Despite a bagillion of them, I had a good day.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14672371*
> What sucks for me/my office is that we don't only alpha/beta the hardware, we also alpha/beta the software.
> 
> I once took a screenshot in here and it actually cost me 2 weeks of hardcore whining from the "powers that be" and I got "spanked" for it (They sent me out into the field to wire a building...bleh) so, with windows 8 on the horizon (cough, cough), I can't really participate which, is what's frustrating.
> 
> I can point and say, "I think that's the problem all day" but, if I actually come out and say "Bla bla bla bla bla is the actual problem", I violate whatever confidentiality/NDA I'm currently under and it's like 2 weeks of paperwork back and forth whenever it happens so, I've resigned to whistling AND pointing profusely...
> 
> Your first situation sucks but, technically you do have some rights in the form of RMA depending on the company...Look into it, you'd be surprised.
> 
> Your second situation blows...I just sold my turbo malibu...(Yes, you can turbo a malipu) That my friends and I were working on for 2 years because I felt the transmission starting to slip/go hardcore...Is a crazy expensive fix so, traded it in for a new zoom zoom + payments (Jeep Liberty 2011 Jet Edition).
> 
> The M17x's are a nice buy...Would definitely stand by them...I keep telling "the powers that be" the only thing that really matters anymore is that it looks pretty, it functions but, most of all, the warranty...People these days want their PC's fixed and NOW, not in 2 weeks or 3 weeks, NOW...So, put out a better warranty!
> 
> I think you'll enjoy it...
> 
> I'm kinda bummed about my 590's lately...I was once so proud and now it's kinda like bleh, I'd rather have the 6990's in for 90% of the work we're doing lately...
> 
> Hopefully something changes soon and be sure to make that call about the warranty, you'd be surprised what some companies would say, even after a year or 2.


What's got you bummed on them? It was just a few days ago you were saying the 590's were the roof. I like AMD fine. My 5870m in my Asus G73JH-A1 rocks. I guess the 6990's might seem more preferable if you only use them at work and didn't have to pay for them. But for me, noise really was the deciding factor against them. They're just too freakin' loud for the fps difference it gets over the 590 in the few titles it does.

Unless there's something about the 6990's, I'm missing?

As for Alienware, I love my M11x-R2. Love it more than my Asus G73JH even though it is way beefier.

An M17x would rock. But make sure to buy an extended warranty for it at time of purchase, as Dell ridiculously hikes the price up if you want to buy it later. I think it's always good to get the extended warranty on notebooks. They're small, lots of things can go wrong.

And you definitely want it with Dell. The M11x had a nasty hinge issue that resulted in an inordinate amount of owners with display panels cracked or broken off from the keyboard. Dell had the same issue with their M15x notebook and after those owners raising a stink, got free replacements if the hinge broke. Yet, knowing of the problem, Dell went with the same design for the M11x and wasn't taking any action on it, until us M11x owners had to raise hell so that it made the front page of tech/web sites, and now we have the same policy for free hinge replacements. To be fair, my hinge is still going strong. Hinge issue aside, I think the quality of Alienware is awesome, and really any quality issues one may get all comes down to who is the one that assembles your notebook. Some people put them together better than others.

Yet it should be said, that in my experience as an Alienware owner, Dell has given me some of the best customer service I've had. My next notebook will most likely be an Alienware, unless the competition comes out with something markedly better by time of purchase.

I love my Asus notebook too. But you do have to be careful where you buy them as Best Buy models and even some on Newegg have inferior panels and I guess inferior GPU's.

I hope you get your desktop up and running soon, RagingCain! I went through a gaming notebook phase last year and that's how I ended up with my M11x and Asus G73JH-A1. And while they're amazing if you're traveling and need to get your game on, you can see where I'm back at now...

Though I can totally understand being sick of having to deal with tear downs and rebuilds on desktops so that it drives you to a notebook.

I trust you'll get it worked out soon!


----------



## RagingCain

Quote:


> Originally Posted by *Shinobi Jedi;14680528*
> What's got you bummed on them? It was just a few days ago you were saying the 590's were the roof. I like AMD fine. My 5870m in my Asus G73JH-A1 rocks. I guess the 6990's might seem more preferable if you only use them at work and didn't have to pay for them. But for me, noise really was the deciding factor against them. They're just too freakin' loud for the fps difference it gets over the 590 in the few titles it does.
> 
> Unless there's something about the 6990's, I'm missing?
> 
> As for Alienware, I love my M11x-R2. Love it more than my Asus G73JH even though it is way beefier.
> 
> An M17x would rock. But make sure to buy an extended warranty for it at time of purchase, as Dell ridiculously hikes the price up if you want to buy it later. I think it's always good to get the extended warranty on notebooks. They're small, lots of things can go wrong.
> 
> And you definitely want it with Dell. The M11x had a nasty hinge issue that resulted in an inordinate amount of owners with display panels cracked or broken off from the keyboard. Dell had the same issue with their M15x notebook and after those owners raising a stink, got free replacements if the hinge broke. Yet, knowing of the problem, Dell went with the same design for the M11x and wasn't taking any action on it, until us M11x owners had to raise hell so that it made the front page of tech/web sites, and now we have the same policy for free hinge replacements. To be fair, my hinge is still going strong. Hinge issue aside, I think the quality of Alienware is awesome, and really any quality issues one may get all comes down to who is the one that assembles your notebook. Some people put them together better than others.
> 
> Yet it should be said, that in my experience as an Alienware owner, Dell has given me some of the best customer service I've had. My next notebook will most likely be an Alienware, unless the competition comes out with something markedly better by time of purchase.
> 
> I love my Asus notebook too. But you do have to be careful where you buy them as Best Buy models and even some on Newegg have inferior panels and I guess inferior GPU's.
> 
> I hope you get your desktop up and running soon, RagingCain! I went through a gaming notebook phase last year and that's how I ended up with my M11x and Asus G73JH-A1. And while they're amazing if you're traveling and need to get your game on, you can see where I'm back at now...
> 
> Though I can totally understand being sick of having to deal with tear downs and rebuilds on desktops so that it drives you to a notebook.
> 
> I trust you'll get it worked out soon!


Yay! MOBO IS DEAD!

Post powered by DROID X2


----------



## MKHunt

Quote:



Originally Posted by *RagingCain*


Yay! MOBO IS DEAD!

Post powered by DROID X2


Ugh can we switch MOBO statuses? I want to RMA this board so badly. I started updating my build log with overclocking but it just turned into a long rant. Posted it anyway as a warning.

I'm quite satisfied with my 590. It seems to overclock pretty decently. Too bad my motherboard and RAM are holding it back.


----------



## L D4WG

Cant find any GTX 590's in stock online anywhere in AUstralia







. Whats the deal, and why doesn't EVGA.com.au have the GTX 590 up at all?

GF is going to Vegas soon, ill see if EVGA USA can ship to her hotel







.

EDIT: EVGA US is out of stock







... Where are they all?


----------



## Tonza

I just ordered GTX 590







There was very nice special offer in finnish retailer (Asus GTX 590 for 495 euros!). Cant wait to get it, my AX 750 should be fine on stock voltage.


----------



## rush2049

590's are getting harder and harder to find new..... might I suggest the 'buy a dissatisfied owners' route? (Ebay, amazon, overclock.net, etc.)


----------



## Wogga

for last week price rised up from 715$ to 890$ in our snowy country =)
glad that i took second palit for 715$.
they're in stock and even new shops start to sell it but prices grow like a teenagers acne
there are alot of palits, some zotac cards, gigabyte ones, PoV, and even couple of gainwards 
and "Point of View GeForce GTX 590 691Mhz PCI-E 2.0" that is out of stock O_O thats awesome stock clocks

and i'm still confused - will my sabertooth p67 accept two 590? there is quad-SLi certification in mobo spec on asus.com (although there are only two pci-e 8/16 slots onboard, so they meant 2 590 i think), sabertooth is in "supported motherboards" list on geforce.com, but there is a plenty of info in sticky threads on different forums that p67 dont even support SLi itself...
and if there is only 16 lines physically and one 590 split 16 lines between 2 GPU (x8/x8), will two 590 work on x8/x8/x8/x8 or x4/x4/x4/x4? as i understand 1 slot x16 and another is x8. if i use only first one, it works as x16 for 580 (1 GPU example), virtual x16/x16 for 590 (physically x8/x8), if i use both slots - they work as x8 each so SLi 580 will work @ x8/x8 (physical) and SLi 590 will work on virtual x8/x8 x8/x8 but physical x4/x4 x4/x4? 
if it will be x4/x4 x4/x4, than perfomance will suck, dont it?


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;14680528*
> What's got you bummed on them? It was just a few days ago you were saying the 590's were the roof. I like AMD fine. My 5870m in my Asus G73JH-A1 rocks. I guess the 6990's might seem more preferable if you only use them at work and didn't have to pay for them. But for me, noise really was the deciding factor against them. They're just too freakin' loud for the fps difference it gets over the 590 in the few titles it does.


Everything I run, is always full water because I beat the living crap out of my hardware...Like I said, beta office









It's disappointing because in SLI the profiles just aren't there to do anything with both cards...and only 1 core is active in 3/4 of the programs/etc I choose to use so, essentially I'm limited in any activity I do, to 1 core.

Regardless of if it's the roof or not, if I can only run my 590's 100% in heaven and not in games or real time functions (They JUST added functionality for CS5) then it's really pointless for my office.

If I want the functionality, I have to rip a driver apart, edit it completely and even then there are issues because the custom driver doesn't mesh with the programs/games I have to use on a daily basis.

From the perspective of an enthusiast, I would say that the 590's are disappointing because even though the beef is there, the functionality is absolutely not.

I loathe the 6990's to be absolutely honest...Whenever I have to put them back in to do a bench, I groan because they're obnoxious even on water...Hot, finicky, just a total and complete ex-girlfriend IMO; however, I never have a problem getting 2/2 or 4/4 on anything we use...

So from my perspective, they aren't all they could be and I understand their potential which, makes it even worse.
Quote:


> Originally Posted by *Wogga;14683018*
> and i'm still confused - will my sabertooth p67 accept two 590? there is quad-SLi certification in mobo spec on asus.com (although there are only two pci-e 8/16 slots onboard, so they meant 2 590 i think), sabertooth is in "supported motherboards" list on geforce.com, but there is a plenty of info in sticky threads on different forums that p67 dont even support SLi itself...
> and if there is only 16 lines physically and one 590 split 16 lines between 2 GPU (x8/x8), will two 590 work on x8/x8/x8/x8 or x4/x4/x4/x4? as i understand 1 slot x16 and another is x8. if i use only first one, it works as x16 for 580 (1 GPU example), virtual x16/x16 for 590 (physically x8/x8), if i use both slots - they work as x8 each so SLi 580 will work @ x8/x8 (physical) and SLi 590 will work on virtual x8/x8 x8/x8 but physical x4/x4 x4/x4?
> if it will be x4/x4 x4/x4, than perfomance will suck, dont it?


Sabertooths should accept SLI in dual slots but, in regards to the "lines", it doesn't matter anymore...The difference on previous MB's was shown to only be 2/3/4% in terms of processing so, realistically performance in any slot is absolutely possible.

I don't think anyone has done any performance tests on the P67 but, I'd ask around if I were you.


----------



## Wogga

already found one review of 590 SLi and 6990 CF on variety of motherboards that are certified for quad and they said (and graphics "said"







) that there is really not so much performance loss (2-3% as you pointed).
for now i'm ok with it. if asus will make sabertooth XXX for the next-gen processors with native x16/x16, then i'll think about replacing current stuff.


----------



## L D4WG

Quote:


> Originally Posted by *rush2049;14682584*
> 590's are getting harder and harder to find new..... might I suggest the 'buy a dissatisfied owners' route? (Ebay, amazon, overclock.net, etc.)


Haha not a chance, if im going to drop that much coin on a GPU, its going to be new with warranty.

I just dont understand why they have none in stock??


----------



## Masked

Quote:


> Originally Posted by *L D4WG;14683914*
> Haha not a chance, if im going to drop that much coin on a GPU, its going to be new with warranty.
> 
> I just dont understand why they have none in stock??


The 590's were a limited release of X-units.

I can't give out the X but, they're almost all out of stock.

I may be selling my personal 590 hydro soon and they are definitely getting harder and harder to find!


----------



## bah73

Hi, does anyone know if this cable http://www.pccasegear.com/index.php?main_page=product_info&cPath=19_1155&products_id=14918 is the same as the power cable for the 8 pin conections on the Asus GTX 590 card?


----------



## Forty-two

"The 590's were a limited release of X-units."

So what do you think will be NVidia's next step after these are all sold? They can't just leave a vacume.


----------



## Levesque

Quote:


> Originally Posted by *Masked;14683685*
> I loathe the 6990's to be absolutely honest...Whenever I have to put them back in to do a bench, I groan because they're obnoxious even on water...Hot, finicky, just a total and complete ex-girlfriend IMO; however, I never have a problem getting 2/2 or 4/4 on anything we use...


I'm just curious. I see you posting this alot about the 6990 in alot of different threads. What are your ''problems'' with the 6990? You're always vague, and never giving any real world exemples.

And when watercooled, the 6990 is running really cool. I'm idling at 30 celsius and never go over 42-43 celsius with really high loads, even with a modded BIOS raising the voltage at 1.25v.

So if you are always watercooling your cards, why are you saying that the 6990 is ''hot''? They are defintely not. Look at my temps.

And I don't have a single problem with my Quad-Fire set-up with every games i have (over 100 in my Steam account), and with every software I use. Rock-stable. So what ''problems'' are you having? PEBKAC?

Plain curiosity. So don't go ballistic. Just explain those ''problems'' with real exemples.


----------



## Opp47

Quote:


> Originally Posted by *Forty-two;14685126*
> "The 590's were a limited release of X-units."
> 
> So what do you think will be NVidia's next step after these are all sold? They can't just leave a vacume.


sure they can... thats what LIMITED means









the classifieds are COMPLETELY gone..


----------



## MKHunt

Quote:


> Originally Posted by *Wogga;14683018*
> and i'm still confused - will my sabertooth p67 accept two 590? there is quad-SLi certification in mobo spec on asus.com (although there are only two pci-e 8/16 slots onboard, so they meant 2 590 i think), sabertooth is in "supported motherboards" list on geforce.com, but there is a plenty of info in sticky threads on different forums that p67 dont even support SLi itself...
> and if there is only 16 lines physically and one 590 split 16 lines between 2 GPU (x8/x8), will two 590 work on x8/x8/x8/x8 or x4/x4/x4/x4? as i understand 1 slot x16 and another is x8. if i use only first one, it works as x16 for 580 (1 GPU example), virtual x16/x16 for 590 (physically x8/x8), if i use both slots - they work as x8 each so SLi 580 will work @ x8/x8 (physical) and SLi 590 will work on virtual x8/x8 x8/x8 but physical x4/x4 x4/x4?
> if it will be x4/x4 x4/x4, than perfomance will suck, dont it?


While the sabertooth (and all sandy platforms w/o NF200) are limited to x8/x8, your cards will not run at x4/x4/x4/x4 because the 590 itself has an NForce 200 chip on it. It's the same chip that the very expensive SB motherboards use to do x16/x16 rather than x8/x8. If you open GPU-Z right now you'll see that your card is running at x16/x16 even though your SB processor only has 16 lanes. The NF200 artificially splits lanes into two and assigns similar workloads to each split lane.

The NF200 is cool, but it does add latency into the mix. It's nothing you would ever notice, but it is there. Where it gets confusing (and possible noticeable) is when you have an NF200 equipped motherboard like the Asus P67 WS Revolution or Gigabyte P67-UD7 and your PCI lanes are split into two by the NF200 on the motherboard, then split again by the NF200 on the 590s. It could be interesting to study the effects of stacking NF200 chips.

But yes, your 590's will run x8/x8/x8/x8. I hope you run benches at the same voltages and speeds as RC did so we can see the difference between quad on SB vs quad on 1336









ETA:
Quote:


> Originally Posted by *Opp47;14685745*
> sure they can... thats what LIMITED means
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the classifieds are COMPLETELY gone..


I'm SO glad I got mine before they dried up!


----------



## Opp47

Quote:


> Originally Posted by *MKHunt;14685854*
> I'm SO glad I got mine before they dried up!










 ME TOOOO!!!!!!

Anyone got any good ideas for me on a set of SLI fittings for use on my 590s in my RIIIE BE mobo??

i have em slotted right next to eachother.. so no spacing..

red or black only please


----------



## kainwalker

Quote:


> Originally Posted by *Tonza;14682000*
> I just ordered GTX 590
> 
> 
> 
> 
> 
> 
> 
> There was very nice special offer in finnish retailer (Asus GTX 590 for 495 euros!). Cant wait to get it, my AX 750 should be fine on stock voltage.


Hi, is there any chance they ship internationally? Thanks!


----------



## Masked

Quote:


> Originally Posted by *Levesque;14685255*
> I'm just curious. I see you posting this alot about the 6990 in alot of different threads. What are your ''problems'' with the 6990? You're always vague, and never giving any real world exemples.
> 
> And when watercooled, the 6990 is running really cool. I'm idling at 30 celsius and never go over 42-43 celsius with really high loads, even with a modded BIOS raising the voltage at 1.25v.
> 
> So if you are always watercooling your cards, why are you saying that the 6990 is ''hot''? They are defintely not. Look at my temps.
> 
> And I don't have a single problem with my Quad-Fire set-up with every games i have (over 100 in my Steam account), and with every software I use. Rock-stable. So what ''problems'' are you having? PEBKAC?
> 
> Plain curiosity. So don't go ballistic. Just explain those ''problems'' with real exemples.


Levesque, you read my posts...Where do I operate from? A BETA office...That means I basically have the opinion of an hardcore, 24/7/365 enthusiast...I actually run my hardware HARDER than the average enthusiast and I never back down because that would defeat the purpose of being in this office...So no offense, I don't care what temps you see...It's irrelevant to me.

There's a reason I don't tread around the 6990 forums...90% of the information, is childish and irrelevant.

I run the 6990's at 90-100%, 24hr operation...They do run hot, in every single beta application I run...In fact, they heat my office in the winter which, happens to be a nice bonus considering how cheap Dell is...But, the fact is, both of mine are at 65c atm with an 8* delta...Yeah, that's right...They have their own loop with an XSPC360 in push/pull...

I also fold which, kicks up the heat, immensely...

I find the size to be awkward getting into any case...I have many issues with AMD's driver releases but, my biggest issue of all is that if I have an issue with an AMD driver, I'm stuck with that issue.

Crossfire is NOW actually working 90%ish, finally! When those comments were made, for MOST users it had been commented on the AMD forums, that they were operating @60-70% and sometimes CF wasn't even responding...Mine were operating at 50% for almost every Beta app I had.

Another +1 to AMD would be that the drivers actually scale and handle most beta apps very well that are "in line" with their predecessors. If there not...You might as well get a new job because it's not-a-changin.

Now in the same token, my 590's beckon a constant love/hate relationship...I LOVE the fact that if I have a driver issue, I CAN FIX IT MOST of the time...I'm not locked into that problem until AMD decides to fix it...I HATE the fact that if I don't spend 2-3 hours customizing my drivers, I only see 1 core performance...However, I do see better performance than the 6990's ONCE that's done.

At 100% load on the 590's, I see MAX 40c, the 590 is most definitely a cooler card...

Just adding to the above, on the 590 in @80% of my applications when/if I get both cores to work, I get better performance from the 590 than I do from the 6990's...At a cooler temperature, too...Most benches ACTUALLY support me in that little tid-bit.

Let's not forget to mention that I've been through 6, 6990's and have yet to RMA my 590's in the same time frame...

My preference is the 590's but, I'm still disappointed in the driver support and current "stance" NVIDIA has on the platform...It's not perfect, it's far from perfect and until I can straight download a working driver and see all 4 cores perform in UNISON, I'm going to be dissatisfied...

Oh, P.S. in your case, it seems to be an ID-10T...


----------



## Wogga

Quote:


> Originally Posted by *MKHunt;14685854*
> While the sabertooth (and all sandy platforms w/o NF200) are limited to x8/x8, your cards will not run at x4/x4/x4/x4 because the 590 itself has an NForce 200 chip on it. It's the same chip that the very expensive SB motherboards use to do x16/x16 rather than x8/x8. If you open GPU-Z right now you'll see that your card is running at x16/x16 even though your SB processor only has 16 lanes. The NF200 artificially splits lanes into two and assigns similar workloads to each split lane.
> 
> The NF200 is cool, but it does add latency into the mix. It's nothing you would ever notice, but it is there. Where it gets confusing (and possible noticeable) is when you have an NF200 equipped motherboard like the Asus P67 WS Revolution or Gigabyte P67-UD7 and your PCI lanes are split into two by the NF200 on the motherboard, then split again by the NF200 on the 590s. It could be interesting to study the effects of stacking NF200 chips.
> 
> But yes, your 590's will run x8/x8/x8/x8. I hope you run benches at the same voltages and speeds as RC did so we can see the difference between quad on SB vs quad on 1336
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ETA:
> 
> I'm SO glad I got mine before they dried up!


yes i know, but still physically there will be x4 for each if they all run @100%

damn..order in russia =) we have alot of 590 but no money to buy them







(except EVGA ones, but it is understandable)


----------



## Arizonian

@Levesque

Hey Levesque, while I have your attention on the 590 thread.

We are missing your 6990 benchmarks on the Team Green vs. Team Red - The Friendly OCN Battle Royale thread. Originally the 6990 vs 590 Bench off. There aren't any 6990 quads being represented atm.

That's all....just wanted to size them up qaud vs qaud on the 590's. Ragin has benches he's done. Also some others with single 590's posted.

Take it easy.


----------



## L D4WG

Quote:



Originally Posted by *Masked*


The 590's were a limited release of X-units.

I can't give out the X but, they're almost all out of stock.

I may be selling my personal 590 hydro soon and they are definitely getting harder and harder to find!


Really, im so disappointed







.
The EVGA cards always look the best, I don't understand why other companies put the ugly stickers all over them...

I found a Palit GTX 590, I don't know anything about the Palit brand, are they ok?

If you were selling a non hydro I would consider it, but my system isn't water cooled.


----------



## MKHunt

Soooo I put this to you knowledgeable gentlemen. Witcher 2 ran flawlessly on 280.19 drivers, even OC'ed to 740. What is happening? PDL? Thermal? (Though it tops at 53?) I am so confuse.


----------



## L D4WG

Quote:



Originally Posted by *bah73*


Hi, does anyone know if this cable http://www.pccasegear.com/index.php?...ducts_id=14918 is the same as the power cable for the 8 pin conections on the Asus GTX 590 card?


Hey buddy, i looked into this a while back and im pretty sure they are not the same 8 pin connectors, you need the PCI-E 8 Pin which PC Case Gear doesn't stock, I was looking for them myself


----------



## rush2049

Quote:



Hi, does anyone know if this cable http://www.pccasegear.com/index.php?...ducts_id=14918 is the same as the power cable for the 8 pin conections on the Asus GTX 590 card?


That is for ATX motherboards, for when you want to route the power cable behind the motherboard tray and up from the top and or it just plain doesn't reach to begin with.

Quote:



Soooo I put this to you knowledgeable gentlemen. Witcher 2 ran flawlessly on 280.19 drivers, even OC'ed to 740. What is happening? PDL? Thermal? (Though it tops at 53?) I am so confuse.


The 280.26 have a hard voltage limit like all the drivers before 280.19
If you notice your voltage adjustment is greyed out, even if you enabled it (in the options) you would have a max of whatever factory default the card came with. This is set in the bios.

RagingCain has offered modified bioses to change the default and fall back voltages to be higher to allow adjustment. Go to this thread for more info:
http://www.overclock.net/nvidia/1063...ng-thread.html

As far as GPU usage that looks about right. What I am seeing is the gpu's are hitting whatever frame limit cap you have enabled. Whether it be vertical sync or a hard frames/sec limit enforced by the code i do not know (not familiar with witcher 2). When it hits this cap it has less usage, but then falls behind and goes back up to 100%.... rinse and repeat.

For that one long dip to 10% in the middle there, I am assuming that is where a loading screen was, or you alt+tabbed out of the game to the desktop, or used the steam overlay and it hung for a bit (seen it happen).....

So in conclusion, from as far as I can tell.... your card is performing exactly as it should. Or am I missing something?


----------



## MKHunt

Quote:



Originally Posted by *rush2049*


As far as GPU usage that looks about right. What I am seeing is the gpu's are hitting whatever frame limit cap you have enabled. Whether it be vertical sync or a hard frames/sec limit enforced by the code i do not know (not familiar with witcher 2). When it hits this cap it has less usage, but then falls behind and goes back up to 100%.... rinse and repeat.

For that one long dip to 10% in the middle there, I am assuming that is where a loading screen was, or you alt+tabbed out of the game to the desktop, or used the steam overlay and it hung for a bit (seen it happen).....

So in conclusion, from as far as I can tell.... your card is performing exactly as it should. Or am I missing something?


I know all about the voltage and whatnot but what's confusing is that it's never behaved like this in the game before. I was previously running it hotter and harder with beta drivers and beta afterburner. Vsync/framecaps are disabled since the avg. fps according to geforce.com with this game maxed (sans ubersampling) is either 62 or 72. Seems to be running fine now, so it must have been a fluke of some sort. Usually 99% GPU usage on both dies.


----------



## rush2049

Well.... the 280.26 drivers do have some performance/compatibility improvements.... and more sli profiles...... That is what you are seeing....


----------



## L D4WG

I should receive my 590 in the mail either today or tomorrow, can anyone tell me, if im currently using the 280.26 drivers with my GTX 275, do I need to do anything when I put my 590 in? Like a clean driver install of 280.26? Or will it work fine with whats currently installed.

Ill also post pics so I can be added to the club







.


----------



## Shinobi Jedi

There's another thread on the EVGA boards linking to a German site who claims they saw a "GTX 595" running BF3 at Gamescomm. And that it is an upcoming card with proper VRAM chips and higher clocks closer to a true 580..

Hmm... Another rumor? Or a place holder while Kepler is delayed? Sorry I don't have a link, I'm posting from my phone...

Oh, and I am super psyched I got my cards before they disappeared too


----------



## Shinobi Jedi

Quote:



Originally Posted by *L D4WG*


I should receive my 590 in the mail either today or tomorrow, can anyone tell me, if im currently using the 280.26 drivers with my GTX 275, do I need to do anything when I put my 590 in? Like a clean driver install of 280.26? Or will it work fine with whats currently installed.

Ill also post pics so I can be added to the club







.


Yes, you will need to reinstall your drivers. Congratulations on your purchase!


----------



## Wogga

Quote:



Originally Posted by *L D4WG*


Really, im so disappointed







.
The EVGA cards always look the best, I don't understand why other companies put the ugly stickers all over them...

I found a Palit GTX 590, I don't know anything about the Palit brand, are they ok?

If you were selling a non hydro I would consider it, but my system isn't water cooled.


i've got two Palit 590. they're just the same as others. basically all 590 are from one factory, all are reference. Asus, Palit and others just put their stickers on them and thats all the difference. and they run just the same as 590 from other brands. they're cheaper (Palit) because there is only card itself, 2 6-pin-to-8-pin cable, drivers CD and little book inside the box.
and i would recommend 280.19 - you could use voltage tweaking for small OC. even on air you can get some MHz

as for 595 - there were no news about it from may, i think its just the previous name of mars 2.


----------



## L D4WG

Quote:


> Originally Posted by *Shinobi Jedi;14693347*
> Yes, you will need to reinstall your drivers. Congratulations on your purchase!


Thankyou and I can not wait to smash some full spec gaming!!!


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;14693339*
> There's another thread on the EVGA boards linking to a German site who claims they saw a "GTX 595" running BF3 at Gamescomm. And that it is an upcoming card with proper VRAM chips and higher clocks closer to a true 580..
> 
> Hmm... Another rumor? Or a place holder while Kepler is delayed? Sorry I don't have a link, I'm posting from my phone...
> 
> Oh, and I am super psyched I got my cards before they disappeared too


...Kepler isn't exactly delayed...They're having issues applying...Well, I can't say really.

I don't think they'll do a 595 but, considering how much free time they have now, who knows.

The MARS2 is the only "595" currently out that I would think the rumor is about.

As to the future, well, that depends entirely on Nvidia and if they'd incorporate the new arch into the "590"s...


----------



## Wogga

tried to play Deus Ex Human Revolution. GPU load is <= 50% (on each GPU) -_-
do i need any SLi profile tweaking?


----------



## Shinobi Jedi

I'm getting like 200+ fps. With frame rates like that, does GPU's running full load %100 of the time really matter?


----------



## Wogga

its just like your car should eat 15 litres per 100km and allow to ride those 100km at 150km/h but it eats the same amount of gasoline but at 75 km/h and wont ride faster.
i want full usage =)


----------



## L D4WG

Add me add me!!

Palit GTX 590, Stock


----------



## emett

Hey i've got one of them.


----------



## Wogga

and update me!


----------



## L D4WG

Go the Palit cards


----------



## Smo

Welcome to the club, officially


----------



## Tept

Deus Ex runs odd for me. My GPU usage is incredibly erratic causing chop in-game. Literally watching the graph in MSI the GPU's are going 70>30>70>20>80>30 during certain points, not always. But when it does that chopping its horribly debilitating.


----------



## Smo

Quote:


> Originally Posted by *Tept;14708772*
> Deus Ex runs odd for me. My GPU usage is incredibly erratic causing chop in-game. Literally watching the graph in MSI the GPU's are going 70>30>70>20>80>30 during certain points, not always. But when it does that chopping its horribly debilitating.


What drivers and in-game settings are you using mate?


----------



## Tept

I seem to have fixed it. I guess I forgot to turn Power settings back to Prefer Maximum Performance after trying the 280.26 drivers. Turning it from Adaptive to Prefer Max seems to have solved it. And @ Smo I'm running the highest settings the game allows on DX11.


----------



## Smo

Quote:


> Originally Posted by *Tept;14709184*
> I seem to have fixed it. I guess I forgot to turn Power settings back to Prefer Maximum Performance after trying the 280.26 drivers. Turning it from Adaptive to Prefer Max seems to have solved it. And @ Smo I'm running the highest settings the game allows on DX11.


All these settings are a bit of a joke - we should get the most out of these cards straight out the box with no fuss. I can't imagine how frustrating a card like this would be for less experienced users who wouldn't know what to do or where to look with regards to overcoming some of these problems.


----------



## Tept

Ok apparently that didn't completely fix it.

Heres what my GPU usage looks like.










Tried turning voltage on my card down a hair to see if maybe PDL was responsible or some other stupid idea by Nvidia.


----------



## Wogga

after 1 hour of playing it just freezes. had to reboot already twice -_-


----------



## Jeci

I wish i had the moneys for one, let alone two!


----------



## Smo

Quote:


> Originally Posted by *Tept;14709350*
> Ok apparently that didn't completely fix it.
> 
> Heres what my GPU usage looks like.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tried turning voltage on my card down a hair to see if maybe PDL was responsible or some other stupid idea by Nvidia.


I just got my copy of Deus Ex and I've encountered the same problem. My usage sits at ~50% but is fluctuating constantly. Despite getting 60-130fps its choppy at times which to me is unacceptable.

Have you made any progress?

I'm on my way to work at the moment so there's nothing I can do but I'm considering using NVIDIA Inspector to delete the Deus Ex profile and add the .exe to either 3DMark11's profile or maybe Crysis: Warhead and see what happens.


----------



## Shinobi Jedi

Some thoughts on Deus Ex:

I've only got a chance to play a bit of the first level. But here's what I've noticed so far:

1. Major differences in Drivers.

After the opening movies ended and went straight to the game engine and you're in the office and the girl is waiting for you by the door (VSync Off):

On 280.19 - 74fps

On 280.26 - 225fps (Clearly there is an optimized SLI profile in this driver)

On the 280.26 driver all my GPU's fluctuated between 75-85% hovering around the higher end. Once did I see them drop to the 50's for a brief second...

This almost constant %85GPU usage did drive my GPU's temps into low 80's, and with my fan profiles set to kick up to %100 at 80c, I started having a fairly loud machine and had to consider throwing on the headphones to catch the dialogue, as it's too late here to turn up the Home Theater Speakers that I have my rig and display hooked up to.

This is all at stock clocks, stock voltages.

2. Now, I want to discuss something that I know that most will probably disagree with me on, but I want to throw it out there for the few who want to consider what I'm saying.

And the following probably only applies to those with 120hz monitors...

When Vsync is set to OFF; I have noticed in many games with extremely high FPS, but Deus Ex especially - What I can only describe as "Micro-Tearing".

Meaning, when I do a whip-pan with my FOV while moving, there is very tiny screen tearing. Not across the whole frame, but seemingly in spots with hard straight lines (walls, etc) - and here's the key thing - it corrects itself almost as fast as it happens. So it's almost imperceptible. But it's there. And when you're playing something feels off, and you get the slightest "stuttering" effect. Now I don't know if this is the same micro-stuttering everyone else is talking about, but...

I have noticed, when I set VSync to "ON" with the game running at 120hz the micro-tearing is gone and I get that incredibly bloody amazing buttery-smooth feeling that we all bought one of these panels for in the first place.

And, when Vsync is on my GPU's usage are all in the low 30's, which means -

_A much quieter computer..._

So, a more enhanced feeling of smoothness and a quieter computer? I'm gonna be having the Vsync turned on for all my SP games now. MP games like Battlefield I can see why you'd want to leave it off..

But in SP games where there's meant to be eye candy? I'm telling you, it looks way better when it's 120fps locked at 120hz than it is when it's 180-200fps+ with Vsync off.

Yet I know everyone's eyes are different, so what looks good to me, may look like arse to another.

I'm curious to see if anyone else notices it though. Look for "Micro-Tearing" with Vsync Off.
Then try enabling Vsync on Deus Ex and see if you get a "smoother", Micro-Tear free experience.

I definitely am. Like I said above: An enhanced smoother experience and low temps to make for a quiet computer?

Man, I FRACKING LOVE THESE CARDS....


----------



## rush2049

Micro-tearing or Micro-stuttering is something that is because of SLI.... its a problem you have when two cards have to communicate to render the same thing....

The fact that your seeing it is a sign that the pre-defined SLI profile isn't exactly ideal for the game.

With V-sync off, if your temps are really that high and you don't have too much of an increase in fps that means that some part of the rendering pipeline be it:
shaders, post-processes, geometry, tesselation, etc.
One of them is being computed overly much when left unbound. Probably computing one of them hundreds of times when the other stages can't keep up, so it is frivolous.

I have seen this before, its a sign of a poorly optimized game engine. (or, dare I say it, the game developer writing the engine is such a way as to favor a specific video card manufacturer)

The fix is to either wait for game patches, new drivers, or to create your own SLI profile and play with rendering settings....


----------



## Shinobi Jedi

Quote:


> Originally Posted by *rush2049;14717978*
> Micro-tearing or Micro-stuttering is something that is because of SLI.... its a problem you have when two cards have to communicate to render the same thing....
> 
> The fact that your seeing it is a sign that the pre-defined SLI profile isn't exactly ideal for the game.
> 
> With V-sync off, if your temps are really that high and you don't have too much of an increase in fps that means that some part of the rendering pipeline be it:
> shaders, post-processes, geometry, tesselation, etc.
> One of them is being computed overly much when left unbound. Probably computing one of them hundreds of times when the other stages can't keep up, so it is frivolous.
> 
> I have seen this before, its a sign of a poorly optimized game engine. (or, dare I say it, the game developer writing the engine is such a way as to favor a specific video card manufacturer)
> 
> The fix is to either wait for game patches, new drivers, or to create your own SLI profile and play with rendering settings....


But I do have a major increase in FPS when I turn Vsync off. 225fps+ on average so far. That's a 100+ FPS difference over having it run with Vsync on at 120fps.

But the Micro-Tearing makes the 225fps+ average not worth it.

Not when the game looks better than silky smooth at 120fps with everything setting in the "in game" video menu turned up. There has been no graphical glitches whatsoever.

I'm no expert by any means, but it seems to me that this is more a byproduct of how screen tearing shows up in new 120hz monitor technology rather than a faulty SLI profile...

Have you had a chance to try the game with Vsync on and the off?


----------



## Wogga

same tearing on 60Hz monitor w/o vsync and smoooooooth with vsync.
but still freezes that lead to reboot after aprox. 1 hour


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;14718213*
> I'm no expert by any means, but it seems to me that this is more a byproduct of how screen tearing shows up in new 120hz monitor technology rather than a faulty SLI profile...
> 
> Have you had a chance to try the game with Vsync on and the off?


Rush is actually right...It basically just means the drivers/game aren't speaking with each-other and causes an issue in latency. (In laymen terms)

It's the profile...Monitors don't really cause those kinds of issues unless you're running CRT from 95'


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Wogga;14718341*
> same tearing on 60Hz monitor w/o vsync and smoooooooth with vsync.
> but still freezes that lead to reboot after aprox. 1 hour


Interesting.

Does the freeze happen in the same part of the game? If so, it might be a game bug and not a Hardware issue..


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14718354*
> Rush is actually right...It basically just means the drivers/game aren't speaking with each-other and causes an issue in latency. (In laymen terms)
> 
> It's the profile...Monitors don't really cause those kinds of issues unless you're running CRT from 95'


I wasn't disagreeing with that per say, but what I inferred from what Rush was saying that because the SLI profile is faulty that one is not getting full graphical performance with Vsync on. And that hasn't been my case, is all I'm saying.

But I probably misunderstood him in the first place.

For me, 120fps "constant", nice smooth buttery gameplay with in game graphics settings at full blast with low GPU usage thereby keeping my temps low and my rig quiet -

I am more than happy with those results. In fact, I am fan freaking ecstatic.


----------



## Smo

Quote:


> Originally Posted by *Shinobi Jedi;14718514*
> I wasn't disagreeing with that per say, but what I inferred from what Rush was saying that because the SLI profile is faulty that one is not getting full graphical performance with Vsync on. And that hasn't been my case, is all I'm saying.
> 
> But I probably misunderstood him in the first place.
> 
> For me, 120fps "constant", nice smooth buttery gameplay with in game graphics settings at full blast with low GPU usage thereby keeping my temps low and my rig quiet -
> 
> I am more than happy with those results. In fact, I am fan freaking ecstatic.


I'll have to try enabling vsync and see if there's any change in my GPU usage. That said, I'm using driver set 280.19 still rather than 280.26 because I wanted to maintain the overvolting control. Thus far every game that I've enabled vsync in has made the GPU usage so erratic that the game is unplayable.

For instance in Crysis (I know, I know) with vsync on the game is juddery, slow and awful to play. With vsync off my GPU usage is much better, but the frame tearing is so bad it's just as awful to play as before. Crysis is locked to 60Hz I believe, and the only way to get it refreshing at 120 is create a custom resolution in NVIDIA Control Panel, effectively forcing the game to conform to the refresh rate. However when I do this my monitors say 'Out of Range' if I try to run 1920x1079 or 1919x1080. Bugger.

Anyway, I've gone off track - I'll try vsync on.

In the meantime, would you please be so kind as to extract your driver profiles from 280.26 with this;

http://www.geeks3d.com/20100528/manage-your-sli-profiles-with-nvidia-geforce-sli-profile-tool/

So that I may import the Deus Ex settings into 280.19 and see if there are any changes?

It's a great little tool that - I've used it with some success already (namely modifying Furmark to work effectively on the 590).


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;14718514*
> I wasn't disagreeing with that per say, but what I inferred from what Rush was saying that because the SLI profile is faulty that one is not getting full graphical performance with Vsync on. And that hasn't been my case, is all I'm saying.
> 
> But I probably misunderstood him in the first place.
> 
> For me, 120fps "constant", nice smooth buttery gameplay with in game graphics settings at full blast with low GPU usage thereby keeping my temps low and my rig quiet -
> 
> I am more than happy with those results. In fact, I am fan freaking ecstatic.


All I mean is that, it's not your monitor.

It can be, in some cases, but the monitor you're actually using, would not have an issue unless something was wrong w/it...

I wasn't insinuating anything by it beyond the fact that, the last monitor I actually remember to have "drag" like that was in fact a 95' CRT from Tandy.

The 120fps is nice, though...

I wish you could enable SLI profiles in 1x1x1 without causing so much drama on the 590's ~ Bleh.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14719287*
> All I mean is that, it's not your monitor.
> 
> It can be, in some cases, but the monitor you're actually using, would not have an issue unless something was wrong w/it...
> 
> I wasn't insinuating anything by it beyond the fact that, the last monitor I actually remember to have "drag" like that was in fact a 95' CRT from Tandy.
> 
> The 120fps is nice, though...
> 
> I wish you could enable SLI profiles in 1x1x1 without causing so much drama on the 590's ~ Bleh.


Word.


----------



## Wogga

disabling steam stuff that allows you to communicate ingame (shift+tab feature) helped =) no freezes for almost 3 hours ^_^


----------



## Smo

Is there anyone else on 280.26 that wouldn't mind posting their profiles with this;

http://www.geeks3d.com/20100528/mana...-profile-tool/

I'd be really grateful


----------



## rush2049

All I was trying to say is that when your cards ONLY have to produce 120 frames, they can spend more time syncing together to make sure there is no tearing at all.

When you let them run unbound it seems some part of the rendering pipeline is using too much of your hardware and causing high temperatures and sync issues.

So to explain lets use this example:

Your trying to draw a circle and fill it with a color as many times as you can in a second, but someone programmed you wrong.

Unbound:
You draw the circle 900 times, then start filling each one only getting to 225 frames, but you ran out of time to nicely display all the images and had to rush really quickly. This causes you to show some of frame 1 then some of frame 2 at the same time.... which looks bad.

Bound:
You draw the cricle 120 times, then fill it 120 times, getting exactly 120 frames. Then you show each frame for 1/120th of a second while working on the nest 120 frames.... taking it easy and sipping a rum and coke inbetween....

Get it now?

Sorry for going all technical on ya'll....


----------



## MKHunt

Quote:



Originally Posted by *Smo*


Is there anyone else on 280.26 that wouldn't mind posting their profiles with this;

http://www.geeks3d.com/20100528/mana...-profile-tool/

I'd be really grateful










Shazam!

That was completely painless, though IDK how many downloads that will allow.


----------



## Smo

Quote:



Originally Posted by *MKHunt*


Shazam!

That was completely painless, though IDK how many downloads that will allow.


Cheers dude, much appreciated!


----------



## Tept

I tried 280.26 and I actually saw a framerate drop due to the inability to up voltage. Though V-sync on 280.19 in Deus Ex has smoothed out alot of the problems. I get erratic irritating chop for maybe 30seconds after a loading screen, then its smooth. I cant bring myself to use 280.26, losing 80mhz core damages my soul.

Also with 280.26 I did notice 2 points where my GPU actually dropped to 0% and I had never seen that on 280.19. My guess is that this game was just optimized for an AMD card seeing as how they REALLY want you to see AMD when you start the game up lol.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *rush2049;14721384*
> All I was trying to say is that when your cards ONLY have to produce 120 frames, they can spend more time syncing together to make sure there is no tearing at all.
> 
> When you let them run unbound it seems some part of the rendering pipeline is using too much of your hardware and causing high temperatures and sync issues.
> 
> So to explain lets use this example:
> 
> Your trying to draw a circle and fill it with a color as many times as you can in a second, but someone programmed you wrong.
> 
> Unbound:
> You draw the circle 900 times, then start filling each one only getting to 225 frames, but you ran out of time to nicely display all the images and had to rush really quickly. This causes you to show some of frame 1 then some of frame 2 at the same time.... which looks bad.
> 
> Bound:
> You draw the cricle 120 times, then fill it 120 times, getting exactly 120 frames. Then you show each frame for 1/120th of a second while working on the nest 120 frames.... taking it easy and sipping a rum and coke inbetween....
> 
> Get it now?
> 
> Sorry for going all technical on ya'll....


Yes! Now I read you. Thanks for the explanation


----------



## Rowland2004

Just picked one up...


----------



## L D4WG

Just ran 3DMark11, My score seems a bit low...










My clocks havnt been touched but I think something may be wrong?

EVGA Precision and GPU-Z read

Core 607
Shader 1215
Memory 1707


----------



## MKHunt

Quote:


> Originally Posted by *L D4WG;14726643*
> Just ran 3DMark11, My score seems a bit low...
> 
> My clocks havnt been touched but I think something may be wrong?
> 
> EVGA Precision and GPU-Z read
> 
> Core 607
> Shader 1215
> Memory 1707


By no stretch of the imagination is that right. I get ~95-9700 on my bios crippled system at stock.

ETA: Actually can you show a score breakdown?


----------



## Tept

That score is very wrong. I get 11.1k @ 715core. You should be well over 9k.


----------



## L D4WG

I just ran it again

http://3dmark.com/3dm11/1729393;jsessionid=1ew6sn1t5hgnz?show_ads=true&page=%2F3dm11%2F1729393%3Fkey%3D7B9XuxtA8LExf2tCk7K52Vr7vj9vbz

Whats going on lol 4675?

I bumped my clocks to

Core 630
Shader 1260
Memory 1728


----------



## Arizonian

Quote:


> Originally Posted by *L D4WG;14727328*
> I just ran it again
> 
> http://3dmark.com/3dm11/1729393;jsessionid=1ew6sn1t5hgnz?show_ads=true&page=%2F3dm11%2F1729393%3Fkey%3D7B9XuxtA8LExf2tCk7K52Vr7vj9vbz
> 
> Whats going on lol 4675?
> 
> I bumped my clocks to
> 
> Core 630
> Shader 1260
> Memory 1728


Definetly wrong, my over clocked EVGA 580 @ 923 Core got 3D Mark11 score 7167 on i7 950 3.80Ghz.


----------



## L D4WG

My CPU couldn't be throttling it that much could it?


----------



## MKHunt

Quote:


> Originally Posted by *L D4WG;14727328*
> I just ran it again
> 
> http://3dmark.com/3dm11/1729393;jsessionid=1ew6sn1t5hgnz?show_ads=true&page=%2F3dm11%2F1729393%3Fkey%3D7B9XuxtA8LExf2tCk7K52Vr7vj9vbz
> 
> Whats going on lol 4675?
> 
> I bumped my clocks to
> 
> Core 630
> Shader 1260
> Memory 1728


That is... wow.


----------



## L D4WG

Ok, run number 3

P7371

http://3dmark.com/3dm11/1729453?pcmVantageResults=0&dm06Results=0&dm11Results=54&dmVantageResults=0&pcm05Results=0&pcm7Results=0&page=%2F3dm11%2F1729453%3Fkey%3Dwut2qrQyUuMN3DyL8jQYAbAh3JQtvj&show_ads=false&isAdmin=false&dm03Results=0&dm05Results=0

Any advice, at least now its not saying I have a problem, But even 7371 seems low for a 590...


----------



## malik22

Hey I guys I got my Asus 590 in the mail this morning going to reinstall windows what software should i use for fan control and ocing?


----------



## Smo

Quote:


> Originally Posted by *malik22;14728231*
> Hey I guys I got my Asus 590 in the mail this morning going to reinstall windows what software should i use for fan control and ocing?


MSI Afterburner 2.2.0 Beta 5 works very well!


----------



## emett

Dude that score is terrible! I get 9555 @ stock.


----------



## malik22

thanks Smo i have a problem though i connected the card to my 24inch tv monitor threw my hdmi cable but theres no sound what is the problem it worked perfectley with my qti 5970.


----------



## L D4WG

Quote:


> Originally Posted by *emett;14728596*
> Dude that score is terrible! I get 9555 @ stock.


But you have a 2600k

Its my CPU score that brings it down, whats you Graphics Only score in 3DMark11?


----------



## Smo

Quote:


> Originally Posted by *malik22;14728851*
> thanks Smo i have a problem though i connected the card to my 24inch tv monitor threw my hdmi cable but theres no sound what is the problem it worked perfectley with my qti 5970.


It's just a setting issue - have you looked at 'Sound' in the Control Panel?


----------



## malik22

i fugure it out what is digital vibrance?


----------



## malik22

here are to pics of my desktop do the colors look normal to you?


----------



## emett

Here is my result.
http://3dmark.com/3dm11/1645066;jsessionid=1n804hxu49ya0?show_ads=true&page=%2F3dm11%2F1645066%3Fkey%3Dx7ZfqPWeTc5e8pQPMdWp8Qmzsj4Qpk


----------



## Tept

L D4WG go into Nvidia control panel and make sure Multi-GPU is activated under Set multi-GPU and PhysX configuration. Choose Maximum 3D performance. Also under Manage 3D settings set Power management mode to Prefer Maximum Performance.

Nvm its your processor+memory. My Physics Score is 11800 while yours, even at your 7300 score, is 4300. Thats whats killing you.


----------



## RagingCain

CPU Could lower performance in 3D11, but how do any games run? Is SLI enabled in NVCP?

Post powered by DROID X2
Rooted running Eclipse.


----------



## dboythagr8

Quote:


> Originally Posted by *Tept;14730322*
> L D4WG go into Nvidia control panel and make sure Multi-GPU is activated under Set multi-GPU and PhysX configuration. Choose Maximum 3D performance. Also under Manage 3D settings set Power management mode to Prefer Maximum Performance.
> 
> Nvm its your processor+memory. My Physics Score is 11800 while yours, even at your 7300 score, is 4300. Thats whats killing you.


How does 3dmark11 work? Below are my results:










How are you getting 2000+ more points than me and we have the same processor?


----------



## emett

My understanding is that physics is not just cpu but also gpu involved.


----------



## MKHunt

Quote:


> Originally Posted by *emett;14736079*
> My understanding is that physics is not just cpu but also gpu involved.


And is affected quite a bit by memory speed. The difference between 1600 CL9 and 1866 CL8 can be ~4-600 in Pscore alone.


----------



## Tept

Quote:


> Originally Posted by *dboythagr8;14735699*
> How does 3dmark11 work? Below are my results:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How are you getting 2000+ more points than me and we have the same processor?


http://3dmark.com/3dm11/1545491

I dont know why my 11,103 score didnt save but thats good enough to show ya a comparison, and thats back on 275.33 drivers.


----------



## L D4WG

When 3DMark11 gets to the physics testing it says on screen that's its testing CPU not GPU.

Its defiantly my CPU thats holding back the score, but I played the original Crysis on absolute max settings and its seems ok, VSync at 26-30fps.


----------



## L D4WG

Quote:


> Originally Posted by *Tept;14730322*
> L D4WG go into Nvidia control panel and make sure Multi-GPU is activated under Set multi-GPU and PhysX configuration. Choose Maximum 3D performance. Also under Manage 3D settings set Power management mode to Prefer Maximum Performance.
> 
> Nvm its your processor+memory. My Physics Score is 11800 while yours, even at your 7300 score, is 4300. Thats whats killing you.


Multi GPU and Maximum 3D performance was already on in global settings, I switch power management to "Prefer Maximum Performance" however.

Any other global setting i can change to get more power out of it?

Number of pre rendered frames for example?


----------



## emett

If the only problem is your cpu why do you results vary so much? Every time you run 3d mark 2011 its like the combined score has been pulled from a hat.


----------



## L D4WG

Quote:


> Originally Posted by *emett;14738793*
> If the only problem is your cpu why do you results vary so much? Every time you run 3d mark 2011 its like the combined score has been pulled from a hat.


That is true...

I'm not sure to be honest, im going to run through again, Ive reinstalled the latest drivers for the 590, DX11 from The Witcher 2.

See how it goes

EDIT: P6990, my GPU only score is always 9000, but my CPU is just crummy at this benchmark.

Oh well, I can max the witcher 2 (no ubersampling) and Crysis, Its powerful enough till I upgrade the core components of my system next year or the year after.


----------



## L D4WG

Quote:


> Originally Posted by *RagingCain;14731987*
> CPU Could lower performance in 3D11, but how do any games run? Is SLI enabled in NVCP?
> 
> Post powered by DROID X2
> Rooted running Eclipse.


Games are running really well, SOme of the more recent games I own are Metro 2033, Crysis 1 and 2, the Witcher 2, and Assassins Creed Brotherhood.

And I can max them all at good fluid frame rates.

SLI Is enabled yes, My CPU is no good at handling the dedicated CPU PhysX part of 3DMark11

http://3dmark.com/3dm11/1735537;jsessionid=16jng9qbxfsex?show_ads=true&page=%2F3dm11%2F1735537%3Fkey%3D5VNaFhzJ5PuE9Tp5XzFNdVLR60FmN4


----------



## L D4WG

Hey Alatar, can you add me?

http://www.techpowerup.com/gpuz/6vh9s/

Brand : Palit

Core : 630
Shader : 1260
Memory : 1728


----------



## RagingCain

Anybody having performance issues that are unexplained, go back to the basics.

Fresh install of drivers.

Re-install benchmarks.

If you just switched brands of cards, make sure to remove all traces of AMD/nVidia drivers before installing fresh drivers. That doesn't include ForceWare Custom Install, Clean mode, you have to remove them all in safe mode.

Make sure Win7 is up to date.

Make sure nothing else is running.

Make sure SLI is enabled in NVCP.

Set 3D Performance Settings to Single Monitor / Prefer maximum performance, or Multi-monitor if you are doing triple monitor.

If you are running triple monitor, test single monitor.

If your issue is only 3Dmark11, chances are its just 3DMark11. Consensus is though, there are no 590 + 3DMark11 specific issues unless there is something new going on.

Make sure to try drivers 280.19/280.26. 280.19 is overclocking central, 280.26 is stability central.

Make sure your CPU overclock is stable testing. Don't make others work when you haven't tested your own system out. Its borderline rude.

Scale back on CPU overclock and test if there is any performance difference. If 4400 MHz is better than 4800 MHz, chances are you have instability. If 3.8 GHz is worse than stock, you have stability issues. Again test it out with Prime95/IBT, don't waste peoples time IF you haven't done this already. Nothing urks me more than finding this out the OC has never been stress tested, and listening to people whine about how Minecraft plays fine for months, this shouldn't be an issue.

Make sure that there is adequate voltage on the GPUs and that there isn't a weird BIOS or new 590 out there acting weird. Consensus is for stock, you need to be 0.913v for 607~620 MHz. On 280.19/280.26 drivers though there is a fake voltage increase, so if you need 0.913v, you have to set 0.925v in Afterburner. Its annoying I know, call nVidia. That also goes double for EVGA users, our stock is 0.925v, but we need 0.938v in afterburner for ACTUAL 0.925v.

If benchmarking, make sure you have no custom 3D Global Settings on, i.e. restore drivers to default.

Make sure you have your PCI-E 8xPIN cables plugged all the way in, mine are 6+2 and the +2 part always gets nudged out which is annoying to troubleshoot. Its now the first thing I check. The cards WILL run on gimped power to a certain extent. I discovered this after my first month of troubleshooting just to find a +2 pin just hanging out willy nilly.

Make sure your PSU can deliver at least 50 AMPs on the Single rail, and if its multi-rail PSUs, make sure that they are running from 2 indepdent 12v rails. Reason for this is, two connections from the same rail have max output of usually 20~30 AMPs. But if you use a connection off of two different rails. Your max output is 40~60 AMPs. If you are quad-SLI and its always crashing even at stock... you should have one of the recommend PSUs by nVidia. If not, everything else is a roll of the dice. I can confirm that an Ultra X4/X3 at 1200W will do the job, but its nowhere near as stable as my Antec HPC 1200.

If all else fails, try a Win7 reformat if its not a fresh install.


----------



## Smo

^ I love this guy.


----------



## L D4WG

Quote:


> Originally Posted by *RagingCain;14740554*
> Anybody having performance issues that are unexplained, go back to the basics.
> 
> Fresh install of drivers.
> 
> Re-install benchmarks.
> 
> If you just switched brands of cards, make sure to remove all traces of AMD/nVidia drivers before installing fresh drivers. That doesn't include ForceWare Custom Install, Clean mode, you have to remove them all in safe mode.
> 
> Make sure Win7 is up to date.
> 
> Make sure nothing else is running.
> 
> Make sure SLI is enabled in NVCP.
> 
> Set 3D Performance Settings to Single Monitor / Prefer maximum performance, or Multi-monitor if you are doing triple monitor.
> 
> If you are running triple monitor, test single monitor.
> 
> If your issue is only 3Dmark11, chances are its just 3DMark11. Consensus is though, there are no 590 + 3DMark11 specific issues unless there is something new going on.
> 
> Make sure to try drivers 280.19/280.26. 280.19 is overclocking central, 280.26 is stability central.
> 
> Make sure your CPU overclock is stable testing. Don't make others work when you haven't tested your own system out. Its borderline rude.
> 
> Scale back on CPU overclock and test if there is any performance difference. If 4400 MHz is better than 4800 MHz, chances are you have instability. If 3.8 GHz is worse than stock, you have stability issues. Again test it out with Prime95/IBT, don't waste peoples time IF you haven't done this already. Nothing urks me more than finding this out the OC has never been stress tested, and listening to people whine about how Minecraft plays fine for months, this shouldn't be an issue.
> 
> Make sure that there is adequate voltage on the GPUs and that there isn't a weird BIOS or new 590 out there acting weird. Consensus is for stock, you need to be 0.913v for 607~620 MHz. On 280.19/280.26 drivers though there is a fake voltage increase, so if you need 0.913v, you have to set 0.925v in Afterburner. Its annoying I know, call nVidia. That also goes double for EVGA users, our stock is 0.925v, but we need 0.938v in afterburner for ACTUAL 0.925v.
> 
> If benchmarking, make sure you have no custom 3D Global Settings on, i.e. restore drivers to default.
> 
> Make sure you have your PCI-E 8xPIN cables plugged all the way in, mine are 6+2 and the +2 part always gets nudged out which is annoying to troubleshoot. Its now the first thing I check. The cards WILL run on gimped power to a certain extent. I discovered this after my first month of troubleshooting just to find a +2 pin just hanging out willy nilly.
> 
> Make sure your PSU can deliver at least 50 AMPs on the Single rail, and if its multi-rail PSUs, make sure that they are running from 2 indepdent 12v rails. Reason for this is, two connections from the same rail have max output of usually 20~30 AMPs. But if you use a connection off of two different rails. Your max output is 40~60 AMPs. If you are quad-SLI and its always crashing even at stock... you should have one of the recommend PSUs by nVidia. If not, everything else is a roll of the dice. I can confirm that an Ultra X4/X3 at 1200W will do the job, but its nowhere near as stable as my Antec HPC 1200.
> 
> If all else fails, try a Win7 reformat if its not a fresh install.


Thankyou, and dayum!!

Lots for me to check, lots to think about now.


----------



## RagingCain

No problem ^.^

Post powered by DROID X2
Rooted running Eclipse.


----------



## Semedar

Anyone else getting character artifacting from their 590s?










Edit: And are my scores normal? (Settings are at *Stock Performance (P) 1280x720 w/ moderate load suitable for most gaming PCs.*)


----------



## Tept

Quote:


> Originally Posted by *Semedar;14752786*
> Anyone else getting character artifacting from their 590s?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: And are my scores normal? (Settings are at *Stock Performance (P) 1280x720 w/ moderate load suitable for most gaming PCs.*)
> 
> 
> Spoiler: Warning: Spoiler!


The Artifacting is because Deus Ex has garbage coding for DX11, I get the exact same thing. If you switch DX11 off and just max out settings on DX10/9, whichever it uses, you wont notice any difference but itll run butter smooth and clean.

That score is ok if you are either stock or barely overclocked on the 590, otherwise its low. You should be in the 10k's on a moderate overclock.


----------



## Smo

I had a similar issue briefly - changing shadows from 'Soft' to 'Normal' sorted it for me. Your score is 'ok' but as Tept said - a mild overclock is always worth trying


----------



## Trixzion

Gigabyte GTX 590 Exclusive Bundle - Running on stock clocks.









I allready mounted the GFX card in my current build, but going to post more pictures very soon with a totaly new build 2500k build.


----------



## Semedar

Quote:


> Originally Posted by *Trixzion;14754048*
> Gigabyte GTX 590 Exclusive Bundle - Running on stock clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I allready mounted the GFX card in my current build, but going to post more pictures very soon with a totaly new build 2500k build.


That's a pretty awesome case for your 590.







Give us some benchys and what-not after you're done.


----------



## RagingCain

I don't see the artifacting can someone point it out?

Post powered by DROID X2
Rooted running Eclipse.


----------



## Tept

Quote:


> Originally Posted by *RagingCain;14754903*
> I don't see the artifacting can someone point it out?
> 
> Post powered by DROID X2
> Rooted running Eclipse.


Its more of missing pixels. Look at Darrow's face and body, you will see missing spots.


----------



## Arizonian

I'm not sure if anyone saw October's issue of Maximum PC with the XFX 6990 Quad Crossfire vs the ASUS 590 Quad SLI.

Looks like the Quad SLI won in Standard Performance / Extreme Performance / Noise and Power / Ease of Use - in both 1920x1200 4X AA and 2560x1600 8X AA.

Out of seven benches in 1920x1200 it won five.
Out of seven benches in 2560x1600 it won four.

They did it on a single 30" monitor. I'd think the 6990 would have gotten better scores on multiple monitors.

Seems the 590 having matured a bit with drivers is doing better than original release where the 6990 had a jump on it.

Kudos









Edited to add model brands of each GPU.


----------



## Arizonian

Double post. Whoops.


----------



## RagingCain

Quote:


> Originally Posted by *Arizonian;14755453*
> I'm not sure if anyone saw October's issue of Maximum PC with the 6990 Quad Crossfire vs the 590 Quad SLI.
> 
> Looks like the Quad SLI won in Standard Performance / Extreme Performance / Noise and Power / Ease of Use - in both 1920x1200 4X AA and 2560x1600 8X AA.
> 
> Out of seven benches in 1920x1200 it won five.
> Out of seven benches in 2560x1600 it won four.
> 
> They did it on single monitor I'm assuming as they didn't state if it was on multiple monitors. I'd think the 6990 would have gotten better scores on multiple monitors.
> 
> Seems the 590 having matured a bit with drivers is doing better than original release where the 6990 had a jump on it.
> 
> Kudos


I mentioned W1zzard to do a re-review of the 590 with the newer drivers, and I took a jab at him to do it with less voltage too. It got ignored of course.
ARES MARS II gets a 9.4 score.
GTX 590 gets a 7.0 score. Performance diff ~10%


----------



## Tept

LOL


----------



## emett

Thats Awesome. You'd just need some black out curtains for the warehouse now.


----------



## MKHunt

I don't know the first thing about PhysX so I'm asking y'all. CPU or let nVIDIA control panel choose? It seems to favor GPU 2 on the 590 but in benches I get ~20 points higher with it set to CPU. I've never figured out how to turn it off completely.

I guess the most effective test would be to play an inner-city level on BFBC2 with both settings then compare fraps results but ehhhhh... I rack the dsciprin.

Grazie.

ETA: I concur with everyone else about Deus Ex. Driver support is very mediocre. Encountering microstutter with vsync off and when I turn it on I get a slight pause/skip/hesitation/lag every two seconds exactly. At first I thought, 'Omg is the 590 not enough?' But then I realized that was an extremely silly thought.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *RagingCain;14755594*
> I mentioned W1zzard to do a re-review of the 590 with the newer drivers, and I took a jab at him to do it with less voltage too. It got ignored of course.
> ARES MARS II gets a 9.4 score.
> GTX 590 gets a 7.0 score. Performance diff ~10%


So what you're saying is, we have someone reviewing Hardware not from the perspective it's designed for - to play Games, but rather from someone whom is not a real gamer but only cares about overclocking.

What sites/publications does he review for? So I can make sure to avoid them.

Honestly, I don't want to be an overclocking hater, but I am getting a little fed up with how much debate and distractions are caused by overclocks that only show a few frames difference in real world performance.

Taking Bloomfield, Nehalem or SandyBridge CPU's and going from 2.6 to 4.0-5.0 - that I get and am down with. Using Throttlestop on my M11x-R2 with it's I7 and OC'ing the GPU so that I'm getting the same performance as a stock "R3" - I get that.

But I don't get the hands down "winner/loser" mentality that happens with videocard GPU OC'ing where people decide what's better based on which card has the longest colored bar graph - when the difference is almost always less than %10 - _because they can't think for themselves_.

Seriously, when someone can show me a GPU clock that gets you a good 10-20fps bump on games like Metro 2033 from stock clocks and/or allows you to run performance hitting effects like D.O.F and stay 60/120FPS+ - Then I'll be impressed with the clock/card.

Or, I am impressed with people like RagingCain and any others that are able to get their 590's up to 580 core clocks and beyond.

But, I don't get the mentality from someone who is more into rigs to overclock rather than game and give a 7.0 score vs. 9.4 over a few FPS? Especially when the reviewer fried the first card over his ineptitude?

That's like school on Sunday - No class...


----------



## MKHunt

Quote:


> Originally Posted by *Shinobi Jedi;14768310*
> Honestly, I don't want to be an overclocking hater, but I am getting a little fed up with how much debate and distractions are caused by overclocks that only show a few frames difference in real world performance.
> 
> ...
> 
> Seriously, when someone can show me a GPU clock that gets you a good 10-20fps bump on games like Metro 2033 from stock clocks and/or allows you to run performance hitting effects like D.O.F and stay 60/120FPS+ - Then I'll be impressed with the clock/card.
> 
> Or, I am impressed with people like RagingCain and any others that are able to get their 590's up to 580 core clocks.


I feel these statements are a tad contradictory.

Going from 630 core to 795 gained me ~11fps in Witcher 2, BFBC2, Crysis 2, Deus Ex, Bioshock 2, and Crysis WARHEAD. Real gains went from 8-13, but the average is probably about 11. Gains with MW2 were much more impressive, but since it runs at more than 250fps they were far, far, far beyond useless.

In benches I only saw a 7-8fps gain to be honest.

I don't see this being a real counterpoint to your argument since the only game on that list to run below 60fps was Witcher 2 with ubersampling. I don't know about everyone else but I get so immersed in games that ubersampling doesn't change the experience at all for me besides dropping me to the high 40's and low 50's for framerates so it doesn't really matter that I saw gains.

Also, gains seem to come more easily with higher CPU overclocks. At 4.4GHz my difference OC to stock was <10fps but at 4.6 and up (currently running 4.9) the differences between stock and OC'ed become larger.

tl;dr 10fps gains ARE there on the 590. You just have to work hard for them, have a highly-strung CPU, and they won't make any difference unless you have a true 120Hz display









If you want screenshot proof, fugeddaboutit. I am FAR too lazy and busy with school to detune everything, play a level I've beaten, take screens, retune everything, then repeat. I did that in my argument with Gigabyte and all it did was drain me of many hours of life and got me nowhere in regards to fixing problems.


----------



## Tept

I noticed a substantial difference in peak FPS in benches. However I also noticed going from 612core to 720mhz core, I had higher highs, but the EXACT SAME lows. There is 1 particular part in 3dmark vantage where no matter WHAT my OC is, it ALWAYS falls to 65-67fps, it doesnt matter if im stock clocks with stock memory or 1900mhz memory and 750 core. That to me is irritating.

My idea of OC is getting the lows tweaked out. I want to take a card, play a game thats unplayable, OC it, and be 100% playable after the OC. Don't get me wrong I havent played a game thats even close to unplayable yet on the 590. But thanks to nvidia and this idiot PDL crap, right when you NEED the power to push through a tuff scene, is right when the card wants to say OMG, BACK OUT, BACK OUT.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *MKHunt;14768653*
> I feel these statements are a tad contradictory.
> 
> Going from 630 core to 795 gained me ~11fps in Witcher 2, BFBC2, Crysis 2, Deus Ex, Bioshock 2, and Crysis WARHEAD. Real gains went from 8-13, but the average is probably about 11. Gains with MW2 were much more impressive, but since it runs at more than 250fps they were far, far, far beyond useless.
> 
> In benches I only saw a 7-8fps gain to be honest.
> 
> I don't see this being a real counterpoint to your argument since the only game on that list to run below 60fps was Witcher 2 with ubersampling. I don't know about everyone else but I get so immersed in games that ubersampling doesn't change the experience at all for me besides dropping me to the high 40's and low 50's for framerates so it doesn't really matter that I saw gains.
> 
> Also, gains seem to come more easily with higher CPU overclocks. At 4.4GHz my difference OC to stock was <10fps but at 4.6 and up (currently running 4.9) the differences between stock and OC'ed become larger.
> 
> tl;dr 10fps gains ARE there on the 590. You just have to work hard for them, have a highly-strung CPU, and they won't make any difference unless you have a true 120Hz display
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you want screenshot proof, fugeddaboutit. I am FAR too lazy and busy with school to detune everything, play a level I've beaten, take screens, retune everything, then repeat. I did that in my argument with Gigabyte and all it did was drain me of many hours of life and got me nowhere in regards to fixing problems.


I'm not communicating what I'm trying to say very well. Because I see what you mean about those statements being contradictory, and yet I'm trying to say essentially the same thing you are in your response.

See, those results are impressive, that's what I'm talking about being down with. Where I see that I am not clarifying really well, is I was comparing the 590 and the 6990/Mars in my head and not the 590 against itself.

Meaning, in the benchmark reviews, even with the overclock, the 6990 was almost always less then a 10fps difference between the cards where the 6990 does come out ahead, even when it's overclocked.

Whereas the 590, brought me a 10fps+ in the Metro 2033 bench just going from the 270-275 series of drivers. But because a 6990 edges out the 590 by such a small difference in FPS, even though your eye most likely won't see the difference (unless it's at the border of being choppy FPS) - It's the "big winner"

But the reviewers that are Overclockers first and gamers second, don't really factor that it's leaf blower loud and there's been a history of bad driver support at AMD, though that seems to be changing. Of course this still applies if the results were flip-flopped. If someone wanted to game with 6 monitors and headphones, then they shouldn't care if the 590 is now out performing the 6990 in many benches.

And now we have the Mars, and the hypocrisy in scores between 7.0 and 9.4 because one overclocks better by %10, is exactly what I mean. There's too much emphasis on overclocking performance and not on the other attributes.

If I was reviewing the Mars card as a gamer, then the Mars score would also have to be graded on real world, gamer applicable qualities. The two biggest one being size and price.

First, this being a 3 slot card, is there even a mobo available that has the room to run two of these in SLI if some Richie-Rich wanted to? If not, then the fact that you're going to be stuck with this one card in your machine should be a factor in the final review score.

The second is price. Being that the suggested MRP is equal to getting two 590's for Quad SLI, should completely be a factor in the card's grading.

But no, because the card comes with true 580 level stock clocks, it gets a 9.4 - Because too much importance is placed on overclockability, rather than all the criteria that someone who is into PC Gaming, only for the games, can make an informed decision on.

That's what I'm trying to say. The fact that people like you are getting such awesome performance increases and this isn't making news in Hardware enthusiast sites, is hypocritical.

I have a feeling that reviewers like W1zzard are too worried that if they go back and take a second look at the 590 and admit they got a little too eager the first time, that they'll lose some kind of credibility, when truthfully, they would actually gain credibility by displaying true objectivity.

So, what I meant to say was, there's was mostly only a less than 10fps difference between the 6990 and 590 with overclocks applied (I'm going off memory, so I may very well be wrong) - And what I was trying to say was, show me an overclock that gives 20-30fps bump of one card over the other 2 out of the top 3 on the market and then you can color me impressed. (You made me see that 10fps is too generous and easy to obtain)

So, If I haven't confused things any further, while I think you're misunderstanding what I was trying to say because I didn't communicate it well - I do concede that when looking at it from your perspective it does seem contradictory. But I forgot to clarify that I meant the 590 vs. 6990/Mars and not against itself - in regards to which one makes the best purchase for the high end gamer.

While I'm pulling internet faux pa's and actually admitting online when I'm wrong, let me really dig into some crow by saying in regards to Deus EX:HR -

Rush and Masked were totally right and I was wrong about the "Micro-Tearing".
My buddy is getting the same thing on his new HP laptop that uses AMD's crossfire (In the same way Nvidia applies Optimus in their notebooks) to switch between his notebook's 6750m and the intergrated GPU.

Only enabling Vsync gets rid of this.

Also now that I'm deep into the game, my GPU usage and FPS are now all over the place too. Anywhere from 50fps for a milli-second, then all the way back up 120fps before u know it.

So, I'm pretty confident this is the game's optimization/coding rather than any of us having issues with our 590's and rigs.

If I was getting these FPS in a competitive MP game, I'd be pissed. But since I'm playing a SP game for the story and everything is fluid for the most part, I don't mind.

I still fracking love these cards. I love them so much, my wife is starting to get jealous.

Although, I'm sure it's probably not the case, I'm justifying to myself to buy a couple of SSD's to see if that helps with any of the Microstutter. One for my OS and one to run games off of. Of course, I know it probably won't help much, but I just really want one.

Anyone know if there's a certain brand or model that's really good for games? Similar to how WD Velociraptor HD is designed with games in mind?

Sorry for the long post. I wasn't planning on writing a book.

Good Times...


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Tept;14769043*
> I noticed a substantial difference in peak FPS in benches. However I also noticed going from 612core to 720mhz core, I had higher highs, but the EXACT SAME lows. There is 1 particular part in 3dmark vantage where no matter WHAT my OC is, it ALWAYS falls to 65-67fps, it doesnt matter if im stock clocks with stock memory or 1900mhz memory and 750 core. That to me is irritating.
> 
> My idea of OC is getting the lows tweaked out. I want to take a card, play a game thats unplayable, OC it, and be 100% playable after the OC. Don't get me wrong I havent played a game thats even close to unplayable yet on the 590. But thanks to nvidia and this idiot PDL crap, right when you NEED the power to push through a tuff scene, is right when the card wants to say OMG, BACK OUT, BACK OUT.


Really good point.


----------



## MKHunt

Ah I see what you meant and I completely agree. Even though I have obtained those results I still game at stock clocks and voltages; the extra heat and power bill (see ~600W of draw overclocked) just aren't worth it for a gain I _will not_ see.

About 590/Mars2/6990 I agree 100%. Sites are too focused on the micro details that only matter when it comes to benchmarking. When I was heavily into benching a few weeks back I would get frustrated and dissatisfied over the tiniest things. I soon realized why I wasn't loving my computer like i should and promptly stopped. The less benching and more gaming I do, the more pleased with my machine I am.

That just reinforces that if you're really into benchmarking then the 590 and 6990 really aren't for you anyway. They're both way too quirky and have too many driver issues to even come close to two 580's @ stock, though they're priced appropriately.

When it comes down to it the 590 and 6990 are gamer's cards built for gamers. That's it.

I don't view the MARS as a gamer's card. I see it as a toy for a rich playboy who needs to show off his e-peen. If you pay the same price as 2x HydroCopper 590s for a single card with maybe ~12% gain over a single 590 and has a seriously crippled warranty (compared to EVGA) and no aftermarket support/H2O cooling, then you need to re-evaluate your life decisions.

To equate it to cars, that's like paying $120,000 USD for a BMW 335i with a software chip.

ETA: the 335is is like ~60K IIRC and stands-in for the 590HC in this example.

I 100% agree with Tept as well. When I push higher clocks I even see my lows drop. It's especially apparent when overclocking memory for me. That's another reason I don't game often while OCed. The effect became really noticeable in Heaven 2.5 where my lows dropped from high 20's to low 20's. It's like every frame that got put on top of my max was taken from my min.


----------



## L D4WG

Hey guys, If I have my HDTV attached via HDMI cable to a HDMI/DVI-D adapter, and the adapter into my 590.

Should I hear sound on my TV?


----------



## Smo

Quote:


> Originally Posted by *L D4WG;14769792*
> Hey guys, If I have my HDTV attached via HDMI cable to a HDMI/DVI-D adapter, and the adapter into my 590.
> 
> Should I hear sound on my TV?


It will work, yeah - I did the same before my monitors arrived with a spare Samsung TV I had laying around. Although I'm aware that there are different types of DVI adapters so they could play a role in it.

Try it out!


----------



## L D4WG

Quote:


> Originally Posted by *Smo;14769823*
> It will work, yeah - I did the same before my monitors arrived with a spare Samsung TV I had laying around. Although I'm aware that there are different types of DVI adapters so they could play a role in it.
> 
> Try it out!


Well Ive been using it for a few days and its not so far, But I only really thought about it just now, ill have to fiddle with sound options when I get home, thanks SMO!!

P.S Niki Minaj is a babe


----------



## MKHunt

Quote:


> Originally Posted by *L D4WG;14769838*
> Well Ive been using it for a few days and its not so far, But I only really thought about it just now, ill have to fiddle with sound options when I get home, thanks SMO!!
> 
> P.S Niki Minaj is a babe


I can confirmed that it will work. Just switched my sound to my monitor. Dear lord it was bad. So so so bad. I almost cried.


----------



## Smo

Quote:


> Originally Posted by *L D4WG;14769838*
> Well Ive been using it for a few days and its not so far, But I only really thought about it just now, ill have to fiddle with sound options when I get home, thanks SMO!!
> 
> P.S Niki Minaj is a babe


If its all hooked up alright then the TV should show up under 'Sound' in the Control Panel - setting it to default should do the trick!

I love Nicki with the pink hair. Hawt


----------



## L D4WG

Cool thanks guys, just will allow me to keep my speakers attached to my tv for my PC and 360 instead of just my PC!!


----------



## L D4WG

Hmm its not in sound devices, Ive just got the usual, Back Panel Audio output, Headphones and S/PDIF.

Ideas?


----------



## Shinobi Jedi

You can usually access the HDMI sound settings in Nvidia Control Panel. It often takes you to Sounds in your device manager, which you said it's not showing up, but maybe it will through Nvidia control panel. If you don't see a sound/HDMI setting in the left side of the control panel in the index, then try reinstalling the driver.


----------



## radagust

hey gents, ive got a asus 590 for little over a week now and ive noticed that everytime my computer wakes from sleep, the fan of the card will suddenly rev up to 100% and then slowly fall back off to about 40% over half a minute. 
im on the 280.26 nvidia drivers. everythings stock. is this normal?

thanks!


----------



## L D4WG

You should be fine man, I wouldn't worry about it.
It can happen when you GPU cant read its temp or fan speed for a few seconds, it cranks the fan up just in case because it doesn't know if its too hot.

Just a precautionary thing, as long as it settles back down, that's the main part.


----------



## RagingCain

Quote:



Originally Posted by *Shinobi Jedi*


I'm not communicating what I'm trying to say very well. Because I see what you mean about those statements being contradictory, and yet I'm trying to say essentially the same thing you are in your response.

See, those results are impressive, that's what I'm talking about being down with. Where I see that I am not clarifying really well, is I was comparing the 590 and the 6990/Mars in my head and not the 590 against itself.

Meaning, in the benchmark reviews, even with the overclock, the 6990 was almost always less then a 10fps difference between the cards where the 6990 does come out ahead, even when it's overclocked.

Whereas the 590, brought me a 10fps+ in the Metro 2033 bench just going from the 270-275 series of drivers. But because a 6990 edges out the 590 by such a small difference in FPS, even though your eye most likely won't see the difference (unless it's at the border of being choppy FPS) - It's the "big winner"

But the reviewers that are Overclockers first and gamers second, don't really factor that it's leaf blower loud and there's been a history of bad driver support at AMD, though that seems to be changing. Of course this still applies if the results were flip-flopped. If someone wanted to game with 6 monitors and headphones, then they shouldn't care if the 590 is now out performing the 6990 in many benches.

And now we have the Mars, and the hypocrisy in scores between 7.0 and 9.4 because one overclocks better by %10, is exactly what I mean. There's too much emphasis on overclocking performance and not on the other attributes.

If I was reviewing the Mars card as a gamer, then the Mars score would also have to be graded on real world, gamer applicable qualities. The two biggest one being size and price.

First, this being a 3 slot card, is there even a mobo available that has the room to run two of these in SLI if some Richie-Rich wanted to? If not, then the fact that you're going to be stuck with this one card in your machine should be a factor in the final review score.

The second is price. Being that the suggested MRP is equal to getting two 590's for Quad SLI, should completely be a factor in the card's grading.

But no, because the card comes with true 580 level stock clocks, it gets a 9.4 - Because too much importance is placed on overclockability, rather than all the criteria that someone who is into PC Gaming, only for the games, can make an informed decision on.

That's what I'm trying to say. The fact that people like you are getting such awesome performance increases and this isn't making news in Hardware enthusiast sites, is hypocritical.

I have a feeling that reviewers like W1zzard are too worried that if they go back and take a second look at the 590 and admit they got a little too eager the first time, that they'll lose some kind of credibility, when truthfully, they would actually gain credibility by displaying true objectivity.

So, what I meant to say was, there's was mostly only a less than 10fps difference between the 6990 and 590 with overclocks applied (I'm going off memory, so I may very well be wrong) - And what I was trying to say was, show me an overclock that gives 20-30fps bump of one card over the other 2 out of the top 3 on the market and then you can color me impressed. (You made me see that 10fps is too generous and easy to obtain)

So, If I haven't confused things any further, while I think you're misunderstanding what I was trying to say because I didn't communicate it well - I do concede that when looking at it from your perspective it does seem contradictory. But I forgot to clarify that I meant the 590 vs. 6990/Mars and not against itself - in regards to which one makes the best purchase for the high end gamer.

While I'm pulling internet faux pa's and actually admitting online when I'm wrong, let me really dig into some crow by saying in regards to Deus EX:HR -

Rush and Masked were totally right and I was wrong about the "Micro-Tearing".
My buddy is getting the same thing on his new HP laptop that uses AMD's crossfire (In the same way Nvidia applies Optimus in their notebooks) to switch between his notebook's 6750m and the intergrated GPU.

Only enabling Vsync gets rid of this.

Also now that I'm deep into the game, my GPU usage and FPS are now all over the place too. Anywhere from 50fps for a milli-second, then all the way back up 120fps before u know it.

So, I'm pretty confident this is the game's optimization/coding rather than any of us having issues with our 590's and rigs.

If I was getting these FPS in a competitive MP game, I'd be pissed. But since I'm playing a SP game for the story and everything is fluid for the most part, I don't mind.

I still fracking love these cards. I love them so much, my wife is starting to get jealous.

Although, I'm sure it's probably not the case, I'm justifying to myself to buy a couple of SSD's to see if that helps with any of the Microstutter. One for my OS and one to run games off of. Of course, I know it probably won't help much, but I just really want one.

Anyone know if there's a certain brand or model that's really good for games? Similar to how WD Velociraptor HD is designed with games in mind?

Sorry for the long post. I wasn't planning on writing a book.

Good Times...


What it comes down to, all the "590 hate" stems from TWO reviewers. Sweclockers and W1zzard who does the good TPU reviews benching everything an analyzing numbers.

Both of these guys killed their 590 samples. W1zzard I believed killed two. While I applaud him pointing out the issue to nVidia, he utterly and unequivocally destroyed the reputation of this card.

Our max voltage I will modify any of our custom BIOSes is 1.063v. Thats really high in my opinion. So high in fact, I only recommend people to use water cooling and that voltage. If anyone has seen my oc thread, its very cautious.

The voltage W1z killed the GPUs at was 1.213v. The maximum voltage for a *580 using a modified BIOS*. It wasn't even incremental, like testing every 10mV, he went straight from 1.1 to 1.2v I believe. I will find the initial article later.

He then did childish things like list the Pros-Cons in a way such as every letter spelt Epic Fail:
E
P
I
C
F
A
I
L

It has since been "edited". Despite every benchmark he ran at stock, being better than 6990, equal to 6990, and slightly behind the 6990, the card was scored a 7.4, where as a 6990 was scored as a 9.0 and Editor's Choice.

Every dead GTX 590 image stems from 3 or 4 incidents of cards dying, yet somehow it turned into EVERY 590 was destined to explode. Rotated images/shots were claimed as new "users" killing their cards. Raziel (an OCN member) claimed his died at stock clocks. I have said it again and before, if all were true, you can count the number of "exploding" 590s on two hands. At least the public ones.

What is annoying isn't really the reviewers doing their thing, its the encouraging a million little parrots on the internet to repeat the same junk either out of context or out of a realistic operating parameter such as "all 590s are destined to explode" instead of "all 590s will probably blow up at 1.213v."

Furthermore, if it wasn't for this total douche baggery with how they handle the extremely public popular reviews - our PDL would barely even exist/be much lessened. Instead we have two hands a foot handcuffed when it comes to OC and in some cases, just general performance.

Oh well, its very true what has been said before, the overclocking emphasis has been taken to the extreme on this card. I can get a 30% OC on my GPU, which is fantastic, yet its still considered a "Fail" card. Even on OCN, all they care about are these initial review, not what people are actually achieving with it. Its this misinformation which fired me up to start the benchmarking thread.

But with all things, you can only go so far to wipe out ignorance, you have to have a willing mind.


----------



## Arizonian

I just wanted to chime in......RaginCain was dead on his post above. Funny thing is he started a 590 vs 6990 Bench Thread and no 6990's showed up which I'm going to assume and leads me to belive that they couldn't beat his score and were too embarrased to show.

Then he changed the thread to a everybody free for all, and even though there were 580's to show, I've yet to see a 6970 show. I was so forward to settling this in a friendly OCN bench off, yet a lot of cards didn't come to the table to put their cards where their mouth is.

It was supposed to be a thread where all OCN members could have fun trying to beat each other out in benches and yet has been one sided thus far. Regardless who won, as GPU owners we could look forward to real live benches as opposed to biased reviews from tech web sites that clearly will side with their advertising dollars.


----------



## Masked

Cain hit a lot of the points I agree with.

The thing that bothers me the most is that, I know for a fact, an absolute 100% FACT, that we received MANY press releases and personal emails from Nvidia saying: Don't take the card over 1.05...First email was 1.1, 2nd was 1.08, 3rd was 1.05, then it was 1.00 ~

I KNOW they received them and yet, they didn't heed the warnings and now, not only is there a stigma but, there's an entire veil of ignorance over the card itself...Especially when it comes to them "blowing up" and the OCP...

It's sad that people whom chose to argue/compete can't be respectful/honorable enough to actually educate themselves.

I may not like Levesque but, at least he takes the time to "skim" this thread and moderately take in what's said here...To that, I give credit.

At least he tries.


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


Cain hit a lot of the points I agree with.

The thing that bothers me the most is that, I know for a fact, an absolute 100% FACT, that we received MANY press releases and personal emails from Nvidia saying: Don't take the card over 1.05...First email was 1.1, 2nd was 1.08, 3rd was 1.05, then it was 1.00 ~

I KNOW they received them and yet, they didn't heed the warnings and now, not only is there a stigma but, there's an entire veil of ignorance over the card itself...Especially when it comes to them "blowing up" and the OCP...

It's sad that people whom chose to argue/compete can't be respectful/honorable enough to actually educate themselves.

I may not like Levesque but, at least he takes the time to "skim" this thread and moderately take in what's said here...To that, I give credit.

At least he tries.


Wow. What utter incompetence. To load up a modified 580 bios meant to overvolt a card that's already running at higher clocks. It's like a noob overclocker turning everything up on high first time instead if going in increments of 5mhz

Well considering almost all, if not all, of the manufactured stock has sold out - I'm assuming Nvidia and Vendors don't really care.

And us owners knowing what kind of a gem it is, shouldn't care either. Because the haters are never going to admit the truth and they had it wrong, because now that it's Ghandi, there's no way for them to upgrade to one. So instead they'll perpetuate this continued falsehood to justify their purchases and ignore that they blew a great opportunity on one of the most Pimpin' cards to come out in a while.

That's cool. I'll just be over here playing my games, full blast, worry free.


----------



## L D4WG

@RagingCain- Mate you have educated me, I was in the mind set thinking that a stock 590 had a chance of blowing up after reading all the news and other OCN member posts.

As im sure many people here still are...

Your post should be stickied somewhere so we can refer people to it.


----------



## RagingCain

Quote:



Originally Posted by *Shinobi Jedi*


Wow. What utter incompetence. To load up a *modified 580 bios meant to overvolt a card* that's already running at higher clocks. It's like a noob overclocker turning everything up on high first time instead if going in increments of 5mhz

Well considering almost all, if not all, of the manufactured stock has sold out - I'm assuming Nvidia and Vendors don't really care.

And us owners knowing what kind of a gem it is, shouldn't care either. Because the haters are never going to admit the truth and they had it wrong, because now that it's Ghandi, there's no way for them to upgrade to one. So instead they'll perpetuate this continued falsehood to justify their purchases and ignore that they blew a great opportunity on one of the most Pimpin' cards to come out in a while.

That's cool. I'll just be over here playing my games, full blast, worry free.


I must have implied that, what I meant to say, he took a 590 to a very high voltage range. Higher than a stock 580 BIOS would allow, which we know that the 580 has a more robust power delivery system. The max voltage for a regular 580 is 1.151~153v, unless you edit the BIOS so it was 1.213v. ASUS made the mistake in their infinite wisdom to let people go that high in the first place (all dead 590s I have seen are non-EVGA too ^.^) but W1zzard was not even slightly cautious about the whole deal. Jumping from 1.100v to 1.213v (max voltage) just to see what would happen.

W1zzard's 6990 Conclusion
http://www.techpowerup.com/reviews/A...D_6990/26.html

W1zzard's 590 Full Review
http://www.techpowerup.com/reviews/A...GTX_590/1.html

W1zzard's 590 Conclusion (







)
http://www.techpowerup.com/reviews/A...TX_590/27.html

One of the con's is its slower than the 6990?








TPU's Own Chart:









Quote:



Originally Posted by *L D4WG*


@RagingCain- Mate you have educated me, I was in the mind set thinking that a stock 590 had a chance of blowing up after reading all the news and other OCN member posts.

As im sure many people here still are...

Your post should be stickied somewhere so we can refer people to it.


The point is, EVERY card has a possibility to die at stock and overclocks. Its just a really low possibility and usually RMA-able. The 590 is no more susceptible than any other video card... but it can be susceptible to user fault, just like another other card as well.


----------



## MKHunt

Quote:



Originally Posted by *Shinobi Jedi*


Wow. What utter incompetence. To load up a modified 580 bios meant to overvolt a card that's already running at higher clocks. It's like a noob overclocker turning everything up on high first time instead if going in increments of 5mhz

Well considering almost all, if not all, of the manufactured stock has sold out - I'm assuming Nvidia and Vendors don't really care.

And us owners knowing what kind of a gem it is, shouldn't care either. Because the haters are never going to admit the truth and they had it wrong, because now that it's Ghandi, there's no way for them to upgrade to one. So instead they'll perpetuate this continued falsehood to justify their purchases and ignore that they blew a great opportunity on one of the most Pimpin' cards to come out in a while.

That's cool. I'll just be over here playing my games, full blast, worry free.


Amen. At first I was worried I wouldn't be able to secure a 2nd 590 in case one wasn't enough. Then I realized I was coming from an FX770m and the day that my single 590 runs games like my 770m was running a month and a half ago my motherboard and CPU will need to be replaced as well.

Not saying it's a perfect card and that it doesn't have faults. My card squeals a tad when folding, and I wouldn't complain if it had the voltage regulation of the 580s; but for $710 (price when I got in) it's an extremely nice compliment to any system.

When people see my system and start asking questions the 590 is the first thing I talk about. Most people immediately latch onto the fans and water cooling but I'm saying, "Yeah yeah yeah it's some tubes and fancy spinners. _Look at this graphics card. Do you see it?_ Look how it sits silently, unmoved by anything that challenges it."


----------



## RagingCain

Quote:


> Originally Posted by *MKHunt;14780688*
> Amen. At first I was worried I wouldn't be able to secure a 2nd 590 in case one wasn't enough. Then I realized I was coming from an FX770m and the day that my single 590 runs games like my 770m was running a month and a half ago my motherboard and CPU will need to be replaced as well.
> 
> Not saying it's a perfect card and that it doesn't have faults. My card squeals a tad when folding, and I wouldn't complain if it had the voltage regulation of the 580s; but for $710 (price when I got in) it's an extremely nice compliment to any system.
> 
> When people see my system and start asking questions the 590 is the first thing I talk about. Most people immediately latch onto the fans and water cooling but I'm saying, "Yeah yeah yeah it's some tubes and fancy spinners. _Look at this graphics card. Do you see it?_ Look how it sits silently, unmoved by anything that challenges it."


Most people try to correct me and say "you mean 580, there is no 590", then they go oh, it must be new. My other favorite one is telling people I have a Pscore of 17,000 in 3DMark11, and then they *keep telling me I mean Vantage*. I always have to link the score...

:/ Le sigh


----------



## Tept

I had a 6990 for about 2 weeks. Actually RMA'd and got another because I wasn't happy with performance. I scored a 33k on vantage with the 6990 with the BIOS unlocked to 880mhz setting. RMA'd into a ASUS 590 instead and scored a 36.3k @ 612mhz core without touching a single thing. It scored almost 800 points higher in 3Dmark11 and I noticed 7-12 higher fps in sc2. Imo the 590 destroys the 6990, the 6990 is huge, its loud as hell, and it has higher mhz right out of the box yet doesnt perform to spec. If we could put a 590 to 880mhz out of the box, what do you thinks gunna happen to the 6990's credibility then?

Its gunna look like a chump.


----------



## Shinobi Jedi

Anyone try the Blackfire Mod for Crysis 2? What do you think, if so?

Personally, I'm pretty impressed.


----------



## rush2049

Quote:


> Originally Posted by *Shinobi Jedi;14785702*
> Anyone try the Blackfire Mod for Crysis 2? What do you think, if so?
> 
> Personally, I'm pretty impressed.


Besides the obvious changes (color suit modes, different textures, better textures?)....

i think they just upped the saturation and contrast on all the lighting.... at least that is what it looks like to me. It now looks less like a gritty realistic environment and more like a flashy movie experience....


----------



## Manac0r

Long time Raging ang co. Been living the dream of a stable OC and some high quality gaming. But decide to quiz the oracle. Currently with MSI beta 6 and 280.26 I can change voltage up to a maximum of 0.975v is the highest I can go? would it be unwise to take them any higher.

Running stable on 725Mhz 1850Mhz 0.975v or so it says in MSI


----------



## RagingCain

Use drivers 280.19, the voltage is completely available to maximum voltage.


----------



## Recipe7

Anyone care to give an update to the performance of 280.36?


----------



## Smo

Quote:


> Originally Posted by *Recipe7;14791869*
> Anyone care to give an update to the performance of 280.36?


I'm assuming that was a typo and you meant 280.26? If so, they're extremely hit and miss. A large chunk of the community is having a variety of problems with this driver set, while others have reached a respectable level of stability. They also clamp down on overvolting. Keeping the issues in mind - I've steered clear of the GeForce drivers.

I've just got my hands on a modified 280.26 set which I started testing briefly last night (so far all is well) - I will update more on that once I've had a chance to play around after work tonight.

I would suggest sticking with 280.19 for the moment though.


----------



## L D4WG

In haven't seen any problems with them, but im not heavily OC'ing my 590...

Witcher 2 is stable as anything for me, its ll im playing at the moment.


----------



## MaxHayman

2*Zotac 590 default settings


----------



## Smo

Quote:


> Originally Posted by *MaxHayman;14795563*
> 2*Zotac 590 default settings


Welcome to the club mate - must say though the idle temps are a little warm (nothing to worry about though) - are you using just that one monitor? Also what case have you got?


----------



## MaxHayman

Haf 932 - the cards are right next to eachother as the Rampage II Extreme slots dont line up with the case :'( Im using one moniter for now im gonna get 3 of thoes acer 3d moniters soon!


----------



## Recipe7

Quote:



Originally Posted by *Smo*


I'm assuming that was a typo and you meant 280.26?


Nope, no typo. http://www.guru3d.com/news/nvidia-ge...28036-drivers/

I tried it myself, but can't overvolt passed 975.

I ran 3dmark11 and compared the 280.19 and 280.36.

My sig CPU speed with the 590 at 675mhz netted me *9481 on 280.36, while my 280.19 netted me at 9277*


----------



## Tept

Quote:



Originally Posted by *Recipe7*


Nope, no typo. http://www.guru3d.com/news/nvidia-ge...28036-drivers/

I tried it myself, but can't overvolt passed 975.

I ran 3dmark11 and compared the 280.19 and 280.36.

My sig CPU speed with the 590 at 675mhz netted me *9481 on 280.36, while my 280.19 netted me at 9277*


Fairly certain thats a typo on the webpage, 280.26 is the latest driver.


----------



## Recipe7

Quote:



Originally Posted by *Tept*


Fairly certain thats a typo on the webpage, 280.26 is the latest driver.


http://forums.nvidia.com/index.php?showtopic=208698


----------



## kevink82

280.36 is a dev driver, same driver with a added open gl 4.2 support. Probably not gonna matter for the average joe?


----------



## Mackumba

Hello, i've been looking for an GTX 590 to buy, and i found a pretty good deal on a Inno3d gtx 590. Never heard of this brand, can anyone tell me if it's good, reliable or should i stay away?

Thanks in advance!


----------



## MaxHayman

Ive heard of inno3d - never used them though. I got mine for 550 each with a free copy of assains creed in each!


----------



## Recipe7

Quote:


> Originally Posted by *kevink82;14803695*
> 280.36 is a dev driver, same driver with a added open gl 4.2 support. Probably not gonna matter for the average joe?


It may be a developer driver, but I had better scores on .36 than .26


----------



## RagingCain

From personal experience, I would tend to stay away from Dev Driver's they are very isolated in purpose. Not to say they are bad or anything.

Second note, the eBay 590 sale is going well, and ends in 3 hours and I will lose two of the most powerful cards on the planet







It should not effect the 590 OC thread, I will still be maintaining it, but I obviously won't be able to test new drivers out and etc so I will need feedback from time to time.

Not sure what I will be working with next, after gutting half my rig, I do have a MaxIV Extreme and a 2600K from trades, but I will be visiting from time to time


----------



## Shinobi Jedi

Quote:



Originally Posted by *RagingCain*


From personal experience, I would tend to stay away from Dev Driver's they are very isolated in purpose. Not to say they are bad or anything.

Second note, the eBay 590 sale is going well, and ends in 3 hours and I will lose two of the most powerful cards on the planet







It should not effect the 590 OC thread, I will still be maintaining it, but I obviously won't be able to test new drivers out and etc so I will need feedback from time to time.

Not sure what I will be working with next, after gutting half my rig, I do have a MaxIV Extreme and a 2600K from trades, but I will be visiting from time to time










Bummer..
Just out of curiosity, You selling them because you want to? Or because you have to?

Deep into Deus Ex, not that this is going to be of a surprise, but Forcing Vsynch on in Nvidia Control Panel does keep my framerate at 120fps instead of jumping all over the place...

Enjoying the game a lot, I must say.


----------



## Mackumba

Hi again, sorry to ask twice but i'm in a hurry, if i'm supposed to get the Inno3d GTX 590.

Can anyone share experiences with this brand? i'd appreciate it very much!

Thanks


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Mackumba;14813602*
> Hi again, sorry to ask twice but i'm in a hurry, if i'm supposed to get the Inno3d GTX 590.
> 
> Can anyone share experiences with this brand? i'd appreciate it very much!
> 
> Thanks


I've never heard of them personally. Sorry, I can't be of more help. Maybe you should see if RagingCain's are still up on EBay? If anyone was going to have cards that are so well kept that they're like new, I'm sure it'd be him. Plus he's already established they clock incredibly well


----------



## RagingCain

Quote:


> Originally Posted by *Shinobi Jedi;14813887*
> I've never heard of them personally. Sorry, I can't be of more help. Maybe you should see if RagingCain's are still up on EBay? If anyone was going to have cards that are so well kept that they're like new, I'm sure it'd be him. Plus he's already established they clock incredibly well


Sorry they are off the market now for $2k. I was a bit surprised.

And to answer your question it was 91% I had to and 9% wanted to









Inno3D is a reliable brand, but all 590s should be identical with different stickers on top. Just make sure you get a warranty.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *RagingCain;14814238*
> Sorry they are off the market now for $2k. I was a bit surprised.
> 
> And to answer your question it was 91% I had to and 9% wanted to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Inno3D is a reliable brand, but all 590s should be identical with different stickers on top. Just make sure you get a warranty.


That's awesome! That's a pretty strong indication right there that the truth about these cards are getting out there.

What is it that you want to replace them with that made you %9 wanting to sell them?


----------



## Mackumba

Quote:


> Originally Posted by *RagingCain;14814238*
> Sorry they are off the market now for $2k. I was a bit surprised.
> 
> And to answer your question it was 91% I had to and 9% wanted to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Inno3D is a reliable brand, but all 590s should be identical with different stickers on top. Just make sure you get a warranty.


Thanks


----------



## RagingCain

Quote:


> Originally Posted by *Shinobi Jedi;14815247*
> That's awesome! That's a pretty strong indication right there that the truth about these cards are getting out there.
> 
> What is it that you want to replace them with that made you %9 wanting to sell them?


I would love 4x 3GB 580s but its a waste of moneys. For the price and to have comparable performance maybe 3x 580s or 3x 6970s. Trouble is I am not sure if the Maximum IV will do tri anything. Will have to see. I still have to consider Waterblocks and the fact I don't like ati uber dodgey crossfirex/drivers. The fanzombies all claim the drivers are solid but I see thread after thread of cfx issues, and the atypical response is user error. If someone said that to me in real life, I would deviate their septum. Especially after my 5870 horror story









Post powered by DROID X2
Rooted running Eclipse.


----------



## Mackumba

So guys, i've decided to go for an Classified EVGA GTX 590, i'm wondering if my PSU can handle it? Everything is in my specs right below, my psu has 750w and 60A on 12+v. By getting the 590 i wont be able to buy a new psu, so please be reasonable with it : )
Thanks in advance, and keep in mind that i dont OC at all!


----------



## Smo

Quote:


> Originally Posted by *Mackumba;14820065*
> So guys, i've decided to go for an Classified EVGA GTX 590, i'm wondering if my PSU can handle it? Everything is in my specs right below, my psu has 750w and 60A on 12+v. By getting the 590 i wont be able to buy a new psu, so please be reasonable with it : )
> Thanks in advance, and keep in mind that i dont OC at all!


You should manage it - NVIDIA themselves recommend a 700w. I would say that overclocking could lead to a forced shutdown.

Let us know when it arrives


----------



## Mackumba

Quote:



Originally Posted by *Smo*


You should manage it - NVIDIA themselves recommend a 700w. I would say that overclocking could lead to a forced shutdown.

Let us know when it arrives










Thanks man! It's a serious ammount of money to spend, so i'm a little nervous about it, but i guess we all do, dont' we?


----------



## Smo

Quote:



Originally Posted by *Mackumba*


Thanks man! It's a serious ammount of money to spend, so i'm a little nervous about it, but i guess we all do, dont' we?










Absolutely dude - and I totally understand your reservation. I was nervous at first too and swore to just enjoy it - but here I am overclocking and overvolting already!

I also wasn't interested in water cooling... that hasn't lasted long either as I have... plans


----------



## Mackumba

i've been reading about some -a lot- of driver issues, performance drops on some games, even below gtx 580 sometimes, did u notice those too? Have u had any kind of problems? Did you get the Limited version from EVGA ? if so, the package and extras are worth it?

really appreciate your attention man, rep +!


----------



## Smo

Quote:



Originally Posted by *Mackumba*


i've been reading about some -a lot- of driver issues, performance drops on some games, even below gtx 580 sometimes, did u notice those too? Have u had any kind of problems? Did you get the Limited version from EVGA ? if so, the package and extras are worth it?

really appreciate your attention man, rep +!


The drivers, even now, aren't fully supporting the GTX 590 - it's a massive shame but as you will notice when you start using it, in the few games that are properly supported, your FPS will be pretty amazing.

Personally, I had fairly good performance from the official 280.19 driver set but so far the best I've used are a modified 280.26 set.

However if you take a look at some of the scores that RagingCain managed to get with his 590s it just goes to show that the potential of these cards is much greater than the majority would have you believe.

I also have the Limited Classified Edition from EVGA as you do - previously clocked to 730/1460 @ 0.988v (back to stock for the moment as I'm testing out a modded driver set). In all honesty though I wasn't interested in the t-shirt, mousepad or the god awful poster! It's all still sitting in the box, where they will stay until I sell the card.


----------



## emett

Why don't you just get another 580 and run sli dude?


----------



## Tept

Quote:



Originally Posted by *emett*


Why don't you just get another 580 and run sli dude?


What I was thinkin. You'll be happier with a pair of 580's over 1 590.


----------



## Mackumba

Quote:



Originally Posted by *Smo*


The drivers, even now, aren't fully supporting the GTX 590 - it's a massive shame but as you will notice when you start using it, in the few games that are properly supported, your FPS will be pretty amazing.

Personally, I had fairly good performance from the official 280.19 driver set but so far the best I've used are a modified 280.26 set.

However if you take a look at some of the scores that RagingCain managed to get with his 590s it just goes to show that the potential of these cards is much greater than the majority would have you believe.

I also have the Limited Classified Edition from EVGA as you do - previously clocked to 730/1460 @ 0.988v (back to stock for the moment as I'm testing out a modded driver set). In all honesty though I wasn't interested in the t-shirt, mousepad or the god awful poster! It's all still sitting in the box, where they will stay until I sell the card.


I see, i'm really looking forward to get a shiny new gtx 590 but i was reading around and this question came to bug my head: is it really worth it if i'm gaming on just 1080p? I mean i had a single 580 for three days but had to RMA it, didnt quite got the chance to game with it so i really dont know if i need that much power, could you help me?

Quote:



Originally Posted by *emett*


Why don't you just get another 580 and run sli dude?


As i said above, i had to RMA the card to evga with only 3 days, it came with a broken fan, so i began considering the 590...

For 1080p gaming would i be good with the single 580 or the 590 is the choice to go? And by 'good' i mean 50+ fps with everything maxed, dx11 and etc, who doesnt love some eye candy? : )


----------



## Tept

Single 580 is enough. But if you want max eye candy your better off just going 2nd 580 over losing money on your current 580 and buying a 590.


----------



## Tept

Nevermind, I didn't notice your mobo, it only has 1 x16 slot. 1x 590 is as good as your gunna get on that board.


----------



## Mackumba

Quote:


> Originally Posted by *Tept;14824137*
> Nevermind, I didn't notice your mobo, it only has 1 x16 slot. 1x 590 is as good as your gunna get on that board.


i'll change the mobo in a near future, regarding newer sockets, and i'll make sure it'll be SLI ready. With that in mind, keeping the 580 up is better, isn't it?


----------



## Tept

2x 580 on a x16/x16 SLI capable mobo will beat a single 590.


----------



## traveler

It's time to upgrade the video card. I finally decided on the *EVGA GeForce GTX 590 Classified Hydro*.

I just ordered one.









Why the EVGA GeForce GTX 590 Classified Hydro? It fits my case at 11" long. Unfortuneately, for me and ATI, the 6990 cards are 12+" long so scratch the 6990 for me. Also I'd like to see what Physx is all about. Per the reviews, the GTX 590 runs Metro 2033 very comfortably as a single card solution with all eye candy set at max and thus should run Metro 2034 well when it is released to the gaming community. Shades of Crysis - all other video cards, except the 6990 and SLI/Crossfire configurations, have trouble running Metro 2033 at max. (google: "metro 2033 590 utube" and you will find a series of videos where someone recorded Metro 2033 with all eye candy on) Hopefully, as a GTX 500 series videocard owner, I will get a long term bang for my buck. This card is watercooled so it will get me into the water cooling thing finally. Nothing else but this card will be watercooled initially. And finally ... this video card comes with a t-shirt and I need a new t-shirt.









__________________
*Asus X48 Rampage Formula / Intel Q9450 2.66 @ 3.2 / 2 x 2GB OCZ Reaper HPC PC2 8500 /
Asus Radeon HD4870X2 Tri-Fan 2GB graphics card / 24" DeLL Ultrasharp 2408WFP 1920 x 1200 resolution monitor /
Western Digital Raptor X Hard Drive / Two Western Digital Caviar RE2 WD5001ABYS 500GB 7200 RPM SATA 3.0Gb/s Hard Drives /
Creative SoundBlaster X-Fi Elite Pro 7.1 / Creative Gigaworks S750 7.1 Speaker System /
Lian Li 343B cube case / PC Power & Cooling Silencer 750 Crossfire Edition /
IOGEAR GCS1782 GCS1782 Dual-Link DVI KVMP switch with 7.1 audio
HDHomeRun ethernet dual TV tuner
*


----------



## Shenta

Hey guys.

I joined this forum just for this thread, and I'm wondering how in the hell you guys keep your GTX 590 stable, mine crashes at 660 Hertz after only 20 minutes :c

Please help me get a good clock, I really want to push this thing a lot further.. And keep it stable.

Here's my card for proof.









I've been using MSI Afterburner, is that a bad one?


----------



## Mackumba

Quote:



Originally Posted by *Shenta*


Hey guys.

I joined this forum just for this thread, and I'm wondering how in the hell you guys keep your GTX 590 stable, mine crashes at 660 Hertz after only 20 minutes :c

Please help me get a good clock, I really want to push this thing a lot further.. And keep it stable.

Here's my card for proof.

I've been using MSI Afterburner, is that a bad one?


Hello shenta, welcome to OCN! Nice card you got there!
First, wich driver version are you on? This is crucial with 590s.
I use MSI afterburner as well, and it overclocks my 580 just fine. You really should be getting a stable system with that tiny OC...


----------



## Shenta

Quote:



Originally Posted by *Mackumba*


Hello shenta, welcome to OCN! Nice card you got there!
First, wich driver version are you on? This is crucial with 590s.
I use MSI afterburner as well, and it overclocks my 580 just fine. You really should be getting a stable system with that tiny OC...


I'm on the latest driver.
Right now I run in stock but this is what I tried running it at..









Crashed in like 6 minutes in BC2. 
I'm eventually going to buy a water block but not until the beginning of next year.


----------



## Mackumba

well, my best guess is you're pushing too hard the memory and shader clocks. A 30 mhz on the core clock is OK and won't cause you any trouble but the stock clocks for your Classified 590 are Core Clock Speed: 630MHz / Memory Clock Speed: 3456MHz / Shader Clock Speed: 1260MHz. So you're doing 30 Mhz on core, 71 mhz on memory and 60 mhz on shader clock. I'd try to even that out and see how it goes


----------



## rush2049

revert to 280.19 (or whatever number the beta 280 series was) and up the voltage a bit and 660 would be easily stable.....


----------



## Shenta

Quote:



Originally Posted by *rush2049*


revert to 280.19 (or whatever number the beta 280 series was) and up the voltage a bit and 660 would be easily stable.....


What voltage? Is it okay if I put the core to 700Hertz cus that's where I want it to be.


----------



## Smo

Quote:



Originally Posted by *Shenta*


What voltage? Is it okay if I put the core to 700Hertz cus that's where I want it to be.


Welcome to the club dude - you'll need ~0.988v to be stable at 700MHz. I would recommend driver set 280.19 for this, as mentioned above. My card was stable at 730MHz with 0.988v on that set.


----------



## Shenta

Quote:



Originally Posted by *Smo*


Welcome to the club dude - you'll need ~0.988v to be stable at 700MHz. I would recommend driver set 280.19 for this, as mentioned above. My card was stable at 730MHz with 0.988v on that set.


Guess I'll test it and let you guys know how it goes.
What should my shader and memory clock be?
And afterburner isn't letting me mess with the voltage, I'll be downloading another program I guess.


----------



## Shenta

Quote:



Originally Posted by *Shenta*


Guess I'll test it and let you guys know how it goes.
What should my shader and memory clock be?
And afterburner isn't letting me mess with the voltage, I'll be downloading another program I guess.


Sorry for being such a noob D:
But I downloaded EVGA precision, can't see where I can mess with the voltage there. And how will I uninstall my update?


----------



## Arizonian

Quote:



Originally Posted by *Shenta*


Sorry for being such a noob D:
But I downloaded EVGA precision, can't see where I can mess with the voltage there. And how will I uninstall my update?


EVGA Persicion dosen't allow for voltage tweak. For that I'd suggest MSI Afterburner. It allows for voltage increase and supports all GPU's.

I use it to over clock. When I'm not over clocking I switch back to EVGA Precision when I'm just at default voltage. Unless I'm benching.

Most just keep MSI Afterburner on 24/7 which is fine if your keeping the over clock/voltage on all the time.

They are both Rivatuner software shells, so they are very very similar except one has greater flexibility.

Just make sure you don't have both start up with windows settings turned on as they will conflict with each other. I would change my over clock and when windows rebooted it would switch back to the prior OC until I realized I had both running. That was my noob mistake. LOL.

BTW - Welcome to OCN!


----------



## Shenta

Quote:



Originally Posted by *Arizonian*


EVGA Persicion dosen't allow for voltage tweak. For that I'd suggest MSI Afterburner. It allows for voltage increase and supports all GPU's.

I use it to over clock. When I'm not over clocking I switch back to EVGA Precision when I'm just at default voltage. Unless I'm benching.

Most just keep MSI Afterburner on 24/7 which is fine if your keeping the over clock/voltage on all the time.

They are both Rivatuner software shells, so they are very very similar except one has greater flexibility.

BTW - Welcome to OCN!










I have MSI too and it's not letting me mess with the voltages ><


----------



## Arizonian

Quote:



Originally Posted by *Shenta*


I have MSI too and it's not letting me mess with the voltages ><


Did you go to settings....you have to check the option to allow for it. By default it dosen't have that turned on.

It's in - Settings / General - check 'unlock voltage control'.


----------



## dvanderslice

@*Arizonian* why do you only use Afterburner for just overclocking? Is there an issue when running just at defaults with Beta 6?

The reason I ask was the reason I was going to post. I've noticed recently that randomly the 590 fan revs up to 100% for about 2 seconds then goes back to its original speed again. Is this Afterburner causing this? I don't have a need to really overclock the 590 as it handles every game out perfectly at stock speeds...should i just stick with precision? And is this the new Nvidia drivers causing this fan revving? I never had this fan speed up slow down till the last WHQLs came out. With such a pricey card I want to make sure all is kosher.

280.26 Driver


----------



## emett

That is the drivers causing the fan rev, mine and many other have the same thing. Nothing to worry about. But it is a bit off putting.


----------



## dvanderslice

Yah no doubt *** all of a sudden things starting to rev up makes you start thinking something is off in the bios, did you forget to change a voltage setting ***? lol

I switched to precision just in case.... i'm checking everything over like mad. Thanks for the heads up man. +1 Rep


----------



## Smo

Quote:


> Originally Posted by *dvanderslice;14863885*
> Yah no doubt *** all of a sudden things starting to rev up makes you start thinking something is off in the bios, did you forget to change a voltage setting ***? lol
> 
> I switched to precision just in case.... i'm checking everything over like mad. Thanks for the heads up man. +1 Rep


Mine does the same mate - but only since upgrading to driver set 280.26 - it never happened on driver set 280.19. The only reason I upgraded was to test out the new 3D Vision profiles. Admittedly, it seems I've been lucky with the latest driver set, my system is perfectly stable and game performance is better than before.

Currently playing Crysis II in 3D with the DX11 patch and High-Res textures the lowest FPS I've hit so far was 48 - it typically hovers at a solid 60.

The 590 is an impressive card - this is at stock clocks too!


----------



## kayoh

Man you guys dropped $750+ on this card? You guys must be balllinnn


----------



## Masked

Quote:


> Originally Posted by *kayoh;14865127*
> Man you guys dropped $750+ on this card? You guys must be balllinnn


Even crippled by poor drivers and undervoltaging, the 590's still beat the 6990's nearly across the board...

So, balllinnn in a negative fashion? Far from.

Baller? Hell yes.


----------



## Smo

What the hell does 'ballin' mean?


----------



## Masked

Quote:


> Originally Posted by *Smo;14865379*
> What the hell does 'ballin' mean?


When I played pro paintball a few years back, ballin meant awesome, amazing, hot.

Now, the youth have changed the term to have a negative connotation...I.E. balllinnnn = crying.

Baller used to mean you were hot stuff...It's interesting how words change.

That's why I put both "meanings".

Btw, I had to go on urbandictionary to find his "meaning".

Anywhoo, with a 1k resale value (Cain's cards)...I don't really see anything worth being remotely worried about.


----------



## Arizonian

Quote:


> Originally Posted by *dvanderslice;14863468*
> @*Arizonian* why do you only use Afterburner for just overclocking? Is there an issue when running just at defaults with Beta 6?
> 
> The reason I ask was the reason I was going to post. I've noticed recently that randomly the 590 fan revs up to 100% for about 2 seconds then goes back to its original speed again. Is this Afterburner causing this? I don't have a need to really overclock the 590 as it handles every game out perfectly at stock speeds...should i just stick with precision? And is this the new Nvidia drivers causing this fan revving? I never had this fan speed up slow down till the last WHQLs came out. With such a pricey card I want to make sure all is kosher.
> 
> 280.26 Driver


First to answer why I use EVGA Precision - is only because I love EVGA and like to use a vendors software for their own cards. Though I'm bummed they don't have it working as good as MSI Afterburner. So if I'm not over volting my card I do all my over clocking, which I do run 24/7 with Precision. Just a preference. See my sig that shows my 24/7 over clock for my 580.

If you do run an over volt of any kind 24/7 then you need to stick with MSI Afterburner to keep it running correctly. If you over clock without over volting then either will do just fine. They are both Rivatuner shells. Also for over volting make sure you do your homework diligently before attempting this as the 590 has it's limits. I can't comment on that as I don't want to give bad advice since I've not dealt with it personally.

As for your revving up and back down. This is happening because I think you have it automatically running your fan and values not set manually. If you go to - Settings / Fan - check " Enable Software Automatic Control " and set the graph yourself.

There are points on your graph. You determine where you'd like the percentage of fan for your temps. You can create a new point by clicking left mouse click and you can delete a graph point by selecting it and hit 'Del' on your keyboard. I use the curve graph over the step up graph. You can switch from 'curve' to 'step up' by double clicking anywhere on the graph.

I have the following settings for my GTX 580:

40C- 45%
50C - 56%
60C - 67%
70C - 78%
80C - 89%
90C - 100%

I'm assuming you have an EVGA GTX 590? If you open your graph and you can't go to 100% it's because you need to go to EVGA and down load the EVGA Fan Speed Unlocker. Find the GTX 590 link and down load it. It's a BIOS flash that will allow your fan speed to max 100%.

View attachment 228002


Now please forgive me if I've explained all this and you've already done it. As I don't own the GTX 590 but what you explained was happening to me as well with the GTX 580 and it was because I had the automatic settings doing it on their own instead of specifying what values I wanted the fan speed at depending on temp the GPU is hitting.

Some may find my fan speeds a bit aggressive. Others not aggressive enough. I was originally leery about ramping it up, but then someone on OCN made a comment how he runs it 100% all the time. When I asked him wasn't he afraid to burn up his fan, he responded with "It's an EVGA lifetime warranty, why would I worry?" I thought to myself, so true. They are the best company for standing behind their cards. They also don't deny warranty due to over clocking as they are a company that encourages it, where others look down upon it even though they provide software to allow you to do so. Go figure.

Lastly, as a side note - I did want the GTX 590 which came out 58 days after I bought my 580. EVGA didn't allow me to step up from a single GPU to a dual GPU as it's their only rule on 'step up' program. I was pretty bummed. The only consilation I had was I could always SLI a second instead. It's a bit more over clockable that way but it comes with some heat issues for top card which I would have rather preferred not deal with. Since the 590's release I've always followed this thread from the start.

Good luck and let me know if this was your problem. Welcome to OCN


----------



## kainwalker

Does anyone know if it is really impossible to find a gtx590 on the market within a reasonable price? (new~700, used~400-600)
Have been looking for one (or two) for quite a while now...


----------



## rush2049

Most likely you are going to see prices above 700$ for a used card now. (Why did you wait so long anyway?)

For a new card you are looking at 800-900$ and up (and potentially shady reasoning for why its still new).....

Or you could get lucky.

See here for completed listings for an idea on price: http://www.ebay.com/csc/i.html?rt=nc...=p3286.c0.m283


----------



## traveler

The new *EVGA GeForce GTX 590 Classified Hydro* card (which I ordered Saturday, see my previous post) just arrived today.

rush2049: Looks like I got lucky. This card (as everyone knows) sells out as soon as it's in stock.

Saturday evening, when I placed the order (with TigerDirect.com), it said: *In Stock* in green letters just like I posted them here. And when I saw those two words in the green letters I was shocked to actually see those two words. I stared at them in disbelief for a while. The green two words were still there even after I placed my order so it seems there was more than one card *In Stock*. The next day, Sunday when I woke up, those two words were no longer there. So I called the 800 number wondering if I would actually get the product now or be on a waiting list. The rep at the other end (who incidently and coinsidently owned an EVGA GTX 590 Hydro) said someone called him on Thursday to see about the availability of this card and was not happy he could not get one. Then probably Friday, the next day, since warehouse and shipping people don't usually work on saturday, they appeared. I knew they would be gone in a day ... and they were. The rep said: If the website said it was in stock when I placed the order then it was in stock. I guess he was right.

So ... it has arrived. When I opened the box, everything seemed to be in order. The card was sealed and all items were there.

Man !!!!! Is this card HEAVY !!!!! Will the case screws be enough to hold this card in the motherboard slot? Or is additional support needed?

The card won't be up and running for a while yet because I have to wait for the new water pump and radiator to arrive.









__________________
*Asus X48 Rampage Formula / Intel Q9450 2.66 @ 3.2 / 2 x 2GB OCZ Reaper HPC PC2 8500 /
Asus Radeon HD4870X2 Tri-Fan 2GB graphics card / 24" DeLL Ultrasharp 2408WFP 1920 x 1200 resolution monitor /
Western Digital Raptor X Hard Drive / Two Western Digital Caviar RE2 WD5001ABYS 500GB 7200 RPM SATA 3.0Gb/s Hard Drives /
Creative SoundBlaster X-Fi Elite Pro 7.1 / Creative Gigaworks S750 7.1 Speaker System /
Lian Li 343B cube case / PC Power & Cooling Silencer 750 Crossfire Edition /
IOGEAR GCS1782 GCS1782 Dual-Link DVI KVMP switch with 7.1 audio
HDHomeRun ethernet dual TV tuner

Under construction: EVGA GeForce GTX 590 Classified Hydro Copper Video Card
*


----------



## kainwalker

Quote:



Originally Posted by *rush2049*


Most likely you are going to see prices above 700$ for a used card now. (Why did you wait so long anyway?)

For a new card you are looking at 800-900$ and up (and potentially shady reasoning for why its still new).....

Or you could get lucky.

See here for completed listings for an idea on price: http://www.ebay.com/csc/i.html?rt=nc...=p3286.c0.m283


that's because I did not have the money








I have wanted a 590 for quite a while - and now I have the money they are all sold out!
I saw one on newegg today - evga 590 749.99 + 7.87 shipping... I hesitated and it was GONE...
I missed a used $500 one yesterday too...
And I missed 3-4 $42x open box ones


----------



## RagingCain

I hope I don't start a wildfire, but yeah kain, I sold mine recently on the eBay for 2k$, and this time the deal went through and the guy paid.

It could be hit or miss, people often sell them before unlocking their potential, so you find a deal for 500$ used, other people don't want the headache any more.

If you are unsure when checking out, you will NOT get one, but there are plenty of good alternatives such as 2x 570s or 2x6970s.


----------



## Shenta

As you can see I've had it unlocked. But it still doesn't let me move it. As you can see it's just empty and doesn't let me move it.


----------



## traveler

Quote:



Originally Posted by *RagingCain*


I hope I don't start a wildfire, but yeah kain, I sold mine recently on the eBay for 2k$, and this time the deal went through and the guy paid.

It could be hit or miss, people often sell them before unlocking their potential, so you find a deal for 500$ used, other people don't want the headache any more.

If you are unsure when checking out, you will NOT get one, but there are plenty of good alternatives such as 2x 570s or 2x6970s.


Hey Bud!







You're the reason I came here and joined! I saw your cards on ebay but I was looking for only one. And fortunatly I got lucky.

And you're right .... About three weeks ago NewEgg had the EVGA GTX 590 Hydro in stock! -----> for about 6 hours! I hesitated and lost that opportunity.









__________________
*Asus X48 Rampage Formula / Intel Q9450 2.66 @ 3.2 / 2 x 2GB OCZ Reaper HPC PC2 8500 /
Asus Radeon HD4870X2 Tri-Fan 2GB graphics card / 24" DeLL Ultrasharp 2408WFP 1920 x 1200 resolution monitor /
Western Digital Raptor X Hard Drive / Two Western Digital Caviar RE2 WD5001ABYS 500GB 7200 RPM SATA 3.0Gb/s Hard Drives /
Creative SoundBlaster X-Fi Elite Pro 7.1 / Creative Gigaworks S750 7.1 Speaker System /
Lian Li 343B cube case / PC Power & Cooling Silencer 750 Crossfire Edition /
IOGEAR GCS1782 GCS1782 Dual-Link DVI KVMP switch with 7.1 audio
HDHomeRun ethernet dual TV tuner

Under construction: EVGA GeForce GTX 590 Classified Hydro Copper Video Card
*


----------



## Arizonian

Quote:


> Originally Posted by *Shenta;14872518*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see I've had it unlocked. But it still doesn't let me move it. As you can see it's just empty and doesn't let me move it.


I don't have 'unlock voltage monitoring' checked. Uncheck that.

No other program is running or ever been installed like EVGA E-leet?

Do you click on 'apply' once your done?

Also do not run the beta version.....only the last up to date final release.

I'm just going through the easy check list first.


----------



## MKHunt

Quote:


> Originally Posted by *Shenta;14872518*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see I've had it unlocked. But it still doesn't let me move it. As you can see it's just empty and doesn't let me move it.


Use AfterBurner 2.2.0 Beta 7. 2.1.0 locks voltage for nVIDIA cards no matter what. My friend's GTX 260 core 216's are voltage locked as well with 2.1.0

ETA: I can adjust voltage with 280.26 drivers as long as I use 2.2.0 Beta. Can't go higher than .913, but adjustment is possible. 280.19 should let you really get up there.


----------



## Tept

Make sure you completely Exit out of MSI and restart it fresh after checking Unlock Voltage control.


----------



## dvanderslice

Quote:


> Originally Posted by *Arizonian;14865595*
> 
> As for your revving up and back down. This is happening because I think you have it automatically running your fan and values not set manually. If you go to - Settings / Fan - check " Enable Software Automatic Control " and set the graph yourself.
> 
> There are points on your graph. You determine where you'd like the percentage of fan for your temps. You can create a new point by clicking left mouse click and you can delete a graph point by selecting it and hit 'Del' on your keyboard. I use the curve graph over the step up graph. You can switch from 'curve' to 'step up' by double clicking anywhere on the graph.........
> 
> .........Good luck and let me know if this was your problem. Welcome to OCN


My current rig is in my sig. Yep I got a custom graph set... but regardless if any of the monitoring UIs (precision or afterburner) are turned on or not it still does its little revving. Its got to be the driver as was mentioned. Isn't hurting anything at all just was curious as any of us harware guys would if we hear some fans acting on their own accord like that.

I've had the 590 for awhile now and am pretty familiar with it. Thanks a lot to this thread when I fist got it as having threads like this are HUGE for when you get high end new hardware like this. I unlocked the fan with that firmware upgrade and all that when I got it via EVGA. I posted in this thread WAAY back about the firmware being different on each GPU when I first upgraded the firmware/fan unlocker. I used to have a GTX 290 and it wasn't the same deal with that so I wasn't sure. Anyways I just was scanning before I was going to post about the "Rev" issue and I saw your post about you using Afterburner only when you overclock and I was wondering if maybe you knew something I didn't about the MSI app causing issues to stock clocks/voltages. Personal preference is a perfect answer for me!

I've been using Afterburner since I had three GTX 275s before I got the 590 and its always been a great app and i've used the same rather aggressive fan graph myself, i have it hitting 100% much earlier but to each his own. My 590s never go over 59 degrees even after playing a game like Shogun 2 maxed out for several hours or Crysis 2 with all its new DX11 stuff it never goes over 62-63 with the custom graph I have set and the amount of cooling my case has since I installed custom fans instead of the Coolermaster stock ones. Two GTX 275s I used to get to 80 easily with the same graph. And the GTX 290 about the same.

The forum on guru3d is great as you can get any help you need from the MSI Afterburner API guys. They are all of course based on the same engine as you know. But Afterburner uses much more feature rich coding. Again as you already posted.

I appreciate you replying. Just wanted to make sure that Afterburner wasn't causing an issue. I used to overclock the 590 but as I said with stock clocks/voltages it runs everything perfectly absolutely perfect. Have Space Marine running on two monitors and it doesn't drop below 50 fps at 1050. So i'm a happy camper.

Thanks again.

*
Sorry to interrupt your troubleshooting guys.*


----------



## Smo

You're not alone there dude - since I upgraded the 280.26 set my card has revved up a few times. Not really a problem, but a slight annoyance nonetheless!

Glad you're enjoying your 590 though


----------



## kainwalker

I don't know if I should really post it here in the 590 owners club thread, but the MARS II is available on newegg as well as excaliberpc:

http://www.newegg.com/Product/Produc...82E16814121470

http://www.excaliberpc.com/609248/as...80x2-mars.html

if I wasn't using the FT03 Fortress case from Silverstone I would probably have jumped on one of these


----------



## Tept

Card isnt worth the money at all. Even if the 22% performance increase was true, which it isnt, how does that justify a 100% price increase?


----------



## kainwalker

Quote:



Originally Posted by *Tept*


Card isnt worth the money at all. Even if the 22% performance increase was true, which it isnt, how does that justify a 100% price increase?


Maybe it provides even more headroom for OC since it has a better power management? But I bought that card I wouldn't dare OC it


----------



## Tept

will still never perform as good as 2 590's on a nice stable OC, which is same price as 1 MARSII.


----------



## Smo

Quote:


> Originally Posted by *Tept;14892405*
> will still never perform as good as 2 590's on a nice stable OC, which is same price as 1 MARSII.


I agree - while as a single card solution, the MARS II is clearly a brilliant product, but it simply isn't viable as an economic solution. Perfect for those that enjoy the exclusivity of an extremely limited edition item though - and for those that took the plunge I'm sure they'll be very happy with it!

There's no doubt it delivers excellent performance, but for the money there are plenty more powerful alternatives. I know for a fact that I would much prefer 2xGTX 590s to a MARS II currently (although I'm waiting to see what Kepler has in store for us).


----------



## Tept

I got monies set aside specifically for pulling the trigger on Kepler when it hits shelves.


----------



## Smo

Quote:


> Originally Posted by *Tept;14898902*
> I got monies set aside specifically for pulling the trigger on Kepler when it hits shelves.


Me too bud


----------



## Wogga

weird question =) 
parallel or serial 2x590 WBs? dont want to loose my idle and load temps (25C/37C or! 20/30 when russian winter comes? who knows)


----------



## Kentan900

I have a question. Can you add a GTX 590 to a system that uses a GTX 580 and use it as a three card setup?


----------



## Smo

Quote:



Originally Posted by *Kentan900*


I have a question. Can you add a GTX 590 to a system that uses a GTX 580 and use it as a three card setup?


Unfortunately not mate. The 590 can only be paired with another 590.


----------



## Kentan900

Quote:


> Originally Posted by *Smo;14900787*
> Unfortunately not mate. The 590 can only be paired with another 590.


Oh well now that made me sad but thanks for letting me know!


----------



## Tept

You can use the 580 as a dedicated PhysX card. The 580 OC'd will make a far better PhysX card than 1 of the 590 GPU's.


----------



## traveler

Quote:



Originally Posted by *Smo*


Unfortunately not mate. The 590 can only be paired with another 590.


That is correct.


----------



## traveler

I just took a look ....

In my earlier post I said from where I bought my EVGA GTX 590 Hydro card. My card was shipped and it has arrived but now the webstore says "This Item Is Currently Unavailable".

Whew ... I'm glad I got one. When I find a digital camera, I'll take a picture with my OC name. Is that all it takes to be put on post #1?


----------



## Smo

Quote:



Originally Posted by *traveler*


I just took a look ....

In my earlier post I said from where I bought my EVGA GTX 590 Hydro card. My card was shipped and it has arrived but now the webstore says "This Item Is Currently Unavailable".

Whew ... I'm glad I got one. When I find a digital camera, I'll take a picture with my OC name. Is that all it takes to be put on post #1?




















Glad it's arrived dude - I'm sure you'll enjoy it! All you need is a photo with your username written on a piece of paper next to the card.


----------



## traveler

Quote:



Originally Posted by *Smo*


Glad it's arrived dude - I'm sure you'll enjoy it! All you need is a photo with your username written on a piece of paper next to the card.


Thanks SMO.

About one day and it was sold out.

When I ordered mine I could not believe it said *In Stock*. The next day it said "Will ship in 10 to 21 days". And now the site says "Unavailable". WoW!!! I should have bought two!!! The website still said *In Stock* for hours after I finished buying mine.


----------



## MKHunt

Quote:


> Originally Posted by *traveler;14903644*
> Thanks SMO.
> 
> About one day and it was sold out.
> 
> When I ordered mine I could not believe it said *In Stock*. The next day it said "Will ship in 10 to 21 days". And now the site says "Unavailable". WoW!!! I should have bought two!!! The website still said *In Stock* for hours after I finished buying mine.


I felt the same way. I brought myself back to reality when I realized that I didn't have enough room for the required rads, had already ordered the PSU, and by the time 1 590 doesn't cut it my SB setup will need an overhaul.

Also, not sure if you'll ever make it to post #1. Hasn't been updated for a while. I still haven't made that list









ETA: Also, I bought the PSU thinking the future 'glory days' would be 2 560ti's in SLI... Things may have gotten out of hand very, very quickly.


----------



## traveler

I hope one GTX 590 will get me through Metro 2033 ... and ... Metro 2034.


----------



## Smo

Quote:


> Originally Posted by *traveler;14903893*
> I hope one GTX 590 will get me through Metro 2033 ... and ... Metro 2034.


No sweat (if you're running a single monitor - I'm on TapaTalk so can't see your sig).


----------



## Shenta

First off, if you guys have a 590, why blow your money on a stupid Keplar? It's performance increase is only 6x...

Next year Maxwell comes out, that's by 16x.. Unless you guys really don't care about money..

Anyway SMO, I put my gpu on ur settings witht the right voltage and everything with fan speeds @100%, and it crashed in 5 seconds in 3DMARK 11. I even put the voltages and clocks down and it still crashed -...-

edit: On the driver u suggested too..


----------



## Wogga

that means that your 590 isnt so lucky (if there is any luck in those). mine for example unstable above [email protected] and unstable at [email protected] but ok with [email protected] there are some cards that need less voltage for same clocks and stable at [email protected] i think there can be some sort of luck with GPUs. havent tried second 590, i bet it will be the worst 590 in the world


----------



## Shenta

Quote:


> Originally Posted by *Wogga;14908266*
> that means that your 590 isnt so lucky (if there is any luck in those). mine for example unstable above [email protected] and unstable at [email protected] but ok with [email protected] there are some cards that need less voltage for same clocks and stable at [email protected] i think there can be some sort of luck with GPUs. havent tried second 590, i bet it will be the worst 590 in the world


Should I call EVGA and ask for a replacement?


----------



## L D4WG

Quote:


> Originally Posted by *Tept;14901121*
> You can use the 580 as a dedicated PhysX card. The 580 OC'd will make a far better PhysX card than 1 of the 590 GPU's.


That would be a huge waste of the 580...


----------



## Smo

Quote:


> Originally Posted by *Shenta;14908027*
> First off, if you guys have a 590, why blow your money on a stupid Keplar? It's performance increase is only 6x...
> 
> Next year Maxwell comes out, that's by 16x.. Unless you guys really don't care about money..
> 
> Anyway SMO, I put my gpu on ur settings witht the right voltage and everything with fan speeds @100%, and it crashed in 5 seconds in 3DMARK 11. I even put the voltages and clocks down and it still crashed -...-
> 
> edit: On the driver u suggested too..


Ouch - sorry to hear that bud. You should have left the voltage the same but just lowered the clock.

However, GPUs are exactly the same as CPUs in terms of overclockability. One persons' settings won't work for everyone else as each chip is different and requires varying voltages. My 590 is Kombustor and 3DMark11 stable at 730/1460 @ 0.988v but that doesn't mean everyone else's will be. Bump it up to 1.000v and see what happens! Alternatively, drop the core clock slightly and try again.

For instance, whiz882 is stable at 750/1500 @ 0.988v, mine isn't - it requires 1.000v but I'm not happy using that much voltage due to the temperatures.

On the flip side, my i5 2500k isn't a particularly good overclocker. There are people hitting 5GHz at ~1.45-1.48v but my chip only manages 4.8GHz and so far (I'm still testing) it looks like it will need ~1.47/1.48v. Just luck of the draw my friend.

Unfortunately I highly doubt that EVGA will replace your card - their only duty to you is that the card is stable at stock clocks, it's nothing to do with them if it doesn't overclock well.


----------



## Chobbit

Can I just check something as VRAM is important to me with a surround setup, are these cards actually allowing you 3GB of VRAM or 1.5GB of VRAM (1.5gb per chip)?

As anyone with multi screen setups would know, this is important.


----------



## RagingCain

Quote:


> Originally Posted by *Chobbit;14910218*
> Can I just check something as VRAM is important to me with a surround setup, are these cards actually allowing you 3GB of VRAM or 1.5GB of VRAM (1.5gb per chip)?
> 
> As anyone with multi screen setups would know, this is important.


Its only 1.5GB per GPU, but multi-screen is definitely possible up to 5760x1080.


----------



## Masked

Quote:



Originally Posted by *Shenta*


Should I call EVGA and ask for a replacement?


...LOL, if you actually got a replacement for a "bad overclocker", I would want full documentation because I //WILL// be returning 3/4 of Alienware's total stock and getting replacements...For quite nearly //EVERYTHING//.

Under contract, I "have heard" of a couple vendors doing this only because the clocks advertised in the contract, weren't what the vendor actually received...Thus, it was a breach of contract.

As an individual, IE, a consumer; you're only promised the clocks on the box...That's it, nothing more, nothing less...You accept the contract as a consumer the moment you open that box.

Aside from a vendor, I have NEVER heard/seen or even an incling of a company EVER replacing a consumer end card because "it didn't OC enough"...And like I said, if that EVER happened, vendors would be jumping on it faster than a pig in...well, you know.

Quote:



Originally Posted by *L D4WG*


That would be a huge waste of the 580...


I actually disagree with this.

I don't really support Physx, I see it as a limited resource however, by adding a 580, even just for Physx, some great things can be done...Especially if you multi-monitor.

I've actually been toying with the idea of adding a 580 due to the fact that, you could run 2 monitors off of the 580 alone while running the main monitor (gaming) off of the 590...

While you don't quite need the 580, having a fermi with that much "beef" makes a big difference.

I have 1 of the interns running this "setup" and it's actually very successful...His frame rates kicked up, issues with the 590 were "avoided" and the 580 runs like a champ.

The only real question is...Is it worth the 500$ to kick your frame rates up 15%?

For most of us, no...Obviously, we have better things to do with 500$.

For the rest of us...It is a good solution and like I mentioned, you DO bypass some of the issues with the 590 just for the sheer fact that you're "working around" some of the driver-based crapola.


----------



## Chobbit

Quote:


> Originally Posted by *RagingCain;14910526*
> Its only 1.5GB per GPU, but multi-screen is definitely possible up to 5760x1080.


Damn that's not quite enough because as some of us have tested and found out in the Surroung Gaming club, as games such as Crysis 1, GTA VI, Metro & Just Cause 2, can use upto 2.7GB of VRAM when you add upwards of 4xAA at 5760x1080.

Which to be honest you wouldn't get a GTX 590 and not whack up the AA.

Is it not misleading selling a card at 3GB when you can never actually have access to more than half of that? I nearly spent quite a lot of money on something that should have been enough for my needs but just wouldn't have


----------



## Masked

Quote:



Originally Posted by *Chobbit*


Damn that's not quite enough because as some of us have tested and found out in the Surroung Gaming club, as games such as Crysis 1, GTA VI, Metro & Just Cause 2, can use upto 2.7GB of VRAM when you add upwards of 4xAA at 5760x1080.

Which to be honest you wouldn't get a GTX 590 and not whack up the AA.

Is it not misleading selling a card at 3GB when you can never actually have access to more than half of that? I nearly spent quite a lot of money on something that should have been enough for my needs but just wouldn't have










...Since when did this concept change?

It's been true since the X2...That was what, 5 years ago?

All Vram is split between the existing cores...

I don't find that misleading at all considering the same concept has been true for literally 5+ years.


----------



## Chobbit

I had a 4870x2 which advertised as being 2gb VRAM and if I remember correctly that actually had 2gb of VRAM?

It might just be me, but if I needed 3GB VRAM to run multi screen gaming the way I wanted and this card advertised having it, then I'd expect it.


----------



## Masked

Quote:



Originally Posted by *Chobbit*


I had a 4870x2 which advertised as being 2gb VRAM and if I remember correctly that actually had 2gb of VRAM?

It might just be me, but if I needed 3GB VRAM to run multi screen gaming the way I wanted and this card advertised having it, then I'd expect it.


No.

The 4870x2 was 2gb total, I.E. 2gb split between 2 cores or, 1gb/core.

In this instance the 590 has 3gb total...I.E. 3gb split between 2 cores or, 1.5gb/core.

Again, 5+ years, this concept has never, ever, not once, changed so...Why does anyone expect anything different, now?

Perhaps with Kepler you'll see a 4gb or 5gb card? If so, it would be 2gb or 2.5gb per core, respectively.

If you need 3gb PER CORE for surround gaming then you should have sought the 3gb 580's since, they were actually advertised for surround gaming.

Yes, the 590 can handle surround gaming...It handles it quite well at some resolutions...Just because it doesn't handle gaming at the resolutions YOU want doesn't mean YOU are not at fault for not properly researching.

In 5/6 reviews that I JUST read VIA google, they actually state 1.5gb/core with a total of 3gbs.

Before one buys a product, one should ALWAYS research and even a simple search would've told you about the Vram being 1.5 and that the 3bg 580's should have been an obvious solution.

Sorry but, like I said, 5+ years, the concept is not-a-changin.


----------



## Tept

Quote:


> Originally Posted by *Shenta;14908027*
> First off, if you guys have a 590, why blow your money on a stupid Keplar? It's performance increase is only 6x...
> 
> Next year Maxwell comes out, that's by 16x.. Unless you guys really don't care about money..
> 
> Anyway SMO, I put my gpu on ur settings witht the right voltage and everything with fan speeds @100%, and it crashed in 5 seconds in 3DMARK 11. I even put the voltages and clocks down and it still crashed -...-
> 
> edit: On the driver u suggested too..


Logically Kepler would be the superior upgrade. Kepler is 6x Fermi. While Maxwell will only be ~3x Kepler. So logically I'd be paying $400 for 6x current tech, instead of $400 for 3x then current tech. Besides that, I believe GPU and CPU power is going to drastically outrun video game power. I'm willing to bet money you will be able to run every single game that comes out during the Maxwell generation on a 660-670-680(Kepler) at max graphics with 0 hiccup.

I mean damn, Battlefield 3 already looks 99% like a damn movie and its going to be maxed on a 580. I truly believe the only thing Maxwell is going to really shine for during its time is going to be monstrous surround setups at unrealistically high resolutions on monitors that dont even exist yet. Otherwise Kepler will do just fine.

Plus Maxwell is likely to be late 2014 possibly even 2015 since Fermi dropped in 09 and Kepler isnt expected to land til early 12, so if you put it to logic that means at least 2.5yrs for Maxwell putting us at '14.5~15. I think i could manage to stash away $400 over the course of 2.5yrs lol.

I dont believe for a second that ATI and Nvidia are going to rush out Maxwell when they have the entire 7xxx and 6xx series to milk for money for awhile.


----------



## Smo

Quote:


> Originally Posted by *Tept;14914236*
> Logically Kepler would be the superior upgrade. Kepler is 6x Fermi. While Maxwell will only be ~3x Kepler. So logically I'd be paying $400 for 6x current tech, instead of $400 for 3x then current tech. Besides that, I believe GPU and CPU power is going to drastically outrun video game power. I'm willing to bet money you will be able to run every single game that comes out during the Maxwell generation on a 660-670-680(Kepler) at max graphics with 0 hiccup.
> 
> I mean damn, Battlefield 3 already looks 99% like a damn movie and its going to be maxed on a 580. I truly believe the only thing Maxwell is going to really shine for during its time is going to be monstrous surround setups at unrealistically high resolutions on monitors that dont even exist yet. Otherwise Kepler will do just fine.
> 
> Plus Maxwell is likely to be late 2014 possibly even 2015 since Fermi dropped in 09 and Kepler isnt expected to land til early 12, so if you put it to logic that means at least 2.5yrs for Maxwell putting us at '14.5~15. I think i could manage to stash away $400 over the course of 2.5yrs lol.
> 
> I dont believe for a second that ATI and Nvidia are going to rush out Maxwell when they have the entire 7xxx and 6xx series to milk for money for awhile.


Well said dude!


----------



## Masked

Well, so 2 things happened this weekend.

#1, I blew up my 590 (NOT by overclocking, thank you)

and #2, I successfully OC'd my card (again) to 850mhz.

Okay, now #1 did not occur by pumping the vcore or, anything like that...It actually occurred because of a surge.

I was replacing my water loop so I had the AX1200 switched off...I had an intern helping me replace my water loop so, there was distilled water literally almost everywhere.

Well, Erica (the intern) bumped the power button WHILE I was adjusting the molex that feeds power to the PCIE on a Classified 3...

The motherboard surged on and the card actually fried out while I was staring at it...

The entire process happened in less than a second because I barely got the Fbomb out before the card went CRUNCH CRUNCH CRUNCH PEW.

Distilled water is non-conductive and it's all I use so, it definitely wasn't the water, it was just 100% a surge...Thanks again CL-P!

EVGA got me a card out yesterday, will be here today and...again, thank god.

#2 ~ Now, before all this happened and I decided to swap out my water loop, I finally had time to do Cain's entire how-to at home.

I think the results spoke for themselves...I had the card at .95v @853mhz...It was running like a champ, just like the first week I had it, so I was VERY happy...

Although I/We frequently see benches of the 590's beating 6990's at stock, it was nice to know that we could achieve some overclocking with some minor tweaking...

This latest driver run though, is terrible so, I suggest nobody update to this WHQL...30+ clients already called in complaining about 590 issues...and why they're forwarded to my office, god only knows.

New card comes in T-minus 90 minutes...Can't wait to get it home later!!!


----------



## Smo

Quote:


> Originally Posted by *Masked;14920305*
> Well, so 2 things happened this weekend.
> 
> #1, I blew up my 590 (NOT by overclocking, thank you)
> 
> and #2, I successfully OC'd my card (again) to 850mhz.
> 
> Okay, now #1 did not occur by pumping the vcore or, anything like that...It actually occurred because of a surge.
> 
> I was replacing my water loop so I had the AX1200 switched off...I had an intern helping me replace my water loop so, there was distilled water literally almost everywhere.
> 
> Well, Erica (the intern) bumped the power button WHILE I was adjusting the molex that feeds power to the PCIE on a Classified 3...
> 
> The motherboard surged on and the card actually fried out while I was staring at it...
> 
> The entire process happened in less than a second because I barely got the Fbomb out before the card went CRUNCH CRUNCH CRUNCH PEW.
> 
> Distilled water is non-conductive and it's all I use so, it definitely wasn't the water, it was just 100% a surge...Thanks again CL-P!
> 
> EVGA got me a card out yesterday, will be here today and...again, thank god.
> 
> #2 ~ Now, before all this happened and I decided to swap out my water loop, I finally had time to do Cain's entire how-to at home.
> 
> I think the results spoke for themselves...I had the card at .95v @853mhz...It was running like a champ, just like the first week I had it, so I was VERY happy...
> 
> Although I/We frequently see benches of the 590's beating 6990's at stock, it was nice to know that we could achieve some overclocking with some minor tweaking...
> 
> This latest driver run though, is terrible so, I suggest nobody update to this WHQL...30+ clients already called in complaining about 590 issues...and why they're forwarded to my office, god only knows.
> 
> New card comes in T-minus 90 minutes...Can't wait to get it home later!!!


Sucks to hear about the surge dude but at least it's not out of your pocket!

I'm amazed at hitting 850 core at just 0.95v - is that solely due to keeping the temperatures down? My understanding was that voltage is needed regardless?

Also - how does the AX1200 handle these cards in SLi? I sent back my NZXT Hale90 850w to RMA today because it was clicking (see below);





http://www.youtube.com/watch?v=rPF9eu4s8Hw[/ame[/URL]]

I'm trying to get the company I bought it from to replace it (or at least let me upgrade to, if I pay the difference) the AX1200.


----------



## Masked

Quote:



Originally Posted by *Smo*


Sucks to hear about the surge dude but at least it's not out of your pocket!

I'm amazed at hitting 850 core at just 0.95v - is that solely due to keeping the temperatures down? My understanding was that voltage is needed regardless?

Also - how does the AX1200 handle these cards in SLi? I sent back my NZXT Hale90 850w to RMA today because it was clicking (see below);






I'm trying to get the company I bought it from to replace it (or at least let me upgrade to, if I pay the difference) the AX1200.


I use the AX1200 exclusively in the office after ditching the Ultra X4's...Never been happier with a PSU to be 100% honest.

It could/can handle the 590's in SLI and even handles the 980x OC I've got going, with some headroom.

I think after all is said and done, on the 1200, with 2x590's you have 100w of headroom before the 1200 and a possible 1-200w+.

On this card, hitting 850mhz was never a problem...It was very easy and very rewarding...If you look back in this thread, before Cain and I had our "battle of the OCP" I actually had the card at 850mhz from day 1...(Before the driver limitations) so, I don't quite have an answer for you...

My personal card was #12...I haven't seen the RMA# to my new card yet but, I don't imagine I'll be quite as lucky


----------



## Wogga

any benches @853 or games? wonder about stability


----------



## Sir_Gawain

FYI guys, 285.27 Beta out today

http://www.nvidia.com/object/win7-wi...ta-driver.html


----------



## Smo

Quote:



Originally Posted by *Masked*


I use the AX1200 exclusively in the office after ditching the Ultra X4's...Never been happier with a PSU to be 100% honest.

It could/can handle the 590's in SLI and even handles the 980x OC I've got going, with some headroom.

I think after all is said and done, on the 1200, with 2x590's you have 100w of headroom before the 1200 and a possible 1-200w+.

On this card, hitting 850mhz was never a problem...It was very easy and very rewarding...If you look back in this thread, before Cain and I had our "battle of the OCP" I actually had the card at 850mhz from day 1...(Before the driver limitations) so, I don't quite have an answer for you...

My personal card was #12...I haven't seen the RMA# to my new card yet but, I don't imagine I'll be quite as lucky










Cheers dude, that's excellent news. Interesting about the OC on the 590 - I assumed that to hit 850MHz core you'd need plenty more than 0.950v! I'd have expected closer to 1.1-1.15v (which we all know pops these things).

Quote:



Originally Posted by *Sir_Gawain*


FYI guys, 285.27 Beta out today

http://www.nvidia.com/object/win7-wi...ta-driver.html


I hope they let you control voltage. Can someone post back to confirm please?


----------



## Shenta

Quote:



Originally Posted by *Masked*


...LOL, if you actually got a replacement for a "bad overclocker", I would want full documentation because I //WILL// be returning 3/4 of Alienware's total stock and getting replacements...For quite nearly //EVERYTHING//.

Under contract, I "have heard" of a couple vendors doing this only because the clocks advertised in the contract, weren't what the vendor actually received...Thus, it was a breach of contract.

As an individual, IE, a consumer; you're only promised the clocks on the box...That's it, nothing more, nothing less...You accept the contract as a consumer the moment you open that box.

Aside from a vendor, I have NEVER heard/seen or even an incling of a company EVER replacing a consumer end card because "it didn't OC enough"...And like I said, if that EVER happened, vendors would be jumping on it faster than a pig in...well, you know.

I actually disagree with this.

I don't really support Physx, I see it as a limited resource however, by adding a 580, even just for Physx, some great things can be done...Especially if you multi-monitor.

I've actually been toying with the idea of adding a 580 due to the fact that, you could run 2 monitors off of the 580 alone while running the main monitor (gaming) off of the 590...

While you don't quite need the 580, having a fermi with that much "beef" makes a big difference.

I have 1 of the interns running this "setup" and it's actually very successful...His frame rates kicked up, issues with the 590 were "avoided" and the 580 runs like a champ.

The only real question is...Is it worth the 500$ to kick your frame rates up 15%?

For most of us, no...Obviously, we have better things to do with 500$.

For the rest of us...It is a good solution and like I mentioned, you DO bypass some of the issues with the 590 just for the sheer fact that you're "working around" some of the driver-based crapola.


Well in this sense they've breached the contract by advertising my VRAM is 3GBS. But it's not. It's 1.5GB mirrored, because that's how SLI works. So I really only have 1.5GB of VRAM available, and on the box it advertises 3GB.

Up yours EVGA xD


----------



## Shenta

Quote:



Originally Posted by *Tept*


Logically Kepler would be the superior upgrade. Kepler is 6x Fermi. While Maxwell will only be ~3x Kepler. So logically I'd be paying $400 for 6x current tech, instead of $400 for 3x then current tech. Besides that, I believe GPU and CPU power is going to drastically outrun video game power. I'm willing to bet money you will be able to run every single game that comes out during the Maxwell generation on a 660-670-680(Kepler) at max graphics with 0 hiccup.

I mean damn, Battlefield 3 already looks 99% like a damn movie and its going to be maxed on a 580. I truly believe the only thing Maxwell is going to really shine for during its time is going to be monstrous surround setups at unrealistically high resolutions on monitors that dont even exist yet. Otherwise Kepler will do just fine.

Plus Maxwell is likely to be late 2014 possibly even 2015 since Fermi dropped in 09 and Kepler isnt expected to land til early 12, so if you put it to logic that means at least 2.5yrs for Maxwell putting us at '14.5~15. I think i could manage to stash away $400 over the course of 2.5yrs lol.

I dont believe for a second that ATI and Nvidia are going to rush out Maxwell when they have the entire 7xxx and 6xx series to milk for money for awhile.


I have a few arguements for this.

First off, do you really see any games even trying to push the 590 coming out next year? Because I still seing us running anything max.

The next gen of consoles come out in 2014, meaning as well game budgets will go up. There will be a much bigger bump in graphics and quality as opposed to next year.

Maxwell will probably be Direct X 12. That sounds nice to me..

So how I look at it. Even if Maxwell doesn't come out till 2014, it's worth it. I'd rather save these 2 years so I can SLI two Maxwell cards.

So yes, you have some what of a point. But I don't see any game ******* with my 590 for a very long time. About...2014 I'd say ;o

And the 590 will get a lot cheaper by the time keplar comes out. So even if some mystical huge budget game comes out, I'll just SLI my next 590.


----------



## RagingCain

Quote:



Originally Posted by *Shenta*


Well in this sense they've breached the contract by advertising my VRAM is 3GBS. But it's not. It's 1.5GB mirrored, because that's how SLI works. So I really only have 1.5GB of VRAM available, and on the box it advertises 3GB.

Up yours EVGA xD


Uh, what is the total memory on the entire PCB? I believe its 3GB. You may be unhappy with it, or their choice of marketing, but it does have 3GB of VRAM total.

Have no one to be mad at except yourself if you bought one thinking it was 3GB. If you weren't sure, could have asked any one of us. Its not even EVGA alone advertising this card as 3GB... don't know why you are angry at one of the 6 companies....

Why would you need 3GB anyways? The power of these 590 GPUs runs out before you could actually use 3GB of VRAM.

Quote:



Originally Posted by *Shenta*


I have a few arguements for this.

First off, do you really see any games even trying to push the 590 coming out next year? Because I still seing us running anything max.

The next gen of consoles come out in 2014, meaning as well game budgets will go up. There will be a much bigger bump in graphics and quality as opposed to next year.

Maxwell will probably be Direct X 12. That sounds nice to me..

So how I look at it. Even if Maxwell doesn't come out till 2014, it's worth it. I'd rather save these 2 years so I can SLI two Maxwell cards.

So yes, you have some what of a point. But I don't see any game ******* with my 590 for a very long time. About...2014 I'd say ;o

And the 590 will get a lot cheaper by the time keplar comes out. So even if some mystical huge budget game comes out, I'll just SLI my next 590.










I had 2x 590s, there were times when I needed more power. Metro2033 comes to mind as one of them. Games are already out surpassing our hardware, thus future titles highly probable.


----------



## Jobotoo

Quote:


> Originally Posted by *RagingCain;14922614*
> 
> I had 2x 590s, there were times when I needed more power.


Where are they now?


----------



## Smo

Quote:


> Originally Posted by *Jobotoo;14925261*
> Where are they now?


Sold!


----------



## Jobotoo

Quote:


> Originally Posted by *Smo;14926110*
> Sold!


/sniff


----------



## Recipe7

Quote:


> Originally Posted by *Sir_Gawain;14920992*
> FYI guys, 285.27 Beta out today
> 
> http://www.nvidia.com/object/win7-winvista-64bit-285.27-beta-driver.html


Stock core with my setup netted me +146 points in 3dMark11 when compared to 280.19.

Voltage is locked to max 925


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7;14927413*
> Stock core with my setup netted me +146 points in 3dMark11 when compared to 280.19.
> 
> Voltage is locked to max 925


Iiinteresting. Is that a single run or an average?

So coming from 280.26 one can only benefit.


----------



## Recipe7

That was my first run.

I had 3 more runs after that, and my scores were between 9002 to 9008.


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7;14929771*
> That was my first run.
> 
> I had 3 more runs after that, and my scores were between 9002 to 9008.


Confirmed.

280.26:
P9902
GFX10039
PHY10365

285.27:
P10050
GFX10235
PHY10324

My scores will be low compared to similar SB clocks due to terribad BIOS as always.

Sadly V control is locked-down as well with Afterburner 2.2.0 B7. I rationalized my use of 280.26 by convincing myself the next beta drivers would have voltage control again. Sigh.


----------



## Semedar

Quick question:

If I add a PNY GeForce GTS 250 to my Sig Rig as a dedicated PhysX card, how much of an overall improvement/decline in performance be?

I will be getting my brothers old GeForce 250 as a trade in for my old ATI 5670 since he needs somewhat of an upgrade.









Edit: It will be this one:

http://www3.pny.com/font-color999999GTS-250-1024MB-PCIefont-P2793C396.aspx


----------



## Goodfella

Add me please


----------



## FlameDB

how does the gtx 590 perform in terms of temperatues and fan noise during gaming/full load? Since i'm a silence freak, I was going to buy a GTX 580 from calibre with a accelere extreme plus II which makes the card completely silent. I now have a gtx 560 ti twin frozer II which is also very quiet for me (unless fans starts spinning @ 60%, but i doesn't have to)...

Also: can a corsair HX650W handle it?


----------



## Masked

Quote:


> Originally Posted by *FlameDB;14933777*
> how does the gtx 590 perform in terms of temperatues and fan noise during gaming/full load? Since i'm a silence freak, I was going to buy a GTX 580 from calibre with a accelere extreme plus II which makes the card completely silent. I now have a gtx 560 ti twin frozer II which is also very quiet for me (unless fans starts spinning @ 60%, but i doesn't have to)...
> 
> Also: can a corsair HX650W handle it?


If you get the air version, about 30% of the overall heat is dumped into the case...This is b/c of the 2nd core.

This isn't a bad thing if you actually have a good case.

Is it loud? Just as loud as a 580...

~

Moderately off-topic...

I've put some serious thought into this 580//590 setup and I think I'm going for it.

My reasoning is, with the 590 dedicated to 1 monitor, you lose half the driver issues...I.E. SLI with Multi, VRM...Etc...

Plus, it's actually been proven that in SLI, the cores actually sync (something I don't currently see) so, I'm really putting some thought into this.

Definitely having a hard time talking myself out of it...

Now, to find an EVGA GTX 580 HC for sale...


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14934170*
> If you get the air version, about 30% of the overall heat is dumped into the case...This is b/c of the 2nd core.
> 
> This isn't a bad thing if you actually have a good case.
> 
> Is it loud? Just as loud as a 580...
> 
> ~
> 
> Moderately off-topic...
> 
> I've put some serious thought into this 580//590 setup and I think I'm going for it.
> 
> My reasoning is, with the 590 dedicated to 1 monitor, you lose half the driver issues...I.E. SLI with Multi, VRM...Etc...
> 
> Plus, it's actually been proven that in SLI, the cores actually sync (something I don't currently see) so, I'm really putting some thought into this.
> 
> Definitely having a hard time talking myself out of it...
> 
> Now, to find an EVGA GTX 580 HC for sale...


So, would this setup help with actual surround gaming? Or is it more for so you can run 3 monitors at once without issues?


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;14934497*
> So, would this setup help with actual surround gaming? Or is it more for so you can run 3 monitors at once without issues?


You can't really surround successfully with a 590.

At some resolutions, absolutely...It works great but, at higher resolutions, no.

My idea would be to buy 1 really upscale, high end monitor...While have the other 2 "slave".

So you'd still be in surround...But all gaming would go to the central monitor.

This allows you to still surround which, unless you work from home, you really don't NEED this but, it allows a work-around for about 10-12 driver issues that I can name off of the top of my head.

So essentially, you can keep your surround setup, the only difference is when gaming, you'd be actually using the 590 to it's fullest VIA the 1 monitor.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;14934538*
> You can't really surround successfully with a 590.
> 
> At some resolutions, absolutely...It works great but, at higher resolutions, no.
> 
> My idea would be to buy 1 really upscale, high end monitor...While have the other 2 "slave".
> 
> So you'd still be in surround...But all gaming would go to the central monitor.
> 
> This allows you to still surround which, unless you work from home, you really don't NEED this but, it allows a work-around for about 10-12 driver issues that I can name off of the top of my head.
> 
> So essentially, you can keep your surround setup, the only difference is when gaming, you'd be actually using the 590 to it's fullest VIA the 1 monitor.


Got you.

How well does a Quad SLI setup do surround gaming with three 1080p 3D/120hz monitors for 120fps gameplay? Have you guys had a chance to test that kind of setup with the latest drivers?

I'm looking hard at that new Asus 27" 1080 120hz/3D monitor coming out in Oct. to replace my BenQ if the picture quality is good enough. If 2 590's in Quad can push three of those, I'd be really psyched.

My panel now is stellar for normal 2D gaming, but I'm not sure if its big enough for 3D gaming and movies. So I may upgrade and keep this panel for a secondary rig. Or I may get an 2560x1600 IPS panel. Still not sure...


----------



## bah73

Can I join this club.
http://www.overclock.net/intel-build-logs/1117260-corsair-600t.html
Asus GTX 590 Stock


----------



## Shinobi Jedi

The new beta drivers have rocked the hell out of my 3DMark11 score.

My last score at stock clocks on the 280.19 drivers: X5825

My last score at 1340/1728 on the 280.19 drivers: X6060

Latest score at stock clocks on the new latest beta drivers? X6230

Impressive. Most impressive.

However, I did not pick up any noticeable gains on the Metro 2033 Bench on Very High with DOF turned OFF. I haven't run the bench with DOF on.

I'm bummed I can't get Crytek's Crysis 2 bench to work..

Crysis 2 does seem to average closer to 90fps now instead of 60fps.

I haven't really noticed any other significant gains in games, but I also really haven't been gaming much ever since I finished Deus Ex. I still liked Mass Effect 2 better (I'm a sucker for Space Opera) but among the current crop of games, everything else just feels "meh" after finishing Deus Ex... Even WarHammer:Space Marine..

I hadn't played BFBC2 in a while because I was playing it so much when I first got my cards that my hands started to cramp up carpel tunnel (sp?) style and it kind of freaked me out. So I started playing with my wireless 360 controller as much as I can.

But I played this morning and dayum, if it didn't cure that post Deus Ex blues.

Good ol' Battlefield....


----------



## MKHunt

Clearly the only solution is moar ME2. Shadow Broker DLC is pretty sweet, but I might be a sucker for Liara. I miss 790MHz core...

Is Deus Ex 360 pad compatible? Might have to try that. Also, if you liked ME2 you should look into Bastion. Just a shot in the dark.

Space Marine isn't bad so far IMO. I am truly sick of the orcs shouting "space marines!" though. That got stale after the 6th time they shouted it.


----------



## Shinobi Jedi

Quote:



Originally Posted by *MKHunt*


Clearly the only solution is moar ME2. Shadow Broker DLC is pretty sweet, but I might be a sucker for Liara. I miss 790MHz core...

Is Deus Ex 360 pad compatible? Might have to try that. Also, if you liked ME2 you should look into Bastion. Just a shot in the dark.

Space Marine isn't bad so far IMO. I am truly sick of the orcs shouting "space marines!" though. That got stale after the 6th time they shouted it.


Yes! The 360 Pad does work great on Deus EX! No conflict with the Logitech G13 either like on some games, if you have one of those.

Space Marine plays with the 360 pad as well. It's not a bad game, at all. It's just that Deus Ex among the current crop of games is arguably the toughest act to follow.

Unfortunately, I've already torn through all the ME2 downloadable content. I loved most of them, especially the Shadow Broker one.

Fortunately, Deus Ex is supposed to be releasing similar style downloadable content.

I'm actually a big fan of downloadable content of this nature. Much more than map packs for the FPS games I play.

Little episodes that have good to great gameplay, but also story wise either add to the main overall plot, or reveal more about the characters in the story and how those revelations flesh out the overall story.

Some people complain, such as in Deus Ex, assuming it's cut content from the primary game to be sold as DLC later. But for me, if the content is improved in some pertinent fashion, then I have no problem paying extra for it.

If I was a Game Producer of such a series, and it was my call, I'd try and have DLC episodes be made and set to release on a monthly or bi-monthly basis that bridged between two games, such as Mass Effect 2 and 3. And by doing so, ideally keep the player hooked and invested in the game/story/characters so that they're lead right into the next game emotionally invested and hyped for the next big "chapter"

Definitely looking forward to the DLC of Deus Ex if it's of the same quality as the ME2 ones.

I picked up Bastion, I'm gonna have to check that out later tonight when I finish with my work.

I impulse bought Call of Juarez: The Cartel, liking previous entries in the series and intrigued by the idea of the new one. I had no idea it was as horrible as they say it is. I haven't even tried it. But that's what I get for not reading the console version review of a game that comes out on that platform first and then PC later.

Lesson Learned.


----------



## Smo

What software do you guys use with your 360 controllers?


----------



## rush2049

I use the default windows 7 driver that installs when you plug it in. Most games support it, even annoyingly so when I leave it plugged in and want to use keyboard and mouse. Especially when one of the sticks is slightly offf center and I can't figure out why I am turning left automatically.......


----------



## Shinobi Jedi

Quote:


> Originally Posted by *rush2049;14946239*
> I use the default windows 7 driver that installs when you plug it in. Most games support it, even annoyingly so when I leave it plugged in and want to use keyboard and mouse. Especially when one of the sticks is slightly offf center and I can't figure out why I am turning left automatically.......


Yeah, same here. Windows 7 driver. The main thing you need is the wireless adapter unless you use a wired controller.

The pad works great. I actually use it as often as I can. The only time I don't is with online FPS multiplayer. Then I use a Logitech G13 gameboard, with it's little joystick which I use as a substitute for the W,A,S,D keys and it's a Godsend. And a Razer Naga Epic. Which actually works way better for me for FPS than the Razer Mamba which was designed more for FPS vs. the Naga Epic's MMO focus. But for some reason, the feel and performance of the Naga Epic works much better for me for Online FPS.

It cracks me up when console gamers try to say that you can't PC game from a couch. Um, I can game from the couch just fine when I hook my rig up to my 60" Plasma. And with exponentially better graphics and Frame rates too









I actually just spent the last few hours sucked into The Witcher 2. I forgot about that one when I was complaining about nothing to play after Deus Ex.

With everything turned up except Ubersampling as well as Vsync set to "On", I got a solid 97fps pretty much no matter what I did. I'm enjoying the game. Looking forward to another round later...


----------



## Tept

So are you guys saying that the new 285 drivers at Stock is superior to 280.19 @ considerable overclock?, like 750mhz core?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Tept;14951167*
> So are you guys saying that the new 285 drivers at Stock is superior to 280.19 @ considerable overclock?, like 750mhz core?


No. My comparison OC to the stock was 670 core at stock volts compared to stock clocks.

I'm having a really good experience with them so far. I say try 'em out


----------



## MKHunt

Quote:


> Originally Posted by *Shinobi Jedi;14953695*
> No. My comparison OC to the stock was 670 core at stock volts compared to stock clocks.
> 
> I'm having a really good experience with them so far. I say try 'em out


I noticed some slight hanging in Witcher 2 and some random slow-mo moments in Space Marine (though the audio plays at normal speeds and seems to be synched still?) but other than that they're a winning set. Also, draw distance has decreased in Witcher 2 which is bugging me, but not enough to sacrifice the extra performance. Plus the stutters and slow-downs could be due to lots of things such as the fact that all my games are on my HDD.

I got a 120GB SSD so I would have room for games and now all I use it for is OS, drivers, and benchmarks. I confuse myself on a regular basis.

OT, I'm really liking Space Marine. Huge fan of the franchise and I'm sure that I'm biased since I have a rather sizable Black Templar army. It's surprisingly challenging; feels like they nerfed space marines (as compared to tabletop stats) to make the game more balanced. 3+ saving throws are missing


----------



## Masked

So, I bit the bullet.

I bought a 580...Someone here, had one at just the right price and now, I can swap it with one of our samples...Thus I can take the sample home.

I realize the 580 for JUST physx is way, way too much...I also realize I could have gone with a much cheaper alternative but, my thoughts were that since they're the same core...It would garner less issues with drivers etc...In the long run.

That being said, also ordered a water block for it and I'm now awaiting the arrival of my new toys







Yay.


----------



## Tept

Quote:


> Originally Posted by *Shinobi Jedi;14953695*
> No. My comparison OC to the stock was 670 core at stock volts compared to stock clocks.
> 
> I'm having a really good experience with them so far. I say try 'em out


1 of my cores is at .913 and other is .925 so I can only pull 640 core with 285 driver, sucks.


----------



## rush2049

With Msi Afterburner you have to set the voltage on the core, click apply. Then go into settings and switch to the other core, then set the same voltage on that core and apply.

That should force the same voltage on both cores.....


----------



## RagingCain

Just keeping in touch, the 590s shipped yesterday..... it felt like I attended a funeral... of someone I cared about.


----------



## MKHunt

Quote:


> Originally Posted by *Tept;14957167*
> 1 of my cores is at .913 and other is .925 so I can only pull 640 core with 285 driver, sucks.


Can anyone tell me how people are observing this? In every monitoring application I have (ibcl. GPU-Z) used BOTH cores at load show .913V regardless of firmware. Just want to make sure I'm not missing something important.

Also, 685MHz core folding stable @ .913V. With these new, more efficient drivers, it feels niiiice. I do miss voltage control though.

RC, you will be missed in the overclocking and general research of these cards. I would see your results on 289.19 and it would cause me to enter some sort of state where I was determined to get similar results. Recently though, I've been adopting the mentality of, "Dang stock voltage is convenient." Something needs to break me of this.


----------



## RagingCain

Quote:


> Originally Posted by *MKHunt;14958052*
> Can anyone tell me how people are observing this? In every monitoring application I have (ibcl. GPU-Z) used BOTH cores at load show .913V regardless of firmware. Just want to make sure I'm not missing something important.


Just enable GPU voltage monitoring in Afterburner and either detach or scroll down the graphs displayed to show it.
Quote:


> Also, 685MHz core folding stable @ .913V. With these new, more efficient drivers, it feels niiiice. I do miss voltage control though.
> 
> RC, you will be missed in the overclocking and general research of these cards. I would see your results on 289.19 and it would cause me to enter some sort of state where I was determined to get similar results. Recently though, I've been adopting the mentality of, "*Dang stock voltage is convenient.*" Something needs to break me of this.


BIOS FLASH ^.^

I need an anime dancing cat...

ah here:


----------



## Tept

Quote:



Originally Posted by *rush2049*


With Msi Afterburner you have to set the voltage on the core, click apply. Then go into settings and switch to the other core, then set the same voltage on that core and apply.

That should force the same voltage on both cores.....


That's what I mean rush, .925 is the max allowable voltage for core 1, and .913 is the max allowable voltage for core 2.


----------



## rush2049

Well the EVGA cards have a bios issue as well..... its talked about way back in the pages on this thread.

But as RagingCain said, he did most of the leg work and a bios flash is in your future if you want to fix it and not be at the mercy of the drivers.


----------



## RagingCain

Quote:


> Originally Posted by *Tept;14961892*
> That's what I mean rush, .925 is the max allowable voltage for core 1, and .913 is the max allowable voltage for core 2.


Yeah Rush is right, I started messing around with Bios editing simply to fix an issue overlooked by EVGA and brought on by nVidia when they first started locking voltages.

My suggestion would be to flash a higher than stock BIOS, and just lower the voltage to 0.925v per core, and then Apply Overclock on startup, and you should be good to go.

I was actually unstable myself at stock 0.925v GPU1 / 0.913v GPU2.


----------



## Krazeswift

What sort of memory overclocks are you guys hitting?

I'm finding mine to be very hit and miss on certain games.


----------



## RagingCain

Quote:


> Originally Posted by *Krazeswift;14965698*
> What sort of memory overclocks are you guys hitting?
> 
> I'm finding mine to be very hit and miss on certain games.


For the most part, better results at stock up to 900mhz (1800 eff) depending on title. Best just to leave mem alone in my opinion.

Post powered by DROID X2
Rooted running Eclipse.


----------



## Krazeswift

Think your right im finding 1800 stable on some things but not others. Plus every time theirs a new driver it all changes again.


----------



## dvanderslice

Quick question guys. I'm adding a new display and the only port I have left on my 590 is the display port. Since my display only supports 120Hz using the DVI-D output not HDMI for some reason. I see these mini display port male to dual link dvi female adapters around. I also see there is a huge difference in price for dual link compared to single link DVI. Also it seems that the Dual Link ones all need a power supply via USB. Obviously I want to get the full performance of my monitor I'm going to be using on that port. Which adapter is recommended? It is for the same model that I have in my system section in my signature.

Thanks as always for your help.


----------



## RagingCain

Quote:



Originally Posted by *dvanderslice*


Quick question guys. I'm adding a new display and the only port I have left on my 590 is the display port. Since my display only supports 120Hz using the DVI-D output not HDMI for some reason. I see these mini display port male to dual link dvi female adapters around. I also see there is a huge difference in price for dual link compared to single link DVI. Also it seems that the Dual Link ones all need a power supply via USB. Obviously I want to get the full performance of my monitor I'm going to be using on that port. Which adapter is recommended? It is for the same model that I have in my system section in my signature.

Thanks as always for your help.


How many displays do you have?

Secondly, you want a Dual Link DVI cable going to DVI to the 120 Hz monitor, or you are not getting 120 Hz. I believe 120 Hz, can not be done with an adapter at either end as far as I know. Consider moving one of your other monitors off a DVI port.

Thirdly, I don't know what you need a powersupply with USB for, they are just Video Cables.... no need to power anything. Just plug a normal DVI Dual Link cable, into the 120 Hz monitor. You might want to link whatever it is you think you need before buying it. The monitor if bought new should come with one. Radioshack sells them in store about 29.99$ which isn't terrible when you take into consideration shipping and waiting by mail for a cable.


----------



## Shenta

Quote:



Originally Posted by *RagingCain*


Uh, what is the total memory on the entire PCB? I believe its 3GB. You may be unhappy with it, or their choice of marketing, but it does have 3GB of VRAM total.

Have no one to be mad at except yourself if you bought one thinking it was 3GB. If you weren't sure, could have asked any one of us. Its not even EVGA alone advertising this card as 3GB... don't know why you are angry at one of the 6 companies....

Why would you need 3GB anyways? The power of these 590 GPUs runs out before you could actually use 3GB of VRAM.

I had 2x 590s, there were times when I needed more power. Metro2033 comes to mind as one of them. Games are already out surpassing our hardware, thus future titles highly probable.


That's because you have 4 GPU's. Metro 2033 utilizes only one of them, and is horrible at utilizing GPU's. No game is surpassing it, they're doing a horrible job of utilizing it.


----------



## emett

Are you sure about that? I get similar performance in crysis 2 with dx11 an ultra textures pack, is this game also using only one of the 590's gpu's?


----------



## L D4WG

Installing 270.27 drivers, anyone had any issues with them?


----------



## RagingCain

Quote:


> Originally Posted by *Shenta;14973482*
> That's because you have 4 GPU's. Metro 2033 utilizes only one of them, and is horrible at utilizing GPU's. No game is surpassing it, they're doing a horrible job of utilizing it.


Well you are wrong there. It uses all 4. Its just not optimized.










Metro2033 is one example. Witcher 2 with all its eyecandy, AA, and Ubersampling, is another one that can kill framerates. This is all at 1920x1080. I can't imagine our cards running these same titles with max settings at 2560x1600 very well. F1 2010 is another such title, that benefits greatly 4x580s. Something 2x590s don't match up to unfortunately.

Of course that is if you need more 60 fps, like I do.


----------



## dvanderslice

Quote:


> Originally Posted by *RagingCain;14972255*
> How many displays do you have?
> 
> Secondly, you want a Dual Link DVI cable going to DVI to the 120 Hz monitor, or you are not getting 120 Hz. I believe 120 Hz, can not be done with an adapter at either end as far as I know. Consider moving one of your other monitors off a DVI port.
> 
> Thirdly, I don't know what you need a powersupply with USB for, they are just Video Cables.... no need to power anything. Just plug a normal DVI Dual Link cable, into the 120 Hz monitor. You might want to link whatever it is you think you need before buying it. The monitor if bought new should come with one. Radioshack sells them in store about 29.99$ which isn't terrible when you take into consideration shipping and waiting by mail for a cable.


What you said about adding an adapter in the mix not allowing 120Hz may have made this whole post moot but i'll just clarify so that we are on the same page. If your reply is the same that it won't give me full functionality then you'll have helped me all the same.
First I am sorry, I guess I didn't describe my issue clearly enough so I think you misunderstood what I meant in some aspects. I have 3 displays now and I'd like to have the option to use the display port on the GTX 590. I have my television using one DVI port, one Acer GDZ235HZ on another DVI port, another GDZ235HZ on the last DVI port...But I occasionally I cap videos using my set top box using one of the DVI ports on the 590, I could use the 1394 but lets not get into that, the issue is trying to make use of the mini display port that I have on GTX 590 with one of the Acer monitors and have the 120Hz that this monitor was bought for. Yes you are thinking why not use the adapter with the set top box but lets just keep that for last...Anyways, This wasn't an issue when I had two GPUs previously as I had four DVI ports but I have just three now. So back to my problem:

The monitor I want to use comes with the DVI-D, HDMI and D-Sub cables and has the corresponding ports. So yes I know I have to use the DVI to get the 120Hz as that is in the manual. That is my problem since that I have only the mini display port on the GTX 590 available and no display port on the monitor I need an adapter to use the DVI cable from the monitor to a mini display port. The eVGA box for the GTX 590 comes with a standard display port to mini display port adapter which is great if I had a monitor with a display port on it like I do at work. But it only has the three that I listed above (HDMI,VGA,DVI). So I did a search for what sort of adapters they have as I have a ton of DVI cables laying around from the various computers I've built for client jobs (that i've grabbed since they weren't being used) and personal use. Therefore I have the cables I need that isn't an issue. The issue is back to using the mini display port on the Video Card. I know you mentioned that you can't use 120Hz with an adapter but lets just continue and we'll go from the end of the post:

So my question is this. Here is what I've found that is available.

We have the Single Link DVI to mini display adapter HERE Which is a great price

And all the Dual Link Adapters have an additonal USB plug for power like this one: HERE

Which I agree is weird as video cables to require power on their own...what the hell right? SO to my original post. I can't find any adapters that use dual link DVI that aren't less than $90 which is a little more than I'd like to spend for an adapter. Some include audio some don't. I don't care about audio capabilities. I'm just wondering can I use the Single Link DVI to mini display adapter and still get the full functionality that a Dual Link DVI to mini display adapter gives? I assume not as "display", DVI and HDMI cables are essentially the same thing with added a capabilities like audio. So back to the adapters, what can I do to use the mini display port with the monitor in question... is there some other option out there? I see there are converters like these: HERE [/ur] where I'd need a mini display cable. Which is no problem as I have tons of those in the server room where I work as we deal with Apple laptops for our multimedia department employees so they always use those display ports on everything. But yet the price for one of those converters is not any different and again they need power. So that's where I'm getting the power from.

I appreciate your reply before and I apologize I didn't clarify a little more I had been out all night Friday and was on a serious lack of sleep when I typed my question out on my phone. Thanks as always ahead of time as you guys are always great with your help.

I do apologize too for this novel of a post and for interrupting any ongoing conversations with this behemoth.


----------



## RagingCain

Quote:


> Originally Posted by *dvanderslice;14979808*
> What you said about adding an adapter in the mix not allowing 120Hz may have made this whole post moot but i'll just clarify so that we are on the same page. If your reply is the same that it won't give me full functionality then you'll have helped me all the same.
> First I am sorry, I guess I didn't describe my issue clearly enough so I think you misunderstood what I meant in some aspects. I have 3 displays now and I'd like to have the option to use the display port on the GTX 590. I have my television using one DVI port, one Acer GDZ235HZ on another DVI port, another GDZ235HZ on the last DVI port...But I occasionally I cap videos using my set top box using one of the DVI ports on the 590, I could use the 1394 but lets not get into that, the issue is trying to make use of the mini display port that I have on GTX 590 with one of the Acer monitors and have the 120Hz that this monitor was bought for. Yes you are thinking why not use the adapter with the set top box but lets just keep that for last...Anyways, This wasn't an issue when I had two GPUs previously as I had four DVI ports but I have just three now. So back to my problem:
> 
> The monitor I want to use comes with the DVI-D, HDMI and D-Sub cables and has the corresponding ports. So yes I know I have to use the DVI to get the 120Hz as that is in the manual. That is my problem since that I have only the mini display port on the GTX 590 available and no display port on the monitor I need an adapter to use the DVI cable from the monitor to a mini display port. The eVGA box for the GTX 590 comes with a standard display port to mini display port adapter which is great if I had a monitor with a display port on it like I do at work. But it only has the three that I listed above (HDMI,VGA,DVI). So I did a search for what sort of adapters they have as I have a ton of DVI cables laying around from the various computers I've built for client jobs (that i've grabbed since they weren't being used) and personal use. Therefore I have the cables I need that isn't an issue. The issue is back to using the mini display port on the Video Card. I know you mentioned that you can't use 120Hz with an adapter but lets just continue and we'll go from the end of the post:
> 
> So my question is this. Here is what I've found that is available.
> 
> We have the Single Link DVI to mini display adapter HERE Which is a great price
> 
> And all the Dual Link Adapters have an additonal USB plug for power like this one: HERE
> 
> Which I agree is weird as video cables to require power on their own...what the hell right? SO to my original post. I can't find any adapters that use dual link DVI that aren't less than $90 which is a little more than I'd like to spend for an adapter. Some include audio some don't. I don't care about audio capabilities. I'm just wondering can I use the Single Link DVI to mini display adapter and still get the full functionality that a Dual Link DVI to mini display adapter gives? I assume not as "display", DVI and HDMI cables are essentially the same thing with added a capabilities like audio. So back to the adapters, what can I do to use the mini display port with the monitor in question... is there some other option out there? I see there are converters like these: HERE [/ur] where I'd need a mini display cable. Which is no problem as I have tons of those in the server room where I work as we deal with Apple laptops for our multimedia department employees so they always use those display ports on everything. But yet the price for one of those converters is not any different and again they need power. So that's where I'm getting the power from.
> 
> I appreciate your reply before and I apologize I didn't clarify a little more I had been out all night Friday and was on a serious lack of sleep when I typed my question out on my phone. Thanks as always ahead of time as you guys are always great with your help.
> 
> I do apologize too for this novel of a post and for interrupting any ongoing conversations with this behemoth.


That sure was a mouth full.

Here is the thing. nVidia says two things:
A single 590 supports 2D Surround.
A single 590 supports 3D Surround.
590 Product Page

But if you look up 3D Surround, like a ninja, nVidia says it requires DVI Ports (3 no less) for 3D LCD Monitors.
3D Tech Source

A 590 supports 3x DVI, 1x Mini DP port.

So I looked up the specs of this DP Port, just to see what its all about capable of:









Okay so you could use a mini-DP out to HDMI1.4a which supposedly supports 3D tech, however only in film and some games like PS3. The reason for that is the frame rate is capped around 30 (24~27 for movies) and thus its maximum output is 60 Hz. Besides, the Acer doesn't have HDMI in I believe.

However, there is other information to take note of, especially in Quad-SLI, and or running 3D Surround. The full PDF is located here: PDF

With all of that said, the thing to take away from here is this: the DP port only broadcasts at 60 Hz. Whatever this strange adapter does to create the extra frequency, I can't imagine it would be good for gaming (input lag). The other thing to consider is that nVidia drivers might simply not let you activate 3D tech unless all 3 DVI ports have connection to monitors. Surround and 3D Surround options don't even appear for you to select if there is no monitor. I understand you want to have YOUR setup YOUR way, and I hate bursting bubbles, I just don't see how it would work. Unless...

Have you considered running your box top with a second GPU, something cheap in the 2xx range?


----------



## dvanderslice

Here's the ports on the monitor that is in my sig:









http://www.tigerdirect.com/applicati...028&CatId=4420

Tiger Direct has a better summary than the Acer page tbh.

See this is why I post here, thanks so much man. Your post is thorough enough to have answered all the questions I had and then some. I really am impressed.

That was one of the things I just automatically assumed. It comes with the HDMI on this Acer monitor and I figured it was 120Hz capable via HDMI as well as the DVI....hooked the HDMI first to the 590...damn thing wouldn't run in 120Hz. Found in bold letters via the online manual that 120 Hz can only be obtained when using a Dual-Link DVI cable to the DVI port of the monitor. So even if I went DP to HDMI i'd be stuck with the 60 Hz and if I was just going to do that I wouldn't have bought the 120 Hz monitor. So that's out.

Anyways, back to your reply. You are indeed a ninja and have found what I been looking for: the capabilities of that DP on the 590. That answers my question on how to proceed. I have a HD-PVR that I'll start using primarily instead of messing with adapters and the like. I mainly game with the 590 and I been using it for capping less and less...and using the PVR more and more to cap which saves times on encoding and everything else that is involved.

Have you ever tried the display port on the 590? Has anyone? Is there any benefit to it? I can honestly say outside of the Apple Laptops I see at work that use them I've not seen it working on any discrete video cards. I was looking at an Acer Monitor that had a display port and was half tempted to get it but the 120 Hz sold me for the 3D and all that. My TV is the same and it is great to have it makes a huge difference if setup properly. There are some great AVISynth scripts that can really utilize the 120Hz to it's fullest to where it looks absolutely incredible for HD sports and action movies.

I agree this adapters are something new that I have not seen in action, that is the main reason I am apprehensive about buying one. They are the first I've seen that require power like this in a very long time. I think i'll stick with what I have. I have a pair of GTX 275s that are almost brand new that I been trying to sell together for $250 on eBay and have had no takers maybe i'll pop one of those in if need be down the line as it seems I'm stuck with them.

Thanks again buddy. REP


----------



## dvanderslice

Quote:



Originally Posted by *L D4WG*


Installing 270.27 drivers, anyone had any issues with them?


I assume you mean 285.27? If so these drivers have been great. Since 280 came out there have been some issues that have cropped up for me with HD playback over multiple displays @ 120Hz and Dawn of War 2 GPU crashes due to the driver and some other performance issues. Let alone the benchmarking of these drivers far surpass any drivers i've benched with this card. Though that's not necessarily applicable to real gaming and app performance I was unexpectedly surprised. They also increased performance on Space Marine which I been playing. I got a ton of games on my PC and haven't tried them all by any means but the games I have I have been really happy with.


----------



## ReignsOfPower

Quote:



Originally Posted by *dvanderslice*


Have you ever tried the display port on the 590? Has anyone? Is there any benefit to it?


I use the DP to hook up my Dell U3011, however it's only a 60Hz monitor so it isn't of much consequence if I used that, or a Dual Link DVI cable on the thing. Though I have noticed that when I completely power off the monitor and turn it back on, all my apps windows have been shrunken into the top left corner of the screen as if the resolution drops to 800x600 while its off. DVI never did this. Anyways, I just use the DP now for 2 reasons. 1. Cable is thinner and more flexible, 2. Plugging and unplugging is a breeze.

Some Video cards like the latest in the Radeon series also allow more than one display to be powered off the one display port as per the latest DP version. Doubt though you could do more than 1 1600p monitor on the thing.


----------



## RagingCain

Quote:


> Originally Posted by *dvanderslice;14981363*
> Here's the ports on the monitor that is in my sig:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=5661028&CatId=4420
> 
> Tiger Direct has a better summary than the Acer page tbh.
> 
> See this is why I post here, thanks so much man. Your post is thorough enough to have answered all the questions I had and then some. I really am impressed.
> 
> That was one of the things I just automatically assumed. It comes with the HDMI on this Acer monitor and I figured it was 120Hz capable via HDMI as well as the DVI....hooked the HDMI first to the 590...damn thing wouldn't run in 120Hz. Found in bold letters via the online manual that 120 Hz can only be obtained when using a Dual-Link DVI cable to the DVI port of the monitor. So even if I went DP to HDMI i'd be stuck with the 60 Hz and if I was just going to do that I wouldn't have bought the 120 Hz monitor. So that's out.
> 
> Anyways, back to your reply. You are indeed a ninja and have found what I been looking for: the capabilities of that DP on the 590. That answers my question on how to proceed. I have a HD-PVR that I'll start using primarily instead of messing with adapters and the like. I mainly game with the 590 and I been using it for capping less and less...and using the PVR more and more to cap which saves times on encoding and everything else that is involved.
> 
> Have you ever tried the display port on the 590? Has anyone? Is there any benefit to it? I can honestly say outside of the Apple Laptops I see at work that use them I've not seen it working on any discrete video cards. I was looking at an Acer Monitor that had a display port and was half tempted to get it but the 120 Hz sold me for the 3D and all that. My TV is the same and it is great to have it makes a huge difference if setup properly. There are some great AVISynth scripts that can really utilize the 120Hz to it's fullest to where it looks absolutely incredible for HD sports and action movies.
> 
> I agree this adapters are something new that I have not seen in action, that is the main reason I am apprehensive about buying one. They are the first I've seen that require power like this in a very long time. I think i'll stick with what I have. I have a pair of GTX 275s that are almost brand new that I been trying to sell together for $250 on eBay and have had no takers maybe i'll pop one of those in if need be down the line as it seems I'm stuck with them.
> 
> Thanks again buddy. REP


Hey no problem, thats what we are here for









Never used DP, allergic to Mac products as well. I will say this, DP version 1.2 (which maybe out now) the next gen DP port has extremely impressive specs. [email protected]@30BPP which I can't help but drool over... then remember thats probably a 2500$ monitor... if not more when it comes out. I saw that on the wiki page about DP reading it earlier. I am like you, I will hold off on adapters to I either know how it magically works (gist of) or let someone else test it









I do have a 120Hz *GD*235HZ which was the first gen of these 120 Hz monitors. Probably one of the best investments for gaming I have ever had. If you have a high DPI mouse and a good mouse pad and a GPU capable of locking in (optionally Vsync too) at a fluid 120 fps... its a total gaming orgasm. Call of Duty 4 was the first game I tried, and I literally took an automatic rifle (M4A1) and I could distinctly tell each individual shot/shell eject even on full squeeze. I was cutting lines with that bad boy... I have had the monitor 9 months now, haven't even tried the 3D because I was so in love with the refresh / FPS display hahaha.

If you were worried about using another card, here is a couple of other persk using that GTX 275. If you use [email protected] or BOINC, its major bonus CUDA points for rendering cancer cures or searching for Aliens. Very handy... but probably the other convincing argument would be that it can also be a dedicated PhysX card and let the beefy 590 focus on rendering.


----------



## kevink82

Add that if you sli the GTX590 the DP does not work.... i though i got a broken card or my U2711 was faulty but now that im only using 1 card the DP works. Go figure


----------



## dvanderslice

Quote:


> Originally Posted by *RagingCain;14984628*
> 
> I do have a 120Hz *GD*235HZ which was the first gen of these 120 Hz monitors.










Yah yah I had an extra letter in the sig I see your little note on needing a correction lol

But yes I love having two of these monitors and can't do without the 120 Hz now it just isn't the same anymore.


----------



## Smo

Quote:


> Originally Posted by *dvanderslice;14987164*
> 
> 
> 
> 
> 
> 
> 
> Yah yah I had an extra letter in the sig I see your little note on needing a correction lol
> 
> But yes I love having two of these monitors and can't do without the 120 Hz now it just isn't the same anymore.


I hear you mate - I was sceptical about the difference but since moving to 120Hz I couldn't go back.


----------



## rush2049

Just wanted to pop in and mention... I really hope that our cards have no major issues with the bf3 beta/release. I had major issues with crysis 2 and its physX interaction.... lets hope there are no repeat problems....


----------



## RagingCain

Quote:


> Originally Posted by *rush2049;14992493*
> Just wanted to pop in and mention... I really hope that our cards have no major issues with the bf3 beta/release. I had major issues with crysis 2 and its physX interaction.... lets hope there are no repeat problems....


Looks like I will let you know how it is from the Red camp, got a 6990... eyeballing a 6970 right for trifire.

I am already dreading catalyst...


----------



## MKHunt

Quote:


> Originally Posted by *RagingCain;14994889*
> Looks like I will let you know how it is from the Red camp, got a 6990... eyeballing a 6970 right for trifire.
> 
> I am already dreading catalyst...


Might be a tad bottlenecked by that 2630QM. Har. Har. Har. I'm sorry. It was too easy. I assume you kept all your WC gear?

Have you read any reviews of trifire on SB? With 16 native lanes your GPU _actual_ allocation would be x4/x4/x8, right? I have no idea how much difference that would make though.

Oh and you're going to post benches in your thread aren't you?


----------



## mojobear

hey all!

Been a long time stalker of this thread. I got myself an EVGA 590 with an xspc block (another gtx 590 on the way!) and am getting rdy to overclock it. Been looking at nvflash and RC's mod bios. Looking at them through nibitior 6.03, the modded bios have setting 2 changed....was any other tinkering done? I ask only because I want to create some bios with voltages up to 0.988.

On a related note...for those who go up to 0.975 or 0.988, do you guys find PDL kicks in at these voltages and is it temperature dependant at all?

Thanks! Here is my gpuz proof.

http://imageshack.us/photo/my-images/839/gpu.gif/


----------



## bootyhead

Sorry is this is asked often but i am having a HELL of a time finding one at MSRP here in the U.S. Anyone have a suggestion? I am at my wits ends and i will not over pay for one. I also heard some speculation that a new revision may come out soon. Any thoughts on that?

PS: as you can see ive lurked this website for a while but this is my first post!!


----------



## RagingCain

Quote:


> Originally Posted by *MKHunt;14995322*
> Might be a tad bottlenecked by that 2630QM. Har. Har. Har. I'm sorry. It was too easy. I assume you kept all your WC gear?
> 
> Have you read any reviews of trifire on SB? With 16 native lanes your GPU _actual_ allocation would be x4/x4/x8, right? I have no idea how much difference that would make though.
> 
> Oh and you're going to post benches in your thread aren't you?


Its just two cards so 8x / 8x but it has a,nf200 chip so if i use a 3rd slot with something its 16x 8x 8x. Of course there will be benchies.

All mah water gear is still here.

Post powered by DROID X2
Rooted running Eclipse.


----------



## Smo

Quote:


> Originally Posted by *mojobear;14995407*
> hey all!
> 
> Been a long time stalker of this thread. I got myself an EVGA 590 with an xspc block (another gtx 590 on the way!) and am getting rdy to overclock it. Been looking at nvflash and RC's mod bios. Looking at them through nibitior 6.03, the modded bios have setting 2 changed....was any other tinkering done? I ask only because I want to create some bios with voltages up to 0.988.
> 
> On a related note...for those who go up to 0.975 or 0.988, do you guys find PDL kicks in at these voltages and is it temperature dependant at all?
> 
> Thanks! Here is my gpuz proof.
> 
> http://imageshack.us/photo/my-images/839/gpu.gif/


So far the only PDL throttle I've encountered is 3DMark11. Crysis 2 pulled my card back to stock clocks also (630Mhz @ 0.913v).


----------



## Fallendreams

Quote:


> Originally Posted by *RagingCain;14994889*
> Looks like I will let you know how it is from the Red camp, got a 6990... eyeballing a 6970 right for trifire.
> 
> I am already dreading catalyst...


Let us know if you see huge performance difference, as well as smooth gameplay is.

Sent from my HTC Aria using Tapatalk


----------



## mojobear

Thx for the reply smo. Ur signature of 0.988 volts - is that with changing voltage setting 2 in nibitior 6.03 and no other settings?


----------



## dvanderslice

Quote:



Originally Posted by *RagingCain*


Looks like I will let you know how it is from the Red camp, got a 6990... eyeballing a 6970 right for trifire.

I am already dreading catalyst...


I just installed and configured a 6990 crossfire setup a couple of weeks ago for a friend and catalyst is something to indeed dread. I haven't used it since I had a Radeon 9800 many moons ago. Really was kind of weird the way Crossfire is controlled on it. Once I got passed that and did some Overclocking the thing was running like a beast. Of course I just set it up and didn't get much of a chance to mess around with it for more than a few hours.

Would like to hear how your setup goes and opinion overall in comparison to the Green... as most are from bias users or reviews where you really don't feel like you are getting a cut and dry comparison.


----------



## Smo

Quote:



Originally Posted by *mojobear*


Thx for the reply smo. Ur signature of 0.988 volts - is that with changing voltage setting 2 in nibitior 6.03 and no other settings?


I'm running the standard BIOS for the moment mate - just overvolting with driver set 280.19 (although I've since dropped the voltage to 0.975v and am testing that at the moment).


----------



## MKHunt

Quote:



Originally Posted by *Smo*


I'm running the standard BIOS for the moment mate - just overvolting with driver set 280.19 (although I've since dropped the voltage to 0.975v and am testing that at the moment).


Have you tried the 285.27 drivers yet? I wonder if the performance increase is mostly due to the SLI profiles? If I make you another dump would you use them and run some comparo benches?









Love the perf of 285.27, but miss the OC ability.


----------



## rush2049

I am running 285.27

So far the only major issue I am having is in Magicka, it randomly crashes to desktop..... and I had a random bluescreen while watching youtube..... other than that, its fairly nice performance boosts....


----------



## Smo

Quote:



Originally Posted by *MKHunt*


Have you tried the 285.27 drivers yet? I wonder if the performance increase is mostly due to the SLI profiles? If I make you another dump would you use them and run some comparo benches?









Love the perf of 285.27, but miss the OC ability.


I haven't tried them for that exact reason mate - useless for teaching and stability testing! I'm definitely willing to try your profiles if you export them bud. What are you thinking of comparing? 3D11?


----------



## MKHunt

Quote:


> Originally Posted by *Smo;15004350*
> I haven't tried them for that exact reason mate - useless for teaching and stability testing! I'm definitely willing to try your profiles if you export them bud. What are you thinking of comparing? 3D11?


Righto! Profiles

Yeah 3D11. I've been feeling the benching bug. If these profiles provide the boost but let me voltage tweak... Oh man.


----------



## kiki31

Hi there i have question to ask
i am getting 3d tv in few days (samsung ue46D6530) going to be using it for 3d games and movies what kind of connection do i use for this two beauties ??
i got also minidisplay adapter and dvi to HDMI(need to buy hdmi cable 1.4?) i guess dvi to hdmi will be good but what with audio??
or better to buy minidisplay to HDMI ??

also i will connect my z5500 with optical to tv so i guess i will get audio?

is this good way to connect this things?

Thanks in advance


----------



## Smo

Quote:


> Originally Posted by *MKHunt;15007249*
> Righto! Profiles
> 
> Yeah 3D11. I've been feeling the benching bug. If these profiles provide the boost but let me voltage tweak... Oh man.


Cheers dude - I'll take a look at the profiles when I get home from work, see if there are any obvious differences.
Quote:


> Originally Posted by *kiki31;15009642*
> Hi there i have question to ask
> i am getting 3d tv in few days (samsung ue46D6530) going to be using it for 3d games and movies what kind of connection do i use for this two beauties ??
> i got also minidisplay adapter and dvi to HDMI(need to buy hdmi cable 1.4?) i guess dvi to hdmi will be good but what with audio??
> or better to buy minidisplay to HDMI ??
> 
> also i will connect my z5500 with optical to tv so i guess i will get audio?
> 
> is this good way to connect this things?
> 
> Thanks in advance


I temporarily used my 3D TV as a monitor (Samsung UE40D6000) while I was waiting for my XL2410T's to arrive. I used a DVI to HDMI adapter. The sound played through the TV as normal once I selected it as default through the Control Panel.

However I must say that playing games on the TV was absolutely horrible. The input lag was very noticeable and due to the low pixel density, the picture was blurry as hell. The difference on my monitors is like night and day - I will never use a TV as a monitor again.


----------



## MKHunt

Quote:



Originally Posted by *Smo*


Cheers dude - I'll take a look at the profiles when I get home from work, see if there are any obvious differences.


Ugh I just compared the data in the profile pull and the data for 3DMark11 is the same as the 280.26 profiles I posted back when.

Is there any way to driver mod?


----------



## kiki31

@smo thx for fast reply
i did little bit of searching on adapters and i guess best option for me is to buy mini display to hdmi + 1.4 hdmi cable
and hook my z5500 with tv via optical cable


----------



## Smo

Quote:



Originally Posted by *MKHunt*


Ugh I just compared the data in the profile pull and the data for 3DMark11 is the same as the 280.26 profiles I posted back when.

Is there any way to driver mod?


The only other thing I can suggest mate is to export through NVIDIA Inspector rather than using the profile dumping tool, I believe it's a more comprehensive profile rather than just an SLi fix. Or is that what you did?

Quote:



Originally Posted by *kiki31*


@smo thx for fast reply
i did little bit of searching on adapters and i guess best option for me is to buy mini display to hdmi + 1.4 hdmi cable
and hook my z5500 with tv via optical cable


You should be fine with the right DVI to HDMI adapter. You only need the one adapter from the 590 to the HDMI cable, then the HDMI cable directly to the TV. Just be sure that the adapter is dual-link (no gaps in the pins). The sound should work as normal without the need for digital out.

That's how I ran my setup for a while, at least, and it was fine.


----------



## Recipe7

ASUS and EVGA GTX 590s seem to be in stock at tigerdirect. For those that want to pick one up. Wish I could get a second =p.

http://www.tigerdirect.com/applications/SearchTools/search.asp?keywords=gtx+590


----------



## MKHunt

Quote:


> Originally Posted by *Smo;15010695*
> The only other thing I can suggest mate is to export through NVIDIA Inspector rather than using the profile dumping tool, I believe it's a more comprehensive profile rather than just an SLi fix. Or is that what you did?


Not what I did, but nvidia inspector revealed that #DMark11 profile for 275.27 is all default (global) settings with these exceptions:

Heading: SLI
NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10
SLI_PREDEFINED_GPU_COUNT_DX10_FOUR
0x00000004

NVIDIA predefined SLI mode on DirectX 10
SLI_PREDEFINED_MODE_DX10_FORCE_AFR
0x00000002

Heading: Unknown
MULTICHIP_DX10_RENDERING_MODE (0x00A06964)
0x080000F5 (Max Payne 3, Stone Giant demo, Ocean Demo, Unigine: Tropics demo, Unigine Engine, Unigine: Sanctuary demo, 3DMark11, Oil Rush, Unigine: Heaven demo, Ladybug Demo, Mecha Demo, Metro 2033)
0x080000F5

Heading: Undefined
0x00161903 (1 Profiles)
0x00000001 (3DMark11)
0x00000001

Only hope might be to unlock voltage control from within the drivers. Might have to try begging people in-the know on tweakforums or whoever does the Xtreme-g drivers.

Heaven has some interesting settings under 'Unknown'. Could this be why Heaven doesn't seem to suffer from PDL or other throttling at the same time 3DM11 does?


----------



## Fallendreams

So how well do think our 590 is going to play bf3. Ultra @ 1080p ?

Sent from my HTC Aria using Tapatalk


----------



## emett

Yeah thats what i'm thinking / hoping.


----------



## Smo

Quote:



Originally Posted by *MKHunt*


Not what I did, but nvidia inspector revealed that #DMark11 profile for 275.27 is all default (global) settings with these exceptions:

Heading: SLI
NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10


> SLI_PREDEFINED_GPU_COUNT_DX10_FOUR
> 0x00000004


NVIDIA predefined SLI mode on DirectX 10


> SLI_PREDEFINED_MODE_DX10_FORCE_AFR
> 0x00000002


Heading: Unknown
MULTICHIP_DX10_RENDERING_MODE (0x00A06964)


> 0x080000F5 (Max Payne 3, Stone Giant demo, Ocean Demo, Unigine: Tropics demo, Unigine Engine, Unigine: Sanctuary demo, 3DMark11, Oil Rush, Unigine: Heaven demo, Ladybug Demo, Mecha Demo, Metro 2033)
> 0x080000F5


Heading: Undefined
0x00161903 (1 Profiles)0x00000001 (3DMark11)
0x00000001

Only hope might be to unlock voltage control from within the drivers. Might have to try begging people in-the know on tweakforums or whoever does the Xtreme-g drivers.

Heaven has some interesting settings under 'Unknown'. Could this be why Heaven doesn't seem to suffer from PDL or other throttling at the same time 3DM11 does?
​
​
​
Interesting - can you post up what's contained within the Heaven Drivers too please?

Quote:



Originally Posted by *Fallendreams*


So how well do think our 590 is going to play bf3. Ultra @ 1080p ?

Sent from my HTC Aria using Tapatalk


Impossible to say really, but I think it will laugh at it.​


----------



## Fallendreams

Quote:



Originally Posted by *Smo*


Interesting - can you post up what's contained within the Heaven Drivers too please?

Impossible to say really, but I think it will laugh at it.


What BF3 laughing at gtx 590 or gtx 590 laughing at bf3 ?









I cant tell...









I feel like i should never got rid of my gtx 580 sli sometimes... but both cards where plague with Mircostuttering. No matter what machine or config. Both cards would stutter so bad. I decide to sell the brand new one and sell the other after a rma ... the funny thing is i bought gtx 580 when they first came out. Black ops edition ran fine... intill i threw the other one in there. IDK.

Sorry for the /rant. Heres the problem i was having if you care: GTX 580

I love my 590, Gameplay has always been smooth. I know i would take 10-20% decrease switching but i would take 200fps stuttering fps over 120fps that was smooth any day.

Just worried this card will get sucker punch but BF3. The alpha didn't but... ill just have to fine out next week.


----------



## MKHunt

Quote:



Originally Posted by *Smo*


Interesting - can you post up what's contained within the Heaven Drivers too please?


I was going to do more typing and formatting, but this was easier for both of us.


----------



## L D4WG

Hey guys, I asked a while back about audio but my setup has changed so can you take a look for me again?

I have Mini DP to HDMI cable connecting my GTX 590 to my HDTV. 
1 Cable, no adapters...

And for the life of me my PC wont output audio to my TV...

I cant see anything in Control Panel\\ Sounds

Nvidia Control panel\\ Setup digital audio. This just points to Control Panel\\ Sounds

I reinstalled the 285.27 drivers doing a clean install

I reinstalled my MOBO's on board sound drivers (Just to see)

Any ideas?


----------



## Kenji

So I really really want a GTX 590 but I hear of micro stuttering issues I need it to be smooth because I am using a 120Hz Monitor.


----------



## Masked

Quote:



Originally Posted by *Kenji*


So I really really want a GTX 590 but I hear of micro stuttering issues I need it to be smooth because I am using a 120Hz Monitor.


580's in SLI have a bigger gap and the 6990's also have issues in MS.

In english, no matter what you buy, it's going to stutter...The benefit of having 2 cores on 1 card is that communication between the cores & system is streamlined thus, less stuttering...

If you can actually find the 590, I'd buy a 590 but, it really doesn't matter what hz your monitor is at because whatever you buy is going to stutter to some extent.


----------



## emett

The stuttering is a pretty nonissue for me. I don't even notice it tbh. Smoothest stutter i've not seen.


----------



## Kenji

Quote:



Originally Posted by *Masked*


580's in SLI have a bigger gap and the 6990's also have issues in MS.

In english, no matter what you buy, it's going to stutter...The benefit of having 2 cores on 1 card is that communication between the cores & system is streamlined thus, less stuttering...

If you can actually find the 590, I'd buy a 590 but, it really doesn't matter what hz your monitor is at because whatever you buy is going to stutter to some extent.


Im using a 5770 playing call of duty 4 MW FPS config at 120Hz at 125 FPS no stutter. Now If I was to use a GTX 590 would I have the stutter problem? I am a "pro" gamer I most run things at low settings in MP and I cant afford to have stutter.


----------



## Masked

Quote:



Originally Posted by *Kenji*


Im using a 5770 playing call of duty 4 MW FPS config at 120Hz at 125 FPS no stutter. Now If I was to use a GTX 590 would I have the stutter problem? I am a "pro" gamer I most run things at low settings in MP and I cant afford to have stutter.


You have stutter, you just can't see it.

When I toss the AMD 5-series on our big screen, there //is// stutter...Every card stutters, period. It's just a question of if you notice it or not.

Cards in SLI like the 580's or 6970's tend to stutter because like I mentioned, there's "communication lag".

With the dual core cards, there is less stutter but, again, on the big screen, stutter.

Will you see less stutter on the 590's? Yes...In every alpha/beta we've been running there is VERY LITTLE...But, it's still there.

Hell, I had a 6990, for the first time yesterday, basically lag-out in the middle of BF2...First time ever but, it happened.


----------



## Manac0r

I m just curious. I have chance to exchange my Quad 590's for TRI Sli 580 (1.5 VRAM). They would run on water cooling. I assumed having 3 cards run at higher clock speeds would be better? Less stuttering as only three GPU's need to cycle/sync (which I can really notice on the 590's in certain games). Also if a game does have stuttering I can deactivate Sli and run it well on one over clocked 580.

I do love my quad 590's but I m wondering if the TRI Sli setup would be better, check out my sig for the rest of the components.

I'm scratching my head on this one and I'm under a time limit as I need to give an answer in 12 hours.

Also what kind of overclocks can you expect on TRI 580's? I had my 590's at 720 core and 1900 mem.

Should I keep the quad or go for the brute force of the TRI?

Time is ticking


----------



## Masked

Quote:



Originally Posted by *Manac0r*


I m just curious. I have chance to exchange my Quad 590's for TRI Sli 580 (1.5 VRAM). They would run on water cooling. I assumed having 3 cards run at higher clock speeds would be better? Less stuttering as only three GPU's need to cycle/sync (which I can really notice on the 590's in certain games). Also if a game does have stuttering I can deactivate Sli and run it well on one over clocked 580.

I do love my quad 590's but I m wondering if the TRI Sli setup would be better, check out my sig for the rest of the components.

I'm scratching my head on this one and I'm under a time limit as I need to give an answer in 12 hours.

Also what kind of overclocks can you expect on TRI 580's? I had my 590's at 720 core and 1900 mem.

Should I keep the quad or go for the brute force of the TRI?

Time is ticking










No.

2x590's = @3x 580's...

Once the roof on OC'ing opens up (which, we're constantly assured, will be "soon")...Then there really won't be any competition.

The only issue is one of patience and the fact that you'll be murdering your delta.

With 3x580s it will nearly double so, better get ready to add another rad.

I'd stick with the 2x590's they're very limited and my understanding of the current stock on TG & Newegg is that it's unused RMA stock...So, you really don't have anything to worry about in that regard.

However, the 590's have been going consistently for 1k on ebay...I've now seen 4 auctions hit 900$+ per card...So, they're definitely in demand.


----------



## Wogga

what you mean by saying about OC roof? -_-
very interesting!
btw, thoughts on mars 2 bios flashing: that guy, that flashed 580 bios on 590 - 580 device id (1080 i assume) differs from 590 d id (1088). mars 2 d id also different (lol, no wonder. 108D). so....flashing mars 2 bios will do the same thing - phase noise, unstable even idle and random BSOD. just theory, i'm really afraid to kill my 590 =)


----------



## Masked

Quote:


> Originally Posted by *Wogga;15026124*
> what you mean by saying about OC roof? -_-
> very interesting!
> btw, thoughts on mars 2 bios flashing: that guy, that flashed 580 bios on 590 - 580 device id (1080 i assume) differs from 590 d id (1088). mars 2 d id also different (lol, no wonder. 108D). so....flashing mars 2 bios will do the same thing - phase noise, unstable even idle and random BSOD. just theory, i'm really afraid to kill my 590 =)


Honestly, I wonder if any of you //actually// read this thread.

The 590 has an OC limit due to driver limitations...Thus the "OC roof" because, it is a roof until they decide to change it.

There are ways to bypass the voltage issues and overclock the card but, then you're still regulated and controlled by the drivers...So, overclocking the card, unless you're just benching, is a giant waste of time.

We have 2 samples of the Mars2 in the office and IMHO, they're a tremendous waste of money...If they had lowered the price a bit, okay...But, they didn't and thus it's truly just an overpriced paperweight with severe driver issues.

I wouldn't flash anything onto that card or a 590 unless it's the voltage tweak that Cain worked out.


----------



## Wogga

i've read =)
just wonder if there is any new way to bypass PDL in the nearest future


----------



## Razi3l

Hey so I put back in my 590 today after not using it for a while and it now only detects one GPU. I've tried a few different driver versions and nothing seems to fix it. Anyone got an idea of what may be the problem?


----------



## MKHunt

Quote:


> Originally Posted by *Masked;15026173*
> _-snip-_So, overclocking the card, unless you're just benching, is a giant waste of time.
> .


QFT. I fully support OCing the 590 to see what clocks you can bench at, but for everyday use? 630 core 1728 memory









I've been considering using the DP instead of DVI just for snits and giggles. Any advantages whatsoever? The monitor only has HDMI and VGA in so I'd use EVGA's little doodad.

The only difference I can think of is sound to the monitor speakers... which is actually a disadvantage since I have halfway decent speakers... Hmmm.


----------



## Tept

I run 735 core 1800 memory now for daily and i do notice a difference in deus ex and sc2 over 630core, its like 6-8fps on some dramatic powerhouse sections but its there. 6-8fps can take you from almost smooth to butter smooth. just sayin


----------



## Kenji

I want to get an EVGA Classified how ever they are not released in Aus does EVGA have an internation warrenty?


----------



## Fallendreams

Quote:



Originally Posted by *Razi3l*


Hey so I put back in my 590 today after not using it for a while and it now only detects one GPU. I've tried a few different driver versions and nothing seems to fix it. Anyone got an idea of what may be the problem?










Does afterburner see two gpus ? Or can NVIDIA inspector or gpuz ?

Sent from my HTC Aria using Tapatalk


----------



## RagingCain

Quote:



Originally Posted by *Razi3l*


Hey so I put back in my 590 today after not using it for a while and it now only detects one GPU. I've tried a few different driver versions and nothing seems to fix it. Anyone got an idea of what may be the problem?










This happened to me one time on one card, can you run me through what you did when installing it, which DVI cable is plugged in where, and triple check the PCI-E 6/8 pin.

My issue was the +2 pin on the 6+2 wasn't all the way in, but it still required me to completely wipe all old drivers before it would detect it.

That the PSU you are using?


----------



## bah73

Hi was hoping you guys could help. I just downloaded GPU-Z and noticed my card is running two different BIOS. I did download a Bios from Asus website and instal it did I do something wrong or is it suppost to be like this? Please see pics thanks.


----------



## Smo

Quote:



Originally Posted by *bah73*


Hi was hoping you guys could help. I just downloaded GPU-Z and noticed my card is running two different BIOS. I did download a Bios from Asus website and instal it did I do something wrong or is it suppost to be like this? Please see pics thanks.


It's perfectly normal mate - nothing to worry about!


----------



## Manac0r

Keeping my 590's...


----------



## Manac0r

Quote:



Originally Posted by *Tept*


I run 735 core 1800 memory now for daily and i do notice a difference in deus ex and sc2 over 630core, its like 6-8fps on some dramatic powerhouse sections but its there. 6-8fps can take you from almost smooth to butter smooth. just sayin


Just wondering how you got 0.988v on the 590's. I'm not the best with nhibitor, hence me always hassling ragingcain. I did use the previous driver that unlocked full voltage but found it more unstable when gaming.


----------



## Fallendreams

Quote:


> Originally Posted by *Manac0r;15035646*
> Just wondering how you got 0.988v on the 590's. I'm not the best with nhibitor, hence me always hassling ragingcain. I did use the previous driver that unlocked full voltage but found it more unstable when gaming.


Did you follow his guide ?

Sent from my HTC Aria using Tapatalk


----------



## Smo

Quote:


> Originally Posted by *Manac0r;15035646*
> Just wondering how you got 0.988v on the 590's. I'm not the best with nhibitor, hence me always hassling ragingcain. I did use the previous driver that unlocked full voltage but found it more unstable when gaming.


IIRC you won't be able to realise voltage as high as 0.988v without driver set 280.19 or one of the earlier 26X.XX sets. Even with a custom BIOS flash the drivers will throttle the voltage.


----------



## anhweezy

http://valid.canardpc.com/show_oc.php?id=2012998



















Let me know what you guys think about my build! Just put it together yesterday! PLEASE ADD ME TO THE CLUB!!!


----------



## Smo

Nice looking build dude - welcome to the club! My only criticisms would be that the rad fans could do with sleeving and the PSU is massively overkill, are you planning SLi or watercooling?


----------



## Arizonian

Quote:


> Originally Posted by *Smo;15039269*
> Nice looking build dude - welcome to the club! My only criticisms would be that the rad fans could do with sleeving and the PSU is massively overkill, are you planning SLi or watercooling?


I got a chuckle from you saying "massively overkill" coming from this thread. I love you guys.....lol.









Awesome rig btw Ahnweezy.


----------



## anhweezy

Thanks guys. I will SLI the card but not anytime soon. I just got the PSU cause i got it for $200 brand new from a friend that works at Fry's. So why not right? And what do you mean by the rad fan?


----------



## MKHunt

Quote:


> Originally Posted by *Smo;15039269*
> Nice looking build dude - welcome to the club! My only criticisms would be that the rad fans could do with sleeving and the PSU is massively overkill, are you planning SLi or watercooling?


Oh man I was about to make the same criticism in regards to the PSU then I read Arizonian's post. There was a little popping sound, then the static had cleared and I was (mostly) grounded in reality again.
Quote:


> Originally Posted by *anhweezy;15040511*
> Thanks guys. I will SLI the card but not anytime soon. I just got the PSU cause i got it for $200 brand new from a friend that works at Fry's. So why not right? And what do you mean by the rad fan?


That is a helluva deal on that PSU. I paid ~$190 for the AX850 (free 2-day shipping though). I like your build. I wish water blocks could somehow preserve the GeForce glowness without being HydroCopper.

What Smo means is sleeving the cables to your H100 fans. They're awfully exposed


----------



## krazyatom

Oh man.. I am only gtx 590 owner with 750watt PSU lol


----------



## Recipe7

I'm the only one with a 650watt PSU,


----------



## emett

haha living on the edge, how many hdd's and disc drives do you have?


----------



## rush2049

I have the AX1200 as well, and a gtx 590.....

but I also have a gtx 275, 5 HD's and a DVD drive....

and everything is massively overclocked....


----------



## Recipe7

Quote:



Originally Posted by *emett*


haha living on the edge, how many hdd's and disc drives do you have?


one ssd
one 750gb hd
one optical drive
9 case fans

Running 600w more of less. A few feet from the edge,


----------



## Fallendreams

Quote:



Originally Posted by *anhweezy*


http://valid.canardpc.com/show_oc.php?id=2012998



















Let me know what you guys think about my build! Just put it together yesterday! PLEASE ADD ME TO THE CLUB!!!


Clean! Welcome to the club !

Sent from my HTC Aria using Tapatalk


----------



## anhweezy

I have an OCZ Vertex 3 120g SSD and an WD 1tb. Planning to RAID my SSD cause I put all my CS5 and photography stuff on it.


----------



## anhweezy

Quote:



Originally Posted by *MKHunt*


Oh man I was about to make the same criticism in regards to the PSU then I read Arizonian's post. There was a little popping sound, then the static had cleared and I was (mostly) grounded in reality again.

That is a helluva deal on that PSU. I paid ~$190 for the AX850 (free 2-day shipping though). I like your build. I wish water blocks could somehow preserve the GeForce glowness without being HydroCopper.

What Smo means is sleeving the cables to your H100 fans. They're awfully exposed










Yea man. I'm waiting for my fan controller to come in then i'll switch them over to that. I just has so many fans on my computer that I don't have anymore places to put them. I'll re-upload a picture once everything is in.


----------



## anhweezy

Quote:



Originally Posted by *Fallendreams*


Clean! Welcome to the club !

Sent from my HTC Aria using Tapatalk


Thanks man! Sorry to say this, it's extremely noobie but how do I add that I am in the GTX 590 club on the bottom like everyone else does? I only been a member on overclock.net for like 2 days. If anyone can help me out that would be awesome! THANKS!


----------



## Semedar

How how big of a PSU would someone need if he/she has 2x GTX590 on air with an OC'd i7 960 to 4GHz? Would a Corsair AX850 be enough? And yes, I'm talking about myself.


----------



## RobotDevil666

Hey there here's my ASUS GTX590







at stock for now add me plz.

http://valid.canardpc.com/show_oc.php?id=2014429


----------



## anhweezy

Quote:


> Originally Posted by *Semedar;15047037*
> How how big of a PSU would someone need if he/she has 2x GTX590 on air with an OC'd i7 960 to 4GHz? Would a Corsair AX850 be enough? And yes, I'm talking about myself.


I would go for a 1200w psu.


----------



## RobotDevil666

Quote:


> Originally Posted by *Semedar;15047037*
> How how big of a PSU would someone need if he/she has 2x GTX590 on air with an OC'd i7 960 to 4GHz? Would a Corsair AX850 be enough? And yes, I'm talking about myself.


1000W Will do the job nicely even 900W should suffice (depending on the rest of the rig) no need for 1200 unless you have cash to burn/like overkill.









Take a look here:
http://www.overclock.net/power-supplies/1045231-phaedrus-quickndirty-psu-calculator.html


----------



## rush2049

Just a quick update for everyone. The latest 285.27 beta driver has some major problems.

I get a BSOD or a hard freeze in L4D2 after/around the 1 hour mark.....
In magicka I get the same thing, either a hard freeze or a BSOD.

It seems openGL based games have a BSOD/Freeze more often....


----------



## Smo

Quote:



Originally Posted by *Arizonian*


I got a chuckle from you saying "massively overkill" coming from this thread. I love you guys.....lol.









Awesome rig btw Ahnweezy.










Haha, I guess you're right! After all, I can't really talk - I have an AX1200 too!

Quote:



Originally Posted by *anhweezy*


Thanks guys. I will SLI the card but not anytime soon. I just got the PSU cause i got it for $200 brand new from a friend that works at Fry's. So why not right? And what do you mean by the rad fan?


Awesome deal mate, mine was a tad more expensive than that! As MK Hunt has said, if your H100 fans were sleeved in black it would look awesome









Quote:



Originally Posted by *anhweezy*


Thanks man! Sorry to say this, it's extremely noobie but how do I add that I am in the GTX 590 club on the bottom like everyone else does? I only been a member on overclock.net for like 2 days. If anyone can help me out that would be awesome! THANKS!


No worries dude, we all start somewhere! Click on 'Quick Links' under the OCN logo, then select 'Edit Signature'.

Quote:



Originally Posted by *Semedar*


How how big of a PSU would someone need if he/she has 2x GTX590 on air with an OC'd i7 960 to 4GHz? Would a Corsair AX850 be enough? And yes, I'm talking about myself.










As has been stated - I think a decent 1000w PSU would be the safest bet. NVIDIA recommend a 700w PSU (although I realise they have to be conservative when it comes to giving out advice). Possibly more if you want case lighting, lots of HDD/SSDs, extensive watercooling etc.

Quote:



Originally Posted by *RobotDevil666*


Hey there here's my ASUS GTX590







at stock for now add me plz.

http://valid.canardpc.com/show_oc.php?id=2014429




















Nice build dude - welcome to the club!

Quote:



Originally Posted by *rush2049*


Just a quick update for everyone. The latest 285.27 beta driver has some major problems.

I get a BSOD or a hard freeze in L4D2 after/around the 1 hour mark.....
In magicka I get the same thing, either a hard freeze or a BSOD.

It seems openGL based games have a BSOD/Freeze more often....


Ouch - cheers for the heads up though dude.


----------



## rush2049

Quote:



Originally Posted by *RobotDevil666*


Hey there here's my ASUS GTX590







at stock for now add me plz.

http://valid.canardpc.com/show_oc.php?id=2014429




















hope that front fan doesn't counteract the gtx 590.....


----------



## RobotDevil666

Quote:



Originally Posted by *rush2049*


hope that front fan doesn't counteract the gtx 590.....










See that's what i was wondering about too mate.
I didn't alter the fans in any way so the front fan is acting as an intake and taking GTX590's construction and the fact that it exhausts hot air on the end it might be not a good idea.
And my temps are not so great to be honest , i see 86/85 in BFBC2 and 91/93 in The Witcher 2 or Heaven 2.5.
I expected a bit better in example my friend with his GTX590 is maxing out t 85C after couple of loops on Heaven 2.5 , granted he has a HAF -X case but still i expected a bit better.

EDIT:
So i did the ol switcheroo and it did improve the temps a little , now after 3/4 loops of Heaven max was 89 though it was 88 most of the time when it stabilized so i think it was worth it.
My concern now is the HDD rack and how this will affect temps of HDD's , i was thinking about turning around the top fan but that would probably just dump all the heat from my Noctua on the 590 so i passed.
Any input on how to improve the temps a little without getting rid of the side window (and swapping it on the mesh + fan) or turning my PC into a leaf blower would be greatly welcome.


----------



## MKHunt

Quote:



Originally Posted by *RobotDevil666*









See that's what i was wondering about too mate.
I didn't alter the fans in any way so the front fan is acting as an intake and taking GTX590's construction and the fact that it exhausts hot air on the end it might be not a good idea.
And my temps are not so great to be honest , i see 86/85 in BFBC2 and 91/93 in The Witcher 2 or Heaven 2.5.
I expected a bit better in example my friend with his GTX590 is maxing out t 85C after couple of loops on Heaven 2.5 , granted he has a HAF -X case but still i expected a bit better.

EDIT:
So i did the ol switcheroo and it did improve the temps a little , now after 3/4 loops of Heaven max was 89 though it was 88 most of the time when it stabilized so i think it was worth it.
My concern now is the HDD rack and how this will affect temps of HDD's , i was thinking about turning around the top fan but that would probably just dump all the heat from my Noctua on the 590 so i passed.
Any input on how to improve the temps a little without getting rid of the side window (and swapping it on the mesh + fan) or turning my PC into a leaf blower would be greatly welcome.










My 590 was dumping less than one inch from my HDD and it was getting pulled across and out. Using HDTune I only saw a 1C increase. I wouldn't really worry about it that much.


----------



## anhweezy

aa


----------



## RobotDevil666

The results i got before were wrong.
After testing both ways to mount the fan i got this results:

Front fan as intake:
GPU 
Heaven 2.5 3/4 runs 89/89 
BCBC2 86/86 Max AVG 85
CPU 
BFBC2 83/81/81/79 MAX

Front fan as exhaust:
GPU
Heaven 2.5 3/4 runs 88/88
BCBF2 85/86 MAX AVG 83
CPU
BFBC2 76/75/73/73 MAX

So you can see that with fan as an intake my GPU temps are marginally better 1/2C but the CPU temps are much better than with front fan as an exhaust.
So it turns out even if counter intuitive front fan is better as an intake than exhaust.


----------



## Shenta

Quote:


> Originally Posted by *RagingCain;14977438*
> Well you are wrong there. It uses all 4. Its just not optimized.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Metro2033 is one example. Witcher 2 with all its eyecandy, AA, and Ubersampling, is another one that can kill framerates. This is all at 1920x1080. I can't imagine our cards running these same titles with max settings at 2560x1600 very well. F1 2010 is another such title, that benefits greatly 4x580s. Something 2x590s don't match up to unfortunately.
> 
> Of course that is if you need more 60 fps, like I do.


We should seriously piss off on these devs. They're HORRIBLE.
Look at it, our card completely DESTROYS consoles, but look how well devs make games work on them? Our cards have SOOOO much more possibilities, seriously. We should have the 590 running BF3/ Crysis 2 completely maxed out running at 240fps. Devs are just ******* horrible at working toward the PC because the sales are not there.


----------



## Smo

Quote:


> Originally Posted by *Shenta;15057743*
> We should seriously piss off on these devs. They're HORRIBLE.
> Look at it, our card completely DESTROYS consoles, but look how well devs make games work on them? Our cards have SOOOO much more possibilities, seriously. We should have the 590 running BF3/ Crysis 2 completely maxed out running at 240fps. Devs are just ******* horrible at working toward the PC because the sales are not there.


Absolutely right - the money is in Consoles and that's why developers skimp on optimisation for the PC. It's not necessarily their fault - they do what they're told to do. 90% of consumers would NEVER consider spending ~£700/$1000 on just a graphics card, when that could buy them the lastest console and ~30 games. On top of that you have to consider the massive amount of money it costs to support a GPU of that calibre, in terms of CPU, cooling, RAM, Motherboard etc. Realistically, gaming PCs (at least the high end) are massive amounts of money for better graphics (which the majority of the population either wouldn't notice, or don't particularly care about).

There's no use slamming consoles though - they are absolutely fantastic. I totally love jumping on Xbox live with my mates and playing COD or GOW3 for hours. Totally love it. A slight graphics hit doesn't bother me in the slightest in that respect.

That said though - I'm crying out for the next boundary pusher for the PC. I can't wait for it! After all, I didn't spunk a load of money on my PC for nothing


----------



## emett

Two words... Battlefield 3


----------



## Smo

Quote:


> Originally Posted by *emett;15058332*
> Two words... Battlefield 3


Hell yes - that was the decider for me!


----------



## Fallendreams

http://store.origin.com/store/ea/html/pbPage.battlefield3_US_LE/?intcmp=eaint569

Origin says 570 now instead of 560, I wonder how well where going to play now

Sent from my HTC Aria using Tapatalk


----------



## Smo

I still think the 590 will do very well in BF3 - we'll have to keep an eye on these supposed new drivers that NVIDIA are planning on releasing for the Beta.


----------



## MKHunt

Quote:


> Originally Posted by *Smo;15058032*
> -snip- After all, I didn't spunk a load of money on my PC for nothing


Coming from someone who lives across the pond, this phrasing is _awesome_. If I said that here I'd get extremely shady glances! XD
Quote:


> Originally Posted by *emett;15058332*
> Two words... Battlefield 3


Quote:


> Originally Posted by *Fallendreams;15058687*
> http://store.origin.com/store/ea/html/pbPage.battlefield3_US_LE/?intcmp=eaint569
> 
> Origin says 570 now instead of 560, I wonder how well where going to play now
> 
> Sent from my HTC Aria using Tapatalk


If 3DMark11 is to be trusted we have two of those going by scores. Conflicting things have been coming out of DICE. The PR reps say 2x 580 for ultra and 1x 560 for medium but the engine devs say 1x 6970 will run everything high with a few settings on ultra. The high specs are either marketing hype or the engine is truly horrific.

My view is if a 590 can't play this equally as well as Witcher 2 then DICE should be more than ashamed of their engine. But it would be foolish of DICE to want this. If the game turns into a benchmark sales will be much lower. I mean how many people bought Crysis? Far fewer than those who played/benched it.

I do _know_ that 590 driver support will be... lacking. It always is


----------



## Sir_Gawain

Quote:



Originally Posted by *Smo*


I still think the 590 will do very well in BF3 - we'll have to keep an eye on these supposed new drivers that NVIDIA are planning on releasing for the Beta.


I think we will be OK too








http://www.pcgamer.com/2011/09/23/ba...ame=0&ns_fee=0


----------



## TerrabyteX

Add me to the club







Stock clocks atm Gigabyte subvendor.


----------



## Fallendreams

Quote:



Originally Posted by *TerrabyteX*


Add me to the club







Stock clocks atm Gigabyte subvendor.




Welcome man. Now go destroy bf3 with that card!


----------



## TerrabyteX

Thank you very much







i sure will ! Can't wait for the beta start on Tuesday


----------



## heyskip

Hi all. Long time reader first time poster.

Can someone confirm if I would be able to use the EVGA full length backplate when using an EK waterblock.


----------



## Sir_Gawain

Quote:


> Originally Posted by *Smo;15058721*
> I still think the 590 will do very well in BF3 - we'll have to keep an eye on these supposed new drivers that NVIDIA are planning on releasing for the Beta.


Here is the BF3 Beta driver

http://www.nvidia.com/object/win7-winvista-64bit-285.38-beta-driver.html


----------



## Masked

Today, I'm officially setting up the 580, making my rad-box and finishing this project 100%!

Very excited, today's a big day.

I think the addition of a 2nd card for surround will make a tremendous difference, especially when it comes to games like BF3.

Having the 590 dedicated to 1 monitor just relieves so many driver issues that, I can't wait to do this just based on principal.

May have some stuff coming tomorrow so, might have to wait till tomorrow





















but, this week, that rad box is getting finished!!!


----------



## biltong

Hey, I don't have a 590 but I've got a friend that does and I've got a question for all of you running on air:

What sort of temps are you getting at stock clocks? Has anyone noticed any temperature differences between drivers versions?


----------



## MKHunt

Quote:



Originally Posted by *heyskip*


Hi all. Long time reader first time poster.

Can someone confirm if I would be able to use the EVGA full length backplate when using an EK waterblock.


EK and Koolance work 100% with _any_ backplate for the 590. All other mfg's should as well, but I can only verify those two from experience.

Quote:



Originally Posted by *biltong*


Hey, I don't have a 590 but I've got a friend that does and I've got a question for all of you running on air:

What sort of temps are you getting at stock clocks? Has anyone noticed any temperature differences between drivers versions?


I was seeing 84-85 after 5-10 loops of Heaven. Driver changes (275.33-280.19-280.26) only resulted in 1-2C difference if that. I did optimize for flow though (2x top exhaust, 1x front exhaust, 2x bottom intake).


----------



## RobotDevil666

My is at 88C after 3/4 loops of Heaven but my friend has 590 in HAF-X and he's maxing out on 85C so it all depends on your case cooling.


----------



## rush2049

I max out at 85C, but idle at 40C
(I need to get better exhaust fans....)


----------



## TerrabyteX

I have a HAF X too and at an ambient temp of 27-26 degrees celsius i get 88-89 temp on load with auto fans. If i use msi afterburner and set fan speed to 90% it drops to 72-75 ! I need better fans too and we all need watercooling for this beast.


----------



## RobotDevil666

Quote:



Originally Posted by *TerrabyteX*


I have a HAF X too and at an ambient temp of 27-26 degrees celsius i get 88-89 temp on load with auto fans. If i use msi afterburner and set fan speed to 90% it drops to 72-75 ! I need better fans too and we all need watercooling for this beast.


Yea but fan at 90% gets really loud so it's trade off.


----------



## krazyatom

MSI afterburner will not allow me to control fan with my Asus gtx 590.
I can only control fans through crappy Asus Smart Doctor. Does anyone having same problem with Asus gtx 590?


----------



## Recipe7

New 285.38 drivers. http://www.techspot.com/drivers/driv...dia_geforce/0/

Anyone care to test? I would do them right this second, but my wife is enjoying her Grey Anatomy episodes -_-


----------



## RobotDevil666

Quote:



Originally Posted by *krazyatom*


MSI afterburner will not allow me to control fan with my Asus gtx 590.
I can only control fans through crappy Asus Smart Doctor. Does anyone having same problem with Asus gtx 590?


It works for me fine , you just have to make your own fan profile.
Like this:








Mine is 1% fan increase for 1C


----------



## krazyatom

Thanks they're working!

Btw, before setting MSI afterburner fan profile, my fan sometimes goes to 95% even in idle.
I had similar experience with different GPUs before (not all, but some)
I wonder why they're having this issue.


----------



## RobotDevil666

Quote:


> Originally Posted by *krazyatom;15075494*
> Thanks they're working!
> 
> Btw, before setting MSI afterburner fan profile, my fan sometimes goes to 95% even in idle.
> I had similar experience with different GPUs before (not all, but some)
> I wonder why they're having this issue.


It's not really an issue , mine does it too it suddenly ramps up the fan to 90% and than slowly reduces the fan speed back to normal , kinda gives the GPU burst of cool air.
I don't think it's going to do that on a custom fan profile though.


----------



## krazyatom

Quote:


> Originally Posted by *RobotDevil666;15076424*
> It's not really an issue , mine does it too it suddenly ramps up the fan to 90% and than slowly reduces the fan speed back to normal , kinda gives the GPU burst of cool air.
> I don't think it's going to do that on a custom fan profile though.


Yes, I never had problem, but just got curious lol.
I would do a custom fan profile using MSI


----------



## anhweezy

Quote:


> Originally Posted by *biltong;15069815*
> Hey, I don't have a 590 but I've got a friend that does and I've got a question for all of you running on air:
> 
> What sort of temps are you getting at stock clocks? Has anyone noticed any temperature differences between drivers versions?


I get 75-78 max on prime 95. It really depends on your case and airflow.


----------



## Shinobi Jedi

Okay, a little OT here, but does relate and I need to rant about it to intelligent people that can appreciate where I'm coming from.

I've been away for awhile as I just spent a marathon 12 hour session finishing Gears of War 3, so I could give the copy to my once a month cleaning guy's 17yr old son as it was just his Birthday and they can't afford the game at all.

I gave him an XBox 360 an 24" Acer LCD with HDMI so he and his sister could have an "escape" at home in their tiny one room apartment as their parents are going through a painful divorce. And the kid has had his eye on it for awhile, so I didn't want to make him wait, but I really wanted to play it and finish out the series. And being that I can only afford one copy, I burned through it and will give it to him slightly used.

The game itself is cool. It's more enjoyable for the story aspects than gameplay. Though if you're a fan of the series, they ably up the stakes all around which makes the game still very exciting to play through even if the game play is familiar.

And while I'm not here to rag on consoles, in fact I actually quite enjoy them and all platforms of gaming-

But, it wasn't until going back and playing a game on a console after gaming only on PC since I rebuilt my rig last May that made me realize how stark the difference in quality is.

And I'm playing on what is arguably one of the finest home theater displays on the recent market - the discontinued 60" Pioneer Kuro Elite with a decent 7.1 surround sound setup. So I was playing in about as prime conditions a console player can play, save going huge on a projector.

Gears 3, while having a pretty good story, fun gameplay and solid graphics all due to art direction -

But DAYUM! That game was a choppy mess at 30fps! And not only that, but it was so jaggy the graphics could cut the glass on my plasma screen.

This was supposed to be the big Hardware maker's franchise game and cutting edge, for console right?

And you all know where this is going.

For the first time, I really, really saw the stark difference between PC and console gaming having just played through most of Bulletstorm on my PC. A game with the same engine that's older. A game I hooked up to the same display on my weakest machine, my Alienware M11x-R2, to compare to Gears 3 on the 360, and it felt like the difference between a PS1 and a PS3.

And while that's an old argument, what I couldn't stop thinking as I kept playing the game was:

The videogame industry is beginning to fleece their market. Because clearly this current console hardware is so outdated that you could put together a rig for $300 the same retail price as any of the consoles at launch, and have significantly improved experience than what the current consoles can give. Yet console makers and game publishers want to extend the life of these consoles for another 2-3 years?? So they can keep pumping out titles that chop and stutter at 30fps with major jag and make money more off the current hardware at the expense of player enjoyment or experience?

Now, this wouldn't bother me as much as it does and make me feel like gamers are getting fleeced if they also didn't jack over us PC players by not optimizing console ports properly for PC! Or even better, have separate builds for PC's. While that may be expensive, console makers and game publishers should remember, that if us high end PC enthusiasts didn't buy something like a GTX 590 every generation then there wouldn't be the trickle down effect with technology that makes the Hardware as affordable as it is to put into those new consoles and helping to reduce the loss most console makers take on a console launch.

The best analogy I can come up with to illustrate the Gears of War 3 console experience or my feelings on purchasing this $60 game is -

I felt like I was paying full adult movie theater price for a ticket to see Star Wars. And instead of seeing it projected in a theater, on a large screen with digital sound and projection as intended by the filmmakers - I was forced to watch a VHS copy on a big screen TV. Not Blu-Ray, not DVD. VHS...

That's how far behind these consoles have gotten, and they want to extend them a few more years??

Fleecing. Plain and simple.

But that's business and them game publishers gotta get that dolla, dolla right? Frack the artists and producers who poured all their creative energy and precious time into that game only to have it chop so much that you sometimes can't even tell what you're playing.

But all the more, the experience really really made me appreciate my rig and PC gaming and especially my 590's. And the whole time I was playing I couldn't help but think how glorious the game could've been had it been released on PC like the first one and had the proper horse power the code needed.

And it also taught me, that FPS is much more important to me than screen size. I used to treat them more equally, but I realize I've been in denial. I will happily take 120fps/120hz gaming on a 24" monitor over 60fps gaming on a 60" big screen display any day of the week.

I don't fault the less than optimal experience of Gears of War 3 on the game or game design or anything that has to do with the actual game. I blame it on industry politics and greed.

Now, whose psyched for BF3 tomorrow? I know I'm gonna make the most of the next 48 hours so I can get something of value out of the $60 I dropped on the disappointment that MOH turned out to be.

Good Times...


----------



## MKHunt

Quote:


> Originally Posted by *Shinobi Jedi;15077883*
> _Edit: There were too many words to quote succinctly._


I completely understand. Everybody always blames the consoles when the problem is actually far deeper than that.

I haven't touched my Xbox ever since I got my sig rig running. I'm too _afraid_ of the disappointment.

Lately I've been combining features of both that I like. Playing more and more games with the X360 pad over kb and mouse. It's definitely more relaxed, and in some ways, more enjoyable. Part of that could be that my desk is way too high and thick (my legs limit how much I can raise the chair to adjust)









I'm afraid of 120Hz as well. Afraid I'll like it too much and spend even more money. Plus I've been enjoying the vibrancy of the e-IPS panels compared to TN. Wish I could have afforded true IPS, but then I'd be in 120Hz territory and that would be a tough choice.

What I really notice now is pixel density. I was spoiled early on with 1920x1200 pixels crammed into 15.4". Even going from that to my 23" monitor feels like I've downgraded in terms of sharpness and precision. I miss those extra pixels.

tl;dr if I go back and play my 360 will my heart break?


----------



## traveler

Quote:



Originally Posted by *Shinobi Jedi*


Okay, a little OT here, but does relate and I need to rant about it to intelligent people that can appreciate where I'm coming from.

I've been away for awhile as I just spent a marathon 12 hour session finishing Gears of War 3, so I could give the copy to my once a month cleaning guy's 17yr old son as it was just his Birthday and they can't afford the game at all.

I gave him an XBox 360 an 24" Acer LCD with HDMI so he and his sister could have an "escape" at home in their tiny one room apartment as their parents are going through a painful divorce. And the kid has had his eye on it for awhile, so I didn't want to make him wait, but I really wanted to play it and finish out the series. And being that I can only afford one copy, I burned through it and will give it to him slightly used.

The game itself is cool. It's more enjoyable for the story aspects than gameplay. Though if you're a fan of the series, they ably up the stakes all around which makes the game still very exciting to play through even if the game play is familiar.

And while I'm not here to rag on consoles, in fact I actually quite enjoy them and all platforms of gaming-

But, it wasn't until going back and playing a game on a console after gaming only on PC since I rebuilt my rig last May that made me realize how stark the difference in quality is.

And I'm playing on what is arguably one of the finest home theater displays on the recent market - the discontinued 60" Pioneer Kuro Elite with a decent 7.1 surround sound setup. So I was playing in about as prime conditions a console player can play, save going huge on a projector.

Gears 3, while having a pretty good story, fun gameplay and solid graphics all due to art direction -

But DAYUM! That game was a choppy mess at 30fps! And not only that, but it was so jaggy the graphics could cut the glass on my plasma screen.

This was supposed to be the big Hardware maker's franchise game and cutting edge, for console right?

And you all know where this is going.

For the first time, I really, really saw the stark difference between PC and console gaming having just played through most of Bulletstorm on my PC. A game with the same engine that's older. A game I hooked up to the same display on my weakest machine, my Alienware M11x-R2, to compare to Gears 3 on the 360, and it felt like the difference between a PS1 and a PS3.

And while that's an old argument, what I couldn't stop thinking as I kept playing the game was:

The videogame industry is beginning to fleece their market. Because clearly this current console hardware is so outdated that you could put together a rig for $300 the same retail price as any of the consoles at launch, and have significantly improved experience than what the current consoles can give. Yet console makers and game publishers want to extend the life of these consoles for another 2-3 years?? So they can keep pumping out titles that chop and stutter at 30fps with major jag and make money more off the current hardware at the expense of player enjoyment or experience?

Now, this wouldn't bother me as much as it does and make me feel like gamers are getting fleeced if they also didn't jack over us PC players by not optimizing console ports properly for PC! Or even better, have separate builds for PC's. While that may be expensive, console makers and game publishers should remember, that if us high end PC enthusiasts didn't buy something like a GTX 590 every generation then there wouldn't be the trickle down effect with technology that makes the Hardware as affordable as it is to put into those new consoles and helping to reduce the loss most console makers take on a console launch.

The best analogy I can come up with to illustrate the Gears of War 3 console experience or my feelings on purchasing this $60 game is -

I felt like I was paying full adult movie theater price for a ticket to see Star Wars. And instead of seeing it projected in a theater, on a large screen with digital sound and projection as intended by the filmmakers - I was forced to watch a VHS copy on a big screen TV. Not Blu-Ray, not DVD. VHS...

That's how far behind these consoles have gotten, and they want to extend them a few more years??

Fleecing. Plain and simple.

But that's business and them game publishers gotta get that dolla, dolla right? Frack the artists and producers who poured all their creative energy and precious time into that game only to have it chop so much that you sometimes can't even tell what you're playing.

But all the more, the experience really really made me appreciate my rig and PC gaming and especially my 590's. And the whole time I was playing I couldn't help but think how glorious the game could've been had it been released on PC like the first one and had the proper horse power the code needed.

And it also taught me, that FPS is much more important to me than screen size. I used to treat them more equally, but I realize I've been in denial. I will happily take 120fps/120hz gaming on a 24" monitor over 60fps gaming on a 60" big screen display any day of the week.

I don't fault the less than optimal experience of Gears of War 3 on the game or game design or anything that has to do with the actual game. I blame it on industry politics and greed.

Now, whose psyched for BF3 tomorrow? I know I'm gonna make the most of the next 48 hours so I can get something of value out of the $60 I dropped on the disappointment that MOH turned out to be.

Good Times...



Good read.


----------



## Masked

I disagree...Which, seems to be the norm here but, I'll explain why I disagree.

I ride a very fine line between gamer and enthusiast which, after being in this office for 5 years, I've come to feel are different categories.

The enthusiast inside me is a stubborn sod...I refuse to play BFBC2 at less then 100FPS, I refuse to play beta games without SLI (Hence the 590 on a single monitor), I also may switch to AMD for the core performance and the fact I'm getting an unlocked sample but, that aside...The enthusiast SCREAMS for perfection.

The gamer inside of me is very understanding...I have a massive TV in the office and our betas frequently take us into testing much more than I'd like it to but, for the purposes of this office, it makes sense...That being said, I was seeing @50fps on a MASSIVE Aquios (GOW3) and to be quite honest, it was clunky, it was choppy in some areas BUT, it's an Xbox title and as a gamer, that's the system I chose to play this game on...So, I suck it up and push forward.

Now, personally, I prefer the PS3 10/10...Originally I bought it JUST for PS1/PS2 titles but, I have to admit...It's grown on me...God of War 1/2/3 and Demon Souls truly changed my mind that this console was a worthy buy and Killzone 3 was so beautifully done that, I've stopped caring if the interns take a break and rip it on the big screen (instead of hockey) because it's truly something I enjoy watching.

What I'm getting at is, we as a generation have taken 1 genre of "gamer" and truly made it 2 separate categories...Enthusiast vs. Gamer.

The gamer will do his best with the medium provided because that's the best he can afford and he'll do his best with what's provided to him...//THIS// is now the majority.

Consoles are you primary example and even though a lot of users are now buying PC's, there's also been a 3ish year gap between systems (Never happened before) so, there needs to be an inkling of understanding with that demographic.

The enthusiast will be picky, he'll get the best hardware, or close to and he'll rock that game at perfection whether he sucks at it or not...He's there for the eye candy...And damn, it's pretty. //THIS// is now the minority.

I don't think you can judge 3+ year old console tech with the games of today because that //VAST// majority of PC users are on year old tech...if that.


----------



## ReignsOfPower

Hey guys, I just got an 'offside' monitor that I've placed in portrait mode to complement my Dell U3011. I've got the U3011 connected by DP, and the offside monitor connected via DVI.
I've noticed a funny thing with the GPU, doesn't quite downclock the second core properly. Is there a fix to this?


----------



## Masked

Quote:



Originally Posted by *ReignsOfPower*


Hey guys, I just got an 'offside' monitor that I've placed in portrait mode to complement my Dell U3011. I've got the U3011 connected by DP, and the offside monitor connected via DVI.
I've noticed a funny thing with the GPU, doesn't quite downclock the second core properly. Is there a fix to this?




















That's pretty normal...

If you're running something in the background, even with the 6990's it eats up more on the 2nd core than the primary, at least in my experience.

Wouldn't worry about it at all, tbh...


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


I disagree...Which, seems to be the norm here but, I'll explain why I disagree.

I ride a very fine line between gamer and enthusiast which, after being in this office for 5 years, I've come to feel are different categories.

The enthusiast inside me is a stubborn sod...I refuse to play BFBC2 at less then 100FPS, I refuse to play beta games without SLI (Hence the 590 on a single monitor), I also may switch to AMD for the core performance and the fact I'm getting an unlocked sample but, that aside...The enthusiast SCREAMS for perfection.

The gamer inside of me is very understanding...I have a massive TV in the office and our betas frequently take us into testing much more than I'd like it to but, for the purposes of this office, it makes sense...That being said, I was seeing @50fps on a MASSIVE Aquios (GOW3) and to be quite honest, it was clunky, it was choppy in some areas BUT, it's an Xbox title and as a gamer, that's the system I chose to play this game on...So, I suck it up and push forward.

Now, personally, I prefer the PS3 10/10...Originally I bought it JUST for PS1/PS2 titles but, I have to admit...It's grown on me...God of War 1/2/3 and Demon Souls truly changed my mind that this console was a worthy buy and Killzone 3 was so beautifully done that, I've stopped caring if the interns take a break and rip it on the big screen (instead of hockey) because it's truly something I enjoy watching.

What I'm getting at is, we as a generation have taken 1 genre of "gamer" and truly made it 2 separate categories...Enthusiast vs. Gamer.

The gamer will do his best with the medium provided because that's the best he can afford and he'll do his best with what's provided to him...//THIS// is now the majority.

Consoles are you primary example and even though a lot of users are now buying PC's, there's also been a 3ish year gap between systems (Never happened before) so, there needs to be an inkling of understanding with that demographic.

The enthusiast will be picky, he'll get the best hardware, or close to and he'll rock that game at perfection whether he sucks at it or not...He's there for the eye candy...And damn, it's pretty. //THIS// is now the minority.

I don't think you can judge 3+ year old console tech with the games of today because that //VAST// majority of PC users are on year old tech...if that.



Masked, you provide an interesting and different perspective that has many points I agree with, but at the same time you're ultimately proving my point because you're misconstruing the key element in the point I was trying to make in my rant.

Actually the normal life cycle for consoles has generally been 3 years before a new console, that is not the case with the PS3 or the 360.

Both the PS3 and 360 were launched in November _2006_

That's coming up on a full five years. And then add the fact that guys like Ballmer and Hirai and their immediate underlings have been repeatedly quoted on Kotaku that they want to stretch the current generation of consoles life cycles another 2-3 years, and you have an 8 year console life cycle, not 3. And if Sony had their way, they'd stretch it to 10 years.

That's a lot longer than the established norm of the 3 year life cycle from previous console generations.

And that is what I mean by the game industry's greed getting the better of them so now they've resorted to "Fleecing" their market.

If you factor in your statistic that most "enthusiasts" (PC gamers) have Hardware that's less than 1 year old, that notion gets compounded.

Because, like I said, they're not even taking console ports and optimizing the code to the fullest extent of PC hardware. So even if you want to split up the market into two categories, gamer or enthusiast, both are still getting fleeced. The console players or "gamer" crowd having to play games on Hardware going on 5 years that may get stretched out to 8-10 and have a choppy/stuttering experience, or, The "enthusiast crowd" who gets crappy, unoptimized console ports after dumping a ton of money into Hardware to be able to play a game to it's fullest and most vivid potential.

So, while your point has a lot of merit and might be valid if we really are discussing previous generations Hardware with a 3 year gap, we're not (which you even feel is a lot of time in between generations). We're talking about 5 year old hardware, that the suits want to push another 2-3 years. And knowing you could go out and buy PC Hardware for $300 and build a rig that would crush a 360 or PS3, but the console makers won't invest in a new console until they can bleed the market to dust with the current ones.

Which means choppy, stuttering games, presented in a way that is most assuredly less than intended by their actual "creative" creators of the prosperity. Or barely touched console game ports that don't maximize enthusiast hardware.

Hardware that if not bought by the "enthusiast" market, would not be able to trickle down into an affordable price for less prosperous gamer or a company's PC/Notebook builds.

I work in the film business, where any director worth his weight in salt absolutely wants to have his film/story/Art seen by as many people in the best conditions as possible. The best print, The best projector, Brand New bulb. Calibrated Sound System, the whole nine. Essentially the "enthusiast" setting.

And they would flip their **** if half the audience had to watch their film on some tiny screen, instead of a beautiful/majestic movie theater

I don't know, maybe I'm making the wrong comparison, but as a Fledgling filmmaker, I believe there is an "Art" to video games much in the same way there is in film. And I can't imagine any of the "creative" creators, whether they be true artists (illustrators, writers, model makers)or have an artist "side" to them, do not love the idea that their hard work only reaching half the audience's true potential of the game experience because of the game industry bottom line or Industry politics.

Where I do think Consoles have relevance, and mean specifically is that arguably even eclipse the PC is in their implementation of motion control, controller schemes.

While the Kinnect has been disappointing, I still find a lot of good stuff with the PS3 Move. Especially with the PS sharpshooter. Killzone 3 was incredible to play all the way through with the sharpshooter. I had to stand up and hold the peripheral by my side to aim right. Interestingly I had to almost literally re-learn how to play with the thing. But once I did, man is that fun. You can't get that on PC. And as an early adopter of the Razer Hydra, trust me when I say, you can't get that on PC. So yes there is still a lot of fun and merit to be had in a 360, but that's not the point.

The point is how the industry is pushing console generations from 3yrs. to 8-10yrs and in effect, Fleecing their own market whether it be a gamer who has to deal with significant chop on a AAA title with his 5yr. old console, or the enthusiast who dumps a lot of money into his rig to be able to play a game that he really desires, only to find the game to be a less than it's full potential port of the console build designed for 5yr old Hardware

And to be fair, if you compare the graphics on Gears 3 to the 8bit animation I got on my Atari 2600 when I was way tiny, then the Graphics are legendary. But compare it to current PC hardware from as far back as '09
and they get crushed pretty bad.

It's not to the fault of the developers and creators of any game, but lies squarely on the Publisher.

I do agree that the PS3 is really into it's shine right now and probably pumping out titles that are performing better than the 360. The Uncharted and God of War series console exclusive on PS3 and Halo/Gears exclusive on 360 is why I still own my PS3 and 360. And while the software is limited on the Kinnect, both consoles offer an experience you can't get on PC via their motion control implementation. And if you keep that perspective in mind, then they won't be too disappointing, in my experience.

I am looking forward to playing Resistance 3 with the Sharpshooter. So it'll be interesting to see if playing that improves on my recent impressions.

And lastly, while I'm not sure if I'm feeling Nintendo's new console design, I do agree that now is the perfect time to release a new console. Hopefully Sony and MS will take a hint.

You do, as usual, make some valid points that are worthy of deeper consideration.


----------



## Masked

I was actually including "fingerbangers" or handhelds in my diatribe, should have expanded on that.

I do agree with you on a lot of things, though


----------



## ReignsOfPower

Quote:



Originally Posted by *Masked*











That's pretty normal...

If you're running something in the background, even with the 6990's it eats up more on the 2nd core than the primary, at least in my experience.

Wouldn't worry about it at all, tbh...


I noticed this too for my 5970 back in the day. I dont understand why it cant simply downclock the GPU it's hooked up to. Not running a single application in the background at all. Just find it interesting as one core is 40C and the other is 60C because of this. Unplugging the second monitor makes the core idles back down to normal.


----------



## Masked

Quote:



Originally Posted by *ReignsOfPower*


I noticed this too for my 5970 back in the day. I dont understand why it cant simply downclock the GPU it's hooked up to. Not running a single application in the background at all. Just find it interesting as one core is 40C and the other is 60C because of this.


Yeah...It's a driver issue in my experience.

I see it a lot...Especially on the inactive screens and I've never been able to figure out why.

I think the draw on most 2 core cards comes through the 2nd core so it's always higher usage but, don't quote me.

I'm adding my 580 in a couple days and I'll let you know if I see a difference but, I've always seen that scenario so, to me, it's fairly normal.


----------



## ReignsOfPower

Hmm no worries bro. I'll give the other DVI port a whirl tomorrow, see if that changes which core is being loaded and post back.


----------



## anhweezy

NVIDIA > Radeon


----------



## Smo

Quote:


> Originally Posted by *ReignsOfPower;15084626*
> Hmm no worries bro. I'll give the other DVI port a whirl tomorrow, see if that changes which core is being loaded and post back.


It will do mate - I have the same issue. One of my cores never idles with two screens connected. Which GPU depends on which DVI ports I'm using.


----------



## MKHunt

Any 590s running the BF3 beta? Settings? Fps? The graphics are definitely dumbed-down from release which makes me concerned about the engine efficiency. It seems fairly mediocre to poor for the quality shown.


----------



## rush2049

I am running it, I don't know how to get the console up to show fps....... but to my eyes I am in the 80+ range. I have every setting maxed....


----------



## Masked

Quote:


> Originally Posted by *MKHunt;15087172*
> Any 590s running the BF3 beta? Settings? Fps? The graphics are definitely dumbed-down from release which makes me concerned about the engine efficiency. It seems fairly mediocre to poor for the quality shown.


I'm not really allowed to give specifics but, I've been gaming on 1 screen w/1 590 and I've had a great amount of success.


----------



## timd78

Im running a 590 at 5760x1080 and i think its quite good. Running 50-60 ish fps but need to fiddle more. On high settings. I am sure with some patches and drivers for another few FPS it will be fine.

I think its a good looking game.


----------



## ReignsOfPower

Quote:


> Originally Posted by *Smo;15086335*
> It will do mate - I have the same issue. One of my cores never idles with two screens connected. Which GPU depends on which DVI ports I'm using.












Quote:


> Originally Posted by *Masked;15087238*
> I'm not really allowed to give specifics but, I've been gaming on 1 screen w/1 590 and I've had a great amount of success.


Resolution?


----------



## skanlesskrew

I'm new here, and this is the system i just got:

2 GTX 590's watercooled w/ Koolance waterblocks

i990 Extreme watercooled w/ Koolance 370 block

Exos2 1055 Radiator/pump system

OCZ Vertex 3 SSD 240GB

Rampage 3 Black Edition

ThermalTake Level 10 GT snow Edition

I took it to a local shop to install everything, and they are telling me its overheating. That my cooling components aren't enough to keep everything running efficiently. Can anyone here give me some advice, and/or expertise on if this configuration is sufficient enough to manage cooling...??? I've seen online setups with far less, running quad sli, with no problems, so i'm a little confused. I'm trying to figure out if the techs that are working on my rig, are idiots, or if I really need better cooling solution.. Thanks in advance!!!!!


----------



## remer

I'm getting some really high temperatures with my 590 at stock clocks playing bf3 (maxed everything at 1920x1200). The highest I saw was 94c. Both gpus are throttling. Could it be the new 285.38 drivers or is it something else. My case is clean with good ventilation.


----------



## Masked

Quote:



Originally Posted by *skanlesskrew*


I'm new here, and this is the system i just got:

2 GTX 590's watercooled w/ Koolance waterblocks

i990 Extreme watercooled w/ Koolance 370 block

Exos2 1055 Radiator/pump system

OCZ Vertex 3 SSD 240GB

Rampage 3 Black Edition

ThermalTake Level 10 GT snow Edition

I took it to a local shop to install everything, and they are telling me its overheating. That my cooling components aren't enough to keep everything running efficiently. Can anyone here give me some advice, and/or expertise on if this configuration is sufficient enough to manage cooling...??? I've seen online setups with far less, running quad sli, with no problems, so i'm a little confused. I'm trying to figure out if the techs that are working on my rig, are idiots, or if I really need better cooling solution.. Thanks in advance!!!!!


You need at least a quad rad to handle that delta...Especially on your core.


----------



## Recipe7

Quote:



Originally Posted by *remer*


I'm getting some really high temperatures with my 590 at stock clocks playing bf3 (maxed everything at 1920x1200). The highest I saw was 94c. Both gpus are throttling. Could it be the new 285.38 drivers or is it something else. My case is clean with good ventilation.


That's a tad hot there. I'm getting 83 max at 640core.

On a side note, I'm getting about 90+fps inside the subway area, and 60 give or take in an intense outdoor battle.


----------



## RobotDevil666

Quote:


> Originally Posted by *ReignsOfPower;15083714*
> Hey guys, I just got an 'offside' monitor that I've placed in portrait mode to complement my Dell U3011. I've got the U3011 connected by DP, and the offside monitor connected via DVI.
> I've noticed a funny thing with the GPU, doesn't quite downclock the second core properly. Is there a fix to this?


It's not a hardware issue it always happens , i had that with GTX480SLI and i have it with 590 as well.
As far as i know when you connect 2nd monitor 2nd core automatically goes into 3D mode hence higher clocks , granted those are not max clocks but it will still raise your GPU temp.
My GTX590 is 1st core 45C 2nd 62 and GTX480SLI was around the same only slightly higher.
I've read about it on the Nvidia forums and as much as i don't remembers the reasons behind this it's been done in purpose and Nvidia is fully aware of this.
Nothing to worry about there mate.


----------



## Smo

Quote:


> Originally Posted by *RobotDevil666;15097111*
> It's not a hardware issue it always happens , i had that with GTX480SLI and i have it with 590 as well.
> As far as i know when you connect 2nd monitor 2nd core automatically goes into 3D mode hence higher clocks , granted those are not max clocks but it will still raise your GPU temp.
> My GTX590 is 1st core 45C 2nd 62 and GTX480SLI was around the same only slightly higher.
> I've read about it on the Nvidia forums and as much as i don't remembers the reasons behind this it's been done in purpose and Nvidia is fully aware of this.
> Nothing to worry about there mate.


The only issue I have with it is that the core voltage remains higher as the core clock does. I would prefer for the longevity of the chip that it idled properly. Other than that, I'm fine with it and drop my overclock when don't use it.


----------



## Sir_Gawain

Quote:


> Originally Posted by *MKHunt;15087172*
> Any 590s running the BF3 beta? Settings? Fps? The graphics are definitely dumbed-down from release which makes me concerned about the engine efficiency. It seems fairly mediocre to poor for the quality shown.


Im running mine single screen 1920x1200 Ultra with FPS at 115-125 vsync off, and 65-110 FPS vsync on







! I really do not think the graphics settings are actually working correctly as they do seem very mediocre as you said and expect to see the numbers decrease with an update.


----------



## Fallendreams

Quote:


> Originally Posted by *Sir_Gawain;15099227*
> Im running mine single screen 1920x1200 Ultra with FPS at 115-125 vsync off, and 65-110 FPS vsync on
> 
> 
> 
> 
> 
> 
> 
> ! I really do not think the graphics settings are actually working correctly as they do seem very mediocre as you said and expect to see the numbers decrease with an update.


Vsync does not work in game. Got go in the nvidia control panel to do that. My settings workings in game. I can tell difference between Low, medium, and ultra.


----------



## Shinobi Jedi

Ultra Settings are disabled for the Beta according to the BF3 Beta forums. The max setting really is High.

So I just toiled around a very little bit in the BF3 Beta, just one or two rounds to be exact....

I averaged 90 FPS, but would often hit above 150+. All four of my GPU's seemed to be running at %60/70 - Everything was at Ultra Settings (which really means it was knocked down to "High").

I bought a second 590 in the hopes of being able to get a stable 120fps like I am in BCBF2 with everything turned up and I'm still only 90 in High settings?

To be fair, it is still in Beta, with Beta GeForce drivers.

But honestly? I am not feeling the changes they've made.

I'm really not feeling that the game is forced to be web-browser launched, in fact I kind of hate it. Specifically that you can't access settings or options from it, unless you can get onto a server and get in game.

I'm trusting this will be changed by the final release, but as of now that is fracking ridiculous.

I was really hoping this game would be enough to not bother with MW3, but I'm not so sure anymore.

But I'm keeping an open mind, as it's still in Beta and there's time left to fix things.

Since Masked said he's been getting great results with one 590 on one monitor; I'm assuming that my performance seeming to not improve at all over a single 590 is due to it being Beta? Whereas in BCBF2 adding a second 590 provided a major gain in performance.

Gotta admit, I'm a little worried now. Though even if the 2nd 590 I got to run this game at 120fps turns out not to be able to do so, I'm still happy I got it for the performance gain I've gotten from all my other games.

I was talking to my literary manager yesterday whose management company has many talents that write for video games. One of them is Activision's go to guy. And knows a lot about the inside process and inside baseball of everything.

And essentially, while discussing my rant I made on here yesterday, he made two interesting points:

1. Consoles generation cycles are purposely being drawn out, because the industry feels there's only one or two cycles left before technology is at a place where consoles are no longer needed and everything is integrated into an all in one device. And assuming that we're soon headed for a console-less world, they want to maximize their profits on this generation and the next or however many remain. So I can accept that, I guess.

2. As for PC gamers getting the shaft not only in console ports, but not even properly optimized console ports, according to this talent, the main reason is the obvious one. All the money/profits are in consoles.

Combined with the fact that there's not a standardized system of development for PC's and tons of extra money and resources have to be allocated to create code that can work with the infinitely myriad combinations of CPU's and videocards on the market,

And being that it takes so much extra money and resources to properly develop for PC and being that PC sales for a game are so low compared to consoles that it's contribution is often negligible.

And that's why we've been getting the shaft and not able to really rock out our expensive Hardware

And a lot of developers are frustrated with MS because they were supposed to implement a good "PC system standard of development" the same way consoles have, originally intended via DirectX and create a system that would make it easy to design for PC as the base build and then port to consoles, instead of the exact opposite way it is now, and supposedly MS dropped the ball on that one.

And because so, we've had PC ports of consoles.

The point I'm getting at is - According to this video game writing talent at Activision, Battlefield 3 is the first game since the original Crysis that was developed for PC first and then consoles (Of course he's wrong, via The Witcher 2, but I think he meant more of a big franchise title)

And basically; the whole industry is watching to see how well it sells, especially in PC gaming markets to see if they should go back to, or adopt, developing for PC first and then porting to consoles if the game sells well enough in the PC market to warrant it.

I have the game pre-ordered, and arguably will keep my pre-order on that principle alone if it turns out to be true.

But after what I played today and in the Alpha, I would seriously consider cancelling my pre-order and waiting on the release and reactions if the gaming industry wasn't supposedly watching it's opening sales so closely to decide if they should allocate more $$ and resources to PC gaming since Dice, to their credit, took the initiative to do it on their own when nobody had anymore since the first Crysis.

Now, this is all been told to me third hand, who my manager heard it from second hand, so take it all with a grain.

I'm looking forward to getting some more time on the Beta and trusting my initial worries will be alleviated.

Cheers!


----------



## Masked

The problem I have with 90% of 590 owners (You guys are the 10% I'm excluding) are that, they feel as if their system should handle everything and //anything//.

While I don't disagree there are //OBVIOUS// limitations.

I had a user with a 590 call yesterday, using 3x27" monitors on ONE 590 and complained up and down the wall about how much his FPS was suffering in a beta game (obviously BF3) and how that was my fault.

It was my fault for not informing him that the cores would split to maintain surround and it was my fault for not explaining to him the reality of the 590...I.E. poor driver support, poor SLI support and voltage restrictions.

Now what many users don't understand is that you can run surround with a 200-series card, heck, I have an 8800GT on our test rig...And still run surround successfully...Your other monitors, literally don't even require a card, in MOST situations, they just need the port to exist.

Now, when you use 3 monitors on the 590 in a crazy resolution...You have just HALF-LIFED your total potential on a primary monitor...

Why do you ask? Because the Nvidia control panel is crap.

Don't fret because Eyefinity suffers the same damn problem.

If you're going to game on the 590 my absolute suggestion as a "professional" in this field is NOT to do it in surround...It's just not feseable with current driver support...It's truly just not there.

If you have a 590 even on 2 monitors, my suggestion is to go buy a dinky 50$ graphics card, use it for Physx and slave the second monitor to it.

Battlefield 3 on one 25" monitor with SLI working, looks awesome...Because the 590 is on one monitor, in SLI and you've now removed the bugginess of the setup.

There's no dual monitor, split GPU or other aspects to worry about...There's 2 cores, 1 monitor and a physx card already in place...

We're crippled by **** driver releases and the general incompetance of programmers that write our drivers...Period...and it's not JUST us, the 580's suffer, everyone does.

So why the diatribe? "WE" as a community, need to come to the realization that this card cannot handle gaming in surround at max resolutions...It's just not there.

Even with 2x590's, you're only pulling 3 cores in SLI because what most people don't realize is, the 4th core is being used to host your monitors...That's just how it works right now.

~~

Onto the console argument, absolutely...

I can't really go into specifics but, gorilla glass has really changed the industry...For example, I know we have a field office without a single PC, it's all virtual VIA servers...

I also know of an entire office in "glass" that has absolutely no PC's...

It's the way of the future, pretty soon, everything will be in a cloud or over a server.

~ I forgot to spell-check so please don't mind my typos.


----------



## Shinobi Jedi

But Masked, that's exactly what worries me. I am gaming on only one monitor. A 24" BenQ XL2410









In fact, it was you and Raging and others in this club that convinced me to go with a single monitor over surround when I was deciding on whether I wanted to get a bigger 120hz monitor (either the 27" Acer or upcoming Asus) over adding two more BenQ's to the one I have now.

And while I know you don't mean to imply me or anyone else in the club, have unrealistic expectations about the 590, I should've been more clear about my 590 Quad SLI expectations with this game and specified when I said that I was hoping to run it full blast, I don't include high AA in that. Or any AA at all.

But I did have it set at 4xAA so maybe that was doing it? Though I can get 120fps pretty constant with 4xAA on BFBC2 (I'm only saying that based on Dice saying if you can run BFBC2, you can run BF3)

Or, maybe the game is somehow a little more CPU dependent than realized? I am running an older Bloomfield 920.

I do know that at the Alienware M11x forums, BFBC2 was one of the few games where having a stronger CPU really made a difference in performance. People with the original M11x or the R2 owners who went with the I5 were having some issues playing stable. But us M11x-R2 owners with the I7 could play at decent settings and get a constant 30-40fps+

I want to get on there more and fiddle with settings, but the fact that you can't do it until you can find an open server, get into a game, and try to do it when you're not being fragged all the time is really frustrating and dissuading.

Masked without getting into specifics that might violate any NDA's, are you running the same build as the open Beta? Or maybe a more advanced build that allows you to really test Ultra settings with uncompressed textures, etc.?

I haven't given the Beta enough time, so maybe it'll get better.

But, Masked, I'm even a little more worried now after your last post, knowing that I'm running the game on a single monitor and am averaging the same 90fps as single 590 users..







(Congrats to you single 590 owners on that btw! Pimp!)


----------



## Masked

I know we're running a different flavor...And I do personally believe the game is a bit more CPU dependent.

Personally, I'm upgrading the entire office to Dozer or 2700k's once they're released...I have yet to decide but, it's cheap, it works, both are highly overclock-able so, I'm just going for it.

They're doing a lot to change the graphics and make it better...For example we have a daily patch (I'm downloading it right now) but, over-all I think you'll be fine.

Personally, I'd upgrade from the 920 when you have the cash because the difference between the 920s/950's to even a 2600k is tremendous...

90fps is still a lot though, so I personally, don't think you're missing too much


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


I know we're running a different flavor...And I do personally believe the game is a bit more CPU dependent.

Personally, I'm upgrading the entire office to Dozer or 2700k's once they're released...I have yet to decide but, it's cheap, it works, both are highly overclock-able so, I'm just going for it.

They're doing a lot to change the graphics and make it better...For example we have a daily patch (I'm downloading it right now) but, over-all I think you'll be fine.

Personally, I'd upgrade from the 920 when you have the cash because the difference between the 920s/950's to even a 2600k is tremendous...

90fps is still a lot though, so I personally, don't think you're missing too much










I was going to do that, but RagingCain's posts about how the SandyBridge mobo's didn't have x16 pipelines like X58's do made me reconsider.

But maybe now I need to reconsider that. When's the 2700K's come out? Are those the same as IvyBridge? Or is that an interim chip?

But maybe jumping from my X58/Bloomfield with x16 to a SB CPU and Mobo at x8, would I take a significant hit? If so, then I'll stick to the original plan to wait for the next CPU that has a mobo that can do x16 pipelines.

I've never heard of Dozer. So I'm assuming that's an AMD chip, yes? Though to be honest, I don't keep up very well with the CPU market like I should so I'm expecting to be wrong on anything I'm saying.

You're right 90fps is nothing to sneeze at. I guess I'm stuck on 120fps for 120hz., but 90fps Vsynch'd at 120hz is pretty smooth.

Masked, if you have any direct input with the developers, I know they're not going to get rid of the browser - but for the love of God, can you or anyone reading this thread that may have the contacts, tell them to make sure to at least make it so the player will be able to change their options and settings from the browser, instead of only in game and on a server?? That is really killing me and many others on the forums it seems.


----------



## Masked

Quote:



Originally Posted by *Shinobi Jedi*


I was going to do that, but RagingCain's posts about how the SandyBridge mobo's didn't have x16 pipelines like X58's do made me reconsider.

But maybe now I need to reconsider that. When's the 2700K's come out? Are those the same as IvyBridge? Or is that an interim chip?

But maybe jumping from my X58/Bloomfield with x16 to a SB CPU and Mobo at x8, would I take a significant hit? If so, then I'll stick to the original plan to wait for the next CPU that has a mobo that can do x16 pipelines.

I've never heard of Dozer. So I'm assuming that's an AMD chip, yes? Though to be honest, I don't keep up very well with the CPU market like I should so I'm expecting to be wrong on anything I'm saying.

You're right 90fps is nothing to sneeze at. I guess I'm stuck on 120fps for 120hz., but 90fps Vsynch'd at 120hz is pretty smooth.

Masked, if you have any direct input with the developers, I know they're not going to get rid of the browser - but for the love of God, can you or anyone reading this thread that may have the contacts, tell them to make sure to at least make it so the player will be able to change their options and settings from the browser, instead of only in game and on a server?? That is really killing me and many others on the forums it seems.


I once had to ask OCN how to overclock the X58 platform after our Intel tech went on vacation for 2 weeks so, I'm not exactly the authority on this subject.

What I can say is that both have their ups and downs...But, the biggest difference is thread dedication.

The 2700k will be a much better overclocker than the 2600k and will be @30$ more.

From the perspective that it's a work-horse, it's well worth the swap from the office 950's to the 2700k on release.

I do realize there are less pipelines but, the 30% increase you'll see in CPU performance I think will outweigh the 2-3% difference from the pipelines missing.

You have to realize that between X8/X16 there's only a 2-4% if that, difference in performance.

I was originally planning to wait for the 3930k's but, we haven't even seen projections on that yet much less anything solid.

We commented on the BF3 issue, in fact, I mentioned it 6 times in a "debriefing" because it pushed me over the edge...Finding a server that's not packed in nearly impossible, too.

I think that the 590 can handle BF3 well given the chance...On high/ultra...However, a faster processor will always help push it a bit further, I think.

Ask Ragin...Get a second opinion but, I think an upgrade would drastically kick up performance.


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


I once had to ask OCN how to overclock the X58 platform after our Intel tech went on vacation for 2 weeks so, I'm not exactly the authority on this subject.

What I can say is that both have their ups and downs...But, the biggest difference is thread dedication.

The 2700k will be a much better overclocker than the 2600k and will be @30$ more.

From the perspective that it's a work-horse, it's well worth the swap from the office 950's to the 2700k on release.

I do realize there are less pipelines but, the 30% increase you'll see in CPU performance I think will outweigh the 2-3% difference from the pipelines missing.

You have to realize that between X8/X16 there's only a 2-4% if that, difference in performance.

I was originally planning to wait for the 3930k's but, we haven't even seen projections on that yet much less anything solid.

We commented on the BF3 issue, in fact, I mentioned it 6 times in a "debriefing" because it pushed me over the edge...Finding a server that's not packed in nearly impossible, too.

I think that the 590 can handle BF3 well given the chance...On high/ultra...However, a faster processor will always help push it a bit further, I think.

Ask Ragin...Get a second opinion but, I think an upgrade would drastically kick up performance.


Well Raging told me to wait for IvyBridge since it will have mobo's with x16

but you've pretty much convinced me to go for it. And since the 2700K look as if they're not coming out until the end of the year or early next, it looks like I have some time to think about it.

I'm sure as hell not going to go get a 2600K now, now that you've enlightened me to the 2700K's right around the corner. Much appreciated.

Battlelog is not getting good feedback anywhere. Kotaku's article about it has the comment section blowing up with disapproval

I can deal with it as long as they make the player able to change options from the browser and not have to wait to be in game for it.

But I really am considering cancelling my pre-order to make a statement that this change they're trying to push on us is FUBAR. And then pick up the game at launch if it's been confirmed to be remedied to widely accepted satisfactory degree.

All I really wanted for the game was BFBC2 with a new coat of paint and the ability to fly Jets. Not some browser based innovation BS that actually makes getting into a game more cumbersome instead of more efficient.

I'm psyched to know that professionals like you are giving the same feedback. Muchas Gracias...

I'm gonna trust it's going to get better.

Looking forward to other 590 users thoughts and experiences that are in the beta.


----------



## RobotDevil666

I'm kinda in the same pickle here , my 930 annoys me so much cause it's so **** hot but i think buying 2600K in this point of time is not a good idea, plus the fact i already have a decent CPU makes the wait even easier.
I think I'll wait and see how Ivy will perform.


----------



## MKHunt

Quote:



Originally Posted by *RobotDevil666*


I'm kinda in the same pickle here , my 930 annoys me so much cause it's so **** hot but i think buying 2600K in this point of time is not a good idea, plus the fact i already have a decent CPU makes the wait even easier.
I think I'll wait and see how Ivy will perform.


All this Ivy talk makes me glad I got an 1155 over 1336 for the IB compatibility.

Everything I've read says IB will still have only 16 physical lanes.

Most of what the 7 series seems to bring to the table is chipset-based USB3 and PCIe 3. I highly doubt nvidia will bring PCIe3 into its 600 series lineup which gives you two years or more to not care unless you run the latest and greatest PCIe-based SSDs; which seems silly for people in this club since we all enjoy graphics performance. If nvidia _does_ release PCIe 3.0-based cards for the 600 series then I will smash a cinder block, add it to my food, and ____ bricks. That is my vow.

Also Intel has been hyping the new IGP and power saving features and avoiding performance. An increased L3 cache will probably help a little and the 22nm might aid in higher overclocks but the architecture is the same as SB.

It'll be interesting to see how the performance pans-out. But I guess my point is that if you get a Z68 chipset and 2600k/2700k, IB is simply a drag-and-drop if it ends-up being worth it. Plus pricepoints are said to be about the same as SB is now, so it won't be a huge expense to upgrade if you choose to.

I'm a bit hesitant whether the 2700k will be that much of an improvement. My money says that it will be a 2600k with a 100MHz OC from the factory. If it OCs better than the 2600k, it will probably be by about 100MHz. Is that 100MHz worth $30? Your call.

My feelings with computers are if you want it, get it. You can always rationalize waiting to the point that you never upgrade.


----------



## emett

Even thou ivy fits sandy I hear it requires a different chipset. No?


----------



## MKHunt

Quote:



Originally Posted by *emett*


Even thou ivy fits sandy I hear it requires a different chipset. No?




Cannot confirm nor deny as I don't have an IB ES


----------



## dvanderslice

Quick question off topic: Are you guys seeing a lot of issues via the drivers in the last 6 months, using the Nvidia Control Panel setting the TEXTURE FILTERING QUALITY to HIGH QUALITY?

I have religiously set that to HIGH QUALITY for all my cards and since they added the Ambient Occlusion Performance/Quality option I've set that to Quality. For years now I've always set it to High Quality. But now I'm seeing strange stuttering in games like Dirt 3 and Dawn of War 2 when setting it to High Quality. Keeping everything at it's defaults except the Ambient Occlusion seems to fix it. Do you guys benchmark with everything at their control panel defaults too?


----------



## Fallendreams

How's your guys gtx 590 doing in bf3 beta?

Sent from my HTC Aria using Tapatalk


----------



## Pawcu

http://www.facebook.com/photo.php?fb...type=1&theater

is this enough?
thx

btw its an EVGA Classified Limited Edition GTX590 with stock clocks


----------



## MKHunt

Quote:


> Originally Posted by *Shinobi Jedi;15099488*
> I'm really not feeling that the game is forced to be web-browser launched, in fact I kind of hate it. Specifically that you can't access settings or options from it, unless you can get onto a server and get in game.
> 
> I'm trusting this will be changed by the final release, but as of now that is fracking ridiculous.
> 
> I was really hoping this game would be enough to not bother with MW3, but I'm not so sure anymore.


Cancelled my preorder. Not taking a chance with Battlelog. If it disappears at release, I will buy the game. Until then? No way in hell.


----------



## kainwalker

Heads up - newegg has ASUS 590 in stock, no FS, 749.99








http://www.newegg.com/Product/Product.aspx?Item=N82E16814121436
would get it if it popped up a week ago. got myself 2 x 580's instead









EDIT:
so the EVGA 590 is also in stock (seems like there's two of those)
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130693

maybe good for those who want to max out BF3 on an ITX setup







(yeah, 590 in ITX!)


----------



## rush2049

BF3 beta is performing very well with the latest beta drivers (the ones specifically for bf3).

Everything set to maximum getting over 80 fps constantly, usually in the 150+ range..... though my card is a bit overclocked.

Seeing some graphical errors, but I am attributing that to the beta bf3 build and not the card.

The utilization of both cores is usually near 80%, sometimes higher.
The utilization of cores on my 1055T @ 4ghz is usually at 60-70% across all 6 cores.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *MKHunt;15118026*
> Cancelled my preorder. Not taking a chance with Battlelog. If it disappears at release, I will buy the game. Until then? No way in hell.


I read you. I'm heavily leaning towards the same.


----------



## skanlesskrew

SkanlessKrew


----------



## MKHunt

Quote:



Originally Posted by *Shinobi Jedi*


I read you. I'm heavily leaning towards the same.


As far as the pre-order bonuses go, they probably won't be that great... I mean EA has literally been selling cheats on the XBL marketplace for years. Why stop now? BFBC1 had guns and perks that EA would sell to you. It was like a Free to Play MMO without _any_ of the free.


----------



## fliq

Anybody having problems with quad sli in bf3 or any other games? I'm getting hardcore flickering now..


----------



## Shinobi Jedi

Quote:



Originally Posted by *MKHunt*


As far as the pre-order bonuses go, they probably won't be that great... I mean EA has literally been selling cheats on the XBL marketplace for years. Why stop now? BFBC1 had guns and perks that EA would sell to you. It was like a Free to Play MMO without _any_ of the free.


I really wasn't that psyched up for Rage, even though I'm gonna buy it (I'm just sick of "post apocalyptic" themed shooters) - but now that BF3 is turning into such a disappointment, hopefully that title will be the sleeper that illustrates why we build enthusiast rigs to game.


----------



## MKHunt

Quote:



Originally Posted by *Shinobi Jedi*


I really wasn't that psyched up for Rage, even though I'm gonna buy it (I'm just sick of "post apocalyptic" themed shooters) - but now that BF3 is turning into such a disappointment, hopefully that title will be the sleeper that illustrates why we build enthusiast rigs to game.


We need to find _something_. This season full of spectacular games has started off with a huge let-down. I hear ya on the post-apocalyptic stuff. It's getting a little old. I was counting on Skyrim based upon how Morrowind and Oblivion pushed machines, but they nerfed that one, too.

I budgeted for BF3 and now that I might have to make a different decision, it's difficult. Already PO Skyrim PC (mods) and ME3 CE.

I had a few friends over to try-out BF3 and they were all underwhelmed. We all agreed that the gameplay felt like MW2 or BO with team-based objectives. A.k.a. Deathmatch but occasionally someone plants a bomb.

Team communication is also fairly lacking. And the keybindings are awful. 'J' to talk to everyone? What? Was there something wrong with 'T' for global chat and 'Y' for squad chat?

I guess you could rationalize it. Because "Talk" definitely starts with 'J.'

I also tried to bind a key as press = toggle crouch, hold = toggle prone but it never mapped the bind. And once I bound the first key the mouse cursor went away.

They're presenting us with the basics... which are what should work when they release an open beta. During beta I don't care about all the advanced stuff, but controls and hit counters should both work.


----------



## RobotDevil666

I have no idea what are you guys complaining about, i just LOVE this game already.
The dynamics is right between COD and BFBC2 , I'd say they hit the sweet spot.
There is some annoying bugs and the graphics is not entirely there but once they fix it we got a winner , pre ordered after a few rounds of beta








Now if they just let us try the Caspian border .......................


----------



## Shinobi Jedi

Quote:



Originally Posted by *RobotDevil666*


I have no idea what are you guys complaining about, i just LOVE this game already.
The dynamics is right between COD and BFBC2 , I'd say they hit the sweet spot.
There is some annoying bugs and the graphics is not entirely there but once they fix it we got a winner , pre ordered after a few rounds of beta








Now if they just let us try the Caspian border .......................










What mainly drives me nuts is being gimped to only be able to set game/video settings and options only when logged on a server and in a match.

That is ridiculous


----------



## Masked

...Wow, so you guys had your first REAL Beta...

How'd it feel?

See, most people don't understand what we Beta on a daily basis...They think it's a working product but, it's really not.

The same goes for BF3...Over the past year @30 games I can name off of the top of my head were worse...Hell, in the CoDBO beta, you couldn't move forward...People literally strafed for 3 days until the patch went live.

So, yet again, I disagree with you guys.

This is a typical beta and is actually presented better than most that we see...My only true complaint is the setup.

The Frostbyte engine is fantastic...Definitely a contender...There are actually trajectory velocities now that align with real life...I.E. I can't shoot you with the M16 from the other side of the map...

I personally think all of your expectations were TOO HIGH going into a Beta...Like I said, I've seen severely worse from games that not only made GOTY but, actually won awards...1 or 2 of them are still on NDA's because of how "bad" their betas were (Games released @a year ago)...So, I see absolutely no reason to worry.

They're going to fix the setup, have said so...The only reason it is this way is because it's a server side fix...You're actually changing the setup on their server which relays it back to you...This //WILL// change.

I'd be happy you guys are actually //IN// a beta and go from there


----------



## Fallendreams

Quote:



Originally Posted by *Masked*


...Wow, so you guys had your first REAL Beta...

How'd it feel?

See, most people don't understand what we Beta on a daily basis...They think it's a working product but, it's really not.

The same goes for BF3...Over the past year @30 games I can name off of the top of my head were worse...Hell, in the CoDBO beta, you couldn't move forward...People literally strafed for 3 days until the patch went live.

So, yet again, I disagree with you guys.

This is a typical beta and is actually presented better than most that we see...My only true complaint is the setup.

The Frostbyte engine is fantastic...Definitely a contender...There are actually trajectory velocities now that align with real life...I.E. I can't shoot you with the M16 from the other side of the map...

I personally think all of your expectations were TOO HIGH going into a Beta...Like I said, I've seen severely worse from games that not only made GOTY but, actually won awards...1 or 2 of them are still on NDA's because of how "bad" their betas were (Games released @a year ago)...So, I see absolutely no reason to worry.

They're going to fix the setup, have said so...The only reason it is this way is because it's a server side fix...You're actually changing the setup on their server which relays it back to you...This //WILL// change.

I'd be happy you guys are actually //IN// a beta and go from there










^ +REP

Seriously, Do you guys remember server browser in BFBC2 when it was first release ? It took DICE well over 6-8 months to final fix it! This battle log has been working so dam well, with some bugs. I rather take this over a main menu like BFBC2. You seem to forget that client your playing on its about 2-3 months old. Most of this stuff has been fix, "then why give us a old client ?" Well, this will give DICE a chance to have Millions of people report bugs or find bugs that they couldn't in house testing. I rather take this build BF3 that over BFBC2 any day!

EDIT: Also i know most you just start playing yesterday and not when it was closed.... of course yesterday was going to have huge problems...... THERE WAS 1.5 Million Trying to play at the same time. I suggest you go give it a try today or during the weekend. Its been much better.

I also can you see your right to knock the game down if they weren't updating it or trying to fix the major problems, but they are, we have had 6 updates and fixes since Tuesday and 2 since yesterday!


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


...Wow, so you guys had your first REAL Beta...

How'd it feel?

See, most people don't understand what we Beta on a daily basis...They think it's a working product but, it's really not.

The same goes for BF3...Over the past year @30 games I can name off of the top of my head were worse...Hell, in the CoDBO beta, you couldn't move forward...People literally strafed for 3 days until the patch went live.

So, yet again, I disagree with you guys.

This is a typical beta and is actually presented better than most that we see...My only true complaint is the setup.

The Frostbyte engine is fantastic...Definitely a contender...There are actually trajectory velocities now that align with real life...I.E. I can't shoot you with the M16 from the other side of the map...

I personally think all of your expectations were TOO HIGH going into a Beta...Like I said, I've seen severely worse from games that not only made GOTY but, actually won awards...1 or 2 of them are still on NDA's because of how "bad" their betas were (Games released @a year ago)...So, I see absolutely no reason to worry.

They're going to fix the setup, have said so...The only reason it is this way is because it's a server side fix...You're actually changing the setup on their server which relays it back to you...This //WILL// change.

I'd be happy you guys are actually //IN// a beta and go from there










It's not my first beta. I used to work at Activision and have 3 QA credits on my resume. Before the Dark Times. Before the Empire...









I'm quite familiar with reporting bugs repeatedly, only to see them not get fixed and then have the game pushed out the door to make a release date. Ever since then, I have never judged the QA team when I play a bug filled game.

Masked, I'm actually taking your word over what the BF3 forums are saying for sure. But over there the insinuation is, that setting things in game were going to be in the final build. Which made no sense to me. So if it's not a design issue, great! But, then it's fair to say that they haven't done a very good job of communicating that to testers.

As for happy to be in the beta, I agree it's a privilege, but it's also one I paid for as a promotion for buying MOH. Considering that the only value I got from dropping that $60 on MOH, is two days early access in a beta where most of the time you can't even get into a game - I think it's fair to be feeling some disappointment.

If this were SW:KOTOR, which is invite only, then yeah, don't look a gift horse in the mouth. But that's not the case with this beta. Since we bought into it by purchasing MOH. Many players only buying MOH, just to be in the beta.

But like I said, I actually trust your word than anything I read on the beta forums. So I'm relieved to know it's just part of the beta.

What's interesting is, some are claiming the whole browser launch is for beta only, but I got the impression the way it was designed it's going to be in the final game. That doesn't bother me as long as you can adjust settings before getting into a game.

I'm actually more excited to try this new The Witcher 2 update which does an overhaul on combat and offers some much needed tutorials. At least for me, as I'm playing with a 360 pad and was just getting my ass handed to me when having to go up against a group of monsters. It was so frustrating that I put the game away. Now that this update is out, it'll be interesting to see the differences.


----------



## Masked

Quote:



Originally Posted by *Shinobi Jedi*


It's not my first beta. I used to work at Activision and have 3 QA credits on my resume. Before the Dark Times. Before the Empire...









I'm quite familiar with reporting bugs repeatedly, only to see them not get fixed and then have the game pushed out the door to make a release date. Ever since then, I have never judged the QA team when I play a bug filled game.

Masked, I'm actually taking your word over what the BF3 forums are saying for sure. But over there the insinuation is, that setting things in game were going to be in the final build. Which made no sense to me. So if it's not a design issue, great! But, then it's fair to say that they haven't done a very good job of communicating that to testers.

As for happy to be in the beta, I agree it's a privilege, but it's also one I paid for as a promotion for buying MOH. Considering that the only value I got from dropping that $60 on MOH, is two days early access in a beta where most of the time you can't even get into a game - I think it's fair to be feeling some disappointment.

If this were SW:KOTOR, which is invite only, then yeah, don't look a gift horse in the mouth. But that's not the case with this beta. Since we bought into it by purchasing MOH. Many players only buying MOH, just to be in the beta.

But like I said, I actually trust your word than anything I read on the beta forums. So I'm relieved to know it's just part of the beta.

What's interesting is, some are claiming the whole browser launch is for beta only, but I got the impression the way it was designed it's going to be in the final game. That doesn't bother me as long as you can adjust settings before getting into a game.

I'm actually more excited to try this new The Witcher 2 update which does an overhaul on combat and offers some much needed tutorials. At least for me, as I'm playing with a 360 pad and was just getting my ass handed to me when having to go up against a group of monsters. It was so frustrating that I put the game away. Now that this update is out, it'll be interesting to see the differences.


I agree actually, with all of that.

We're in SWTOR and I actually thuroughly enjoy it and agree 100% with the limited release...Actually...As anyone that's actually played the game, would as well but, that aside...Beta is and has always been a privileged practice.

Only in the last year or two have you actually seen Betas become public...BFBC2's beta was a nightmare...The game you play now, does NOT even compare to the game we beta'd...Same goes for CODBO...

The issue you have with BF3 is server based...They're not on an oracle backbone anymore with a unix header...It's now 100% Nginx...And that's going to have some release issues.

I've been told there will be a different setup profile come release because, quite frankly...That was the only complaint I had.

They're still tweaking the frostbyte engine as you can see and there are a lot of other kinks to work out.

What you're doing when you change your profile is...You're actually logging into their servers...Changing your information...and then it's going live on your PC...Which is why it currently is setup that way.

If it still didn't change, I'd flag it as an annoyance but, in no way/shape/form is that going to deter me from enjoying the BF3 experience...It's a setback not a game-changer.

I understand that most of you "don't like it" but, considering within 10 years you're not going to have a desktop anymore, this is truly the start of server-side dominance and if there are some errors, there are some errors.

Like I said, a bump in the road is far from a roadblock and I don't think it should be treated as a roadblock even if it doesn't get dealt with immediately (again, I was assured it would be).


----------



## Shinobi Jedi

Quote:



Originally Posted by *Fallendreams*


^ +REP

Seriously, Do you guys remember server browser in BFBC2 when it was first release ? It took DICE well over 6-8 months to final fix it! This battle log has been working so dam well, with some bugs. I rather take this over a main menu like BFBC2. You seem to forget that client your playing on its about 2-3 months old. Most of this stuff has been fix, "then why give us a old client ?" Well, this will give DICE a chance to have Millions of people report bugs or find bugs that they couldn't in house testing. I rather take this build BF3 that over BFBC2 any day!

EDIT: Also i know most you just start playing yesterday and not when it was closed.... of course yesterday was going to have huge problems...... THERE WAS 1.5 Million Trying to play at the same time. I suggest you go give it a try today or during the weekend. Its been much better.

I also can you see your right to knock the game down if they weren't updating it or trying to fix the major problems, but they are, we have had 6 updates and fixes since Tuesday and 2 since yesterday!


8 updates, huh? This I did not know. I haven't been able to log on for the last 24hrs. because of work.

To be clear, I think it's a super polished beta. My main issue was, testers were saying that Dice guys were saying on the Battlelog forums that only being able to set you options in game was a design choice and going into the final build.

This has been my only issue. If it's not true, then it's not a design issue but a miscommunication. And that's groovy.

That was really my only issue. That, and that I could barely get into a game when it was still closed for us MOH purchasers. Which after the disappointment that was MOH, was the only thing of value I was looking forward to from that purchase. And for the life of me, could only get into a couple matches. So, therein lies my frustration. Not being able to get into a match while it was still closed, and the settings issue.

But if the settings issue is a temp beta thing, then everything is cool for me.

I hadn't played Battlefield since Battlefield 2 and didn't get into BFBC2 until about a year ago when most of the kinks were worked out. So I had no idea the beta back then was that bad, and this one is much better.

I'm gonna try to get on there and see if my initial impressions change.


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


I agree actually, with all of that.

We're in SWTOR and I actually thuroughly enjoy it and agree 100% with the limited release...Actually...As anyone that's actually played the game, would as well but, that aside...Beta is and has always been a privileged practice.

Only in the last year or two have you actually seen Betas become public...BFBC2's beta was a nightmare...The game you play now, does NOT even compare to the game we beta'd...Same goes for CODBO...

The issue you have with BF3 is server based...They're not on an oracle backbone anymore with a unix header...It's now 100% Nginx...And that's going to have some release issues.

I've been told there will be a different setup profile come release because, quite frankly...That was the only complaint I had.

Their still tweaking the frostbyte engine as you can see and there are a lot of other kinks to work out.

What you're doing when you change your profile is...You're actually logging into their servers...Changing your information...and then it's going live on your PC...Which is why it currently is setup that way.

If it still didn't change, I'd flag it as an annoyance but, in no way/shape/form is that going to deter me from enjoying the BF3 experience...It's a setback not a game-changer.

I understand that most of you "don't like it" but, considering within 10 years you're not going to have a desktop anymore, this is truly the start of server-side dominance and if there are some errors, there are some errors.

Like I said, a bump in the road is far from a roadblock and I don't think it should be treated as a roadblock even if it doesn't get dealt with immediately (again, I was assured it would be).


Really well said.

And good to know. Consider me psyched again!


----------



## Fallendreams

Quote:



Originally Posted by *Shinobi Jedi*


8 updates, huh? This I did not know. I haven't been able to log on for the last 24hrs. because of work.

To be clear, I think it's a super polished beta. My main issue was, testers were saying that Dice guys were saying on the Battlelog forums that only being able to set you options in game was a design choice and going into the final build.

This has been my only issue. If it's not true, then it's not a design issue but a miscommunication. And that's groovy.

That was really my only issue. That, and that I could barely get into a game when it was still closed for us MOH purchasers. Which after the disappointment that was MOH, was the only thing of value I was looking forward to from that purchase. And for the life of me, could only get into a couple matches. So, therein lies my frustration. Not being able to get into a match while it was still closed, and the settings issue.

But if the settings issue is a temp beta thing, then everything is cool for me.

I hadn't played Battlefield since Battlefield 2 and didn't get into BFBC2 until about a year ago when most of the kinks were worked out. So I had no idea the beta back then was that bad, and this one is much better.

I'm gonna try to get on there and see if my initial impressions change.


I totally understand where your coming from, the updates have been server side and battlelog. So this didn't require you to DL anything.

I bought 2 copy's of MOH, All registered on OCT 12th 2010 and i didn't receive any keys. waited in EA support chats for hours no luck, lucky someone on OCN had a spare and gave it too me.

I'm sorry if i sound like a hole, i just have small patience lately for all the stupid comments and opinions about a beta version having bugs. To many people expectations where way to high!

The funny thing, most of the people saying they cancel there pre order didn't.. and most people who say there not buying are going to buy it. There just crying because they didn't get there way. I mean the world is not burger king, and you cant have it your way.

Technology and the way things are being done are changing fast... if you can not deal with it, then this hobby isn't for you.

Please, i hope no one is taking this as a personal attack. I'm not singling anybody out.


----------



## Shinobi Jedi

Quote:



Originally Posted by *Fallendreams*


I totally understand where your coming from, the updates have been server side and battlelog. So this didn't require you to DL anything.

I bought 2 copy's of MOH, All registered on OCT 12th 2010 and i didn't receive any keys. waited in EA support chats for hours no luck, lucky someone on OCN had a spare and gave it too me.

I'm sorry if i sound like a hole, i just have small patience lately for all the stupid comments and opinions about a beta version having bugs. To many people expectations where way to high!

The funny thing, most of the people saying they cancel there pre order didn't.. and most people who say there not buying are going to buy it. There just crying because they didn't get there way. I mean the world is not burger king, and you cant have it your way.

Technology and the way things are being done are changing fast... if you can not deal with it, then this hobby isn't for you.

Please, i hope no one is taking this as a personal attack. I'm not singling anybody out.


No. I read you.

Actually, I just played some. MUCH, much better. I'm feeling it.

Funny thing - With Vsync off, my FPS averaged around 90-100

When I turned Vsync On, it bumped it up and locked it in around 120fps or a little better.

So for now, I'm gonna play Vsync On and see how it goes.

I was actually rocking it pretty well in the Operation Metro map, and then my stupid Time Warner internet reset itself and I crashed out of the game. Now if I can just deal with that, I'm good to go.

I wish I caught the password to get into that locked 64 player map, when it was available...


----------



## Fallendreams

Quote:


> Originally Posted by *Shinobi Jedi;15126086*
> No. I read you.
> 
> Actually, I just played some. MUCH, much better. I'm feeling it.
> 
> Funny thing - With Vsync off, my FPS averaged around 90-100
> 
> When I turned Vsync On, it bumped it up and locked it in around 120fps or a little better.
> 
> So for now, I'm gonna play Vsync On and see how it goes.
> 
> I was actually rocking it pretty well in the Operation Metro map, and then my stupid Time Warner internet reset itself and I crashed out of the game. Now if I can just deal with that, I'm good to go.
> 
> I wish I caught the password to get into that locked 64 player map, when it was available...


Yeah the game been running well for me too. CB Map and Experience was so EPIC. So much win. Now that map slow me down a little in Forest area and gas station. I'm also playing with Vsync force through the NVCP. I have not felt any input lag, couldn't stand the screen flickering.


----------



## rush2049

I very much enjoy reading your guys back and forths.....

As far as betas, I am a programmer so I am very very familiar with the methodologies of testing stuff. Reporting bugs is like 50% of my job so I do it repeatedly and as accurately as possible for anything I test. There is a reason I am in the testing program for Microsoft Game Studios and other companies that send me betas to stuff all the time.... I don't even apply for some of it and definitely do not have time to thoroughly play most of it.

If I got paid to do that, I would love it.... but I am happy just to get to see the mechanics / evolution of games over time. I appreciate the technologies behind them and can usually pinpoint where bugs are coming from when I see them.

So far BF3 beta/alpha has impressed me. These high profile games are more and more polished towards launch than ever before. And if this were 5/10 years ago, the game would have been released at the alpha stage and been patched from there. That I can assure you..... but they are fixing a lot of stuff and taking the time to balance the game, while letting people who might not have played the game get a chance. This might turn a few people away who don't understand the many many iterations software goes through, but they probably won't leave even if they say they will.....

Something I have said to my cousin was: There are these two pieces of software here, both are getting shoved down my throat. Origin is ok, its better than steam was when it launched, but it is nowhere near as good as steam is now. And battlelog is very good at what it was built to do. It occasionally has some glitches having to do with parties and what not, but overall works well for its purpose.

The thing that irks me about it is... these are two wholly different systems. Why are my origin friends not also battlelog friends???? Why can't someone click on my name in origin and join my game in bf3? Why is the battlelog chat different from the origin chat? They either need to ditch one or the other. That or make the two systems interface 100%. Any feature that is used in once needs to talk to the other.

That is the biggest issue I have with the whole bf3 launch! The game is great, they can patch/fix everything over time. It is the applications being forced upon me along side that I have issue with. Those are the things that need massive work.

What happened to being able to launch origin games without having origin open?


----------



## MKHunt

Quote:


> Originally Posted by *Masked;15125265*
> I agree actually, with all of that.
> 
> We're in SWTOR and I actually thuroughly enjoy it and agree 100% with the limited release...Actually...As anyone that's actually played the game, would as well but, that aside...Beta is and has always been a privileged practice.


I agree with this I have been lucky enough to be part of the SW:TOR beta and it's been quite enjoyable. Depending upon monthly prices I may even get the game at launch. My expectations were rather high due to the marketing, which was excellent. Also, I've done a lot of Blizzard betas which are so polished it's like being given an early release copy.

The PoE beta is also extremely polished (for what they have done) IMO. There are some features yet to be implemented (I can still run through a mob w/o being forcefully engaged) but that will come with time.

I'm going to trust you on the settings bit. You definitely have a more detailed view than any of us.
Quote:


> Originally Posted by *Fallendreams;15125615*
> The funny thing, most of the people saying they cancel there pre order didn't.. and most people who say there not buying are going to buy it. There just crying because they didn't get there way. I mean the world is not burger king, and you cant have it your way.


I also agree with this. If people are going to try and make a statement by voting with their wallet then they need to follow through. Emoty words and butthurt mean nothing and show only that someone has an opinion they aren't willing to stand behind. It's really a problem in every facet of our society.
Quote:


> Originally Posted by *rush2049;15127170*
> Something I have said to my cousin was: There are these two pieces of software here, both are getting shoved down my throat. Origin is ok, its better than steam was when it launched, but it is nowhere near as good as steam is now. And battlelog is very good at what it was built to do. It occasionally has some glitches having to do with parties and what not, but overall works well for its purpose.
> 
> The thing that irks me about it is... these are two wholly different systems. Why are my origin friends not also battlelog friends???? Why can't someone click on my name in origin and join my game in bf3? Why is the battlelog chat different from the origin chat? They either need to ditch one or the other. That or make the two systems interface 100%. Any feature that is used in once needs to talk to the other.
> 
> That is the biggest issue I have with the whole bf3 launch! The game is great, they can patch/fix everything over time. It is the applications being forced upon me along side that I have issue with. Those are the things that need massive work.
> 
> What happened to being able to launch origin games without having origin open?


This this this this this and more this. If they are dead-set on keeping the two separate then why not code a super-lightweight browser (based on an open-source platform) into one of the game screens that _only_ connects to battlelog? Why can't EA's servers talk to other servers owned by EA or those it finances?

Most of my upset with the beta is because of this. It feels insanely disjointed. When people want to see the beta, or try playing it, they mostly seem to see a lot of Firefox/IE/Chrome. At one point last night one of my friends made the comment, "Well I came to see BF3 and all I've gotten a feel for is your Firefox theme. I like it by the way; the dark tones mesh nicely with your black glass." If they delayed release by a month to integrate the three packages into one functioning unit, I would probably preorder and pay in full again.

My other major disappointment is that it doesn't feel like the successor to BF2. On the off chance that I manage to join a server with some form of teamwork, it's great. There's that sense of cohesion that you never get in other shooters. But in every other circumstance it feels it just feels like I'm just grinding kills to get the next unlock. If I want that, I'll play CoD where that is the sole purpose of the game. I'm hoping this is just a function of the Rush gametype and not indicative of the majority of gameplay.

I really really miss the early days of Bad Company 1 while playing Gold Rush on XBL and having everybody work together to coordinate attack and decoy strategies. That cooperative gameplay where you stop worrying at all about your score and just hope that you fill your role is magical. Any game that can reproduce that will win my heart.

ETA: It could also use something to prevent spawn camping. Too many times have I joined a game and ended-up on an assault team that is getting massacred because of insane spawn camps. The camping is probably the reason the spots were open in the first place







I have no doubt that this will be fixed with time. Halo Reach is _still_ getting balance tweaks to this day lol.

...

I also yearn for the day that someone does a graphics and engine overhaul to Morrowind but keeps the interaction and character building exactly the same. If someone does that, I will praise them as though they were gods.


----------



## rush2049

Quote:


> Originally Posted by *MKHunt;15128167*
> I also yearn for the day that someone does a graphics and engine overhaul to Morrowind but keeps the interaction and character building exactly the same. If someone does that, I will praise them as though they were gods.


http://morrowindoverhaul.net/blog/

...though I think it does change the game a bit, the graphical updates are amazing!


----------



## Shinobi Jedi

Dungeon Siege III is on sale at Steam for half off regular price.

I picked it up, it's pretty fun. Interestingly, when I run it full blast, it taxes all four of my gpu's all the way to the max. Even with Vsync turned on.

I'm down with BF3 again. I'm going to hold onto my pre-order and buy the game at launch, if only to help encourage the industry to start developing for PC as it's base platform and then port to consoles like they're doing with BF3 instead of the opposite.

ETA: Anyone in the SW: KOTOR Beta recommend what class to play without breaking any NDA's?

I can't decide. The fan that just wants to RPG a character and doesn't care about MMO stuff, totally wants to be a Jedi. The gamer in me knows 'everyone' is going to be a Jedi.

Combined with the fact that my only MMO experience is with DCU:Online (I actually liked it quite a bit. A nice action RPG that was fun to play with my 360 controller) and a bit of Star Trek Online.

There's still a bunch I don't get about MMO style of Gameplay. Also, I hope they make KOTOR a game where once you hit the level cap, you don't need a group of people to keep progressing or having fun. As I often would play DCUO on the odd hours, due to work, and the game would be empty (this is before it got empty for real) and there was nothing I could do because I hit the level cap and needed a group for raids and stuff.

On another note: The BF3 beta does support XBox 360 controller for windows in MP. While I agree this is not the preferred method of control. If you get Carpal Tunnel or cramped up hands from playing mouse and KB and don't mind getting killed a lot and want to support your squad, this is a godsend.


----------



## Swolern

PLEASE HELP. BAD STUTTERING DURING GAMING GTX 590.

Hey guys. Im new to the forum and new to PC gaming in general. Finally smartened up and switched over from console's horrible graphics. So I built my first PC about a month ago and lovin it. I have the EVGA GTX 590 classified so most of my games run well above 60 FPS which is my 1920x1080 monitors limit.

A couple games that really push the GPU like Crysis 2, Metro 2033, & Witcher 2 all at max setting I have a problem with.
(During games running FRAPS to see fps) when my rate during play onscreen is maxed at 60fps everything is smooth as butter during movement. But when framerate drops below 60 I get some bad constant stuttering during movement, very noticeable when panning slowly left or right. Now this happens right below 60, I mean 59fps I get this stutter effect.

Its not screen tearing, there is no screen cut lines or polygon malformations, i do get screen tearing when Vsync is off, so I know what that looks like. Could this be microstutter? Ive read that microstutter is not noticable, and this effect is very jerky. It does sound like it because it looks like it skips some frames what causes the jutter effect. If so why would i get stutter @59FPS and none @ 60fps when it's very smooth? All this while vsync is enabled. Any help will be greatly appreciated. Thanks.

Also tried the 285 beta drivers and 280 and both with same results.

System
-CPU- i5 2500k OC @ 4.4GHz
-Cooler- CM Hyper 212
-Motherboard- Asus-Z68-Pro
-RAM- G.Skill DDR3 1600 CL8 1.5v
-GPU-EVGA GTX 590 Classified (factory Overclocked) newest 280 driver
-Samsung F4 2tb HDD (will buy SSD soon)
-PSU-Corsair 1050 watt Professional Series
-Case-Cooler Master HAF X


----------



## Simirath

I also bought this system a few days ago and i am having same problem. Is there a setting or an information that we are missing about stuttering ? With my old system (6950 dirt 3 edition/i7 860 etc) games are smooth even with lower fps than what i am getting here.

Can it be about monitor ? But i dont think so otherwise it would happen in movies and such aswell.. Can someone help me i am getting annoyed by this, pff.

Thanks.


----------



## Tept

Sounds to me both of you have some settings wrong in nvidia control panel. This can also be issues with the driver on certain games. I suggest setting Force Vsync on and set Power management mode to Prefer Maximum Performance. Make sure Maximum pre-rendered frames is at 3, you can even try 4 and 5 to see if it fixes the issue, but I wouldn't go any higher than that.


----------



## Simirath

I will try when i get to home to see if it helps







Thank you i hope it is about a setting so we can fix it with your help.


----------



## Simirath

Hello Everyone..

I had a chance to run 3dmark vantage test today and this is the result. Is everything oke ?










Today i was testing it in Rift and i had a feeling like it was stuttering a little then i get annoyed.. I checked temperature it was around 75. I restarted pc a few times but it didnt help stuttering was there. I was also having an ssd problem (1 minute freeze black screen and random bsod) then i updated firmware, made a clean install of drivers and re-connected every cable in the pc and monitor. No problem for 6-7 hours so far. It looks like stuttering is gone but i just cant be sure about it. This is a new build and having a few problems made me a little upset tbh..

How can i be sure that everything is oke ? 3d mark vantage score is there but i am not sure if it is good enough or not.

I tested my system in world of warcraft in a crowded area and fps was around 30-40 and in a raid with lots of aoe and spell visual effects it goes down to 25. Some people say they always have 60 fps in this games but i dont and we pretty much have same system so whats wrong here ?

Should i change this card with others ? Here are options. They got zotac 590/580/580 amp and MSI N580GTX Twin Frozr II/OC.

What games do you prefer to test my card ? I want to be sure that everything is good about it before 7 day return policy runs out and i got 2 days more for that so i need your help here..

Thank you..


----------



## L1eutenant

Mm. Im sure when i ran this test with my sig rig i scored around the 60k mark. Not sure if same software tho.

I will download and run tonight when i get home as i think im having the same issues as you..


----------



## Swolern

Quote:


> Originally Posted by *Tept;15146996*
> Sounds to me both of you have some settings wrong in nvidia control panel. This can also be issues with the driver on certain games. I suggest setting Force Vsync on and set Power management mode to Prefer Maximum Performance. Make sure Maximum pre-rendered frames is at 3, you can even try 4 and 5 to see if it fixes the issue, but I wouldn't go any higher than that.


Thanks for the advice Tept. Unfortunately when I changed the settings you suggested they had no effect on my stuttering.

@ Simirath. For me I see the stuttering most in Witcher 2 with ultra settings and uber sampling enabled. I like to test in the new arena mode, in the very beginning, in the room right before you go out to fight. I look at the light canister on the wall and then pan around left and right at different speeds. The stuttering is very noticeable cause the light canister on the wall jumps around everywhere with FPS @ 55. (pretty much the whole environment jumps and very jittery when moving slowly, but the light canister is more noticable) Now when I go back and decrease the graphics settings so the FPS stay locked at 60fps and pan around looking at the light canister, the light just floats seamlessly across my screen.

List of most of interventions Ive tried to fix stuttering- nothing worked
- installed each previous driver for the 590, even the old one archived ones
- stress and tested each component via 3dMark. Everything scored above recommended score
- attempted Tept's setting
- disabled SLI to run only one GPU to rule out SLI microstutter
- tested multiple games at different settings. All same effect. No stutter at 60fps. 59 & less= entire screen stutter.

Samirath do you have same effect as me where the stuttering is gone with framerate @ 60fps but returns when in high 50s?

I'm pulling my hair out here. Anyone have any other suggestions?


----------



## MKHunt

Quote:


> Originally Posted by *Swolern;15154671*
> @ Simirath. For me I see the stuttering most in Witcher 2 with ultra settings and *uber sampling enabled*.


Honestly that's your problem. Stuttering is FPS changes. Ubersampling causes drastic fps changes. Disable ubersampling, leave everything else at ultra, and be happy.

If you can see a difference, it's either mental or you're not _playing_ the game.

ETA; It also sounds like you have vsync on. Disable the crap out of that. It's known to cut GPU usage drastically.


----------



## Swolern

Quote:


> Originally Posted by *MKHunt;15154738*
> Honestly that's your problem. Stuttering is FPS changes. Ubersampling causes drastic fps changes. Disable ubersampling, leave everything else at ultra, and be happy.
> 
> If you can see a difference, it's either mental or you're not _playing_ the game.
> 
> ETA; It also sounds like you have vsync on. Disable the crap out of that. It's known to cut GPU usage drastically.


Unfortunately the stuttering does not only happen in Witcher 2. It happens in every game where the framerate falls below 60fps (Crysis 2 & Metro 2033). When playing Witcher 2 with uber sampling off my framerate goes to 120fps using SLI and my framerate is locked at 60fps and everything is perfect. But I should be able to run it fine with uber on that's why I bought the 590. Now with ubersampling off and SLI disabled from my 590 then my framerate is at @56fps and again I have the stuttering.

And it's not my imagination. Even my 9 y/o son tells me to put the game back the other way because it's to shaky. It's extremely noticeable when the framerate is smooth at 60, then it hits the 50s mark and everything starts shaking during movement.

And my monitor is 60Hz so if I turn off Vsync most games are way above 60fps and the screen tearing is horrible.

Thanks for reply.


----------



## MKHunt

Quote:


> Originally Posted by *Swolern;15156090*
> Unfortunately the stuttering does not only happen in Witcher 2. It happens in every game where the framerate falls below 60fps (Crysis 2 & Metro 2033). When playing Witcher 2 with uber sampling off my framerate goes to 120fps using SLI and my framerate is locked at 60fps and everything is perfect. But I should be able to run it fine with uber on that's why I bought the 590. Now with ubersampling off and *using only one GPU from my 590* then my framerate is at @56fps and again I have the stuttering.
> 
> And it's not my imagination. Even my 9 y/o son tells me to put the game back the other way because it's to shaky. It's extremely noticeable when the framerate is smooth at 60, then it hits the 50s mark and everything starts shaking during movement.
> 
> And my monitor is 60Hz so if I turn off Vsync most games are way above 60fps and the screen tearing is horrible.
> 
> Thanks for reply.


I am confused by the bolded part. Why would you ever do this? The closest to that I've come is folding on one GPU and browsing the net on the other. Is there a reason you need to run the card at half capacity?

Sorry if i wasn't very clear. What I mean is that the image quality isn't affected overmuch by ubersampling. Not the shaking. Shaking is most definitely noticeable









Any game with the shaking, disable vsync. It's 100% useless at that point (below 60) and will only handicap your card.

nVIDIA drivers have a problem where vsync enabled seems to chop off about 30% of your power. This is why most of us only run it on games where the tearing is _really_ bad. I have a 60Hz monitor as well and not a single new (post 2009) game I have has vsync enabled for this reason alone.

The only other thing I can think of is a fresh install of windows and overclocking your CPU. Do you select the "Perform Clean Install" box when installing the nvidia drivers?


----------



## Swolern

Quote:


> Originally Posted by *MKHunt;15156242*
> I am confused by the bolded part. Why would you ever do this? The closest to that I've come is folding on one GPU and browsing the net on the other. Is there a reason you need to run the card at half capacity?
> 
> Sorry if i wasn't very clear. What I mean is that the image quality isn't affected overmuch by ubersampling. Not the shaking. Shaking is most definitely noticeable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any game with the shaking, disable vsync. It's 100% useless at that point (below 60) and will only handicap your card.
> 
> nVIDIA drivers have a problem where vsync enabled seems to chop off about 30% of your power. This is why most of us only run it on games where the tearing is _really_ bad. I have a 60Hz monitor as well and not a single new (post 2009) game I have has vsync enabled for this reason alone.
> 
> The only other thing I can think of is a fresh install of windows and overclocking your CPU. Do you select the "Perform Clean Install" box when installing the nvidia drivers?


I disabled SLI on the 590 to rule out microstutter between the two GPUs as the cause of my stuttering issue.

The reason I use Vsync is with it on these GPU intense games (maxed settings) i sit about 55-60fps with stutter below 60. With Vsync off I get framerates above 60 and then comes the screen tearing, and it's bad. I'm screwed both ways.

I do select clean install when installing drivers. I haven't tried a fresh install of windows yet. I will try that tomorrow after I get off work. I work the night shift. I have my i5 OC already. Could my HDD be a possible cause?

System

-CPU- i5 2500k OC @ 4.4GHz
-Cooler- CM Hyper 212
-Motherboard- Asus-Z68-Pro
-RAM- G.Skill DDR3 1600 CL8 1.5v
-GPU-EVGA GTX 590 Classified (factory Overclocked)*
-Samsung F4 2tb HDD (will buy SSD soon)
-PSU-Corsair 1050 watt Professional Series*
-Case-Cooler Master HAF X


----------



## MKHunt

Quote:


> Originally Posted by *Swolern;15156655*
> The reason I use Vsync is with it on these GPU intense games (maxed settings) i sit about 55-60fps with stutter below 60. With Vsync off I get framerates above 60 and then comes the screen tearing, and it's bad. I'm screwed both ways.


Yeah vsync is definitely your problem. Like I said, it's a known issue. Not much you can do but wait and hope nvidia fixes it. Sorry to be the bearer of bad news.


----------



## Swolern

Quote:


> Originally Posted by *MKHunt;15156738*
> Yeah vsync is definitely your problem. Like I said, it's a known issue. Not much you can do but wait and hope nvidia fixes it. Sorry to be the bearer of bad news.


Well that makes sense. It looks like the frames are not in coordination with my display when framerate is below 60. So I don't have a bad GPU? If it's the drivers why arent more people complaining about this issue? Because from my viewpoint it looks pretty bad and ruins the immersive gameplay. Thanks for your help.


----------



## MKHunt

Quote:


> Originally Posted by *Swolern;15156912*
> Well that makes sense. It looks like the frames are not in coordination with my display when framerate is below 60. So I don't have a bad GPU? If it's the drivers why arent more people complaining about this issue? Because from my viewpoint it looks pretty bad and ruins the immersive gameplay. Thanks for your help.


It's just something we live with. I get immersed easily so I don't notive minor tearing. I only really notice it when there is apparent ghosting or horizontal lines of distortion


----------



## Swolern

Quote:


> Originally Posted by *MKHunt;15156979*
> It's just something we live with. I get immersed easily so I don't notive minor tearing. I only really notice it when there is apparent ghosting or horizontal lines of distortion


But it's not minor in my case. It's very jerky and causes distortion to the entire environment. Do ATI cards have this same problem?

Reps for ur help.


----------



## MKHunt

Quote:


> Originally Posted by *Swolern;15157094*
> But it's not minor in my case. It's very jerky and causes distortion to the entire environment. Do ATI cards have this same problem?
> 
> Reps for ur help.


That's very interesting. is there any way you can record and upload footage with vsync on and off? Be sure to include fraps or afterburner's OSD

ETA: As for ATI cards... I have no idea whatsoever :/ Sorry.


----------



## Swolern

Quote:


> Originally Posted by *MKHunt;15157164*
> That's very interesting. is there any way you can record and upload footage with vsync on and off? Be sure to include fraps or afterburner's OSD
> 
> ETA: As for ATI cards... I have no idea whatsoever :/ Sorry.


I haven't before but I will give it a try when I'm off work tomorrow. Thanks again for your help. At least I have a possible source of the problem to go after. Vsync.

Awesome forum btw.


----------



## Swolern

Quote:


> Originally Posted by *MKHunt;15157164*
> That's very interesting. is there any way you can record and upload footage with vsync on and off? Be sure to include fraps or afterburner's OSD
> 
> ETA: As for ATI cards... I have no idea whatsoever :/ Sorry.


Still at work but found a video and this is exactly what it looks like.




http://www.youtube.com/watch?v=TgM2O8Ci4SU&feature=related[/ame[/URL]]


----------



## Smo

Quote:


> Originally Posted by *Swolern;15157408*
> Still at work but found a video and this is exactly what it looks like.


That's very interesting - I have the exact same problem, but only in one game; Metro 2033. All others are fine.


----------



## Swolern

Quote:


> Originally Posted by *Smo;15157980*
> That's very interesting - I have the exact same problem, but only in one game; Metro 2033. All others are fine.


Interesting. I think this may be a driver related issue with Vsync as stated by MKHunt. Have you tried Witcher 2 at max settings with uber sampling enabled? Metro and Witcher are the only games that run at average below 60 FPS and that's when this stuttering starts. Crysis 2 stays mostly above 60 every once in a while it goes in 50s and the same stutter.

There are not that many game that the 590 can't display above 60fps maxed. But the few there are there seems to be this stutter issue, more noticeable when panning left or right than walking straight. I am trying to have this fixed before BF3 release because I've read ultra setting are going to test any system.


----------



## RobotDevil666

Hmmmmmmm interesting , i was reading this for a while now and from sheer curiosity I've tested my card as well.
I find tearing really annoying so i have always V-sync on , I've tested Witcher 2 Crysis , Crysis 2 , Metro 2033 and all are fine but than i found out that i have this exact issue in BF3 Beta.
When my FPS is at 60 all is super fluid but if only my FPS drops to 50's range i get this stutter.
Honestly this is getting confusing


----------



## Masked

It's a beta.

Everyone is having tearing issues...Even ATI.

It's because the engine isn't finalized...

Tell THEM about the stutter, tell THEM about the Vsync issues...

A beta is about feedback, even if you're just emailing them, you're giving them feedback...Talking about it here, is never going to solve the problem...Telling them about it in droves, will.

Never ever expect a graphically appeasing game in a beta...It's a bad practice...One I learned a very long time ago, only gets your hopes up to be torn down.

Amazing games that had the WORST betas ever: Morrowind, CODBO, BFBC2, Borderlands, Doom 3's beta was graphically ******ed...etc.


----------



## Swolern

@ Robotdevil
I saw it in Battlefield 3 beta also when FPS<60. it's harder to notice the faster you pan left or right due to blur effect. Slow panning close to objects or walls is where the stuttering is most noticeable.


----------



## Swolern

Quote:


> Originally Posted by *Masked;15159018*
> It's a beta.
> 
> Everyone is having tearing issues...Even ATI.
> 
> It's because the engine isn't finalized...
> 
> Tell THEM about the stutter, tell THEM about the Vsync issues...
> 
> A beta is about feedback, even if you're just emailing them, you're giving them feedback...Talking about it here, is never going to solve the problem...Telling them about it in droves, will.
> 
> Never ever expect a graphically appeasing game in a beta...It's a bad practice...One I learned a very long time ago, only gets your hopes up to be torn down.
> 
> Amazing games that had the WORST betas ever: Morrowind, CODBO, BFBC2, Borderlands, Doom 3's beta was graphically ******ed...etc.


But I'm having the same stuttering with Witcher 2, Crusis 2, and Metro 2033 also.


----------



## Masked

Quote:


> Originally Posted by *Swolern;15159150*
> But I'm having the same stuttering with Witcher 2, Crusis 2, and Metro 2033 also.


I think you all need to define what you think "stuttering" is...

For example, I just hired a girl for an internship that graduated from Yale...Arts/Graphic arts with a dual in Networking...Smartest intern I've actually ever hired.

She thinks stuttering is "when a game kind of pauses and loads for a few moments and then speeds up suddenly back to normal".

Folks, that's not stuttering. There are several reasons for that, #1, If you're on a server, that server is now loading...#2, If you're on a 7200 or a 10k, that game is loading into the cache...

I tend to find what 90% of you refer to as "stuttering" is not, stuttering.

If your game actually stutters and skips, that's stuttering...And that's a graphical issue...One, which I've actually noticed in 2/3 recent betas and in no way/shape/form is worth the panic.

I.E. frames skip across and actually aren't present...

If you're turning in BF3 and all the sudden it lags out on you...That's purely, 100% buffer/load capacity...Which is far from being the fault of your card.


----------



## Fallendreams

Quote:


> Originally Posted by *Swolern;15154671*
> Thanks for the advice Tept. Unfortunately when I changed the settings you suggested they had no effect on my stuttering.
> 
> @ Simirath. For me I see the stuttering most in Witcher 2 with ultra settings and uber sampling enabled. I like to test in the new arena mode, in the very beginning, in the room right before you go out to fight. I look at the light canister on the wall and then pan around left and right at different speeds. The stuttering is very noticeable cause the light canister on the wall jumps around everywhere *with FPS @ 55.* (pretty much the whole environment jumps and very jittery when moving slowly, but the light canister is more noticable) Now when I go back and decrease the graphics settings so the F*PS stay locked at 60fps* and pan around looking at the light canister, the light just floats seamlessly across my screen.
> 
> List of most of interventions Ive tried to fix stuttering- nothing worked
> - installed each previous driver for the 590, even the old one archived ones
> - stress and tested each component via 3dMark. Everything scored above recommended score
> - attempted Tept's setting
> - disabled SLI to run only one GPU to rule out SLI microstutter
> - tested multiple games at different settings. All same effect. No stutter at 60fps. 59 & less= entire screen stutter.
> 
> Samirath do you have same effect as me where the *stuttering is gone* with framerate *@ 60fps* but returns when in high 50s?
> 
> I'm pulling my hair out here. Anyone have any other suggestions?


As you see, i highlighted some stuff. This stuttering your experiencing is not mircostuttering or Nvidia Drivers fault. Well i have experience this too. MY STUTTERING EXP..

Having vsync on will cause this effect, but its not a driver issue. When your at 60fps your frame latency is like 16ms. Its been said the human eye can not see the difference up to like 20s ms area. I say Bull****! When i had GTX 580 SLI i had this problem all the time.

If the game was at 60fps (16ms) it would feel so smooth. The issue i was having was the Frames will jumping so fast from 16-21ms back to 17ms up to 22ms so quickly (Its like going from 60fps to 52 to 56 to 50fps) So this would through my eye off. I feel with vsync on you need a solid 60fps for smoothness or no lower then 58fps.

This where your seeing this stuttering effect. Play with it for 3 damn months and i could not solve this on my GTX 580 SLI.

I would try RMAing the cards, Because my cousin is rocking one of those GTX 580s. EVGA sent me a new GTX 580 and i sold it off to him. He does not have this problem anymore.

I suggest RMAing those cards. This only fix i can think of. It work for me.

I know rmaing doesn't seem like much help, but why would evga send me another gtx 580 if the first one had nothing wrong with it ? Doesn't make sense.

(Why didnt you keep the GTX 580 SLI ? Well i sold one of them off before RMAing the card. I'm also quite happy with my GTX 590)

Quote:


> Originally Posted by *MKHunt;15156242*
> I am confused by the bolded part. Why would you ever do this? The closest to that I've come is folding on one GPU and browsing the net on the other. Is there a reason you need to run the card at half capacity?
> 
> Sorry if i wasn't very clear. What I mean is that the image quality isn't affected overmuch by ubersampling. Not the shaking. Shaking is most definitely noticeable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any game with the shaking, disable vsync. It's 100% useless at that point (below 60) and will only handicap your card.
> 
> *nVIDIA drivers have a problem where vsync enabled seems to chop off about 30% of your power.* This is why most of us only run it on games where the tearing is _really_ bad. I have a 60Hz monitor as well and not a single new (post 2009) game I have has vsync enabled for this reason alone.
> 
> The only other thing I can think of is a fresh install of windows and overclocking your CPU. Do you select the "Perform Clean Install" box when installing the nvidia drivers?


No true, AMD and NVIDIA will do this too. Also depends on the game. CODBO will use like 20-30% Multilayer on both cards and then Single player will use like 30-60%. BF3 beta is using 60-80%. Vsync does this, why should it render more frames then it should ? Wouldn't you do less work if you had too ?








Quote:


> Originally Posted by *Swolern;15157408*
> Still at work but found a video and this is exactly what it looks like.


Read the first post dude ^

Good luck man, I'm sorry i don't have a magic fix for you









Anyone please feel free to jump in if i explain something wrong or feel im just









Just trying to share my experience


----------



## L1eutenant

My GTX 590 whilst running black ops.

GPU1. 70-80%
GPU2. 20-30%

Whilst playing the BF3 Beta on ultra.

GPU1. 90-99%
GPU2. 90-99%

It would seem that Black Ops simply does not like my card since the last patch update.

And once i set to let the 3D application decide my FPS went from 30 to 70-80 in the BF3 Beta.

So thank god there isn't anything wrong. (i have been worrying that something is for a few months now.)


----------



## Swolern

Quote:



Originally Posted by *Fallendreams*


As you see, i highlighted some stuff. This stuttering your experiencing is not mircostuttering or Nvidia Drivers fault. Well i have experience this too. MY STUTTERING EXP..

Having vsync on will cause this effect, but its not a driver issue. When your at 60fps your frame latency is like 16ms. Its been said the human eye can not see the difference up to like 20s ms area. I say Bull****! When i had GTX 580 SLI i had this problem all the time.

If the game was at 60fps (16ms) it would feel so smooth. The issue i was having was the Frames will jumping so fast from 16-21ms back to 17ms up to 22ms so quickly (Its like going from 60fps to 52 to 56 to 50fps) So this would through my eye off. I feel with vsync on you need a solid 60fps for smoothness or no lower then 58fps.

This where your seeing this stuttering effect. Play with it for 3 damn months and i could not solve this on my GTX 580 SLI.

Just trying to share my experience










Finally I'm starting to see the light. That explains so much. Thank you.

Questions-
-1. My problem is with the EVGA 590. Is the stuttering completely gone with your new 590 card?(even with framerates in the 50s with Vsync enabled) Which brand 590 did you buy?

-2. the latency differences you described from 16-22ms and FPS jumping from 60-52 and back and forth. This is what I understand as microstutter. Am i wrong? And I've read the 590 has less microstutter because the 2 GPUs are on one card and they have less of a bridge distance than 580SLI.

-3. Are you saying my 590 is faulty. I'm out of return period but EVGA provides lifetime warranty so I can haggle them.

Thanks a ton Fallendreams. Repped.


----------



## Fallendreams

Quote:



Originally Posted by *Swolern*


Finally I'm starting to see the light. That explains so much. Thank you.

Questions-
-1. My problem is with the EVGA 590. Is the stuttering completely gone with your new 590 card?(even with framerates in the 50s with Vsync enabled) Which brand 590 did you buy?

-2. the latency differences you described from 16-22ms and FPS jumping from 60-52 and back and forth. This is what I understand as microstutter. Am i wrong? And I've read the 590 has less microstutter because the 2 GPUs are on one card and they have less of a bridge distance than 580SLI.

-3. Are you saying my 590 is faulty. I'm out of return period but EVGA provides lifetime warranty so I can haggle them.

Thanks a ton Fallendreams. Repped.


Doing mid terms today. I'll respond to your questions when I get home.

Sent from my HTC Aria using Tapatalk


----------



## Masked

Quote:



Originally Posted by *L1eutenant*


My GTX 590 whilst running black ops.

GPU1. 70-80%
GPU2. 20-30%

Whilst playing the BF3 Beta on ultra.

GPU1. 90-99%
GPU2. 90-99%

It would seem that Black Ops simply does not like my card since the last patch update.

And once i set to let the 3D application decide my FPS went from 30 to 70-80 in the BF3 Beta.

So thank god there isn't anything wrong. (i have been worrying that something is for a few months now.)


It's a beta...NEVER ever use a beta even as a performance check.

Apparently there's some article that's saying the engine is finalized and that's what Ultra is going to be...Wrong, sorry.

In a meeting we had w/them, the engine is NOT finalized even on the maps in beta so, they haven't even given us a final "this is the final product".

In 3 weeks a lot can happen and I personally think it's going to be @80-90% dedicated on a 590 for ultra...

I've also heard that DICE may not offer the next Battlefield to the public in beta form which, after the over-all response, IMO, may just be a good idea...Their forums are littered with whining instead of constructive criticism and it's rampant with pre-teens... disappointing.

I also wouldn't use the 280 drivers when playing BF3, use the optimized flavor...Those are actually better.


----------



## rush2049

I am interested, why do we have an influx of new gtx 590 owners? It just seems that people with new accounts are coming in with a 590.... interesting. Not bad, just interesting.

In response to the performance of video cards in different games... assuming all limiting of fps is disabled (Vsync and engine limitations) and you are recieving less than 100% utilization of a video card, there is only one possible thing you have happening.
Bottlenecking...
This can be anywhere, between your cpu and video card, hard drive and ram, Vram and ram. Or even in an internal pipeline inside of the video card, like all pixel shader pipelines are being used, but all the other pipelines are left vacant of work units.

Now, about the topic of using beta's to judge a final product. DON'T DO IT.

Beta and development builds of product are very very different from the finalized versions. Often having double to triple the size of the compiled software, because of all the error and bug gathering code included in a test compile.

The performance will not be the same as the final product. Period

Now onto the supposed stuttering. Swolern you need to be very careful with the terminology used to describe something you are seeing. For instance, above I used the word bottlenecking..... it is a dangerous word to use without understanding of its implications.


----------



## Masked

Quote:



Originally Posted by *rush2049*


I am interested, why do we have an influx of new gtx 590 owners? It just seems that people with new accounts are coming in with a 590.... interesting. Not bad, just interesting.

In response to the performance of video cards in different games... assuming all limiting of fps is disabled (Vsync and engine limitations) and you are recieving less than 100% utilization of a video card, there is only one possible thing you have happening.
Bottlenecking...
This can be anywhere, between your cpu and video card, hard drive and ram, Vram and ram. Or even in an internal pipeline inside of the video card, like all pixel shader pipelines are being used, but all the other pipelines are left vacant of work units.

Now, about the topic of using beta's to judge a final product. DON'T DO IT.

Beta and development builds of product are very very different from the finalized versions. Often having double to triple the size of the compiled software, because of all the error and bug gathering code included in a test compile.

The performance will not be the same as the final product. Period

Now onto the supposed stuttering. Swolern you need to be very careful with the terminology used to describe something you are seeing. For instance, above I used the word bottlenecking..... it is a dangerous word to use without understanding of its implications.


I believe that some RMA stock was released considering how much they actually had...So, to generate a bit more revenue some of them were thrown up on Newegg etc. I think about 100? ~ I know what our policy is...Ours is 100...So, take that # with a grain of salt.

There have been a tremendous amount of driver issues with the 280-series and the 590s...I think that has a a lot to do with them tweaking the 580 cores and not taking into effect what would happen with the 590...In breaking down the latest driver release, that's what I find, anyway.

A lot of people lately, especially my newer intern, have been confusing stuttering with "loading" or "staggering", whatever you want to call it and unfortunately, that's what a lot of people on the BF3 forums are doing.

I do see SOME stuttering on the 590 in BF3, there are some areas where it happens frequently...Like I said, use your voices on the forums and constructively tell them or, even feedback it.

You'd be very surprised what changes will happen when the game goes live...And the more you voice your opinion constructively, the better the game will actually be.


----------



## RobotDevil666

Referring to my last post , I've disconnected the second monitor and voila !
Not enough my FPS in Witcher 2 went up slightly but main difference is it feels so much smoother when frames drop below 60.
I wanted to play Withcher 2 with ubesampling on but i couldn't due to the fact it felt so shaky now it's gone.
Same goes for BF3 beta , although FPS stayed the same but when it drops to 50's range it feels normal now.
I bought an old 8600GT on E-bay just to power the 2nd monitor , gonna keep the PhysX on GTX590 though , don't think 8600GT could handle it.

Big thanks to the Masked for the suggestion.


----------



## Fallendreams

Quote:



Originally Posted by *Swolern*


Finally I'm starting to see the light. That explains so much. Thank you.

Questions-
-1. My problem is with the EVGA 590. Is the stuttering completely gone with your new 590 card?(even with framerates in the 50s with Vsync enabled) Which brand 590 did you buy?

-2. the latency differences you described from 16-22ms and FPS jumping from 60-52 and back and forth. This is what I understand as microstutter. Am i wrong? And I've read the 590 has less microstutter because the 2 GPUs are on one card and they have less of a bridge distance than 580SLI.

-3. Are you saying my 590 is faulty. I'm out of return period but EVGA provides lifetime warranty so I can haggle them.

Thanks a ton Fallendreams. Repped.


1. Yes my stuttering did go away when I bought my 590. I have evga gtx 590.

2. Yeah your understanding it. Gtx 590 has nf200 chip on the PCB. It also faster communication between gpus. So yeah it has less chance from this problem. Most people who disagree but it this phemonen that has no simple answer, its like bsod. No simple explainton.

3. I wouldn't return them. Just give them rma. There great cards. I feel they get a bad rep. I wish they where faster then gtx 580s but if that was the case this card would be like 1000+ I'm fine with gtx 570 sli performance. I'm only running 1080p anyways.

I do get slows down with Vsync force on in bf3. I feel Vsync in that game is not fully polish yet.

NVIDIA said we can run this game on max settings. AA and all the eye candy. So we shouldn't be worried ? Right ? :/

http://www.geforce.com/Optimize/Guides/battlefield-3-beta-performance-guide

Sent from my HTC Aria using Tapatalk


----------



## Shinobi Jedi

Quote:



Originally Posted by *Fallendreams*


1. Yes my stuttering did go away when I bought my 590. I have evga gtx 590.

2. Yeah your understanding it. Gtx 590 has nf200 chip on the PCB. It also faster communication between gpus. So yeah it has less chance from this problem. Most people who disagree but it this phemonen that has no simple answer, its like bsod. No simple explainton.

3. I wouldn't return them. Just give them rma. There great cards. I feel they get a bad rep. I wish they where faster then gtx 580s but if that was the case this card would be like 1000+ I'm fine with gtx 570 sli performance. I'm only running 1080p anyways.

I do get slows down with Vsync force on in bf3. I feel Vsync in that game is not fully polish yet.

NVIDIA said we can run this game on max settings. AA and all the eye candy. So we shouldn't be worried ? Right ? :/

http://www.geforce.com/Optimize/Guid...formance-guide

Sent from my HTC Aria using Tapatalk


I may be inferring, but it feels like to me the reputation on the 590 has been turning around, if not almost flipped. If you want to judge by how there still seems to be significant demand for the card yet hardly any supply.

Shiyat, Kingpin couldn't beat RagingCain's 3DMark11 Performance score with 'four' of the new EVGA 580 Classifed LE's, against Cain's old dual 590 on Hydro clock (Basically RagingCain broke 18K on the Performance setting where the four 580's came very, very close but didn't)

The 590's are turning out to be my favorite card, maybe ever, so far.

Though Dungeon Siege III really stresses them out and makes the temps get way too high if I run it at anything higher than 60fps.

I'm going to have to bite the bullet and get a new mobo that will space the cards out instead of having them bumping up against each other. And if I do that, then I'm pretty much going to get that 2700K Masked suggested. I just wish I didn't have to wait until the end of the year or early '12 to get it.

I went ahead and impulse bought Rage hoping that when it unlocks tomorrow it's going to unleash some eye candy that reminds me why I invested in these cards. Hopefully the gameplay will be even better.

It better be, the pre-load comes out to 25gb!


----------



## Masked

I just pre-ordered Rage from our local gamestop as well...I think they're the devil but, it was literally the only way of getting it tomorrow









That and Dark Souls


----------



## rush2049

sigh, dark souls..... if only I could afford that.....


----------



## Smo

I simply cannot wait for Dark Souls and Rage! I'm downloading the preload now and holy hell, it's taking a while (I'm back on 250kb/s download temporarily







).


----------



## jcde7ago

My 590 absolutely chugged through the BF3 Beta at 1920x1200 at pretty much a constant 60FPS (VSYNC seemed broken, I kept jumping above 60), all Ultra/highest settings with 4xAA...can't wait to see how it will run the final release.

Also, looking forward to tearing apart RAGE tonight as well completely maxed out...3 more hours!


----------



## Swolern

Quote:


> Originally Posted by *Masked;15159188*
> I think you all need to define what you think "stuttering" is...
> 
> For example, I just hired a girl for an internship that graduated from Yale...Arts/Graphic arts with a dual in Networking...Smartest intern I've actually ever hired.
> 
> She thinks stuttering is "when a game kind of pauses and loads for a few moments and then speeds up suddenly back to normal".
> 
> Folks, that's not stuttering. There are several reasons for that, #1, If you're on a server, that server is now loading...#2, If you're on a 7200 or a 10k, that game is loading into the cache...
> 
> I tend to find what 90% of you refer to as "stuttering" is not, stuttering.
> 
> If your game actually stutters and skips, that's stuttering...And that's a graphical issue...One, which I've actually noticed in 2/3 recent betas and in no way/shape/form is worth the panic.
> 
> I.E. frames skip across and actually aren't present...
> 
> If you're turning in BF3 and all the sudden it lags out on you...That's purely, 100% buffer/load capacity...Which is far from being the fault of your card.






http://www.youtube.com/watch?v=TgM2O8Ci4SU&feature=related[/ame[/URL]]


----------



## Swolern

Quote:


> Originally Posted by *Fallendreams;15164620*
> 1. Yes my stuttering did go away when I bought my 590. I have evga gtx 590.
> 
> 2. Yeah your understanding it. Gtx 590 has nf200 chip on the PCB. It also faster communication between gpus. So yeah it has less chance from this problem. Most people who disagree but it this phemonen that has no simple answer, its like bsod. No simple explainton.
> 
> 3. I wouldn't return them. Just give them rma. There great cards. I feel they get a bad rep. I wish they where faster then gtx 580s but if that was the case this card would be like 1000+ I'm fine with gtx 570 sli performance. I'm only running 1080p anyways.
> 
> I do get slows down with Vsync force on in bf3. I feel Vsync in that game is not fully polish yet.
> 
> NVIDIA said we can run this game on max settings. AA and all the eye candy. So we shouldn't be worried ? Right ? :/
> 
> http://www.geforce.com/Optimize/Guides/battlefield-3-beta-performance-guide
> 
> Sent from my HTC Aria using Tapatalk


thanks for all the great info. Evga went ahead and rma the card. I'm keeping my fingers crossed it fixes the problem .

This forum is awesome!


----------



## Fallendreams

Quote:


> Originally Posted by *Swolern;15167887*
> thanks for all the great info. Evga went ahead and rma the card. I'm keeping my fingers crossed it fixes the problem .
> 
> This forum is awesome!


Np dude. Let's us know how it all turns out.

Sent from my HTC Aria using Tapatalk


----------



## Swolern

Quote:



Originally Posted by *Masked*


I just pre-ordered Rage from our local gamestop as well...I think they're the devil but, it was literally the only way of getting it tomorrow









That and Dark Souls










Let's us know how Rage is. I've read that the game was design for consoles and will be the same across all platforms, even PC. So I canceled my pre-order. Sucks cause I was really looking forward to this game.


----------



## Masked

Quote:



Originally Posted by *Swolern*








THAT is stuttering...+1!

Rage is made by the old IDsoft guys...My understanding was that it was made for PC but, shifted to a console port...

I don't talk to any of them anymore but, back in my heyday I used to do 24/7 Quake tournaments with those [email protected] years ago...

I really don't think they'd do us any injustice on the PC...But, I'll let you know.

From what I've heard, a couple techs I know got it yesterday and it's awesome.

I won't really have time to play it tonight because, there's some "high profile meeting" I have to attend which means about 2 hours of BS I really don't need to actually know/be involved with...But, they'll bore me to death anyway.

~

I have yet to hook up my Rad-Box and the 580 officially at home because of meetings//My getting the wrong parts...Etc...So, that will happen this weekend.

I need a new monitor as well...Was actually debating for ponying out for the Alienware but, from what I understand, the Dell dead pixel policy is BS so, I'm actually looking at an Asus 24" 2ms HD ~ I don't want to go over 24in...But, I want that pretty resolution for BF3 instead of being stuck in 1680 + I'll have a spare that way.


----------



## dvanderslice

Quote:



Originally Posted by *Masked*


I just pre-ordered Rage from our local gamestop as well...I think they're the devil but, it was literally the only way of getting it tomorrow









That and Dark Souls










Yah me 2. But Rage looks great so I just had to get it through a store. I hate ordering from them.

I been reading about these stuttering issues those guys were having on here, but did anybody get a chance to look over my last post? I'm curious if this has anything to do with the stuttering issues these guys are suffering. http://www.overclock.net/nvidia/9749...l#post15112045

I've worked out my stuttering issues that were similar to that youtube video posted earlier. It was mainly vsync issues as was posted plus having two 120Hz monitors seemed to be causing my problems. I'm just wondering if you guys in Nvida Control Panel 3d Settings are setting the Texture Filtering Quality to High Quality or not. Are you leaving all the settings at their defaults? What about the general consensus, do most of you change the Texture Filtering Quality when you configure the Nvidia Control Panel with clean driver installs?


----------



## Smo

I just recorded this brief video of my sig rig running Rage. No stuttering, no tearing, but AWFUL texture popping (which is the games fault). I'm also using the latest BF3 Beta Drivers and my usage on GPU 2 maxed at ~18%, but for most of the time is at 0%. Not good.





http://www.youtube.com/watch?v=wYhxqTzq-p4[/ame[/URL]]


----------



## Swolern

Quote:


> Originally Posted by *dvanderslice;15174529*
> Yah me 2. But Rage looks great so I just had to get it through a store. I hate ordering from them.
> 
> I been reading about these stuttering issues those guys were having on here, but did anybody get a chance to look over my last post? I'm curious if this has anything to do with the stuttering issues these guys are suffering. http://www.overclock.net/nvidia/974902-nvidia-gtx-590-owners-club-post15112045.html#post15112045
> 
> I've worked out my stuttering issues that were similar to that youtube video posted earlier. It was mainly vsync issues as was posted plus having two 120Hz monitors seemed to be causing my problems. I'm just wondering if you guys in Nvida Control Panel 3d Settings are setting the Texture Filtering Quality to High Quality or not. Are you leaving all the settings at their defaults? What about the general consensus, do most of you change the Texture Filtering Quality when you configure the Nvidia Control Panel with clean driver installs?


I'm using only one monitor. 1920x1080 @ 60Hz. Stuttering was on default settings. I also tried adjusting max pre-rendered frames to 4,5. Power management mode to maximum. Vertical sync to force on, off. SLI on and off Nothing helps.

I do feel like it's a synchronization issue because when framerate is locked at max of monitor 60fps the stuttering resolves. When FPS 50s I get the stuttering, so bad it looks like FPS in the teens,it looks like the frames are playing at an uneven rate and causes distortion to entire environment like you saw in the video.


----------



## Swolern

Quote:


> Originally Posted by *Smo;15177182*
> I just recorded this brief video of my sig rig running Rage. No stuttering, no tearing, but AWFUL texture popping (which is the games fault). I'm also using the latest BF3 Beta Drivers and my usage on GPU 2 maxed at ~18%, but for most of the time is at 0%. Not good.


That's sucks. Did you try the released patch yet?


----------



## MKHunt

What games are you all playing? Anything with great eye candy or fantastic immersion that puts the card through its paces?

I think the last game that I played and enjoyed fully was Bastion.

Midterms, boredom, need help procrastinating.


----------



## Smo

Quote:



Originally Posted by *Swolern*


That's sucks. Did you try the released patch yet?


Yeah dude (as far as I'm aware anyway) - although I haven't updated for several hours I think I'm up to date. Looks like we'll have to wait for Id to fix it.


----------



## Tept

I've only seen 23% gpu usage on gpu2, 0% on gpu1 on Rage, yet it runs butter smooth...lol.


----------



## dvanderslice

Quote:



Originally Posted by *Smo*


I just recorded this brief video of my sig rig running Rage. No stuttering, no tearing, but AWFUL texture popping (which is the games fault). I'm also using the latest BF3 Beta Drivers and my usage on GPU 2 maxed at ~18%, but for most of the time is at 0%. Not good.


Damn, sometimes I get that in Shogun 2 and sometimes I get that in Dirt 3 but not to those extremes. Always seems to be in games with Ambient Occlusion options in the software but if I look at the Nvidia Control Panel profile they say Ambient Occlusion isn't offered for that game so I assume one is software based and another is hardware based? Hopefully you'll have Rage running smooth with a new driver. Just got home from work and installing Rage right now. Interested to see how it runs on my end.


----------



## Swolern

Quote:



Originally Posted by *MKHunt*


What games are you all playing? Anything with great eye candy or fantastic immersion that puts the card through its paces?

I think the last game that I played and enjoyed fully was Bastion.

Midterms, boredom, need help procrastinating.


Witcher 2 is gorgeous with max settings and uber sampling enabled. I'm not much into RPGs but this game really grips you.

........when not stuttering for me......


----------



## Recipe7

Quote:



Originally Posted by *MKHunt*


What games are you all playing? .


CS:S. Getting a solid 299fps,


----------



## Fallendreams

Quote:


> Originally Posted by *Smo;15177182*
> I just recorded this brief video of my sig rig running Rage. No stuttering, no tearing, but AWFUL texture popping (which is the games fault). I'm also using the latest BF3 Beta Drivers and my usage on GPU 2 maxed at ~18%, but for most of the time is at 0%. Not good.


So im not the only with the same problem, i also see your only using one GPU too.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Fallendreams;15184476*
> So im not the only with the same problem, i also see your only using one GPU too.


If you look at the profile in the Nvidia Control Panel, you'll actually see the profile set to "Single GPU". I changed it to run all four. But I haven't tried the game yet.

I have to admit, my enthusiasm for the game has wavered since reviews have came out. But I'm gonna give it a spin.

Post Apocalyptic Shooters. Bleh!

In terms of games with Sci-Fi settings, give me a Mass Effect Space Opera over Post Apocalyptic Shooters any and every day.

Plus, you can't adjust your video settings in game? Instead your regulated to the settings the game deems best from your Hardware? Sorry, but no game yet has gotten that right with my system. But I haven't tried it yet, so I can't really judge.

Also, I forgot that Id thinks story in games is according to them, 'about as important as it is in porn'

I couldn't disagree more. Nothing against Id as this is nothing new from them. But I keep forgetting, and next time I won't and will wait and see first the next game they put out.

But the game is only set to run on one GPU. Try changing it in your control panel profile.


----------



## Smo

Quote:


> Originally Posted by *Shinobi Jedi;15184785*
> If you look at the profile in the Nvidia Control Panel, you'll actually see the profile set to "Single GPU". I changed it to run all four. But I haven't tried the game yet.
> 
> I have to admit, my enthusiasm for the game has wavered since reviews have came out. But I'm gonna give it a spin.
> 
> Post Apocalyptic Shooters. Bleh!
> 
> In terms of games with Sci-Fi settings, give me a Mass Effect Space Opera over Post Apocalyptic Shooters any and every day.
> 
> Plus, you can't adjust your video settings in game? Instead your regulated to the settings the game deems best from your Hardware? Sorry, but no game yet has gotten that right with my system. But I haven't tried it yet, so I can't really judge.
> 
> Also, I forgot that Id thinks story in games is according to them, 'about as important as it is in porn'
> 
> I couldn't disagree more. Nothing against Id as this is nothing new from them. But I keep forgetting, and next time I won't and will wait and see first the next game they put out.
> 
> But the game is only set to run on one GPU. Try changing it in your control panel profile.


I noticed that too mate - but regardless of which setting you apply (force 1 or 2) it halves the fps to ~30 and the game is stuttering like mad! It's far more playable right now to let it ignore GPU 2.


----------



## dvanderslice

Quote:


> Originally Posted by *Smo;15185187*
> I noticed that too mate - but regardless of which setting you apply (force 1 or 2) it halves the fps to ~30 and the game is stuttering like mad! It's far more playable right now to let it ignore GPU 2.


Yah same deal when forcing multi-gpu you get around 30-40% out of each GPU and it runs just like your video.

After a few hours playing when I should be working my game is playing smooth as can be. I have everything maxed out in the video settings (as much as they allow) and its running great, it really is amazing looking and runs phenomenally. I must say since I updated to the drivers 285.27 things have been good. Dirt 3 has been my problem game and in all stages but Norway it runs and looks perfect but that's another issue. Rage:

My buddy has a Crossfire setup and I had to have him upgrade to the new "Rage supported" drivers to get multi-gpu support working properly as well.

I'm real happy so far with the gameplay of Rage so I can't complain but its a bummer when they release games before adding some sort of support or iron out bugs. It seems over the last 10-12 years that this has become the norm. And now they are selling us the patches in the guise of DLCs and NEW this and NEW that... instead of just providing them as they should.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Smo;15185187*
> I noticed that too mate - but regardless of which setting you apply (force 1 or 2) it halves the fps to ~30 and the game is stuttering like mad! It's far more playable right now to let it ignore GPU 2.


Ah! Good to know. I still haven't tried it.

Maybe Nvidia or even EVGA can compose an SLI profile for the game?

But Carmack seems to really pride himself on making code that looks amazing and beautiful, but also be able to run smoothly on older crappy systems and/or consoles. So maybe the code/engine is designed in a way that it can't run SLI because of that ability of wanting it to be able to play on such a wide range of Hardware.

Keep in mind, I know nothing about code like Rush and the other programmers here, so I am definitely theorizing and talking out the side of my arse...


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;15188568*
> Ah! Good to know. I still haven't tried it.
> 
> Maybe Nvidia or even EVGA can compose an SLI profile for the game?
> 
> But Carmack seems to really pride himself on making code that looks amazing and beautiful, but also be able to run smoothly on older crappy systems and/or consoles. So maybe the code/engine is designed in a way that it can't run SLI because of that ability of wanting it to be able to play on such a wide range of Hardware.
> 
> Keep in mind, I know nothing about code like Rush and the other programmers here, so I am definitely theorizing and talking out the side of my arse...


Eh, I'm very happy with the game.

Are there some texture issues? Yes but, they're working on them.

If you remember Doom 3/Quake, both had their issues on release...Same exact scenario and look at the life of both of those titles.

I think we don't have a thing to worry about because they're going to do right by us like they've always done.

That being said...

Today I'm building my rad box at home and putting the 580 in...Doing mine first so that when I build the office's I'll have most of the kinks worked out.

I will have 2 resi's, 1 Danger Den stock 250ml, 1 Danger Den Monsoon w/655 Var, 1 655 var, 1 XSPC 480, 1 XSPC 360, 11 Coolermaster R4's 2000RPM on the 360/480...Also has a dedicated 700w Alienware PSU that I randomly found in the office closet and may soon get an external fan monitor!

Will take pictures!


----------



## emett

Why you guys playing rage when caspian border is now open?


----------



## MKHunt

Quote:


> Originally Posted by *emett;15219308*
> Why you guys playing rage when caspian border is now open?


I was, but then after about two rounds I realized that the EA servers weren't anywhere near prepared for the load 64 players + destruction brings and I got bored.

When you see everything happen five times before it actually happens it's difficult to stay entertained.

In a rare show of mercy I've stopped challenging my 590 (a.k.a. closed [email protected]) and I'm playing Bastion through again. Witcher 2 has gotten a bit monotonous and when I played GoW3 with my friends I called it Gears of Mediocrity. Then I accused them all of being bros and proposed that we buy matching STI Imprezas and then powder coat our rims to match our Izod polos and golf visors. I followed that up with a detailed analysis of why the gameplay was bland and the graphics insanely awful.

Then I remembered that I felt roughly the same way that Shinobi Jedi felt after playing GoW3 and concluded that the game has awful vibes.

Caspian did rekindle a small amount of desire to play BF3, but I still have doubts. We'll see how it's doing in December.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *MKHunt;15219560*
> I was, but then after about two rounds I realized that the EA servers weren't anywhere near prepared for the load 64 players + destruction brings and I got bored.
> 
> When you see everything happen five times before it actually happens it's difficult to stay entertained.
> 
> In a rare show of mercy I've stopped challenging my 590 (a.k.a. closed [email protected]) and I'm playing Bastion through again. Witcher 2 has gotten a bit monotonous and when I played GoW3 with my friends I called it Gears of Mediocrity. Then I accused them all of being bros and proposed that we buy matching STI Imprezas and then powder coat our rims to match our Izod polos and golf visors. I followed that up with a detailed analysis of why the gameplay was bland and the graphics insanely awful.
> 
> Then I remembered that I felt roughly the same way that Shinobi Jedi felt after playing GoW3 and concluded that the game has awful vibes.
> 
> Caspian did rekindle a small amount of desire to play BF3, but I still have doubts. We'll see how it's doing in December.


MK, that is so hilariously Pimp what you said to your friends. I have to rep it.

I'm gonna try Resistance 3 with the Sharpshooter tonight. I played Killzone 3 with it and after a ***** of a learning curb, it turned out to be really fun. I had to stand to play it right, but I didn't mind.

I also have that Rise of the Dead for Kinnect coming via Gamefly that is supposed to be decent.

But I'm still rocking the 590's the most and having more fun with Warhammer. Withcer 2, even with it's update and combat rebalancing and tutorial is still kicking my ass. Only now the group of monsters aren't kicking my ass as much.

Of course I'm gonna play some more Rage too. I'm not having as much issues with the Software engine as much as I'm burnt out on apocolyptic "desert" shooters after Borderlands and Fallout.

@Masked:

Psyched to see your home rig and the results you'll be getting. Good to know you're liking Rage. That's got me encouraged to stick with it.

Question for anyone and everyone:

Are ID the only company that uses OpenGL? If so, why?


----------



## fliq

Can I join?










2x ASUS GTX590's

Stock clocks for now ~

Thanks<3


----------



## yfz350rider

Im a happy owner of a EVGA GTX 590. I plan on adding another one soon. I used the guide here and flashed the bios to increase voltage. I used msi afterburner to overclock it and make a custom fan curve. Here is some proof of overclock http://valid.canardpc.com/show_oc.php?id=2035541. I also have a picture of my set up below. My clocks are 705/1410/1975. Highest temp I have seen on this card was 76C during BF3 beta gameplay. I have 3 monitors and usually game in 5760 x 1080.


----------



## CJL

Hi everyone,
I have been enjoying my 590 since May and like most of you thinking of getting a second one. There are always a few going in and out of stock. Lately i've been experiencing the fan ramping up which so far affects Dead Island which becomes choppy for a few seconds. Haven't been playing to much of anything else really.

Mostly i would like to get a second one so i can send the second monitor, PhysX and CUDA (folding, encoding) to GPU 3 and 4. Which reminds me, i'm not going to remove my card now so i'll have to study some pictures, but in order to use PCI-E slot #3 i would have to dremel some of the plastic near the bottom of the PCB to clear the cabling at the bottom of my motherboard. I tried the third slot when i had dual 5870s and i couldn't get the card in all the way because of front panel, usb and sata cables.

And of course i might have to replace my PSU? Corsair are conservative on their ratings so don't know if my 850 will cut it. Only doing 4GHz on the CPU.


----------



## L1eutenant

How much of a performance boost would you get running 2 x GTX590 SLI?


----------



## Scorpion49

I have a question for the 590 guys, is it true the EVGA classified 590 is shorter in length than the other ones? I'm seeing 11.5" for the Asus and 11" for the EVGA. Just wondering if anyone can confirm that, my buddy is looking at one for a SFF case and it only has 285mm of space, so the 11" would fit but possibly not the longer one if that is in fact the case.


----------



## jcde7ago

Quote:


> Originally Posted by *L1eutenant;15239739*
> How much of a performance boost would you get running 2 x GTX590 SLI?


It would be a decent amount, but honestly, 2x 3GB 580s in SLi would be a much better choice, since anything above 1920x1200 (2560x1440/1600, or triple-screen setups with or without 3D Vision) will be severely VRAM limited.

So, if you're someone that likes to up the AA to the max and all the settings at the absolute highest, you're going to want a card with more VRAM at resolutions higher than 1920x1200 - otherwise, all that quad-SLI power is going to be a waste (and driver support for quad-SLI is not even that great to begin with).


----------



## L1eutenant

Quote:


> Originally Posted by *jcde7ago;15239795*
> It would be a decent amount, but honestly, 2x 3GB 580s in SLi would be a much better choice, since anything above 1920x1200 (2560x1440/1600, or triple-screen setups with or without 3D Vision) will be severely VRAM limited.
> 
> So, if you're someone that likes to up the AA to the max and all the settings at the absolute highest, you're going to want a card with more VRAM at resolutions higher than 1920x1200 - otherwise, all that quad-SLI power is going to be a waste (and driver support for quad-SLI is not even that great to begin with).


So should i go with 2 x MSI GeForce GTX 580 Lightning Edition

Or with 2 x Gigabyte GeForce GTX 580 Ultra Durable 3GB


----------



## MKHunt

Quote:


> Originally Posted by *Scorpion49;15239778*
> I have a question for the 590 guys, is it true the EVGA classified 590 is shorter in length than the other ones? I'm seeing 11.5" for the Asus and 11" for the EVGA. Just wondering if anyone can confirm that, my buddy is looking at one for a SFF case and it only has 285mm of space, so the 11" would fit but possibly not the longer one if that is in fact the case.


The EVGA model is indeed 11" long. But all 590s are reference design so the ASUS should therefore be the exact same length. See if you can find a diagram for the ASUS model to see where they were measuring from. If they start at the rearmost edges of the DVI ports then it could well be 11.5".

My case can only fit 12" of video card but tubing with a 5/8" O.D. fits with ease behind my 590 without touching it.


----------



## Scorpion49

Quote:


> Originally Posted by *MKHunt;15241654*
> The EVGA model is indeed 11" long. But all 590s are reference design so the ASUS should therefore be the exact same length. See if you can find a diagram for the ASUS model to see where they were measuring from. If they start at the rearmost edges of the DVI ports then it could well be 11.5".
> 
> My case can only fit 12" of video card but tubing with a 5/8" O.D. fits with ease behind my 590 without touching it.


Good to know, thanks. I didn't think there were any custom ones but they measured differently so I had to be sure.


----------



## Wogga

about 2x590: in benchmarks i've got from x1.5 to x2 perfomance boost. in games...dunno, maybe less, maybe more. now i'm playing RAGE and it wont use any other GPU than GPU2 (60-90%) but still runs smooth.

2all: while not having nf200 on sabertooth, system uses 2 nf200 from cards..every (4) GPU works @PCIex16 2.0...something weird? GPU-Z lies or do i really have 64 virtual lines?


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;15220474*
> Question for anyone and everyone:
> 
> Are ID the only company that uses OpenGL? If so, why?


No, my understanding is that, they're the only company that actually advertises openly that they do.

If you know anything about the language, it's an API that's primary purpose is to "translate" cross-platform...

Essentially you're making Doom 3d from a 2d base...

I don't think it's used as widely but, considering it's constantly updated (4.2 now???) someone's gotta be using it.

~~

So, over the weekend, my pump blew on the rad box...I have 2 pumps for this build and unfortunately, it couldn't run with just 1...

I just ordered another from FrozenCPU and had it expressed so, hopefully, it will be here tomorrow.

Ordered 10 more for the office as backups as well...







Should have done that weeks ago.


----------



## andyvee

Hi all , just joined the forum as i have just ordered the above card to replace my ageing GTX 280 SLI setup and wanted to get some advise .

My basic setup is i7-930 on gigabyte x58a-ud7 rev 1 in an 800d case ( filled with xigametek fans ) with an hx1000 for power . My old cards setup would get high 70's on the hottest card while gaming so am i going to have any temp issues with the 590 , i wont be overclocking it , just use it ( novel concept i know







) and if i want to modify the fan profile can you suggest some options as i currently use Rivatuner which is now abandonware , i'm guessing Afterburner but not sure what else is out there as i don't think POV bundles anything .

Thanks all







.


----------



## RagingCain

Quote:


> Originally Posted by *Wogga;15245219*
> about 2x590: in benchmarks i've got from x1.5 to x2 perfomance boost. in games...dunno, maybe less, maybe more. now i'm playing RAGE and it wont use any other GPU than GPU2 (60-90%) but still runs smooth.
> 
> 2all: while not having nf200 on sabertooth, system uses 2 nf200 from cards..every (4) GPU works @PCIex16 2.0...something weird? GPU-Z lies or do i really have 64 virtual lines?


Not a cats chance in Hell









GpuZ has trouble reading mode its running in.

16 on the card for GPU to GPU comms. Actual lanes for CPU to Card comms. Gpuz sees both. Bottleneck is on the CPU comms.

Sent from my DROID X2 using Tapatalk


----------



## aptna

Hi guys, I recently bought Asus GTX590. I foolishly downloaded and ran the bios from Asus main website. It ruined my GPU and now I can barely make it to window screen.

I have tried to flash my GTX 590 with all the available bios out there. To my luck, nothing works.

Does any one know how I could fix this or if any one has bios version 70.10.42.00.02 ? I would hate for my hard-earned money to be wasted. Your help is much appreciated.


----------



## rush2049

Quote:


> Originally Posted by *aptna;15248370*
> Hi guys, I recently bought Asus GTX590. I foolishly downloaded and ran the bios from Asus main website. It ruined my GPU and now I can barely make it to window screen.
> 
> I have tried to flash my GTX 590 with all the available bios out there. To my luck, nothing works.
> 
> Does any one know how I could fix this or if any one has bios version 70.10.42.00.02 ? I would hate for my hard-earned money to be wasted. Your help is much appreciated.


Here: http://www.overclock.net/nvidia/1063263-gtx-590-flashing-overclocking-thread.html#post14179433


----------



## aptna

Quote:


> Originally Posted by *rush2049;15248399*
> Here: http://www.overclock.net/nvidia/1063263-gtx-590-flashing-overclocking-thread.html#post14179433


Thanks for the post. But, I have tried all the bios there. None works for my GTX590. At the back of my GTX it says the bios version is 70.10.42.00.02 which is newer than what are listed there.

Does anyone have this version of BIOS for ASUS? Or has anyone experienced this and fixed it?

Thank you.


----------



## H3KL3R

Quote:



Originally Posted by *aptna*


Thanks for the post. But, I have tried all the bios there. None works for my GTX590. At the back of my GTX it says the bios version is 70.10.42.00.02 which is newer than what are listed there.

Does anyone have this version of BIOS for ASUS? Or has anyone experienced this and fixed it?

Thank you.



Aptna, I have run into the same situation as yourself as Asus really should provide the BIOS version on the website so people can make an educated guess as to which version they are downloading and updating on to a very expensive graphics card.

I purchased a Asus GTX 590 and stupidly proceeded to download what i though to be the latest bios from the downloads section from the Asus website. Once the bios was installed, I restarted the machine and now graphic corruption occurs pre Windows & while booted into Windows.









I can just make out the desktop but something has clearly gone worng! Things I've tried are re-installing the bios which confirms it was successful, also tried using nvfash from a USB key to re-flash but no luck.

What I did notice was that the bios version that was present on the card in GPU -Z was a higher version to which I'm unable to trace on the internet. I stupidly did not back up this fiile....

If any new Asus GTX 590 owners could export their BIOS via GPU-Z it would be nuch appreciated - The version we need is 70.10.42.00.02 which is the newest BIOS for this card.


----------



## rush2049

Well I would help, but the bios on the page I linked is mine....

technically there shouldn't be any differences in the PCB's as it was a once and done run of cards.... the bios should work.

Are both of you remembering to flash both cores? That could be causing the issue...


----------



## H3KL3R

Hey pal. I'm from the UK and I'm guessing I've the European version.

I know this is the newest revision of the card and the newest bios. I think we just need to write over the non stock BIOS with the original version me and Aptna cannot obtain


----------



## Masked

Quote:



Originally Posted by *H3KL3R*


Hey pal. I'm from the UK and I'm guessing I've the European version.

I know this is the newest revision of the card and the newest bios. I think we just need to write over the non stock BIOS with the original version me and Aptna cannot obtain










There were //no// revisions.

1 release, 2 different production lines.

You do however, have whatever bios ASUS decided to infect their cards with...I'd call them, see if they'll throw it on the site...

Or someone w/an Asus could be so kind...


----------



## Recipe7

Is this the bios version you are in need of?

http://support.asus.com/download.asp...gPYBzck1dgp3ti


----------



## H3KL3R

Quote:



Originally Posted by *Recipe7*


Is this the bios version you are in need of?

http://support.asus.com/download.asp...gPYBzck1dgp3ti



That's the version we downloaded thinking it was the newest BIOS, in fact it has overwritten a newer version that was shipped on the card!


----------



## Recipe7

Ouch. Well, I'll keep an eye out for it and post it if I see it or get a old of it.

Please do the same, as I'm sure some people would require the latest bios as well.


----------



## max883

using nvidia driver. 285.38 windows 8. BIOS EVGA (Special) Stock Clock - 0.963v Min-1.063v Max - 100% Fan. And MSI Afterburner 2.2.8

Core 730 mem 1728 0.950V NO PDL!! Havent tried to go higer! wana try 3dmark 11, but it wont start! maybe it windows 8?. Heaven 2.5 and everyting else works! gaming is super smooth! Air coling.


----------



## H3KL3R

Quote:



Originally Posted by *Recipe7*


Ouch. Well, I'll keep an eye out for it and post it if I see it or get a old of it.

Please do the same, as I'm sure some people would require the latest bios as well.


Thanks dude, I appreciate it


----------



## Wogga

Quote:


> Originally Posted by *RagingCain;15247786*
> Not a cats chance in Hell
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GpuZ has trouble reading mode its running in.
> 
> 16 on the card for GPU to GPU comms. Actual lanes for CPU to Card comms. Gpuz sees both. Bottleneck is on the CPU comms.
> 
> Sent from my DROID X2 using Tapatalk


well, ok. still x8 is enough for each 590.
now waiting new "unlocked" driver revision, because second (newer) card seems to be more OCable. adding 2nd 590 rised idle temps from ~27 to ~30 and load temps from ~38 to ~45 so [email protected] is realistical freq/voltage. only one thing bothers - in new case backplates have almost no cooling so they became hotter (aprox. 60, can keep hand on them for a long period of time)


----------



## chaosneo

*EX*tremely happy owner of a GTX590, sailed through BC2 at 5040x1050 at everything MAX, smooth sailing...









i been following this thread for quite some time, last week decided to get one and in preparation for the launch of Battlefield 3







.

will go download 3Dmark tonight and see how the scores fare...


----------



## H3KL3R

bump


----------



## 01001oc

Quote:



Originally Posted by *aptna*


Hi guys, I recently bought Asus GTX590. I foolishly downloaded and ran the bios from Asus main website. It ruined my GPU and now I can barely make it to window screen.

I have tried to flash my GTX 590 with all the available bios out there. To my luck, nothing works.

Does any one know how I could fix this or if any one has bios version 70.10.42.00.02 ? I would hate for my hard-earned money to be wasted. Your help is much appreciated.


Hi there,

I have the 70.10.42.00.02 BIOS on my fresh bought GTX590's. Gigabyte version.










Here is the BIOS.


----------



## krazyatom

That's very strange I also downloaded asus bios from their direct web, but I don't have any problem so far.


----------



## krazyatom

Does anyone have problem playing BF3 @ ultra setting with single gtx 590?
I don't have bf3 beta keys, so I have no way to test them out. I keep reading and ppl are saying 1.5gb is not enough for bf3. Btw, I have 2560 x 1600 resolution monitor.


----------



## H3KL3R

Quote:



Originally Posted by *01001oc*


Hi there,

I have the 70.10.42.00.02 BIOS on my fresh bought GTX590's. Gigabyte version.










Here is the BIOS.


Hey man, thankyour for uploading the version of the BIOS
















I wonder if I can tweak the clock settings as I know the Asus is just slightly higher, as I'm guessing it will lower the clock rates to the same as the Gigabyte version.


----------



## Recipe7

Quote:



Originally Posted by *01001oc*


Hi there,

I have the 70.10.42.00.02 BIOS on my fresh bought GTX590's. Gigabyte version.










Here is the BIOS.


Nice, +rep!


----------



## toX0rz

I have bios 70.10.37.00.01

Whats the difference compared to 70.10.42.00.02 ?


----------



## yfz350rider

Quote:


> Originally Posted by *krazyatom;15265535*
> Does anyone have problem playing BF3 @ ultra setting with single gtx 590?
> I don't have bf3 beta keys, so I have no way to test them out. I keep reading and ppl are saying 1.5gb is not enough for bf3. Btw, I have 2560 x 1600 resolution monitor.


one 590 plays great for me but im on a 1920 x 1080 monitor. When I tried 5760 x 1080 i had to turn AA down and ambient occlusion off and it still was dipping into the 30s.


----------



## 01001oc

Quote:



Originally Posted by *Recipe7*


Nice, +rep!


Thank you. No probs.









Now I was wondering if there is anybody here that could voltage tweake my bios, becouse in the GTX590 voltage tweak thread all BIOS'es are 70.10.37.... and they don't work with my card. So I could really use some help on that.

Thank you and best regards,

Miha


----------



## Semedar

Quote:



Originally Posted by *01001oc*


Hi there,

I have the 70.10.42.00.02 BIOS on my fresh bought GTX590's. Gigabyte version.










Here is the BIOS.


Would it be wise to update my BIOS? I have the ASUS make.


----------



## Masked

Quote:



Originally Posted by *yfz350rider*


one 590 plays great for me but im on a 1920 x 1080 monitor. When I tried 5760 x 1080 i had to turn AA down and ambient occlusion off and it still was dipping into the 30s.


It was a beta...and this was pretty normal.

I would expect better driver support etc on release.


----------



## Jeppzer

Think my top 590 is about to die on me, in the process of RMA'ing it, just spent two days with techsupport trying to find the issue, and they just pointed me back to the retailer cuz they think it's a hardware failure on the card and it needs further testing.

I am dissapointed. Somewhat.
Still awesome cards.









Edit:


----------



## H3KL3R

Quote:


> Originally Posted by *Semedar;15274649*
> Would it be wise to update my BIOS? I have the ASUS make.


Hi, could you please upload your current Asus BIOS to the forum please?


----------



## aptna

Hey Man, Thnx a billion. Ur a life saver. Bios works fine even for my Asus GTXC590.

For those who may have trouble flashing this bios, flash it from DOS and use this command.

nvflash -4 -5 -6 name.rom

Good luck!


----------



## H3KL3R

Quote:


> Originally Posted by *aptna;15277353*
> Hey Man, Thnx a billion. Ur a life saver. Bios works fine even for my Asus GTXC590.
> 
> For those who may have trouble flashing this bios, flash it from DOS and use this command.
> 
> nvflash -4 -5 -6 name.rom
> 
> Good luck!


Hey Aptna,

What clocks is your card now reporting since you flashed with the MSI Bios?

I'm just curious to wonder as the Asus card is overclocked slightly!

*Original Asus GTX 590 Clocks
*


----------



## Scorpion49

Well I'm about to be part of the club, this was the most GPU power I can stuff into my new build so I just ordered an EVGA Classified 590. Should be here on friday according to amazon (no sales tax in cali yet?) so I'm pretty excited.


----------



## aptna

Quote:


> Originally Posted by *H3KL3R;15277787*
> Hey Aptna,
> 
> What clocks is your card now reporting since you flashed with the MSI Bios?
> 
> I'm just curious to wonder as the Asus card is overclocked slightly!
> 
> *Original Asus GTX 590 Clocks
> *


Hey bro, it's " 608, 1707, 1215. I'm not too worried about it. I think the difference is minimum. And I am glad I can enjoy the card for now after having it sit idle for 2 weeks. lol


----------



## H3KL3R

Quote:


> Originally Posted by *aptna;15278971*
> Hey bro, it's " 608, 1707, 1215. I'm not too worried about it. I think the difference is minimum. And I am glad I can enjoy the card for now after having it sit idle for 2 weeks. lol


I agree dude, better than it working that not


----------



## aptna

By the way, does anyone know how to setup sterioscopic 3d?

I went to NVIDIA control panel to enable it, then the screen just turned black. Also, I clicked set up and it also turned black. What could the the reason for this?

Ta!


----------



## krazyatom

Quote:


> Originally Posted by *yfz350rider;15268846*
> one 590 plays great for me but im on a 1920 x 1080 monitor. When I tried 5760 x 1080 i had to turn AA down and ambient occlusion off and it still was dipping into the 30s.


Thanks, I guess I have to wait til 25th to find out


----------



## chaosneo

i remember reading many post ago that it is required to proof pics of ownership?









the cable management part is still incomplete, waiting for quite a bit of clips, UV CCFL, sleeves being posted to me. trying to get a blue/black theme into this interior.

i downloaded the basic/free and ran 3DMark 11 and my scores were SURPRISINGLY LOW







. ran it a few times, on one screen only, under Performance Preset. my scores hover around P5000 - P5200. shockingly low right?

so i thought it could be a few factors, but i am not sure...

*- driver?*
using 280.26, according to Nvidia Update, its the latest and it was a clean install.

*- heat?*
open casing at the moment, one GPU core idling around 45 Celsius. the other is even lowered around 5-10 Celsius.

*- PSU under-ed powered?*

*- CPU need to be oC?*

*- Windows 7 64bit Ultimate?*
haven't update it yet, it was a clean install after building my new setup.


----------



## yfz350rider

Quote:



Originally Posted by *Scorpion49*


Well I'm about to be part of the club, this was the most GPU power I can stuff into my new build so I just ordered an EVGA Classified 590. Should be here on friday according to amazon (no sales tax in cali yet?) so I'm pretty excited.


Thats awesome I just ordered one from amazon on the 8th and got it in today. It came in a much smaller box than the first 590 I ordered. This one has a different bios on it too and the voltage is increased just a tad bit.


----------



## Scorpion49

Quote:



Originally Posted by *yfz350rider*


Thats awesome I just ordered one from amazon on the 8th and got it in today. It came in a much smaller box than the first 590 I ordered. This one has a different bios on it too and the voltage is increased just a tad bit.


Does the new classy still come with the full backplate? I hate the half-plates the Asus one has.


----------



## yfz350rider

Quote:



Originally Posted by *Scorpion49*


Does the new classy still come with the full backplate? I hate the half-plates the Asus one has.


Yup only difference is in the smaller box is no mouse pad, no Tshirt, a cooler sticker than the old one, and the card has a newer bios that ups the voltage to .938


----------



## Scorpion49

Quote:



Originally Posted by *yfz350rider*


Yup only difference is in the smaller box is no mouse pad, no Tshirt, a cooler sticker than the old one, and the card has a newer bios that ups the voltage to .938


Nice, good to know. Could care less bout t-shirt and mouse pad but the voltage is nice


----------



## MKHunt

Quote:



Originally Posted by *yfz350rider*


Yup only difference is in the smaller box is no mouse pad, no Tshirt, a cooler sticker than the old one, and the card has a *newer bios that ups the voltage to .938*


If warranty will not be voided, do want.


----------



## yfz350rider

Quote:


> Originally Posted by *MKHunt;15285992*
> If warranty will not be voided, do want.


I think as long as you dont use another manufactures bios the warranty isnt an issue. I used a EVGA modded bios to bump mine to .975 with 100% fan speed enabled (default only allows for 95%). I was able to reach clocks of 705/1410/1975 with highest temp of 76C (fans not even 100% at that time). I posted my bios in the overclocking thread for the 590.


----------



## krazyatom

Is it possible to add a single gtx 580 1.5gb to my sig for tri-fire?


----------



## MKHunt

Quote:


> Originally Posted by *krazyatom;15295745*
> Is it possible to add a single gtx 580 1.5gb to my sig for tri-fire?


No sir. Would be awfully neat if we could though.


----------



## Tuthsok

Hi! Can I Join? ^_^

Please excuse the current cable mess, PSU and 2nd 590 are new and just roughed in knowing that all QDCs need to be RMAed. It will look prettier after it gets rebuilt with a second external radiator hung off the back.

Brand: EVGA Hydro Copper
Clocks: Currently Stock


----------



## yfz350rider

Quote:



Originally Posted by *Tuthsok*


Hi! Can I Join? ^_^

Please excuse the current cable mess, PSU and 2nd 590 are new and just roughed in knowing that all QDCs need to be RMAed. It will look prettier after it gets rebuilt with a second external radiator hung off the back.


Hey im thinking about water cooling my 2 590s.  Is the only rad you have for your system a 3x120? How are your gpus temps? Im trying to get an idea of how much stuff I need.


----------



## Tuthsok

Quote:



Originally Posted by *yfz350rider*


Hey im thinking about water cooling my 2 590s. Is the only rad you have for your system a 3x120? How are your gpus temps? Im trying to get an idea of how much stuff I need.


Yes, I am currently running only a single 3x120 thin 20 fpi rad. Surprisingly it is holding it's own (CPU and GPUs currently running @Stock).

When running benchmarks (Heaven, Metro 2033) with settings pushed to the max my GPUs are holding at about 50C with my fans running at 80% and my flow rate at 0.6 gpm.

I am happy with that but mostly want to add a second radiator as my coolant temp is higher that I would like to see at about 42C during these tests. In case ambient is 23C for me this time of year.

Hope that helps!


----------



## Scorpion49

Quote:



Originally Posted by *Tuthsok*


Yes, I am currently running only a single 3x120 thin 20 fpi rad. Surprisingly it is holding it's own (CPU and GPUs currently running @Stock).

When running benchmarks (Heaven, Metro 2033) with settings pushed to the max my GPUs are holding at about 50C with my fans running at 80% and my flow rate at 0.6 gpm.

I am happy with that but mostly want to add a second radiator as my coolant temp is higher that I would like to see at about 42C during these tests. In case ambient is 23C for me this time of year.

Hope that helps!


Thats actually really helpful to me, as I am doing a build and it looks like 120.4 is all I can fit, possibly 120.5, but I'm only going to run a single 590 plus the CPU so I should be golden. Thanks a ton for that info.


----------



## TerrabyteX

Quote:



Originally Posted by *TerrabyteX*


Add me to the club







Stock clocks atm Gigabyte subvendor.




You didn't add me :\\ . Btw here is another shot from OS of my card


----------



## Masked

I need to finish a couple final touches on the Rad box + on the main setup...Make it look all perdy for you guys.

Will have pics up soon.

That being said, with the complete "load" off of the 590 for all 3 screens...Gaming has gone from "okay" to "***BBQ why didn't I do this sooner?!?!"...So I am extremely happy.

I was suckered into Rift so, literally the only PC game I really play @home is rift but, on ultra, it's definitely looking pretty.

When BF3 comes out, I'll get some benchmarks going and get a stable OC going...


----------



## Scorpion49

Count me in, classy just got here (super-crappy cell phone pix warning):


















Quote:



Originally Posted by *yfz350rider*


Yup only difference is in the smaller box is no mouse pad, no Tshirt, a cooler sticker than the old one, and the card has a newer bios that ups the voltage to .938



Quote:



Originally Posted by *MKHunt*


If warranty will not be voided, do want.


It sure does up to volts to .938


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


I need to finish a couple final touches on the Rad box + on the main setup...Make it look all perdy for you guys.

Will have pics up soon.

That being said, with the complete "load" off of the 590 for all 3 screens...Gaming has gone from "okay" to "***BBQ why didn't I do this sooner?!?!"...So I am extremely happy.

I was suckered into Rift so, literally the only PC game I really play @home is rift but, on ultra, it's definitely looking pretty.

When BF3 comes out, I'll get some benchmarks going and get a stable OC going...


You got the kind of results that made you feel that way just by adding a 580 as a PhysX card? Or is there more to it?

Very interesting....


----------



## yfz350rider

Quote:



Originally Posted by *Scorpion49*


Count me in, classy just got here (super-crappy cell phone pix warning):


















It sure does up to volts to .938










Sweet man thanks for confirming the voltage on the newer ones. When I ran one 590 I could up the voltage to .975 and overclock to 705/1410/1975. If you have a good case maybe you want to look into that. Now that I have 2 it gets way too hot in there to even think about overclocking lol.


----------



## Scorpion49

Quote:



Originally Posted by *yfz350rider*


Sweet man thanks for confirming the voltage on the newer ones. When I ran one 590 I could up the voltage to .975 and overclock to 705/1410/1975. If you have a good case maybe you want to look into that. Now that I have 2 it gets way too hot in there to even think about overclocking lol.


Its going under water with a quickness. Trying to track down an XSPC RZR590 block for it but I can't seem to find one. NCIX carries them but the US site doesn't work for the past week or so.


----------



## MKHunt

Quote:



Originally Posted by *Scorpion49*


Its going under water with a quickness. Trying to track down an XSPC RZR590 block for it but I can't seem to find one. NCIX carries them but the US site doesn't work for the past week or so.


Any reason for XSPC? I'm rather fond of my Koolance block and would recommend them in a heartbeat.


----------



## Scorpion49

Quote:



Originally Posted by *MKHunt*


Any reason for XSPC? I'm rather fond of my Koolance block and would recommend them in a heartbeat.











1) I like the look of it

2) Space is super-premium and its the thinnest out of all the 590 blocks

3) Its cheaper than the Koolance NX590 but performs within 2*C of it and flows almost the same

Heres one of the reviews I was looking at: 590 Waterblock Roundup


----------



## Arizonian

Congrats Scorpion on the GTX 590, I'm jelly.


----------



## AlexRS

Cheers,

since the beginning I have bought a 590GTX and with early drivers I could overclock this graphics card pretty good. Especially the 280.19 driver was fantastic because he gave the opportunity to set the vcore to 1.050V stable.

With the new drivers it is imo not possible to do this again. I have also seen that a clock speed > 800 Mhz throttles the GPU.
My highest stable settings were 795/1790 MHz. The maximal temperature went up to 53 degrees.

I just want to know: Is it possible to rise the Vcore (eg. new BIOS) to 1,050V without any problems? Maybe it will not go?

Here's a photo of my PC:









Sorry for my bad English


----------



## H3KL3R

Sorry bit late here, add me to the 590 Club







Asus GTX 590 Stock


----------



## Miraul

upload ur bios plz


----------



## H3KL3R

Quote:


> Originally Posted by *Miraul;15318110*
> upload ur bios plz


Who are you aiming the question to?


----------



## Scorpion49

Quote:


> Originally Posted by *H3KL3R;15318638*
> Who are you aiming the question to?


Was wondering the same thing.


----------



## yfz350rider

Quote:



Originally Posted by *Miraul*


upload ur bios plz


Bios is uploaded check the overclocking gtx 590 thread I put both my old bios and new one in a link there. I am not sure if you can use that new bios on the old cards I would ask the OP in that thread before you attempt.


----------



## mightyphoenix




----------



## H3KL3R

Does anyone have the new Asus GTX 590 stock BIOS *70.10.42.00.01* & *70.10.42.00.02*?

I'm using an MSI BIOS which has under-clocked the card at present


----------



## Scorpion49

Quote:



Originally Posted by *H3KL3R*


Does anyone have the new Asus GTX 590 stock BIOS *70.10.42.00.01* & *70.10.42.00.02*?

I'm using an MSI BIOS which has under-clocked the card at present










What is it underclocking to?


----------



## Wogga

finally built my system and cleaned it a bit
now this
EK makes damn stylish waterstuff. next rebuilding - socket 2011 8-core extreme (i believe there will be one like 990X after 980X)


----------



## emett

Wogga is there a reason you don't have an ssd in your build? I stayed away from them myself as I had read of so many people having issues with them.


----------



## Tept

Quote:



Originally Posted by *emett*


Wogga is there a reason you don't have an ssd in your build? I stayed away from them myself as I had read of so many people having issues with them.


If you get an Intel SSD , particularly a 510 series, you likely will have 0 problems. Been thru 2 myself and have 2 friends with them, has been a plug and play deal every time.


----------



## Scorpion49

Quote:



Originally Posted by *Tept*


If you get an Intel SSD , particularly a 510 series, you likely will have 0 problems. Been thru 2 myself and have 2 friends with them, has been a plug and play deal every time.


I've used just about every brand besides OCZ, I know a few guys who had those (6 of them bought the revodrives when they were on sale) and they all failed within weeks. I haven't had a problem with samsung, intel, corsair, crucial or patriot.


----------



## MKHunt

Quote:



Originally Posted by *emett*


Wogga is there a reason you don't have an ssd in your build? I stayed away from them myself as I had read of so many people having issues with them.


Just avoid anything with a SandForce controller and you should be fine. Intel and Crucial are pretty much the two best manufacturers for SSDs in the business right now.


----------



## Scorpion49

Quote:



Originally Posted by *MKHunt*


Just avoid anything with a SandForce controller and you should be fine. Intel and Crucial are pretty much the two best manufacturers for SSDs in the business right now.


I haven't had any problems with SF drives yet... hopefully it stays that way. I just ordered a second ForceGT.


----------



## MKHunt

Quote:


> Originally Posted by *Scorpion49;15336002*
> I haven't had any problems with SF drives yet... hopefully it stays that way. I just ordered a second ForceGT.


My Agility 3 bluescreens every 12 hours.


----------



## L1eutenant

I have the OCZ Vertex 3 120GB SSD Drive

And i haven't had trouble yet..


----------



## H3KL3R

Quote:


> Originally Posted by *Scorpion49;15331910*
> What is it underclocking to?


Hi dude,

Here is the MSI BIOS version clocks:









As you can see, the ASUS BIOS stock clocks are a lot higher:


----------



## emett

Quote:


> Originally Posted by *H3KL3R;15338607*
> Hi dude,
> 
> Here is the MSI BIOS version clocks:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see, the ASUS BIOS stock clocks are a lot higher:


I wouldn't call that a lot higher.


----------



## Wogga

SSD? they're too expensive in Mb/$ count. maybe i'm still under impresson that their life is too short


----------



## emett

I also believe you need a powerboard that supplies power after the mains cut out, so they don't get wiped?


----------



## H3KL3R

Quote:


> Originally Posted by *emett;15338814*
> I wouldn't call that a lot higher.


Your right, but I wish it to be back at default to how it was originally


----------



## emett

Why not just bump them up with msi afterburn, it's not like you'll need to increase the voltage.


----------



## Wogga

and you could rise it higher than just 612 =)


----------



## Masked

Quote:



Originally Posted by *Shinobi Jedi*


You got the kind of results that made you feel that way just by adding a 580 as a PhysX card? Or is there more to it?

Very interesting....


Sorry, I'm sick and spent all week in a dead closet room trying to fix a dying UPC which, actually blew up 5 minutes after I fixed it...Amazing weekend.

For me, my 580 holds 2 monitors which takes @15% gpu power...Which means, while I game, it bumps up to @35% for physx.

Is it insane overkill for physx? Yes...God yes...But, what it's doing is relieving the 2+ monitor stress from the 590.

My 590 is now quite nearly "bug free" (knock on wood) because I've managed to dodge 1/2 the situations in which bugs occur in a tri-monitor setup AND I can now SLI on the 590 for the first time since I really got the card.

The 590 concentrating on 1 monitor, makes a big difference, yes.

I got my personal 580 for JUST OVER my card budget when I was looking for a physx card, which was 325$ shipped. Had I not gotten that deal I would've settled on a 570 or similar.

It's not the 580 that makes the setup great, it's the fact that the 580 becomes the "work horse" while your prized stallion is left to chill at the gate.


----------



## max883

sold my gtx 590! I have now 2X ASUS GTX580 DirectCU II in SLI









gtx 590 Gpu 650mhz mem 1720 3dmark11 = 10243p.

GTX 580 SLI Gpu 1.Ghz mem 2000 3d mark11 = 14972p. no trottle!!


----------



## aptna

Hi guys,

I am now playing Shogun total war 2. On the graphics setting, it says GTX 590 with 1500mb. Is this normal? Does Shogun 2 only detect one card in GTX 590? Or is something wrong with my GPU?

Thanks a lot!


----------



## rush2049

Quote:



Originally Posted by *aptna*


Hi guys,

I am now playing Shogun total war 2. On the graphics setting, it says GTX 590 with 1500mb. Is this normal? Does Shogun 2 only detect one card in GTX 590? Or is something wrong with my GPU?

Thanks a lot!


Yes this is normal, each gpu only has 1.5 GB of ram. Each gpu is only aware of its own ram.

So, while yes the card does have 3 GB of ram, there is only 1.5 available for each card.

Games generally duplicate textures/data across the cards....


----------



## aptna

Thnx a lot bro. How have u been? Enjoying gtx590 now that it's fixed?


----------



## aptna

Oh and thanks to you too, Rush, for the answer


----------



## rush2049

delete me


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;15340152*
> Sorry, I'm sick and spent all week in a dead closet room trying to fix a dying UPC which, actually blew up 5 minutes after I fixed it...Amazing weekend.
> 
> For me, my 580 holds 2 monitors which takes @15% gpu power...Which means, while I game, it bumps up to @35% for physx.
> 
> Is it insane overkill for physx? Yes...God yes...But, what it's doing is relieving the 2+ monitor stress from the 590.
> 
> My 590 is now quite nearly "bug free" (knock on wood) because I've managed to dodge 1/2 the situations in which bugs occur in a tri-monitor setup AND I can now SLI on the 590 for the first time since I really got the card.
> 
> The 590 concentrating on 1 monitor, makes a big difference, yes.
> 
> I got my personal 580 for JUST OVER my card budget when I was looking for a physx card, which was 325$ shipped. Had I not gotten that deal I would've settled on a 570 or similar.
> 
> It's not the 580 that makes the setup great, it's the fact that the 580 becomes the "work horse" while your prized stallion is left to chill at the gate.


Sorry to ask like a noob, but does having the 580 in with Quad 590's allow for Nividia surround gaming full blast in general? Ideally with all the graphic settings set to max?

Or does it just take the stress off of games with PhysX support?

Because I was thinking about selling my 24" BenQ XL2410 120hz display and 3 year extended warranty for those new Asus 27" ones since the 590's seem to favor one monitor.

But if this configuration can make Surround gaming at full blast possible, then I may consider keeping it and adding two more instead of getting rid of it for a 27". As with proper calibration it gives one of the most fantastic pixel perfect picture quality I've ever seen on an LCD. The blacks are super deep without crushing.

I really love it, but the only thing it's missing for me is size and HDMI 1.4a input for PS3 3D gaming.

But I'm concerned about getting this new Asus and it's PQ after buying 2 of their 24" 120hz monitors and both having major dead pixels out of the box.

Throwing in a 580 to open up Surround gaming to how it should be sounds like a lot of fun. Congrats!


----------



## MKHunt

I believe what he means is that the 590 is connected to one monitor while he games and the 580 holds the other two (not playing the game) and handles PhysX. So the leftmost monitor could be a media player handled by the 580, the game via 590 in the center, and a web browser on the far right handled by the 580 as well.

So two screens that are _not_ the game handled by the 580 and the game on one screen handled by the 590.

That's what I got anyway.


----------



## OutlawNeedsHelp

Do I just have to post a pic of my 590 to join?


----------



## solsamurai

Quote:


> Originally Posted by *OutlawNeedsHelp;15347604*
> Do I just have to post a pic of my 590 to join?


Please read the OP.


----------



## Scorpion49

Got my water block on the way... I just love the way this looks compared to the others available:


----------



## OutlawNeedsHelp

My stock 590 with proof.

http://imageshack.us/photo/my-images/97/590proof.png/

http://imageshack.us/photo/my-images/571/photoer.jpg/


----------



## MKHunt

Quote:


> Originally Posted by *Scorpion49;15347732*
> Got my water block on the way... I just love the way this looks compared to the others available:
> 
> [/QUOTE]
> 
> Interesting, never seen someone with one of those before. What colors are in your build?


----------



## Scorpion49

Quote:


> Originally Posted by *MKHunt;15347793*
> Interesting, never seen someone with one of those before. What colors are in your build?


Mostly red and black. I don't mind the GPU block standing out because its the only one thin enough to make it all work. I have exactly 2mm of space between the back of the 590 and the front fans


----------



## Masked

Quote:



Originally Posted by *MKHunt*


I believe what he means is that the 590 is connected to one monitor while he games and the 580 holds the other two (not playing the game) and handles PhysX. So the leftmost monitor could be a media player handled by the 580, the game via 590 in the center, and a web browser on the far right handled by the 580 as well.

So two screens that are _not_ the game handled by the 580 and the game on one screen handled by the 590.

That's what I got anyway.


Bingo.

Works fabulously.

I'm currently looking for a new monitor myself, for just that reason, actually.


----------



## m3th0d1c4l

Count me in.











Running all at stock. Nvidia OEM model.


----------



## rush2049

Wow, for a set number of cards in existence, loving the new influx of members.

Welcome everyone!


----------



## MKHunt

Quote:



Originally Posted by *Scorpion49*


Mostly red and black. I don't mind the GPU block standing out because its the only one thin enough to make it all work. I have exactly 2mm of space between the back of the 590 and the front fans










Not sure I understand what you mean. Isn't the HAF-X a large case?


----------



## Scorpion49

Quote:



Originally Posted by *MKHunt*


Not sure I understand what you mean. Isn't the HAF-X a large case?


New build is going is going to look sort of like this, except the case is black and my plans are far more clever







:


----------



## kzinti1

EVGA has a new firmware upgrade that unlocks the fan control and adds a few enhancements here: http://www.evga.com/forums/tm.aspx?m=1034790
And, an in-depth report on working with the new Nvidia drivers, also at the EVGA Forum, here: http://www.evga.com/forums/tm.aspx?m=1174372 A must read for any Nvidia videocard owners.


----------



## Sir_Gawain

Quote:


> Originally Posted by *m3th0d1c4l;15361838*
> Count me in.
> 
> 
> 
> 
> 
> 
> 
> Running all at stock. Nvidia OEM model.


Whats up buddy, nice to see you over here!


----------



## andyvee

And me , gotta POV ultracharged here


----------



## Jeppzer

Card sent yesterday, wonder when they will return a new, working, one.


----------



## Masked

Well, I had another step-back last night which pushed me to making a major change.

My Classified 3 fried another 590...

At first, I believed it was the fault of my PSU but, after further review and considering it sparked out in literally the same place...There's no other finality other than my board is frying my card.

EVGA is RMA'ing it for me but, I'm tried of the issues so, I lit up my Newegg Preferred account last night.

Incoming is:

Z68 FTW, 16gb Ram, Sample I7 core, 120gb Intel SSD...Couple of other goodies including a new monitor.

Will have some results once I get my 590 back from RMA...


----------



## Smo

Quote:



Originally Posted by *Masked*


Well, I had another step-back last night which pushed me to making a major change.

My Classified 3 fried another 590...

At first, I believed it was the fault of my PSU but, after further review and considering it sparked out in literally the same place...There's no other finality other than my board is frying my card.

EVGA is RMA'ing it for me but, I'm tried of the issues so, I lit up my Newegg Preferred account last night.

Incoming is:

Z68 FTW, 16gb Ram, Sample I7 core, 120gb Intel SSD...Couple of other goodies including a new monitor.

Will have some results once I get my 590 back from RMA...


I wish my pockets were as deep as yours dude!


----------



## chaosneo

sorry to hear about that Masked...

and here is my proof pic, after updating my Windows 7, the scores got a lot better from 5k to 8k under Performance Set, 3Dmark 11.








i guess i am also the only owner to use the smallest PSU? quite suprisingly,it can handle the stress.

my 3Dmark Link >>> click here


----------



## Masked

Quote:



Originally Posted by *Smo*


I wish my pockets were as deep as yours dude!


I've fried now, 3x480's 2x580's and 2x590's on this motherboard.

Each time they said "let's try again, it can't be the motherboard"...Each time was 35$ to ARMA the cards...So, 7x35= a whole lotta $$$$.

I'm on water, 100% and each time they blamed a "leak" but, I use distilled so, there's 0 conductivity.

Last time they and myself blamed a surge but, it's very obvious to me at this point that it's the board itself so, I'll gladly pay off the newegg credit card for the next 9 months...Seriously.

The saddest part of it all really is that I'm now dubbed "the fryer of 590's" around the office









I really hope this new board doesn't come with the same "issues" that the X58's had...For real.


----------



## Recipe7

Quote:



Originally Posted by *chaosneo*


sorry to hear about that Masked...

and here is my proof pic, after updating my Windows 7, the scores got a lot better from 5k to 8k under Performance Set, 3Dmark 11.








i guess i am also the only owner to use the smallest PSU? quite suprisingly,it can handle the stress.

my 3Dmark Link >>> click here


You beat me with a lower wattage psu!

+rep for living on the edge, =p


----------



## Arizonian

Quote:



Originally Posted by *Masked*


The saddest part of it all really is that I'm now dubbed "the fryer of 590's" around the office









I really hope this new board doesn't come with the same "issues" that the X58's had...For real.


Im not questioning your sitatuation but I'd like to clear up if you think it's a X58 boards or yours specifically? Only because if it was we'd have heard this by many more X58 owners previous to yours and this is the first I'm hearing.

Are they sending you a new mother board replacement with another X58? If they do and your going Sandy Bridge will you be selling it?


----------



## derickwm

Can I haz join?

Proof


----------



## Wogga

no =) theres nothing in your card with 590 except GPUs (even device ID is different)


----------



## derickwm

Oh... Fine I'll just make my own club


----------



## MKHunt

Quote:


> Originally Posted by *Wogga;15384858*
> no =) theres nothing in your card with 590 except GPUs (even device ID is different)












It's not even close. Though you did get that at a steal compared to MSRP. It's almost as cheap as 2x 580s.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked;15380058*
> Well, I had another step-back last night which pushed me to making a major change.
> 
> My Classified 3 fried another 590...
> 
> At first, I believed it was the fault of my PSU but, after further review and considering it sparked out in literally the same place...There's no other finality other than my board is frying my card.
> 
> EVGA is RMA'ing it for me but, I'm tried of the issues so, I lit up my Newegg Preferred account last night.
> 
> Incoming is:
> 
> Z68 FTW, 16gb Ram, Sample I7 core, 120gb Intel SSD...Couple of other goodies including a new monitor.
> 
> Will have some results once I get my 590 back from RMA...


Which monitor did you get?

I'm definitely getting that new Asus 27" 120hz with lightboost and HDMI 1.4a

Especially after seeing a 3D demo of Uncharted 3 on a RealD movie theater projection system. It was really impressive.


----------



## Masked

See recent post.


----------



## derickwm

I'm not seeing how the mars is much different than a 590...

It has GF110s or whatever on the same board. I essentially paid the price of 2 retail BNIB 580s.

I basically have a volt modded 590 with a beastly ass aftermarket cooler on it.


----------



## Smo

Quote:


> Originally Posted by *derickwm;15389749*
> I'm not seeing how the mars is much different than a 590...
> 
> It has GF110s or whatever on the same board. I essentially paid the price of 2 retail BNIB 580s.
> 
> I basically have a volt modded 590 with a beastly ass aftermarket cooler on it.


It just isn't a 590 dude. It's like engine swapping an LS1 into a Mini and calling it a Corvette!

Make your own club bud - you'll be glad you did


----------



## derickwm

Everyone just be ragin my 590 actually does what the original 590 should have done. *Goes off and pouts*

/troll


----------



## krazyatom

Quote:



Originally Posted by *derickwm*


Everyone just be ragin my 590 actually does what the original 590 should have done. *Goes off and pouts*

/troll


my gtx 590 is better than your mars!


----------



## derickwm

<3

I still haven't even turned the damn thing on yet...


----------



## kzinti1

I'm certainly a little jealous of that MARS II. If it has a pair of 580's on the same card I see no problem, other than other people's jealousy, why you can't be in this club!
BTW:

I hope this is enough proof for me to be accepted into this illustrious club!


----------



## Masked

Quote:



Originally Posted by *Arizonian*


Im not questioning your sitatuation but I'd like to clear up if you think it's a X58 boards or yours specifically? Only because if it was we'd have heard this by many more X58 owners previous to yours and this is the first I'm hearing.

Are they sending you a new mother board replacement with another X58? If they do and your going Sandy Bridge will you be selling it?


Going to clarify what happened b/c I've gotten a few inquires about this.

Basically, my X58 I've had for...Since release day...Serial #30...It's the E770, so the moffset PCI power adapter is at the bottom.

On THIS motherboard it was always a requirement to have the molex in there because there were some power issues w/the top slot getting enough juice...So, IMHO there's a bad solder in there somewhere...It happens, no big deal.

I had 3x 480 samples some time ago...Every year I allow our interns to take home their work PC, it's their bonus...So, my work PC at the time (About 2 weeks before the 580's were released) had 3x 480's in TRI-SLI on a 980x...

Within a week from it being home, all 3 cards fried out at the same time...When I called EVGA they covered all 3 but, I was...miffed...so, I ended up swapping to a 580...Then I eventually added another...

I had talked to EVGA @ the board and they had said it wasn't possible, they had my back...Jacob said to keep going and if it was an issue and actually the board we'd deal w/it.

Well, both of those 580's mysteriously burnt out before I bought my original 590...Which, my personal 590 was #12...And I received it the morning of release.

When that Tropical Storm hit, the 590 surged out...There's absolutely no argument that was a pure 100% surge, nothing I can/could do to prevent it...So, that was an instant RMA...No questions asked...//EVERYTHING// else survived EXCEPT for that card.

I've had this new card for @3 weeks...Turned on my PC on Wednesday and it sparked out at the coupling...

At this point, it's my belief that the moffset is simply applying too much power to the rails and when there's a dual demand from the rails and the slot itself, it just fries out...

It would explain every issue I've had.

To be 100% honest, I don't need a 980x at home...I also have a sample 950...So, what I think I'm going to do is RMA the board, ARMA it for an advanced, sell the new board/cpu/ram...Take the 980x back to the office and call it a day.

I'm "in line" for a 2700k and truth be told, I was waiting for it to do this because I had my suspicions but, this whole situation just forced my hand a little sooner.

TLDR: Motherboard is frying my crap so it's getting replaced.

1 thing I would like to note is EVGA's customer service...I really need to give them props...

Quote:



Originally Posted by *Shinobi Jedi*


Which monitor did you get?

I'm definitely getting that new Asus 27" 120hz with lightboost and HDMI 1.4a

Especially after seeing a 3D demo of Uncharted 3 on a RealD movie theater projection system. It was really impressive.


I chose this monitor...http://www.newegg.com/Product/Produc...82E16824236104

I chose it for a couple of reasons...1 being that I have a surround stand w/max 24" but, also because MOST users off of the egg have basically stated how close to truecolor the monitor actually is...2ms response time...120hz...and it was offered at a special price (basically half off) so, I snagged one.

I'm likely going to be paying this card off till I turn to dust but, at least, like I said, I won't be frying my cards anymore.


----------



## Jeppzer

Quote:



Originally Posted by *derickwm*


Everyone just be ragin my 590 actually does what the original 590 should have done. *Goes off and pouts*

/troll


You're in _my_ 590 bro club.

*brofist*


----------



## Wogga

its not jealousy =) 
really. 590 is 10DE-1088, mars is 10DE-108B. 580 is 10DE-1080
mars isnt 590 or i could flash its bios w/o changing device ID and for sure no danger that mars bios will fry my card(s). as well mars is not 2x580 for device id reason


----------



## Masked

Quote:


> Originally Posted by *Wogga;15395546*
> its not jealousy =)
> really. 590 is 10DE-1088, mars is 10DE-108B. 580 is 10DE-1080
> mars isnt 590 or i could flash its bios w/o changing device ID and for sure no danger that mars bios will fry my card(s). as well mars is not 2x580 for device id reason


Since this has never actually been described...

Asus decided that the 590's could be more and had already laid out the PCB's for something "else"...So essentially, towards the very end of the 590 production (2x cherry picked 580 cores) they requested if they could get some extra cores.

Their request was granted and they dropped the cores on a beefed up PCB.

The only difference between the mars II and the 590 is the PCB...Same cores, same production run...Etc.

The bios is not the same because the power flow/etc is different due to the PCB being different.

While ours is basically 2x crippled 580's or basically overclocked 570's in SLI...The mars II is actually @90%...

Now, I don't think the 20% performance difference justifies the 700$ price difference...Nor, would I EVER and I mean EVER want to be dependant on Asus for custom drivers...But, in terms of the cards being alike, they're the same cores.

I have no problem with him being in here as he basically faces the same exact issues we do...Nvidia coins the cards as being the same so...


----------



## Recipe7

I'm planning to re-apply some TIM on my GTX 590.

Does anyone have a link to a tutorial which I can follow so I can have this done successfully and safely?

For reference, I will only be applying TIM, not adding a cooler or waterblock.


----------



## Scorpion49

Its getting closer:


----------



## emett

lol why would you want in on a 590 club when you have a mars II they are different cards.


----------



## MKHunt

Oh guys I have really terrible awesome news.

I have always had some coil whine with my 590. But that has changed. When I accidentally baked my card to the point that it shut down (same event that completely destroyed my old 2600k) it seems to have fundamentally changed. I used to have coil whine at 630MHz and when I OCed it to 685MHz for folding (same voltage) the pitch of the whine would change.

But now? Completely gone. I can run anywhere from 607 to 680 and TPFs are the same as before but now the card is dead silent.

tl;dr I baked my GPU on accident and it _got better_. Don't bake your GPU. There was a limited run of 590s and the number of them still kicking will only dwindle with time.


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


Going to clarify what happened b/c I've gotten a few inquires about this.

Basically, my X58 I've had for...Since release day...Serial #30...It's the E770, so the moffset PCI power adapter is at the bottom.

On THIS motherboard it was always a requirement to have the molex in there because there were some power issues w/the top slot getting enough juice...So, IMHO there's a bad solder in there somewhere...It happens, no big deal.

I had 3x 480 samples some time ago...Every year I allow our interns to take home their work PC, it's their bonus...So, my work PC at the time (About 2 weeks before the 580's were released) had 3x 480's in TRI-SLI on a 980x...

Within a week from it being home, all 3 cards fried out at the same time...When I called EVGA they covered all 3 but, I was...miffed...so, I ended up swapping to a 580...Then I eventually added another...

I had talked to EVGA @ the board and they had said it wasn't possible, they had my back...Jacob said to keep going and if it was an issue and actually the board we'd deal w/it.

Well, both of those 580's mysteriously burnt out before I bought my original 590...Which, my personal 590 was #12...And I received it the morning of release.

When that Tropical Storm hit, the 590 surged out...There's absolutely no argument that was a pure 100% surge, nothing I can/could do to prevent it...So, that was an instant RMA...No questions asked...//EVERYTHING// else survived EXCEPT for that card.

I've had this new card for @3 weeks...Turned on my PC on Wednesday and it sparked out at the coupling...

At this point, it's my belief that the moffset is simply applying too much power to the rails and when there's a dual demand from the rails and the slot itself, it just fries out...

It would explain every issue I've had.

To be 100% honest, I don't need a 980x at home...I also have a sample 950...So, what I think I'm going to do is RMA the board, ARMA it for an advanced, sell the new board/cpu/ram...Take the 980x back to the office and call it a day.

I'm "in line" for a 2700k and truth be told, I was waiting for it to do this because I had my suspicions but, this whole situation just forced my hand a little sooner.

TLDR: Motherboard is frying my crap so it's getting replaced.

1 thing I would like to note is EVGA's customer service...I really need to give them props...

I chose this monitor...http://www.newegg.com/Product/Produc...82E16824236104

I chose it for a couple of reasons...1 being that I have a surround stand w/max 24" but, also because MOST users off of the egg have basically stated how close to truecolor the monitor actually is...2ms response time...120hz...and it was offered at a special price (basically half off) so, I snagged one.

I'm likely going to be paying this card off till I turn to dust but, at least, like I said, I won't be frying my cards anymore.


That's the one I tried to buy at Fry's electronics but both the initial purchase and the exchange had dead pixels out of the box. Though I'm sure with your contacts and position, you can easily get that rectified should the same happen to you.

Then I went with the BenQ and it's been pure bliss. If they made a 27" model I'd buy it in a heartbeat but it doesn't seem as if they have one in the pipeline, so I'll be rolling the dice again Asus and their new 27"

To be clear, I really like Asus. I love my G73 notebook, so I was disappointed that I had two bad experiences with their monitor.

The guys in the PC department at Fry's who helped me, said that while they're notebook division is kick ass, their monitor division has been hit or miss. Hopefully you'll get a pristine one, but if you don't then I'm sure you'll be able to verify if their accusations are on point, or just sales B.S.

Congrats on the purchases!


----------



## Shinobi Jedi

Quote:



Originally Posted by *emett*


lol why would you want in on a 590 club when you have a mars II they are different cards.


I'd imagine because there's probably not enough owners hitting the boards to get a proper one going, being as limited as they are.

I don't have a problem with him joining either, if only it shows that we're PC gamers and rig enthusiasts who don't let hardware prejudices get in the way of the fun.


----------



## emett

I couldn't care less tbh. No skin off my nose. There was a thread a while back where someone asked the moderators to make a mars 2 club.


----------



## max883

Sold my gtx 590, and now i have 2X GTX 580 Direct Cu 2 in SLi







very hapy cos i got it for the prise of the gtx590!

In the first test at the begining of 3dmark11 i got 58.fps max With gtx590.
In the first test at the begining of 3dmark11 i got 89.fps max wit GTX 580 SLi. With 950mhz core. wow.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi;15402653*
> That's the one I tried to buy at Fry's electronics but both the initial purchase and the exchange had dead pixels out of the box. Though I'm sure with your contacts and position, you can easily get that rectified should the same happen to you.
> 
> Then I went with the BenQ and it's been pure bliss. If they made a 27" model I'd buy it in a heartbeat but it doesn't seem as if they have one in the pipeline, so I'll be rolling the dice again Asus and their new 27"
> 
> To be clear, I really like Asus. I love my G73 notebook, so I was disappointed that I had two bad experiences with their monitor.
> 
> The guys in the PC department at Fry's who helped me, said that while they're notebook division is kick ass, their monitor division has been hit or miss. Hopefully you'll get a pristine one, but if you don't then I'm sure you'll be able to verify if their accusations are on point, or just sales B.S.
> 
> Congrats on the purchases!


No dead pixels (knock on wood) and it's almost the same screen size so, not missing too much.

It is really freaking shiny though...Like SHINY SHINY.

New card comes in @ Friday...New motherboard comes in @ Wednesday and all the other new stuff comes in @ Monday...So, busy week.

Will take some pics, definitely of the new setup and will post those up likely on Friday









Can't wait ~ Miss my baby


----------



## rush2049

Quote:


> Originally Posted by *max883;15406614*
> Sold my gtx 590, and now i have 2X GTX 580 Direct Cu 2 in SLi
> 
> 
> 
> 
> 
> 
> 
> very hapy cos i got it for the prise of the gtx590!
> 
> In the first test at the begining of 3dmark11 i got 58.fps max With gtx590.
> In the first test at the begining of 3dmark11 i got 89.fps max wit GTX 580 SLi. With 950mhz core. wow.


Where did you find that good of a price on those cards?


----------



## max883

i live in norway, so prise is god


----------



## Recipe7

Quote:


> Originally Posted by *Recipe7;15398740*
> I'm planning to re-apply some TIM on my GTX 590.
> 
> Does anyone have a link to a tutorial which I can follow so I can have this done successfully and safely?
> 
> For reference, I will only be applying TIM, not adding a cooler or waterblock.


Bump. Anyone able to help me out?


----------



## Scorpion49

Quote:


> Originally Posted by *Recipe7;15409373*
> Bump. Anyone able to help me out?


Here is a good dis-assembly video to help you out:





http://www.youtube.com/watch?v=LNNg3rbmiM0[/ame[/URL]]

When I re-apply TIM to my cards I do it just like a CPU and spread and even thin layer over the whole heat spreader like this:





http://www.youtube.com/watch?v=aU2_uP9S9Gg&feature=related[/ame[/URL]]

Heres a good tidbit from the EVGA forums before you go scraping it off:
Quote:


> The stock TIM on the 5xx series is Shin Etsu X23. It's one of the best out there. If you're going to replace it, do not replace it with a dumpy TIM.
> 
> I would recommend replacing with Shin Etsu X23 or G751 or IC Diamond. I would no longer recommend using AS5.
> 
> You mentioned temps in the 90s but nothing about your fan speed. What are you temps when you set the fan to 100%? That's a better baseline than auto fan.
> 
> I've noticed virtually no difference between Shin-Etsu X23, G751, or IC Diamond in terms of reducing GPU temperatures. They all seem to run within 1-2*C of each other. Why? Because they're the best.


----------



## Smo

I would have to advise against spreading the TIM like in the video above. The application (X) was fine, but by slapping your finger all over it you're going to introduce air pockets which will hinder the heat transfer.

If you absolutely must spread it, use an old credit card. Otherwise, apply the TIM in the shape of an X as above, then bolt the fan assembly back on. Job done.


----------



## Scorpion49

Quote:


> Originally Posted by *Smo;15409743*
> I would have to advise against spreading the TIM like in the video above. The application (X) was fine, but by slapping your finger all over it you're going to introduce air pockets which will hinder the heat transfer.
> 
> If you absolutely must spread it, use an old credit card. Otherwise, apply the TIM in the shape of an X as above, then bolt the fan assembly back on. Job done.


Air pockets? Seriously? With two flat pieces bolted together? Many of the higher end TIM compounds do not spread well at all with pressure because of the thickness. Using a credit card may work, or some of the kits come with little plastic spreaders, but there is nothing wrong with using your finger.


----------



## Smo

Quote:


> Originally Posted by *Scorpion49;15409836*
> Air pockets? Seriously? With two flat pieces bolted together? Many of the higher end TIM compounds do not spread well at all with pressure because of the thickness. Using a credit card may work, or some of the kits come with little plastic spreaders, but there is nothing wrong with using your finger.












That is a picture of a sedimentary rock formed under tremendous pressure. Yet it's aerated.

I don't want to be pedantic mate but yes, it can happen!


----------



## Scorpion49

Quote:



Originally Posted by *Smo*


That is a picture of a sedimentary rock formed under tremendous pressure. Yet it's aerated.

I don't want to be pedantic mate but yes, it can happen!


I don't think that has anything to do with this, given there is four sides of the heat spreader for air to escape on (not that there will be any to begin with). A simple google search will tell you your spreading method has to change depending on the consistency of the TIM, thick = manual spread, thin = weight spread.

If you try to weight spread the thick stuff its just going to be a blob in the middle and act as an insulator since it will likely be too thick.


----------



## Recipe7

Thank you both for your advice. I will take everything into consideration. rep for you both.


----------



## Wogga

you can look at this manual for EK waterblocks. i did like they said and having still damn less than 40* under load on both my cards (count OCed 2600K too, they're all in one circuit)


----------



## supergoopinator

Proof
So as you can see in my proof photo, I seem to have a pretty good system, but I can't find any 3dmark scores of identical systems that are anywhere near as low as mine. Any ideas?


----------



## MKHunt

Here are the application notes for IC Diamond, my TIM of choice. I've used Gelid, Shin Etsu, AS5, MX-4, and ICD and it's IMO the thickest of the above TIMs.

I don't believe the application should be varied by thickness, but rather by the TIM's solid:solvent ratio. Those with less solvent must be spread by pressure whereas those with more can be manually spread.

Also, I wouldn't repaste an EVGA 5XX series due to the quality of the paste used. When I blocked my card I was actually impressed by the quality of the paste and the nearly perfect amount used. Maybe my card was an exception, but why change something for the same or maybe 1-2* difference? The quality of the application alone can make it worse even if you go up a level in TIM.

ETA: Just saw the card in question is an ASUS. I'd probably repaste it








I vote for IC DIamond


----------



## Recipe7

More great info, thanks MKHunt and Wogga.


----------



## emett

Guys is it possible to run a cracked version of 3dmark2011? Only reason I adk is i'd like to test on the exterme setting. Was douting a cracked version was around because the program loads a webpage to give the score.

Never mind, found it


----------



## Wogga

*supergoopinator*, here is X score of OCed single 590 so yours stock clock score is fine


----------



## chaosneo

i think my GTX590 just crash, after playing BadCompany2 for around 30min.

Ambient Temp: 29 celcius
Last 15 second record from GPUZ Log:
GPU Load: 95%-99%
one GPU Temp (always the higher temp core): 98 celcius

i guess it got too hot, restarted without problem, going to remedy it with a loud 80mm fan.


----------



## supergoopinator

Quote:


> Originally Posted by *Wogga;15418174*
> *supergoopinator*, here is X score of OCed single 590 so yours stock clock score is fine


Thanks Wogga. I think I was comparing to people using 2 gtx 590's. They were getting scores of around x6000. In the compare screen they had sli enabled, except on my results it also says sli enabled, since I technically do.


----------



## Smo

Quote:



Originally Posted by *chaosneo*


i think my GTX590 just crash, after playing BadCompany2 for around 30min.

Ambient Temp: 29 celcius
Last 15 second record from GPUZ Log:
GPU Load: 95%-99%
one GPU Temp (always the higher temp core): 98 celcius

i guess it got too hot, restarted without problem, going to remedy it with a loud 80mm fan.


Absolutely - that was your card throttling to avoid damage. Are you using a custom fan profile?


----------



## Scorpion49

Quote:



Originally Posted by *Smo*


Absolutely - that was your card throttling to avoid damage. Are you using a custom fan profile?


Probably not. It seems like theres been a rash of people who don't know about fan profiles the last few days.


----------



## MKHunt

Friends, I am distraught. Very, very distraught. Here's why.

Not overclocked at all. Will EVGA give me trouble about "physical damage"?


----------



## Smo

Quote:



Originally Posted by *MKHunt*


Friends, I am distraught. Very, very distraught. Here's why.

Not overclocked at all. Will EVGA give me trouble about "physical damage"?


I'm sorry to hear it dude - I would assume that the card was at fault here pulling too much from the PSU for some reason and melting the pin. Of course I could be way off, but that's just my 2 cents!

I'd RMA the card - your PSU may well have survived and just killed the PCIe connector.


----------



## OutlawNeedsHelp

Since my last post wasn't recognized (Page 272 half way down) I'll just post it again. The date is the 17th because that's when I originally took the pic and I'm lazy.

Stock EVGA 590 
http://imageshack.us/photo/my-images/97/590proof.png/
http://imageshack.us/photo/my-images/571/photoer.jpg/


----------



## Smo

Quote:



Originally Posted by *OutlawNeedsHelp*


Since my last post wasn't recognized (Page 272 half way down) I'll just post it again. The date is the 17th because that's when I originally took the pic and I'm lazy.

Stock EVGA 590 
http://imageshack.us/photo/my-images/97/590proof.png/
http://imageshack.us/photo/my-images/571/photoer.jpg/


What do you mean by not recognised dude? If you meant that you weren't added to the club then try PMing Alatar.


----------



## OutlawNeedsHelp

Quote:



Originally Posted by *Smo*


What do you mean by not recognized dude? If you meant that you weren't added to the club then try PMing Alatar.


I think it would be fairly easy for a post to be unnoticed.


----------



## chaosneo

Smo, Scorpion49,
will look into the fan custom profile, do i configure it on Nvidia control panel?

if yes, how should i set it instead of default?
thank you for your thoughts...


----------



## Smo

Quote:


> Originally Posted by *chaosneo;15430048*
> Smo, Scorpion49,
> will look into the fan custom profile, do i configure it on Nvidia control panel?
> 
> if yes, how should i set it instead of default?
> thank you for your thoughts...


If you aren't already, use a program like the latest MSI Afterburner Beta. You can use that to increase the fan speed relative to temperature. You can use 100% fan speed whenever you like, depending on how acceptable you find the volume levels.

Personally, I use maximum speed from 70c.


----------



## Masked

Quote:



Originally Posted by *MKHunt*


Friends, I am distraught. Very, very distraught. Here's why.

Not overclocked at all. Will EVGA give me trouble about "physical damage"?


This makes me very curious because I've had the thing happen twice now only my rails have absolutely no damage.

I do believe in my situation that it was the fault of the motherboard however, if 1 of the cards was pulling more...this is possible as well.

Throwing my new build together at work then going to bring her home...pics later, as promised!


----------



## Smo

Quote:



Originally Posted by *Masked*


This makes me very curious because I've had the thing happen twice now only my rails have absolutely no damage.

I do believe in my situation that it was the fault of the motherboard however, if 1 of the cards was pulling more...this is possible as well.

Throwing my new build together at work then going to bring her home...pics later, as promised!


Looking forward to seeing it dude.


----------



## Semedar

Quick question:

Is is safe for my 590 to be @ 92-93c while running [email protected]? Don't want to damage it.









Edit: I'm on air, not WC.


----------



## Wogga

definately bad idea =) maybe you should try additional fans


----------



## Masked

Quote:



Originally Posted by *Wogga*


definately bad idea =) maybe you should try additional fans












Max temp for Nvidia Ref PCB's is 105c...

While I agree he should get more fans to better cool it...The card itself, is in absolutely no danger.


----------



## chaosneo

my GPU core reach 98c on GPU-Z ... not very far from 105c

Quote:



Originally Posted by *Masked*











Max temp for Nvidia Ref PCB's is 105c...

While I agree he should get more fans to better cool it...The card itself, is in absolutely no danger.


----------



## Masked

Quote:



Originally Posted by *chaosneo*


my GPU core reach 98c on GPU-Z ... not very far from 105c


I would think you need a better case for more air circulation as well then









Still that's a HUGE difference in terms of C...it may not seem like it but, it is.

100-105c is the "max range" and 105c is the max...

As long as you're not beating down the doors of 100c, I wouldn't worry about...

If you're at 98c then I would get a new case asap.


----------



## Semedar

Alright, thanks guys!









Feel a lot better now. I shall try to leave my Sig Rig on overnight to crunch some [email protected]


----------



## Masked

Quote:


> Originally Posted by *Semedar;15432824*
> Alright, thanks guys!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Feel a lot better now. I shall try to leave my Sig Rig on overnight to crunch some [email protected]


...I don't know if I'd go that far actually.

What you have to realize is that as your card circulates air if there's a build-up of hot air...That just recycles into the card.

The card SHOULD turn off automatically at 100c-105c but, that doesn't mean it //WILL//...I've actually had several that haven't in the past.

If you have good air management, then I wouldn't worry about it...If you don't have good air-flow...I wouldn't exactly leave it on overnight.

I seem to recall 3x480's heating my office to over 80f one 15f morning...Just the cards...because I left them on overnight...Seriously.

So just be cautious.


----------



## MKHunt

These cards _will_ change the temperature of a room. I fold a lot and my room is the hottest in the house, even with the window open and 50* temps outside.

If that's OK with you and your airflow is ok then go for it. On 285.27 drivers at 630/864 (stock EVGA) each die was turning about 15.4K PPD with an SMP client running.


----------



## dimisyro

Hi all,

I've recently bought a POV GTX 590 Ultra Charged (at 670Mhz) and while playing Metro 2033 for about 20 minutes, the card reached 97C, which is supposed to be the maximum operating temperature! At that point I exited the game. Later the same day I tried again and the temps kept rising until they reached 97 once again, at which point I stopped the game.

I have not overclocked the card myself. Also, I was actually playing with the case open, I have the latest Nvidia drivers and I live in Switzerland not the Sahara.

This is the only game that has made the card reach those temps, but I don't get it how they can release a card that can reach its limits like just by normal operation (even though I acknowledge Metro is very heavy, it is still a game that one should be able to play without fear of explosion...)

I'm looking for suggestions. Shall I RMA it (i.e. is it possible that this is an isolated case and that other cards of the same brand would not go as high)? Shall I try and undervolt it to the reference settings (would that void my guarantee??).

Thank you in advance.

@Masked: where did you get the 100-105 figures? Everywhere I've looked they list 97 as the max temp


----------



## Semedar

Quote:


> Originally Posted by *dimisyro;15437043*
> Hi all,
> 
> I've recently bought a POV GTX 590 Ultra Charged (at 670Mhz) and while playing Metro 2033 for about 20 minutes, the card reached 97C, which is supposed to be the maximum operating temperature! At that point I exited the game. Later the same day I tried again and the temps kept rising until they reached 97 once again, at which point I stopped the game.
> 
> I have not overclocked the card myself. Also, I was actually playing with the case open, I have the latest Nvidia drivers and I live in Switzerland not the Sahara.
> 
> This is the only game that has made the card reach those temps, but I don't get it how they can release a card that can reach its limits like just by normal operation (even though I acknowledge Metro is very heavy, it is still a game that one should be able to play without fear of explosion...)
> 
> I'm looking for suggestions. Shall I RMA it (i.e. is it possible that this is an isolated case and that other cards of the same brand would not go as high)? Shall I try and undervolt it to the reference settings (would that void my guarantee??).
> 
> Thank you in advance.
> 
> @Masked: where did you get the 100-105 figures? Everywhere I've looked they list 97 as the max temp


If all else fails, water cool it!







What kind of case you got? How many/what kind/where at are your fans placed inside your case? Where's the intake/exhaust fans at? What's your ambient temps?


----------



## Arizonian

Hey Dim,

In the Nvidia control Panel is the power management set to 'adaptive'?


----------



## kceenav

Quote:


> Originally Posted by *dimisyro;15437043*
> I'm looking for suggestions. Shall I RMA it (i.e. is it possible that this is an isolated case and that other cards of the same brand would not go as high)? Shall I try and undervolt it to the reference settings (would that void my guarantee??).


Probably these temperatures are normal for a "charged" GTX590, because manufacturers of factory-oc'ed cards tend to up the voltage to a point where even the least overclockable GPU will operate stable. It's up to you alone if you are willing to put up with that.

Regarding undervolting: You can try to lower the voltage with MSI Afterburner. Afterburner also allows to run the GPUs at lower clocks. Both methods will _lower_ the stress for the card, so no fear of losing your warranty.


----------



## Wogga

670 is a "small" OC. it needs somewhere between 0.950v and 0.963v - this voltage isnt so high compared to stock on some cards (0.925v) so temps should be nearly the same. ofc meaning good case cooling (like HAF 932 with noctua fans instead of stock)
maybe thermal interface isnt so good or stock cooling was set not properly (my second newer 590 had such problems)...


----------



## krazyatom

woot
9 more hours to play bf3 with my gtx 590! I hope I can max all settings with 2560 x 1600 resolutions.


----------



## ilukeberry

Quote:


> Originally Posted by *dimisyro;15437043*
> Hi all,
> 
> I've recently bought a POV GTX 590 Ultra Charged (at 670Mhz) and while playing Metro 2033 for about 20 minutes, the card reached 97C, which is supposed to be the maximum operating temperature! At that point I exited the game. Later the same day I tried again and the temps kept rising until they reached 97 once again, at which point I stopped the game.
> 
> I have not overclocked the card myself. Also, I was actually playing with the case open, I have the latest Nvidia drivers and I live in Switzerland not the Sahara.
> 
> This is the only game that has made the card reach those temps, but I don't get it how they can release a card that can reach its limits like just by normal operation (even though I acknowledge Metro is very heavy, it is still a game that one should be able to play without fear of explosion...)
> 
> I'm looking for suggestions. Shall I RMA it (i.e. is it possible that this is an isolated case and that other cards of the same brand would not go as high)? Shall I try and undervolt it to the reference settings (would that void my guarantee??).
> 
> Thank you in advance.
> 
> @Masked: where did you get the 100-105 figures? Everywhere I've looked they list 97 as the max temp


Dude, i own PoV GTX 590 "Charged Edition" -A1 revision which is clocked at 668MHz and i get max 70~75C when i play BC2. Yours is probably -A2 revision since they downclocked the cards in -A2 revision.

-A1 (old revision):

Charged = 668MHz
Ultra Charged = 692MHz

-A2 (new revision)
Charged = 640MHz
Ultra Charged = 670MHz

Make sure you have good airflow in the case.


----------



## Smo

Quote:


> Originally Posted by *dimisyro;15437043*
> Hi all,
> 
> I've recently bought a POV GTX 590 Ultra Charged (at 670Mhz) and while playing Metro 2033 for about 20 minutes, the card reached 97C, which is supposed to be the maximum operating temperature! At that point I exited the game. Later the same day I tried again and the temps kept rising until they reached 97 once again, at which point I stopped the game.
> 
> I have not overclocked the card myself. Also, I was actually playing with the case open, I have the latest Nvidia drivers and I live in Switzerland not the Sahara.
> 
> This is the only game that has made the card reach those temps, but I don't get it how they can release a card that can reach its limits like just by normal operation (even though I acknowledge Metro is very heavy, it is still a game that one should be able to play without fear of explosion...)
> 
> I'm looking for suggestions. Shall I RMA it (i.e. is it possible that this is an isolated case and that other cards of the same brand would not go as high)? Shall I try and undervolt it to the reference settings (would that void my guarantee??).
> 
> Thank you in advance.
> 
> @Masked: where did you get the 100-105 figures? Everywhere I've looked they list 97 as the max temp


For the moment mate you should set up a custom fan profile to control your temps - you could drop potentially ~15-20c that way.
Quote:


> Originally Posted by *krazyatom;15438646*
> woot
> 9 more hours to play bf3 with my gtx 590! I hope I can max all settings with 2560 x 1600 resolutions.


Easily - the lowest FPS I've seen so far with my 590 @ stock clocks is 48 in an intense scene and max memory usage so far has been 1.3GB. This is about an hours worth of gameplay @ 1920x1080.


----------



## emett

I am getting an error when trying to install these new drivers. "this grfx drive could not find compatible hardware".

How the hell do I get around this? I was about to do some benchmarking....
Any ideas guys??


----------



## Recipe7

Quote:


> Originally Posted by *emett;15439810*
> I am getting an error when trying to install these new drivers. "this grfx drive could not find compatible hardware".
> 
> How the hell do I get around this? I was about to do some benchmarking....
> Any ideas guys??


I bet you downloaded the wrong windows version.

I bet you have a 64 bit version... you probably downloaded the 32 bit.


----------



## Smo

Quote:


> Originally Posted by *emett;15439810*
> I am getting an error when trying to install these new drivers. "this grfx drive could not find compatible hardware".
> 
> How the hell do I get around this? I was about to do some benchmarking....
> Any ideas guys??


Did you uninstall the old set by using the device manager? If so, NVIDIA claim that the recent driver sets don't support this.

My suggestion would be to look in the Control Panel and uninstall anything NVIDIA related from there, then reset the machine into safe mode before installing the new driver set.

Good luck!


----------



## emett

Quote:


> Originally Posted by *Recipe7;15439948*
> I bet you downloaded the wrong windows version.
> 
> I bet you have a 64 bit version... you probably downloaded the 32 bit.


Yeah I stuffed up. Was sure I got the 64bit ones anyway thx it's working now.


----------



## Arizonian

Quote:



Originally Posted by *emett*


I am getting an error when trying to install these new drivers. "this grfx drive could not find compatible hardware".

How the hell do I get around this? I was about to do some benchmarking....
Any ideas guys??


Did you download 32bit or 64bit? I did that once and it didn't like the fact I was trying to download 32 bit on a Win7 64 bit.









Edited : Saw I was ninja'ed before I posted reply. No need to respond.


----------



## EM2J

add me

Friggin love this card. So glad I ditched the 6970s.


----------



## m3th0d1c4l

Anyone running the new drivers still getting the random fan spike on their GTX 590? I thought they would have fixed this, but mine is still doing it after updating to 285.62.


----------



## RagingCain

Well hello gents, just wanted to drop by and let you know I am up and running once more


----------



## MKHunt

Quote:



Originally Posted by *RagingCain*


Well hello gents, just wanted to drop by and let you know I am up and running once more










What day did you get things running? Did it happen to be yesterday (Sunday?). Based solely on casual observation I'm formulating a theory that between you and me there is a fixed amount of "up and running." My PSU and 590 are both on RMA as I type this. My 2600k was RMAed week before last and I just got it back last Thursday. As soon as my rig was under water and starting to feel its legs, your old rig went down.

Coincidence? Maybe.


----------



## RagingCain

Quote:



Originally Posted by *MKHunt*


What day did you get things running? Did it happen to be yesterday (Sunday?). Based solely on casual observation I'm formulating a theory that between you and me there is a fixed amount of "up and running." My PSU and 590 are both on RMA as I type this. My 2600k was RMAed week before last and I just got it back last Thursday. As soon as my rig was under water and starting to feel its legs, your old rig went down.

Coincidence? Maybe.


Holy crap ouch









Yes it was yesterday actually, I tested a theory that a wobbly stand-off was causing a short. I removed the stand-off off of the motherboard tray and voila. Power On, no random shut downs etc.

Maybe you can give me some advice, what temps did you see on the 2600k? I am thinking I have to re-seat its awfully high. At 5GHz and 1.39v it was peaking at 78c average of about 75c. Idle is anywhere from 24 to 35c. Its almost no rhyme or reason. Driving me nuts.


----------



## MKHunt

Quote:



Originally Posted by *RagingCain*


Holy crap ouch









Yes it was yesterday actually, I tested a theory that a wobbly stand-off was causing a short. I removed the stand-off off of the motherboard tray and voila. Power On, no random shut downs etc.

Maybe you can give me some advice, what temps did you see on the 2600k? I am thinking I have to re-seat its awfully high. At 5GHz and 1.39v it was peaking at 78c average of about 75c. Idle is anywhere from 24 to 35c. Its almost no rhyme or reason. Driving me nuts.


I would reseat. At 5.3 and 1.5V I was peaking at 62ish with the 590 folding at stock volts. (sadly the new chip isn't as nice) At my everyday OC of 4.6 1.31V things would hit 51 with everything folding and the case side on (only front as intake). This new chip (Malaysia vs Costa Rica) eats more volts and runs about 6C hotter.

This is with everything in one loop driven by a 35X at full blast and 1800rpm fans also running full and all as exhaust with a single identical fan as an intake. (Push on RX240, pull on RS240, push on RX120)

IIRC you have a more robust rad setup as well. But would a reseat really give a 10C+ difference? I expect when I set up my loop again (required complete teardown) I'll see lower temps. The Koolance block had TONS of grease in it so everything was coated with slime and grudge after only 2.5-3 months. Might want to check for that maybe.

Does the CPU block still have its factory bow shape? The 2600k has the cores in a straight line from enter top to center bottom, so if you align the bow with that area it might help a leetle. The bow being absent could also cause problems as well. My Rasa no longer has the bow shape so I just apply a tad more IC Diamond than I used to and it evens out.

Dunno, just kind of did a mind dump there.

ETA: I idle at ambient with EIST, C-states, and C1E all off. When my room drops down to the 50s the temp probes freak out and claim I'm idling at 11C on two cores and 17C on the other two.

Also what program are you using to monitor temps? CoreTemp read exactly 10.5C higher for me than the actual temperature.


----------



## RagingCain

Quote:



Originally Posted by *MKHunt*


I would reseat. At 5.3 and 1.5V I was peaking at 62ish with the 590 folding at stock volts. (sadly the new chip isn't as nice) At my everyday OC of 4.6 1.31V things would hit 51 with everything folding and the case side on (only front as intake). This new chip (Malaysia vs Costa Rica) eats more volts and runs about 6C hotter.

This is with everything in one loop driven by a 35X at full blast and 1800rpm fans also running full and all as exhaust with a single identical fan as an intake. (Push on RX240, pull on RS240, push on RX120)

IIRC you have a more robust rad setup as well. But would a reseat really give a 10C+ difference? I expect when I set up my loop again (required complete teardown) I'll see lower temps. The Koolance block had TONS of grease in it so everything was coated with slime and grudge after only 2.5-3 months. Might want to check for that maybe.

Does the CPU block still have its factory bow shape? The 2600k has the cores in a straight line from enter top to center bottom, so if you align the bow with that area it might help a leetle. The bow being absent could also cause problems as well. My Rasa no longer has the bow shape so I just apply a tad more IC Diamond than I used to and it evens out.

Dunno, just kind of did a mind dump there.


I appreciate the tips, I am a noob again with a new CPU and its nice, but BF3 is out tomorrow need a working computer







This is my 4th completely new architecture in 12 months. Will be last till post Ivy-Bridge and Maxwell GPUs.

I have a brand new spare Maximus IV Extreme P67 I am going to have to sell as well.


----------



## Smo

Welcome back Cain! How's Tri-Fire working out?


----------



## ilukeberry

Quote:



Originally Posted by *m3th0d1c4l*


Anyone running the new drivers still getting the random fan spike on their GTX 590? I thought they would have fixed this, but mine is still doing it after updating to 285.62.


So far so good, I've installed this drivers yesterday and i didn't have a single fan spike yet. I've had random fan spikes with 280.26 drivers so i downgraded to 275.33 but yesterday I've upgraded to 285.62 choosing clean install option, I'll see how it goes.


----------



## krazyatom

I also had random fan spikes before, but let's see how it goes for 285.62.


----------



## Masked

Quote:


> Originally Posted by *dimisyro;15437043*
> @Masked: where did you get the 100-105 figures? Everywhere I've looked they list 97 as the max temp


Was an Alienware tech before I made admin and we actually used "custom" PCB's that weren't as good as NVIDIA Ref for a little while (for the lower end models), those PCB's had a melt-down temp of 110c.

So my knowledge of it being @100c is one that comes from RL experience and the fact that every other card in the industry is 100-105c.

97c is what "techs" call, created for stupid...I.E. the general public...It's to protect the average person from melting/blowing up their ****.

I don't think any of you are in any REAL danger of that happening so, I'll shoot you straight.

I would say that if you're hitting 97c to begin with, you have 2 issues...Either 1, it's a hot card which, DOES happen...I mentioned my 480's earlier...One of them, even on water, LOVED to hit 60c on idle...Or 2, you have an air flow issue...

Fixing either situation will mostly solve your problem considering the 590 actually dumps inside your case as well...

~~~

That being said, I ran into a host of issues last night a long with a pre-order getting charged to my card







...I had forgotten I pre-ordered a 2700k from Newegg 3 weeks ago and it shows up tomorrow







.

That being said, after griping for 15 minutes at the 400$ surcharge on my card and whining to the girlfriend about it...I decided this was best because I'm going to overclock the living **** out of the 2700k!

So, TLDR, I have some pictures for you.

Parts list once I finish:

1x EVGA Z68 FTW
1x I7 2700k
1x EVGA GTX 590
1x EVGA GTX 580 SC
16gb Gskill Ripjaws
Intel 510 120gb SSD
2x 300gb Raptors in Raid
1x 1tb Caviar Black
Creative Fatality X-Fi Pro
Corsair AX1200










The swag...










The motherboard...With Raystorm...

I decided to try EVGA's thermal paste since it was free but, I do have AS5 on hand + some samples so, I'm going to see what works best.

At the moment, until the 590 arrives, I'm only on 1x580 and I want to kick BF3 to the curb so, I'm waiting for my new baby to really do anything.


----------



## m3th0d1c4l

Quote:


> Originally Posted by *ilukeberry;15444532*
> So far so good, I've installed this drivers yesterday and i didn't have a single fan spike yet. I've had random fan spikes with 280.26 drivers so i downgraded to 275.33 but yesterday I've upgraded to 285.62 choosing clean install option, I'll see how it goes.


Quote:


> Originally Posted by *krazyatom*
> I also had random fan spikes before, but let's see how it goes for 285.62.


I uninstalled all the Nvidia Drivers, Physx, etc from control panel, cleaned everything out with Driver Sweeper, and then installed the 285.62 drivers, and since yesterday, I have had 2 random fan spikes. Also, it seems when I am browsing the net and scrolling through webpages, my idle temps on the video card are hitting higher temps than before. Usually I would never really reach over 48-50C while doing things (complete idle is around 38-40C) and now I am reaching close to 58C.

I don't know if its from the new driver or what...but I am waiting to hear from more GTX 590 owners with the new driver to see what results they are getting. I guess I will try reinstalling the new drivers again, but I don't know what good it will do.


----------



## ilukeberry

As soon as i see fan spike with 285.62 i'll report here







currently it's running fine.


----------



## dimisyro

Quote:


> Originally Posted by *Semedar;15437179*
> If all else fails, water cool it!
> 
> 
> 
> 
> 
> 
> 
> What kind of case you got? How many/what kind/where at are your fans placed inside your case? Where's the intake/exhaust fans at? What's your ambient temps?


I have a Coolermaster Cosmos 1000. It has 2 fans on top, one at the bottom and the exhaust fan. I've added one on top of the HDDs but that should have no impact to the rest of the case.
Problem is that the case does not allow for a side fan as far as I can see.
Ambient temps are around 22C.
I'm thinking of buying the Lian Li BS-08B PCI cooler. Would you think that it would be a good idea?


----------



## dimisyro

Quote:


> Originally Posted by *ilukeberry;15438796*
> Dude, i own PoV GTX 590 "Charged Edition" -A1 revision which is clocked at 668MHz and i get max 70~75C when i play BC2. Yours is probably -A2 revision since they downclocked the cards in -A2 revision.
> 
> -A1 (old revision):
> 
> Charged = 668MHz
> Ultra Charged = 692MHz
> 
> -A2 (new revision)
> Charged = 640MHz
> Ultra Charged = 670MHz
> 
> Make sure you have good airflow in the case.


That's the thing, other games, including BC2 run at the temps you mentioned. It's only Metro 2033 which makes the temps so high...
Well, Furmark too after 15 mins or so.


----------



## andyvee

Quote:


> Originally Posted by *dimisyro;15449652*
> That's the thing, other games, including BC2 run at the temps you mentioned. It's only Metro 2033 which makes the temps so high...
> Well, Furmark too after 15 mins or so.


I also own a POV ultracharged A2 rev - 670 clock , its in a corsair 800d case ( all fan positions used and filled with xigmateks as its not known as a good air case ) and normally average high 60's after a few hours of bc2 , idle is low-mid 30's , ambient is around 22 .


----------



## dimisyro

Quote:


> Originally Posted by *Arizonian;15437259*
> Hey Dim,
> 
> In the Nvidia control Panel is the power management set to 'adaptive'?


Yes it is. I'm guessing it should be, right?


----------



## Masked

Quote:


> Originally Posted by *dimisyro;15449652*
> That's the thing, other games, including BC2 run at the temps you mentioned. It's only Metro 2033 which makes the temps so high...
> Well, Furmark too after 15 mins or so.


Quote:


> Originally Posted by *andyvee;15449704*
> I also own a POV ultracharged A2 rev - 670 clock , its in a corsair 800d case ( all fan positions used and filled with xigmateks as its not known as a good air case ) and normally average high 60's after a few hours of bc2 , idle is low-mid 30's , ambient is around 22 .


There is/was no second revision.

The cards were produced in 1 "run" on 2 different production lines.

I have 2 samples in the office that run 670mhz 24/7 and my "Rev 2" that runs 720mhz 24/7 in fact, both of mine do 700mhz+ 24/7.

At home, my card is a Rev 2 and runs 692mhz stock...

There were not 2 different versions...There were not 2 different productions...There was no real revision.

Your card will operate at the max specs it can run at...This is true of every card.

It is possible they were flashed differently but, the runs were identical...And I highly doubt they took the time to make the swap.

That being said Furmark pushes the fan to 100% which, doesn't exactly circulate cold air considering you need a very solid intake...This //can// cause heat issues...


----------



## andyvee

Quote:


> Originally Posted by *Masked;15449759*
> There is/was no second revision.
> 
> The cards were produced in 1 "run" on 2 different production lines.
> 
> I have 2 samples in the office that run 670mhz 24/7 and my "Rev 2" that runs 720mhz 24/7 in fact, both of mine do 700mhz+ 24/7.
> 
> At home, my card is a Rev 2 and runs 692mhz stock...
> 
> There were not 2 different versions...There were not 2 different productions...There was no real revision.
> 
> Your card will operate at the max specs it can run at...This is true of every card.
> 
> It is possible they were flashed differently but, the runs were identical...And I highly doubt they took the time to make the swap.
> 
> That being said Furmark pushes the fan to 100% which, doesn't exactly circulate cold air considering you need a very solid intake...This //can// cause heat issues...


I have no clue whether there was 1 or 2 batches made but i can say for sure that POV had the " rev 1 " cards clocked at 670 / 691 and the " rev 2 " cards clocked at 648 / 670 for the charged and ultra charged versions . probably just a bios mod as you suggest , perhaps 691 was a bit too much for their warranty department ?! .
Regardless at 670 i should have mentioned that i have setup a custom AB fan profile but apart from that its completely stock and has good temps / very quiet .
If i run vsync on it will barely hit 50 but cant deal with the mouse lag , kinda dissapointed that the fps limiter did'nt make the new drivers as they said it would but hey i digress







.
Lastly i should say i have never ran furmark on it , cant see the point ( ran it on earlier cards ) and all it does is stress the cards way beyond anything that will happen in the real world .


----------



## ilukeberry

Quote:


> Originally Posted by *m3th0d1c4l;15446339*
> I uninstalled all the Nvidia Drivers, Physx, etc from control panel, cleaned everything out with Driver Sweeper, and then installed the 285.62 drivers, and since yesterday, I have had 2 random fan spikes. Also, it seems when I am browsing the net and scrolling through webpages, my idle temps on the video card are hitting higher temps than before. Usually I would never really reach over 48-50C while doing things (complete idle is around 38-40C) and now I am reaching close to 58C.
> 
> I don't know if its from the new driver or what...but I am waiting to hear from more GTX 590 owners with the new driver to see what results they are getting. I guess I will try reinstalling the new drivers again, but I don't know what good it will do.


LOL, just got fan spike... back to 275.33







oh and btw, **** you nvidia.


----------



## ilukeberry

Quote:


> Originally Posted by *Masked;15449759*
> There is/was no second revision.
> 
> The cards were produced in 1 "run" on 2 different production lines.
> 
> I have 2 samples in the office that run 670mhz 24/7 and my "Rev 2" that runs 720mhz 24/7 in fact, both of mine do 700mhz+ 24/7.
> 
> At home, my card is a Rev 2 and runs 692mhz stock...
> 
> There were not 2 different versions...There were not 2 different productions...There was no real revision.
> 
> Your card will operate at the max specs it can run at...This is true of every card.
> 
> It is possible they were flashed differently but, the runs were identical...And I highly doubt they took the time to make the swap.
> 
> That being said Furmark pushes the fan to 100% which, doesn't exactly circulate cold air considering you need a very solid intake...This //can// cause heat issues...


Of course there in _NO_ second revision of Nvidia GTX 590 card but there is revision of PoV TGT overclocked cards which means that cards with -A2 revision sticker don't run at so high clocks as -A1 did.


----------



## Masked

Quote:


> Originally Posted by *andyvee;15449928*
> I have no clue whether there was 1 or 2 batches made but i can say for sure that POV had the " rev 1 " cards clocked at 670 / 691 and the " rev 2 " cards clocked at 648 / 670 for the charged and ultra charged versions . probably just a bios mod as you suggest , perhaps 691 was a bit too much for their warranty department ?! .
> Regardless at 670 i should have mentioned that i have setup a custom AB fan profile but apart from that its completely stock and has good temps / very quiet .
> If i run vsync on it will barely hit 50 but cant deal with the mouse lag , kinda dissapointed that the fps limiter did'nt make the new drivers as they said it would but hey i digress
> 
> 
> 
> 
> 
> 
> 
> .
> Lastly i should say i have never ran furmark on it , cant see the point ( ran it on earlier cards ) and all it does is stress the cards way beyond anything that will happen in the real world .


Quote:


> Originally Posted by *ilukeberry;15449967*
> Of course there in _NO_ second revision of Nvidia GTX 590 card but there is revision of PoV TGT overclocked cards which means that cards with -A2 revision sticker don't run at so high clocks as -A1 did.


The revision 1 cards are "unlocked" for the most part...The revision 2 cards are hit or miss...This goes for every single manufacturer because, like I said, the PCB's were made in 1 shot...They literally set up Kepler the next day.

I think certain companies started to flash the rev 2 cards differently just because of the time that they were released...Hardware wise, they're the exact same cards...Software, there are some different bios's out there so, if PoV were to have flashed their software differently, that I can buy.

However, like I said, we have 6 in the office and each has a different roof, some of the Rev 2's go higher than the Rev 1's and I know my original card (Evga SRL#30) could hit [email protected] without batting an eye.


----------



## dimisyro

great, I just realised that undervolting it to reference card settings in order to keep temps a bit more in check is not an option, since voltage control is locked (probably by the bios?)...


----------



## ilukeberry

Get in contact with PoV and tell them your problem.


----------



## m3th0d1c4l

Quote:


> Originally Posted by *ilukeberry;15449936*
> LOL, just got fan spike... back to 275.33
> 
> 
> 
> 
> 
> 
> 
> oh and btw, **** you nvidia.


Yea I just got one too. Third one in 2 days. lol.

I really did not get my hopes up for a fix in this release though, since I did not see it fixed in the Beta 285's. But I figured why not give it a shot? Maybe Nvidia would have addressed it......apparently not.

I think im going to go back to 275.33 as well. Is Nvidia even aware of the problem?


----------



## CJL

Can i join the Fan Spike club! Add me please...!!!! I will upload an mp3 audio recording as proof later.

Oh well, i'm not reverting back just yet. I'll continue to see how these drivers work out. I usually uninstall PhysX and the Display Driver from the CP, reboot and install the new drivers. For the first time i chose the clean install option. Also, i flashed my BIOS, went from 90 and 91 to 92 and 93.

So far 285.62 got me a DNR in 3DMark11 and Rage freezing my system right after launching from Steam (2nd launch after reboot worked). Now waiting for BF3 to install...


----------



## Scorpion49

Just got my XSPC RZR590 block in, all I can say is this thing is a work of art compared to the other designs that are available. Very happy I went with this one.


----------



## CJL

You're right, looks very nice. Like something off the same assembly line as the T800s. Last GPU i WCed was an X800, would love to do it again. Maybe someday.


----------



## MKHunt

You can see the caps through it? Interesting! If only they made it in black w/ black nickel...

Precut thermal pads though... SO jelly.


----------



## Scorpion49

Quote:



Originally Posted by *MKHunt*


You can see the caps through it? Interesting! If only they made it in black w/ black nickel...

Precut thermal pads though... SO jelly.


Yeah, seems like two of the blocks out have depressions for the caps, and two have cutouts like this. I don't care much either way but it looks very clean like it is.


----------



## MKHunt

Quote:



Originally Posted by *Scorpion49*


Yeah, seems like two of the blocks out have depressions for the caps, and two have cutouts like this. I don't care much either way but it looks very clean like it is.


It's so _thin_. Very sleek. The slight blue from the caps with the red in your build will turn out nicely I think.

You should clean the inside of the block and put it on... immediately









Then post pics. Many pics.


----------



## Scorpion49

Quote:



Originally Posted by *MKHunt*


It's so _thin_. Very sleek. The slight blue from the caps with the red in your build will turn out nicely I think.

You should clean the inside of the block and put it on... immediately









Then post pics. Many pics.


Hahaha maybe by the end of this weekend. Got a lot of cutting to do before it gets to that point.


----------



## Pipeman597

I have been using that block for months and all I can say is you won't be disappointed.


----------



## Scorpion49

Quote:



Originally Posted by *Pipeman597*


I have been using that block for months and all I can say is you won't be disappointed.


Nice to know! The reviews pretty much sold me on it when it was barely 1*C difference between it and the $~130 blocks that look like a lego project from the remedial class.


----------



## meowis

Can i join da club plox ?
Gigabyte GTX 590 @ stock speeds

proof :


----------



## Shinobi Jedi

So...

For the last few weeks I've been going back and forth playing Resistance 3 with the PS Sharpshooter (so much fun when you get past the learning curve) and Dungeon Siege III on PC.

Interestingly, Dungeon Siege III if run with V-Sync off would run all four of my GPU's to max and before long all the temps were climbing into the high 90's.

So it being a SP campaign, I put the Vsync back on and capped the game's FPS at 60fps, and played it off of my 60" Plasma on the couch with a XBox 360 pad for Windows on the couch. And noticing that all four GPU's were only being utilized at 20% vs. %98 with the V-Sync off, I pretty much just gamed and didn't pay much attention.

Then the other day after letting it run all night downloading BF3 pre-load and some other games, I noticed the temps on GPU 2 on the top card, the one with not very much air flow, (because the second card has to be slotted right below it for Quad GPU) were running 10c higher than the bottom card and 5c higher than GPU 1 -

When for the longest time both GPU 1 and 2 on the top card only ran about 5-8c above the bottom card with better airflow.

So last night after unlocking BF3, I opened up my rig and tried to move the 8pin PSU cable for the mobo that runs down over the cards before being looped out the back-side and then back in to the PSU. And I guess I knocked something the ever so slightest off, because then I started having jacked up graphical corruptions. Which I first thought might be due to splitting my top card's signal with one going to my 120hz monitor and the other going to my Asus soundcard for Blu-Ray HD sound/Home theatre via a DVI to HDMI cable, which then runs out via HDMI to my 7.1 Home Theater receiver.

After ruling this out, I went ahead and took the cards out and switched them around.

The new driver also kept crashing on me in game on BF3 and my monitor would flash "Device out of Range" before the driver would recover and BF3 would crash. But I'm assuming it's related to one of the cards not being secure and trust it's resolved.

Everything is working fine now, but my GPU 2 temp on the new switched out top card is now running idle at 60c with it's fan running at %70, while GPU 1 is at 41c and GPU 3 & 4 on the bottom card are running at 32c and 33c!

I'm not really tripping on the cards as I never overclock them and knew from EVGA tech support and boards, that like Masked said, you don't need to be tripping on your temps unless you hit 100c+. I might be tripping on them dying out sooner if I didn't have EVGA's life time warranty, but being that I do, I'm not.

Interestingly, in game, the temps stay the same they always have. Usually in the 70's to low 80's celcius. And that's still the case! Or to be more specific, GPU 2's temp are the same as the other GPU's during heavy usage.

What I'm wondering is, if it means that something might be going out on my Mobo? I already have one dead ram slot. But I haven't bothered RMA'ing it since I still have 8gb in the other two slots and don't do anything on this machine but game and write.

I was really impressed with the results I was getting on BF3 SP, Ultra settings, before I started having issues. The game was often running well over 120fps during the "prologue" board's firefights, and seemed to go back and forth between 90fps-140fps with the lowest fps I caught in the corner of my eye, being 58fps on the first big opening board after the prologue.

That should be more than enough to keep up with the hard core rigs and players in the game I'd like to think. But I was "noobishly" wondering, would a mobo and CPU upgrade get me to 120fps constant on Ultra during heavy action? Or is the current available hardware just not able to do it? I'm assuming no, and am fine with it.

Though I'm still trying to decide if I am going to get the same mobo and 2700K that Masked got, or if I'm going to wait for Ivy Bridge.

I'll have a better idea after some more play testing, I guess!









My other, slightly off topic, gaming dilemma that maybe those of you in the know, or beta, can help me decide:

After missing out on the KOTOR collector's edition pre-order, I bought the Digital Deluxe version. Really only wanting the collector's edition for the exclusive digital content rather than any physical goodies. And the only main difference between the two, seemed to be an exclusive store only available to those with the collectors edition.

So a kind soul on Kotaku let me know in a talkback thread, that here in the USA, Walmart.com still had the collector's edition available for retail price. So I snagged one up, before they were gone and it's arriving next week!

Now after seeing online vendors on Amazon.com selling them now for $300+, I'm starting to wonder if a mint, sealed in box collectors edition might turn into a truly valuable collector to SW or game collectors?

And if I should just shelve it with some of my other mint in box collectibles and see what happens and use my Digital Deluxe version rather than cancel it or gift it? Or open it and use the codes/account for the exclusive digital content and cancel or gift the Digital version?

General opinions or anyone in the beta that can answer without violating any NDA's would be huge!

Now, I'm going to go test BF3 and see if my issues are resolved. I'm trusting so.

Thanks to all that made it all the way through this novel of a post!

Cheers!


----------



## Swolern

Wow Battlefield 3 is such an amazing game. Absolutely gorgeous with the 590. Everything maxed on ultra and never drops below 60fps with Vsync, 1920x1080. Gorgeous! How is it working for everyone else?

@ Shinobi Jedi
Yes EVGA customer service is top notch. Their lifetime overclock warranty keeps my mind at ease with my EVGA 590. I would wait for the new Ivy Bridge processors. You will only have slight gaming performance enhancements vs the 920. No games use hyperthreading that I'm aware of and I read many people where having problems with HT and BF3 and were needing to disable it.

What resolution are you playing BF3 on? How's the 3d?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Swolern;15457681*
> Wow Battlefield 3 is such an amazing game. Absolutely gorgeous with the 590. Everything maxed on ultra and never drops below 60fps with Vsync, 1920x1080. Gorgeous! How is it working for everyone else?
> 
> @ Shinobi Jedi
> Yes EVGA customer service is top notch. Their lifetime overclock warranty keeps my mind at ease with my EVGA 590. I would wait for the new Ivy Bridge processors. You will only have slight gaming performance enhancements vs the 920. No games use hyperthreading that I'm aware of and I read many people where having problems with HT and BF3 and were needing to disable it.
> 
> What resolution are you playing BF3 on? How's the 3d?


Ha! Funny, you asked! After seeing that the new drivers had a 3D profile for BF3, and then 3D not working, I spent the last two hours trying to see what was wrong with my system before I went online and found out on 3DVisionBlog that it's been temporarily disabled until Dice can get it to be truly 3D Vision Ready and will release the ability in a later patch.

So, I don't know yet. But I read you on waiting for Ivy Bridge. It's pretty much where I'm leaning as I really want that new Asus 3DVision2 Lightboost ready 27" 120hz monitors even more and don't think I can afford both right now.

Finally, I'm gonna try this game proper. But so far, I couldn't be more happier that I ignored the "anti-hype" and went with the 590


----------



## derickwm

Quote:


> Originally Posted by *meowis;15455901*
> Can i join da club plox ?
> Gigabyte GTX 590 @ stock speeds
> 
> proof :
> 
> *snip*


590 hooked up via 4 molex adapters...sketch.


----------



## meowis

Quote:


> Originally Posted by *derickwm;15458146*
> 590 hooked up via 4 molex adapters...sketch.


lol sorry im still new in the computer building scene. I thought this was normal since thats what is provided in the box, other than that i don't have any other alternatives to plug in to the GTX 590 for power =s


----------



## kzinti1

Is this difference in clocks, at idle, supposed to be so far off or is this normal?
EVGA GTX590.


----------



## Scorpion49

Quote:



Originally Posted by *meowis*


lol sorry im still new in the computer building scene. I thought this was normal since thats what is provided in the box, other than that i don't have any other alternatives to plug in to the GTX 590 for power =s


You're saying your 1000w PSU has not 6 ot 8 pin PCI-E connectors? I don't think any company has sold one like that in at least 5 or 6 years.


----------



## Masked

Quote:



Originally Posted by *Shinobi Jedi*


*snip*


The jump from my 980x to the 2600k was enough for me to notice.
The jump from the 2600k to the OC'd 2700k was enough for me to







.

I can't comment on Ivy without some serious legal trouble but, I think the Z68 is a sturdy platform, especially from EVGA.

While eventually you might not be able to utilize 8 lanes of ram...Who in their right mind actually needs that much?

Ivy is supposed to have 1155 capable chips...I.E. the 3630k and with a bios update they'll be a simple PNP...

So, from my personal perspective...I think I made an educated decision...I think I made it at a good time and I'll be the first to tell you, I have a preorder for the 3630k ready to go.

Aside from that, I'm not buying BF3 until I get my 590 back which sucks because I actually have it on pre-order...That being said...I was 1 of the first @ 6am EST to order my SWTOR collector's edition from Amazon...Registered and I think overall, I was #50? ~ Glad I jumped on that ASAP.

Collector's editions go for more a couple years after...For example I have the statue from Killzone 3 and I've actually been offered a lot of $$$$ for it considering there are some knockoffs on ebay etc...

With Star Wars, you have a sure bet that it's going to appreciate regardless of how the game does because it's Star Wars, period.

So, TLDR: I'd buy a Z68 if I were you, the FTW is awesome...2700k maybe and just pop in a 3630k down the road.


----------



## derickwm

Gonna leave this here









*Dual GPU Card Club*


----------



## chaosneo

i am using the latest Nvidia Driver 285.62

Played Battlefield 3 at resolution 5040x1050.

This is what FRAPS & GPUZ tells me:
Campaign at ULTRA SETTING was great, the FPS hovers around 60-100.

Caspian Map with 64 players at ULTRA SETTINGS, averaging at 14-20 FPS.
GPU MEMORY hit 1.5MB at constant rate. also, after ALT-TAB a few times in and out, Direct-X reported failure to load and crash BF3 and drop into windows.

this time, all settings at ULTRA except AA which is switch it off. run an improvement of averaging at 40-70 FPS.

as for temperature, after increasing some casing fan speed.
Ambient: 29c
At Load: 87c-95c

Fan speed was running at 70% when it hit 85c. will download the MSI program these few days to custom the fan profile. probably make it run 100% fan speed when it hit 80c.

on another note, BATTLEFIELD 3 GRAPHICS IS GORGEOUS!


----------



## Recipe7

Quote:


> Originally Posted by *MKHunt;15412114*
> ETA: Just saw the card in question is an ASUS. I'd probably repaste it
> 
> 
> 
> 
> 
> 
> 
> 
> I vote for IC DIamond


So I'm coming around to finally re-pasting the GPUs... but I got to wondering...

My max temp at 100% fan and full load nets me 81C max. Are those respectable temps, or should I definitely go for the IC Diamond application?


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7;15468105*
> So I'm coming around to finally re-pasting the GPUs... but I got to wondering...
> 
> My max temp at 100% fan and full load nets me 81C max. Are those respectable temps, or should I definitely go for the IC Diamond application?


It's maybe 1-2C higher than I was seeing at 100% fan. 1-2C is NBD, esp. for these cards on air. If pasting is something you take pleasure in, go for it







Otherwise, play some games.


----------



## Recipe7

I don't take too much pleasure in it. However, sometimes I can hear voices coming from my CPU... reason why I decided to purchase a stick of IC Diamond.. it told me to.

I'm going to be re-seating my CPU and cleaning out the case with my new Datavac, might as well give a new layer to the GPUs as well.


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7;15468627*
> I don't take too much pleasure in it. However, sometimes I can hear voices coming from my CPU... reason why I decided to purchase a stick of IC Diamond.. it told me to.
> 
> I'm going to be re-seating my CPU and cleaning out the case with my new Datavac, might as well give a new layer to the GPUs as well.


I hear those same voices. "Buy more rads... you need a Raystorm... this water is too warm, get a chiller... 50C is too hot!..."

Freaking CPU and GPU urging me to buy new RAM and all sorts of expensive gadgetry. Sigh.

Aso on the Koolance waterblock, is it safe to use the pink pads? EVGA has my grey pads...


----------



## Scorpion49

Quote:


> Originally Posted by *MKHunt;15469584*
> I hear those same voices. "Buy more rads... you need a Raystorm... this water is too warm, get a chiller... 50C is too hot!..."
> 
> Freaking CPU and GPU urging me to buy new RAM and all sorts of expensive gadgetry. Sigh.


Those stupid voices get to me as well. Good thing this weekend I have a ton of stuff to do and hopefully get the new rig up and running. That might satisfy the voices for a little while.


----------



## Wogga

ha! i've sent them to hell and brought MoRA3








folding temps 36/37/36/38 and 65 on CPU. all in one loop. and cheaper than numerous 2x120 or 3x120 rads
hope it will handle OCed 3630K

my 590s started to make coil whine at 650MHz and higher under full load
if i put back ferrite rings on 8pin wires, can it help?


----------



## Recipe7

Quote:


> Originally Posted by *MKHunt;15469584*
> I hear those same voices. "Buy more rads... you need a Raystorm... this water is too warm, get a chiller... 50C is too hot!..."
> 
> Freaking CPU and GPU urging me to buy new RAM and all sorts of expensive gadgetry. Sigh.
> 
> Aso on the Koolance waterblock, is it safe to use the pink pads? EVGA has my grey pads...


Seems like my PC isn't as spoiled as yours, luckily. Although, this thread is giving it ideas...


----------



## Opiuz

Quote:


> Originally Posted by *chaosneo;15467313*
> i am using the latest Nvidia Driver 285.62
> 
> Played Battlefield 3 at resolution 5040x1050.
> 
> This is what FRAPS & GPUZ tells me:
> Campaign at ULTRA SETTING was great, the FPS hovers around 60-100.
> 
> Caspian Map with 64 players at ULTRA SETTINGS, averaging at 14-20 FPS.
> GPU MEMORY hit 1.5MB at constant rate. also, after ALT-TAB a few times in and out, Direct-X reported failure to load and crash BF3 and drop into windows.
> 
> this time, all settings at ULTRA except AA which is switch it off. run an improvement of averaging at 40-70 FPS.
> 
> as for temperature, after increasing some casing fan speed.
> Ambient: 29c
> At Load: 87c-95c
> 
> Fan speed was running at 70% when it hit 85c. will download the MSI program these few days to custom the fan profile. probably make it run 100% fan speed when it hit 80c.
> 
> on another note, BATTLEFIELD 3 GRAPHICS IS GORGEOUS!


i find it interesting you getting 60 -100 fps.

im running a tad bit better specs than your system and ive got 2 GTS590's SLI and overclocked 700 / 1400 / 1800 also running 3 monitors, i have to run my BF3 at medium settings and only get max 40 fps, so im really wandering what your doing differently than me.

after doing a lot of research this evening, i have found that the GTX590 Vram of 1.5G per GPU is not using each individual Vram, but rather is being shared between both cards so in essence, we only have 1.5G Vram for both GTX590's, and that running the 3 displays maxes said Vram out which is why im getting such shotty FPS, so please may i ask what are you doing differently to me.

also ive clocked my 2600k to 4.5G and my cpu idles along at 30% utilization


----------



## Arizonian

VRAM is mirrored not combined. Whether it's a dual GPU or two singles in SLI/Crossfire.


----------



## chaosneo

Quote:


> Originally Posted by *Opiuz;15489465*
> i find it interesting you getting 60 -100 fps.
> 
> im running a tad bit better specs than your system and ive got 2 GTS590's SLI and overclocked 700 / 1400 / 1800 also running 3 monitors, i have to run my BF3 at medium settings and only get max 40 fps, so im really wandering what your doing differently than me.
> 
> after doing a lot of research this evening, i have found that the GTX590 Vram of 1.5G per GPU is not using each individual Vram, but rather is being shared between both cards so in essence, we only have 1.5G Vram for both GTX590's, and that running the 3 displays maxes said Vram out which is why im getting such shotty FPS, so please may i ask what are you doing differently to me.
> 
> also ive clocked my 2600k to 4.5G and my cpu idles along at 30% utilization


that is interesting... QUAD GPU & SSD is not a bit tad better, its a lot better, not to mention you have a good PSU.

i have a few theories:

1. resolution wise...
mine: 1680 x 1050 x 3
yours: 1920 x 1080 x 3
your quad although more sufficient for that resolution but the Vram limit might be the case, mine at around 20% lower resolution than yours might need less Vram though.

2. My score was on Tutorial Campaign of BF3, that small map might also mean why i got such good FPS. i will get more FPS on different maps when i get more time to play.

3. Multiplayer really stress the GPU more than campaign. maybe cause there are more firefights?

maybe you can try lowering from 4AA to 2AA and see if there is any improvement but keep everything to ULTRA? i am no expert but i think AA really consume a lot of Vram. also, in games like BF3, AA or NO-AA, i find that i don't have time to enjoy looking at scenery or tiles on the road as i am busy tagging or put my cursor over others...









on another note,
temperature for my GPU now hovers at 86c to 89c, ambient 25c - 28c, and after some internal organizing and cable management.

_note:
correction on my earlier post, GPU hit 1.5GB at ULTRA SETTINGS not 1.5mb, sorry._


----------



## armartins

Quote:


> Originally Posted by *Opiuz;15489465*
> i find it interesting you getting 60 -100 fps.
> 
> im running a tad bit better specs than your system and ive got 2 GTS590's SLI and overclocked 700 / 1400 / 1800 also running 3 monitors, i have to run my BF3 at medium settings and only get max 40 fps, so im really wandering what your doing differently than me.
> 
> after doing a lot of research this evening, i have found that the GTX590 Vram of 1.5G per GPU is not using each individual Vram, but rather is being shared between both cards so in essence, we only have 1.5G Vram for both GTX590's, and that running the 3 displays maxes said Vram out which is why im getting such shotty FPS, so please may i ask what are you doing differently to me.
> 
> also ive clocked my 2600k to 4.5G and my cpu idles along at 30% utilization


besides the resolution difference there is the qua vs dual scaling, i.e. driver issue. Just use MSI AB OSD for showing the VRAM usage and report back I think you're not hitting the VRAM wall just yet (add one step of more of AA and you'll definitely hit it, and kill your GPUs even further), if you were hitting it you'd see FPS stalling to 1-5.


----------



## Scorpion49

Its pretty....


----------



## Opiuz

Quote:


> Originally Posted by *Arizonian;15490250*
> VRAM is mirrored not combined. Whether it's a dual GPU or two singles in SLI/Crossfire.


i get that, mu previous statements were incorrect, thanks.

additionally had teamviewer running, exited that, sorted everything out.

running 120FPS mulit monitor ultra settings, O and disabled hyperthreading, i hope they fix this in patches, along with friendly indicator on mulit monitor as well.


----------



## zerounleashednl

Hey guys, my GPU's are only at 50% most of the time in Battlefield 3...?

















By zerounleashednl at 2011-10-29


----------



## Wogga

*Scorpion49*, what is on GPUs? thermal gum? whats the temps?


----------



## Scorpion49

Quote:


> Originally Posted by *Wogga;15496364*
> *Scorpion49*, what is on GPUs? thermal gum? whats the temps?


Some kind of paste that came with the XSPC block. I decided to try it. Don't know temps yet it was leak testing over night.


----------



## Narong30

Hello guys,

Firstly, I would like to thanks to the founder of this forum. Because it give us opportunity to share our problem in order to solve it. And this is my first time I ask question at here.

Let's start with the topic.

I own a GTX 590 card (GIGAGABYTE), When I play game that have good graphic, I'll face problem with flickering object inside the game. Let' say it's something like this:




http://www.youtube.com/watch?v=Y_RHQkbDgsI&feature=related[/ame[/URL]] (I don't know what is the problem, flickering or what?)

I've try 3 different kind of driver. I still have problem with the flickering object. Below I list the driver that been used before.

Current : 285.62
Previous 1 : 285.38 Beta
Previous 2: 285.27 Beta
Previous 3: 280.26

The game that I play and I receive the flickering problem :

1) Call Of Duty: Modern Warfare 2
2) Call Of Duty: Black Ops
3) Battlefield Bad Company 2
4) Battlefield 3

I put all setting in highest. Let say Battlefield ultra with 1900 x 1080 resolution. But I've try to set it into normal setting. But after 20-30 minute play. I still get the error. I believe the setting is not a big deal. GTX 590 should can run any game at highest setting. Even in 2560 x 1600 resolution.

At the last, When I play the Battlefield 3. I get annoyed with the flickering, and I decide to stop playing it and continue surfing website. But I noticed that I get unusual symbol at my desktop. But the symbol disappear after few minute. So I decide to capture the picture of that symbol and upload at here.

Take look guys :

Picture 1:









Picture 2:









For your information, it's a same picture but at the different position. I try to capture the screen use "Print Screen" function. But I cannot see the symbol. Even it still appear at there.

So guys, hope someone willing to help me out from this problem. And, thank you for reading and reply. I love to troubleshoot problem and overcome it.

P/s: I'm start thinking that, this is not Driver problem. Maybe it was my Graphic Card Problem? Should I send to RMA? Got anything final test program I should go through before send it to RMA?

*//EDIT//*

Here I add GPU-Z picture. So you can see the BIOS version :
GPU 1









GPU 2:


----------



## rush2049

Narong30,

Looks to me like you have some bad ram on your graphics card. That is what causes a block of pixels like that to go. It would be a block of ram that is causing it, probably wherever the 590 chose to store its frame buffer.

It also could be a sign of overheating causing bad ram.... check your temperatures in your case as well as on the video card. Use MSI afterburner for the video card temperatures.

The bios versions are different because one is the 'master' card and one is the slave, that is normal.

If temperatures aren't the problem, and you didn't overclock to cause this problem, then a RMA is in your future.


----------



## Narong30

Quote:


> Originally Posted by *rush2049;15499177*
> Narong30,
> 
> Looks to me like you have some bad ram on your graphics card. That is what causes a block of pixels like that to go. It would be a block of ram that is causing it, probably wherever the 590 chose to store its frame buffer.
> 
> It also could be a sign of overheating causing bad ram.... check your temperatures in your case as well as on the video card. Use MSI afterburner for the video card temperatures.
> 
> The bios versions are different because one is the 'master' card and one is the slave, that is normal.
> 
> If temperatures aren't the problem, and you didn't overclock to cause this problem, then a RMA is in your future.


Hello,

Thanks for the reply. I've MSI after burner. The temp for GPU 1 : minimum 42c max 87c. For GPU 2 : Min 45c max 93. I ask at Nvidia about this. But they tell me, the card can handle the temperature without any problem. It's normal for GTX 590 with that temperature. Some more, I use HAF X... I guess already try my best to give fresh air to my GPU.

So maybe I need to send GPU to RMA.... but the bad thing is gigabyte sucks







they'll take about 4 month to replace or credit note it


----------



## Scorpion49

Quote:


> Originally Posted by *Wogga;15496364*
> *Scorpion49*, what is on GPUs? thermal gum? whats the temps?


Okay update, temps at idle are 29-31*C, loaded on heaven 2.5 loop for 2 hours it got up to 43*C. I think that beats 75*C+ that I was getting before with a custom fan profile for sure.


----------



## Shinobi Jedi

Quote:



Originally Posted by *Masked*


The jump from my 980x to the 2600k was enough for me to notice.
The jump from the 2600k to the OC'd 2700k was enough for me to







.

I can't comment on Ivy without some serious legal trouble but, I think the Z68 is a sturdy platform, especially from EVGA.

While eventually you might not be able to utilize 8 lanes of ram...Who in their right mind actually needs that much?

Ivy is supposed to have 1155 capable chips...I.E. the 3630k and with a bios update they'll be a simple PNP...

So, from my personal perspective...I think I made an educated decision...I think I made it at a good time and I'll be the first to tell you, I have a preorder for the 3630k ready to go.

Aside from that, I'm not buying BF3 until I get my 590 back which sucks because I actually have it on pre-order...That being said...I was 1 of the first @ 6am EST to order my SWTOR collector's edition from Amazon...Registered and I think overall, I was #50? ~ Glad I jumped on that ASAP.

Collector's editions go for more a couple years after...For example I have the statue from Killzone 3 and I've actually been offered a lot of $$$$ for it considering there are some knockoffs on ebay etc...

With Star Wars, you have a sure bet that it's going to appreciate regardless of how the game does because it's Star Wars, period.

So, TLDR: I'd buy a Z68 if I were you, the FTW is awesome...2700k maybe and just pop in a 3630k down the road.



'*Snip*' Ha! That was freakin' Hilarious!

I totally read you on the Z68 FTW and the 2700K. That was my original plan.

I wanted to get those and that Asus 27" VG278H 3D monitor with Lightboost.

But tips have sucked at work lately, and so I haven't had much room after bills for either. At this point, I'm leaning towards the Asus if I have to choose. And then push the Z68 and 2700K down the line a ways. Or, get those and push the monitor. But I really dig 3D gaming (whereas I don't like 3D movies for the most part) and when I went to that Sony marketing event for Uncharted 3, the 3D in it was really, really impressive. And my current 3D monitor doesn't have the HDMI 1.4 port that is needed for PS3 3D whereas the Asus does. Plus it has the Lightboost technology which seems really interesting as it will work for PS3 3D games or 3D Blu-Rays (if there was one worth watching, save maybe Tron. Though I haven't watched it yet to be honest)

I'm average around 90-105 fps in BF3 so far, though I just had a gnarly freeze of the game that required a hard reset. I'm assuming that's the code though.

What's weird is, and I haven't had a chance to test it more, but the other night while in a MP match it didn't matter if I set the video options to Ultra or Low, the game still hovered between 90-100fps.

And I was a bit frustrated that turning the settings down did not provide any results for me to gauge if having those missing FPS instead of the eye candy would really help my play and progression.

So I was wondering, could it possibly be a Ram issue, with a web browser and origin having to run concurrently with the game? Is 8gb not enough?

One of the Dimm Slots on my X58 died and I just haven't felt like tearing my rig apart to RMA it. Especially if it's not that much longer before I tear it apart for a CPU/Mobo upgrade. So I've been getting by on 8gb since I only use my rig to game and write.

But if BF3 needs more ram, and tips stay low, then I'll probably RMA the mobo. I've still got my two gaming notebooks to get me by for the few days I'm down.

I'm curious to know what the other Quad SLI 590 owners are getting with BF3. Especially if you're running a new Sandy Bridge CPU. I'd love to know how much my CPU is really holding me back. It would definitely help make my decision easier on what to upgrade first.

Happy Halloween 590-Pimps!


----------



## Wogga

dunno what exact fps i have (should have turn counter on) but on ultra settings..i havent any troubles, freezes, glitches or smth else. but, i play on 60Hz monitor with vsync on.
only one problem i have - after 5-8 matches sometimes shadows become acid green for a ms and then its ok.
mb its worth to mention, before gaming i fold for about 5-6 hours on all GPUs and on CPU


----------



## CJL

Quote:


> Originally Posted by *Scorpion49;15503586*
> Okay update, temps at idle are 29-31*C, loaded on heaven 2.5 loop for 2 hours it got up to 43*C. *I think that beats* 75*C+ that I was getting before with a custom fan profile for sure.


You think? Nice job!


----------



## CJL

Quote:


> Originally Posted by *Wogga;15507011*
> ... before gaming i fold for about 5-6 hours on all GPUs and on CPU


Is it worth folding on the 590s? I thought they weren't officially supported, they had to be whitelisted or something.


----------



## Wogga

its 57k+ PPD from GPUs only. i've heard that 580 do up to 17k...mine have done only 15k max. but they're almost not OCed. i should try at [email protected]


----------



## Narong30

Quote:



Originally Posted by *Wogga*


its 57k+ PPD from GPUs only. i've heard that 580 do up to 17k...mine have done only 15k max. but they're almost not OCed. i should try at [email protected]










What is folding?

What is the benefit?

And how to do that?


----------



## Arizonian

Quote:



Originally Posted by *Narong30*


What is folding?

What is the benefit?

And how to do that?


*Project details, big picture*

*What is [email protected]? What is protein folding?*

[email protected] is a distributed computing project, that very simply stated, studies protein folding and misfolding. Protein folding is explained in more detail in the scientific background section.

*What is distributed computing?*

Distributed Computing is a method of computer processing in which different parts of a program, or different portions of data, are processing simultaneously on two or more computers that are communicating with each other over a network or through the Internet.

*Who "owns" the results? What will happen to them? *

Unlike other distributed computing projects, [email protected] is run by an academic institution (specifically the Pande Group, at Stanford University's - Chemistry Department), which is a nonprofit institution dedicated to science research and education. We will not sell the data or make any money off of it.

Moreover, we will make the data available for others to use. In particular, the results from [email protected] will be made available on several levels. Most importantly, analysis of the simulations will be submitted to scientific journals for publication, and these journal articles will be posted on the web page after publication. Next, after publication of these scientific articles that analyze the data, the raw data of the folding runs will be available for everyone, including other researchers, here on this web site.

*What has been "folded" so far, and how much have I folded?*

We keep many types of statistics of users and work accomplished in our Stats section. You can check your Individual stats, Team stats, and overall Project stats. Please also review the Results and Awards sections.

*What has the project completed so far? *

We have been able to fold several proteins in the 5-10 microsecond time range with experimental validation of our folding kinetics. This is a fundamental advance over previous work. Scientific papers detailing our results can be found in the Results section. We are now moving to other important proteins used in structural biology studies of folding as well as proteins involved in disease. There are many peer-reviewed and published in top journals (Science, Nature, Nature Structural Biology, PNAS, JMB, etc) that have resulted from FAH. Currently, the FAH project has published more papers than all of the other major distributed computing projects combined!

*Why not just use a supercomputer? *

Modern supercomputers are essentially clusters of hundreds of processors linked by fast networking. The speed of these processors is comparable to (and often slower than) those found in PCs! Thus, if an algorithm (like ours) does not need the fast networking, it will run just as fast on a supercluster as a supercomputer. However, our application needs not the hundreds of processors found in modern supercomputers, but hundreds of thousands of processors. Hence, the calculations performed on [email protected] would not be possible by any other means! Moreover, even if we were given exclusive access to all of the supercomputers in the world, we would still have fewer computing cycles than we do with the [email protected] cluster! This is possible since PC processors are now very fast and there are hundreds of millions of PCs sitting idle in the world.

*What are the minimum system requirements? *

All computers can contribute to [email protected] However, if the computer is too slow (e.g. wasn't bought in the last 3-4 years or so), the computer might not be fast enough to make the deadlines of typical work units. A Pentium 3 450 MHz or newer equivalent computer (with SSE) is able to complete work units before they expire.

*Why should I update my [email protected] software to the current version? *

We are continuously improving the [email protected] software and adding new features. We release new versions to fix bugs reported by the users to help make the project run as smoothly as possible.

*How do the results get back to you? *

Your computer will automatically upload the results to our server each time it finishes a work unit, and download a new work at that time.

*How long does it take to finish a work unit? How do you measure a work unit?*

This varies, of course, on the speed of the computer and the size of the protein under study. Depending on the protein and the properties studied, different size work units may be used. The Project Summary page has information on particular proteins' sizes and the deadlines allotted for completion.

*How can I make sure my results are being sent back and used? How can I tell how much work I've processed? *

To find out data that has been reported back you can check the Stats page for Individual stats, Team stats, and overall Project stats. If your computer is returning data, you should see your username there along with the number of work units completed. If your name isn't there, and your screen saver or console version seems to be working fine, then either it hasn't finished a unit yet (this can take a few days, or even longer with an older computer), the list hasn't been updated yet. Check back in a day or two, and it should be there . . . as long as you remember correctly what you typed in for your donor user name.









*Can I run [email protected] when [email protected] is running? *

Yes, [email protected] and other distributed applications can be run alongside [email protected], provided you have enough system memory. Some programs, including [email protected] run at a higher priority than [email protected], which prevents FAH from progressing if it is run at the same time. If you notice the FAH client is not progressing, you can fix this by enabling the "Slightly Higher Priority" Option in the FAH client. This can be done through the advanced options page for the Windows GUI client, or by running the Console client with the -config switch.

Note: FAH work units are time sensitive, while work units from other projects typically are not time sensitive. Therefore we strongly recommend giving the FAH client a higher priority, or to dedicate a processor core to FAH while allowing other projects to use the remaining computer resources.

*Are there any limits to how long my machine can take to finish a work unit (WU)? *

Yes. Work Units are serial in nature. When a completed WU is sent back, a new work unit is generated from those results. This must happen many times over within each project (group of work units). A generation 1 work unit must be turned in before a generaton 2 work unit is created and sent out.

To keep these generations moving along, we have to set expiration deadlines in the event a work unit is not uploaded in a timely manner (lost, deleted, whatever). These unfinished work units "expire" and are reassigned to new machines. You will still receive credit for all WUs completed and uploaded prior to the preferred deadline. However, after the preferred deadline, your contribution is not as useful scientifically because another copy of that work unit had to be sent out to another contributor. Even if you eventually complete the work unit, that other contributor still had to process duplicate work to assure the science moves forward. And it would be unfair not to also credit that second contributor.

Even so, full credit is given up until the final deadline. After the final deadline has expired, the client will automaticlly discard the work unit and download new work. If you have trouble completing work units before the preferred deadline, it is recommended to either run the FAH client more hours each day, or to run the client on a faster computer.

As we move to larger and longer WUs, we will extend the expiration time as needed. Deadlines vary on the order of a few days to a several weeks, depending on the nature of the WU. Turn in a work unit just before the deadline is not the goal. It is most helpful to the project to return work units as quickly as possible. And how these deadlines are determined is explained a few answers below.

*How much power/money does keeping a FAH running 24/7 on a computer use? *

Roughly, a CPU uses about as much power (watts) as a typical light bulb. Here's a report on computer power management from Lawrence Berkeley government labs, and there are other references on the web you can find. Although power supplies on most computers are rated at 400 watts, average usage is lower. On average, a Pentium-type computer uses about 100 watts (if the monitor is off). So, the daily difference between off and running FAH is about 24x100 = 2.4 kWh. At $0.15 per kWh ( from PG&E here in California), this works out to about $0.36 per day. In general, lighting and climate control use a much larger share of household power than computers do. So the best bet for cutting costs and conserving energy would be to turn off lights, turn off your computer monitors (which use more power than a CPU), and turn down the heat.

*My monitor is set to turn off after a while. *

Can I still run the screen saver? Energy saving features, which turn the monitor off after a specified period of time, does not affect the screen saver. As long as the computer is running, the screen saver will continue to run and accumulate useful data, even if the monitor is off.

*Does the screen saver use a lot of CPU time? *

The screen saver is designed to use very little CPU time. Even without any OpenGL hardware, the screen saver only uses about 5% of the CPU time for graphics. If you have some sort of OpenGL support on your graphics card, the CPU time becomes virtually zero. Since the space filling ("orb") drawing of the atoms can be seen in the process of drawing, it may look like it is taking a lot of processor space, but that delay is set by a timer, and does not occur because it takes a long time to draw.

*Will extra 3D hardware help make the screen saver go faster?*

3D acceleration cards will make a small difference for complex proteins. However, the accelerator does not aid in processing the proteins, is simply offloads the graphical workload from the CPU so it can fold faster.

*How do I shut down the screen saver? *

The screen saver is designed so that it shuts down on mouse click or key press. It will NOT shut down by simply moving the mouse. This is to prevent users from inadvertently closing down [email protected], since it takes a short while to start processing every time it starts up.


----------



## emett

Maybe pm him that one next time, or provide a link.


----------



## Arizonian

Quote:



Originally Posted by *emett*


Maybe pm him that one next time, or provide a link.


Sorry guys - guess that was a bit over the top/long.









Aren't you guys glad I didn't get the GTX 590 after all?







Should have provided the link.

Seriously though, I bought my EVGA GTX 580 and 80-85 days later the GTX 590 came out. I called EVGA and asked about the 'step up' option not being available online and they denied me saying that it's a single GPU to dual GPU and not eligible.

My only grip about EVGA that their 'step up' should be exactly that within 90 days purchased. If it's a performance upgrade with a single product it's a step up. I would have been glad to put down $300 more for that second 580 on one pcb.









Been reading/following along with this club thread since day one. Wishing my rig had one as well.

All this time has passed and next bounus check in March, I'll be getting that second GTX 580 after all.

*_Arizonian goes back to quietly reading the club members posts_*


----------



## Shinobi Jedi

Well, while getting incredible FPS on indoor maps like Metro in BF3, I seem to be only averaging 60fps on the short time I've gotten to play on the large outdoor maps.

What's my bottleneck? CPU? Ram?

I hope it's not somehow the 1.5gb VRam hitting some wall on even a 'single' monitor setup like mine.

Kinda bummin' about it, but really haven't played much of them yet. Keep getting that freakin' PunkBuster bug. First they frack up with Optimus on M11x-R2 so I kept getting kicked out on that machine and BF:BC2, and now this and my pride and joy.

Good Times.

Why am I only getting 60fps on Ultra settings on outdoor maps? Is it my rig? Or the game? And what can I do to make it better?

Thanks again to all of you 590-Jedi....


----------



## chaosneo

i was going to get a new casing & new PSU next year when budget approves.

i was already eying on Silverstone Raven RV-01BW, and then did some research that RV-02-E has better airflow and would cater well to remove some of the incredible heat dispel by GTX590. it occur to me then that the Raven series rotate the MB to 90 degrees and that will put the exhaust of the GPU one side facing downwards and expelling hot air against the inlet fans. this might not be so good in terms for the GPU cooling?

does anyone uses a Raven casing here? would like to know your experience.


----------



## Scorpion49

Quote:


> Originally Posted by *chaosneo;15516397*
> i was going to get a new casing & new PSU next year when budget approves.
> 
> i was already eying on Silverstone Raven RV-01BW, and then did some research that RV-02-E has better airflow and would cater well to remove some of the incredible heat dispel by GTX590. it occur to me then that the Raven series rotate the MB to 90 degrees and that will put the exhaust of the GPU one side facing downwards and expelling hot air against the inlet fans. this might not be so good in terms for the GPU cooling?
> 
> does anyone uses a Raven casing here? would like to know your experience.


I've used an RV02, I would not get it for the 590 at all. The back vent of the card will be VERY nearly touching the fan grille and it will force the hot air right back up the GPU. I had 5870's when I had my RV02 and they were about 2mm away from the fan, good thing the PSU connectors were on top.


----------



## Swolern

Quote:


> Originally Posted by *Shinobi Jedi;15515921*
> Well, while getting incredible FPS on indoor maps like Metro in BF3, I seem to be only averaging 60fps on the short time I've gotten to play on the large outdoor maps.
> 
> What's my bottleneck? CPU? Ram?
> 
> I hope it's not somehow the 1.5gb VRam hitting some wall on even a 'single' monitor setup like mine.
> 
> Kinda bummin' about it, but really haven't played much of them yet. Keep getting that freakin' PunkBuster bug. First they frack up with Optimus on M11x-R2 so I kept getting kicked out on that machine and BF:BC2, and now this and my pride and joy.
> 
> Good Times.
> 
> Why am I only getting 60fps on Ultra settings on outdoor maps? Is it my rig? Or the game? And what can I do to make it better?
> 
> Thanks again to all of you 590-Jedi....


Well BF3 is an amazing game but it's got it's share of bugs. It looks like we are going to have to fix them ourselves until Dice comes out with a patch. As for online connectivity I found a good list of fixes to try(@ half way down the page). http://forum.ea.com/eaforum/posts/list/45/7657954.page?ClickID=arn5rlvprv9klllnkv0ys9ywozpyw5vwysrn

You have plenty on VRAM on the 590 for just one monitor. All your components look good so there should be no bottleneck. I am getting @ same FPS as you with just one evga 590/i5 2500k @ 3.3Ghz. Most likely the issue is BF3 is just not optimized for quad SLI yet.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Swolern;15516976*
> Well BF3 is an amazing game but it's got it's share of bugs. It looks like we are going to have to fix them ourselves until Dice comes out with a patch. As for online connectivity I found a good list of fixes to try(@ half way down the page). http://forum.ea.com/eaforum/posts/list/45/7657954.page?ClickID=arn5rlvprv9klllnkv0ys9ywozpyw5vwysrn
> 
> You have plenty on VRAM on the 590 for just one monitor. All your components look good so there should be no bottleneck. I am getting @ same FPS as you with just one evga 590/i5 2500k @ 3.3Ghz. Most likely the issue is BF3 is just not optimized for quad SLI yet.


Awesome. This is encouraging. I'm gonna check out that thread right now.


----------



## Smo

Quote:


> Originally Posted by *Scorpion49;15516424*
> I've used an RV02, I would not get it for the 590 at all. The back vent of the card will be VERY nearly touching the fan grille and it will force the hot air right back up the GPU. I had 5870's when I had my RV02 and they were about 2mm away from the fan, good thing the PSU connectors were on top.


I wouldn't say that the RV02 was too bad considering the 590 vents hot air back into it. It's not an ideal situation, sure - but with my card overclocked to 740MHz @ 0.988v with a custom fan profile the temperatures were still lower than stock.

For reference, this is how close the 590 sits to the fans;










You've got a good inch or so of clearance.


----------



## Narong30

Hello all,

I would like to ask at here. Anybody use their GTX 590 playing Battlefield 3 (ultra) with 2560 x 1600 resolution?

What is the minimum and maximum fps? Do you feel any lag? Any bugs?

I really want to know, some of people saying that Nvidia card perform better in Battlefield 3.







Any Idea?


----------



## Shinobi Jedi

Well, after finally playing for awhile, I'd say I'm somewhere between "pretty happy" and "psyched" with the performance I'm getting on BF3.

I still plan on upgrading though, as soon as the funds become available


----------



## Wogga

damn...i used to play BF3 all this time with SLi turned off -_-
still capped at 60fps (vsync on)
turning SLi on didnt improved anything except lowering GPU usage


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Wogga;15518247*
> damn...i used to play BF3 all this time with SLi turned off -_-
> still capped at 60fps (vsync on)
> turning SLi on didnt improved anything except lowering GPU usage


Well if you're running a 60hz monitor and have Vsync on, it is going to cap at 60fps. Try turning off Vsync and see what you get then.


----------



## Kariz-Matik

Quote:


> Originally Posted by *Narong30;15518062*
> Hello all,
> 
> I would like to ask at here. Anybody use their GTX 590 playing Battlefield 3 (ultra) with 2560 x 1600 resolution?
> 
> What is the minimum and maximum fps? Do you feel any lag? Any bugs?
> 
> I really want to know, some of people saying that Nvidia card perform better in Battlefield 3.
> 
> 
> 
> 
> 
> 
> 
> Any Idea?


I'm playing on a 30" Dell -- And that resolution. I get roughly 70-80 FPS the last time I checked (From memory, anyway. I'll double check tomorrow.) -- I haven't played on Campaign yet, though and I hear for some reason you get better FPS in Campaign then you do in MP.

I hope that helps! What are you getting?

Oh and my keyboard doesn't have a print screen button. Does a case photo with the GTX 590 powering away suffice for proof? xD


----------



## ilukeberry

Quote:


> Originally Posted by *CJL;15453193*
> Can i join the Fan Spike club! Add me please...!!!! I will upload an mp3 audio recording as proof later.
> 
> Oh well, i'm not reverting back just yet. I'll continue to see how these drivers work out. I usually uninstall PhysX and the Display Driver from the CP, reboot and install the new drivers. For the first time i chose the clean install option. Also, i flashed my BIOS, went from 90 and 91 to 92 and 93.
> 
> So far 285.62 got me a DNR in 3DMark11 and Rage freezing my system right after launching from Steam (2nd launch after reboot worked). Now waiting for BF3 to install...


People with spike fan issue please make a post here with with your version of BIOS and GTX 590 vendor:

http://forums.nvidia.com/index.php?showtopic=210262&st=0&p=1313988entry1313988

I've just talked with ManuelG the NVIDIA Forums Technical Advisor and he said he will file a bug if some other people with this problem will post in that topic on NVIDIA forum.

ManuelG:
Quote:


> Thank you. I'll file a bug however if you do talk to other people complaining about this, could you please ask them to post here and include their board maker, VBIOS version and verify this only happens with R280.xx drivers or later:


----------



## Forty-two

I went to Vegas this weekend and had some luck playing poker. I won 2 small tournaments and had some cash burning a hole in my pocket, so I stopped by Fry's on my way out of town. I thought I would buy a cheap GPU so I could turn on the power on my new build while I continued my search for a 590. Guess what I found just waiting for me to spend some of my cash on, this Asus GTX 590! Can I join the club now?


----------



## Narong30

Quote:


> Originally Posted by *Kariz-Matik;15520625*
> I'm playing on a 30" Dell -- And that resolution. I get roughly 70-80 FPS the last time I checked (From memory, anyway. I'll double check tomorrow.) -- I haven't played on Campaign yet, though and I hear for some reason you get better FPS in Campaign then you do in MP.
> 
> I hope that helps! What are you getting?
> 
> Oh and my keyboard doesn't have a print screen button. Does a case photo with the GTX 590 powering away suffice for proof? xD


How about the temperature?


----------



## Scorpion49

Quote:


> Originally Posted by *Forty-two;15522384*
> I went to Vegas this weekend and had some luck playing poker. I won 2 small tournaments and had some cash burning a hole in my pocket, so I stopped by Fry's on my way out of town. I thought I would buy a cheap GPU so I could turn on the power on my new build while I continued my search for a 590. Guess what I found just waiting for me to spend some of my cash on, this Asus GTX 590! Can I join the club now?


Looking nice! Good to have some cash after vegas I bet


----------



## Narong30

Quote:


> Originally Posted by *Kariz-Matik;15520625*
> I'm playing on a 30" Dell -- And that resolution. I get roughly 70-80 FPS the last time I checked (From memory, anyway. I'll double check tomorrow.) -- I haven't played on Campaign yet, though and I hear for some reason you get better FPS in Campaign then you do in MP.
> 
> I hope that helps! What are you getting?
> 
> Oh and my keyboard doesn't have a print screen button. Does a case photo with the GTX 590 powering away suffice for proof? xD


I'm on 1900 x 1080

Yeah I can get FPS around your fps


----------



## EM2J

my stock 590 in bf3 with everything maxed + vsync uses only about 50% of each gpu at 1980x1080. it's bored.....

stays about 70c max even after being used for hours.


----------



## Kariz-Matik

Quote:


> Originally Posted by *Narong30;15525272*
> How about the temperature?


I haven't seen either card reach over 88C - Card 'A' sits at about 83C and the second gets up to 87C max and on really big maps I sometimes see it hit 88C and that's with both cards under 80-99% load. And I live in Australia -- So my ambient temp is sitting around 26-30 easily. The summer will see 40, so that should be interesting.

I'm considering some more cooling - I don't know what yet, though. But the stock definitely sucks. Any news on when some custom air cooling comes out for it? Or am I dreaming?

[UPDATE]

Was just playing BF3 Campaign - Played for a solid 45 minutes and the highest the first GPU got was 76C and the second was 82C. I was getting about 60 FPS outside with settings on Ultra everything on 2560 x 1600.

I think I may have to get a smaller screen haha! Will having a smaller resolution / screen yield better FPS? I'm considering a 27" or maybe two 24"? Let me know, please!


----------



## Kariz-Matik

Does this suffice for proof?


----------



## EM2J

Quote:


> Originally Posted by *Kariz-Matik;15526687*
> I haven't seen either card reach over 88C - Card 'A' sits at about 83C and the second gets up to 87C max and on really big maps I sometimes see it hit 88C and that's with both cards under 80-99% load. And I live in Australia -- So my ambient temp is sitting around 26-30 easily. The summer will see 40, so that should be interesting.
> 
> I'm considering some more cooling - I don't know what yet, though. But the stock definitely sucks. Any news on when some custom air cooling comes out for it? Or am I dreaming?
> 
> [UPDATE]
> 
> Was just playing BF3 Campaign - Played for a solid 45 minutes and the highest the first GPU got was 76C and the second was 82C. I was getting about 60 FPS outside with settings on Ultra everything on 2560 x 1600.
> 
> I think I may have to get a smaller screen haha! Will having a smaller resolution / screen yield better FPS? I'm considering a 27" or maybe two 24"? Let me know, please!


yeah you'll definitely get better fps on a monitor with a 1920x1080 resolu.

pick one of these up http://www.newegg.com/Product/Product.aspx?Item=N82E16824016155&nm_mc=EMC-IGNEFL110111&cm_mmc=EMC-IGNEFL110111-_-EMC-110111-Index-_-LCDMonitors-_-24016155-L011A

i've got one and it's fantastic, especially for the price. extra 40$ off too with this promo code *EMCJJKJ46*


----------



## RagingCain

Hey guys just dropping a comment nVidia Drivers > ATi Catalyst still.


----------



## Arizonian

Quote:



Originally Posted by *RagingCain*


Hey guys just dropping a comment nVidia Drivers > ATi Catalyst still.










I never had troubles with 3DFX.









After 3870, 4870, 5870, & 6870 up to 10.2 drivers, I thought driver problems were just part of owning discrete graphic cards.

My 580 changed that all....your preaching to the quior with me.


----------



## chaosneo

>> Kariz-Matik
hi i love your setup








was thinking of getting the Raven 02 early next year & it probably would look quite similar to your setup.

was wondering, what is your ambient, idle & load temperature for your CPU and GPU?


----------



## Kariz-Matik

Quote:


> Originally Posted by *EM2J;15530226*
> yeah you'll definitely get better fps on a monitor with a 1920x1080 resolu.
> 
> pick one of these up http://www.newegg.com/Product/Product.aspx?Item=N82E16824016155&nm_mc=EMC-IGNEFL110111&cm_mmc=EMC-IGNEFL110111-_-EMC-110111-Index-_-LCDMonitors-_-24016155-L011A
> 
> i've got one and it's fantastic, especially for the price. extra 40$ off too with this promo code *EMCJJKJ46*


Thanks man! I've actually got a Samsung 27" sitting in my cupboard. I may give that a go - It'll be sad to see this Dell 30" to go, though. It's great for work!
Quote:


> Originally Posted by *chaosneo;15536438*
> >> Kariz-Matik
> hi i love your setup
> 
> 
> 
> 
> 
> 
> 
> 
> was thinking of getting the Raven 02 early next year & it probably would look quite similar to your setup.
> 
> was wondering, what is your ambient, idle & load temperature for your CPU and GPU?


Thanks man! For my CPU I have a H80 on it and when "Idle" it's anywhere from 25-30C (I've never actually waited 15 minutes to try it, but just general music playing / surfing the web, those are my temps) -- And under duress I've seen it hit a max of 55C on one core, the other 3 sit in the high 40's -- And I'm running programs that actually use them all (Video / Photo editing).

As for GPU -- At the moment I'm seeing my GTX590 on BF3 get up to 88C on one GPU and around 83C on the other -- That is max temps. I've just tried putting a custom fan profile on that another member was kind enough to let me in on. So shall see how that fairs. But my room temp is around 26 - 30 usually (I live in Australia ... sad face!) -- Hence the slightly higher temps.

Hope that helps!


----------



## Shinobi Jedi

Quote:



Originally Posted by *RagingCain*


Hey guys just dropping a comment nVidia Drivers > ATi Catalyst still.










Ha, Raging! Good to see you still dropping by! How are you liking the new cards, drivers aside?


----------



## RagingCain

Quote:


> Originally Posted by *Shinobi Jedi;15541924*
> Ha, Raging! Good to see you still dropping by! How are you liking the new cards, drivers aside?


Meh, its one of Levesques cards. Its alright of a card but meh to me. There are quite a few issues on BF3.

WoW plays about the same. Thats about all I have tested. Image quality looks lower to me and AA doesnt always turn on but definitely lowers performance so....

Sent from my DROID X2 using Tapatalk


----------



## Shinobi Jedi

So, I had a pretty gnarly crash on BF3 last night. Second time the game has frozen so bad with the sound going screeching and fubar so that the only remedy was a hard reset on the case.

The first time, everything rebooted back up fine.

Then last night, it booted and the cards drivers weren't recognized and I was staring at a 800x600 resolution screen.

I was ready to tear everything apart, but went ahead and rebooted right away one more time and now everything is back to normal.

I'm looking for an excuse to upgrade my CPU & mobo even though I really don't "need" it. If its something with my cards, I've got my warranty and haven't overclocked them, so I'm not worried about getting them fixed/replaced but I know the down time will suck, though I've got my notebooks to get me through.


----------



## Recipe7

Quote:



Originally Posted by *Shinobi Jedi*


So, I had a pretty gnarly crash on BF3 last night. Second time the game has frozen so bad with the sound going screeching and fubar so that the only remedy was a hard reset on the case.

The first time, everything rebooted back up fine.

Then last night, it booted and the cards drivers weren't recognized and I was staring at a 800x600 resolution screen.

I was ready to tear everything apart, but went ahead and rebooted right away one more time and now everything is back to normal.

I'm looking for an excuse to upgrade my CPU & mobo even though I really don't "need" it. If its something with my cards, I've got my warranty and haven't overclocked them, so I'm not worried about getting them fixed/replaced but I know the down time will suck, though I've got my notebooks to get me through.


Same thing happened to me. I try not to worry about it


----------



## Wogga

same thing happened with one of members of our boards. but he still cant solve problem, card shows bios only for one GPU and flashing doesnt help


----------



## Shinobi Jedi

Okay, cool. While it sucks for all of us, it helps knowing I'm not the only one.


----------



## MKHunt

My friends I need your help. I have all my parts back from RMA for my sig rig. I have had them for two days. It's still in pieces. What will motivate me to reassemble it?

Can you?

The Koolance block was so full of grease from the factory that I need to reclean _everything_ in my WC loop... even the tubes.

But if I don't put grease do the chances of a leak go up?

If my parts sit disassembled then there's never a freak chance frying my stuff again.

Also, I only have thepink Koolance block pads left. Can I use those? Do I need to order and cut more grey pads?


----------



## Scorpion49

Quote:


> Originally Posted by *MKHunt;15555499*
> My friends I need your help. I have all my parts back from RMA for my sig rig. I have had them for two days. It's still in pieces. What will motivate me to reassemble it?
> 
> Can you?
> 
> The Koolance block was so full of grease from the factory that I need to reclean _everything_ in my WC loop... even the tubes.
> 
> But if I don't put grease do the chances of a leak go up?
> 
> If my parts sit disassembled then there's never a freak chance frying my stuff again.
> 
> Also, I only have thepink Koolance block pads left. Can I use those? Do I need to order and cut more grey pads?


You better get on it! Expensive parts sitting around don't do you any good. Tell you what, send it all to me and I'll, uh, take care of it for you


----------



## mojobear

hey guys,

i have a quick question for any gtx 590 owners with an XSPC waterblock.

Okay, so I recently bought a second gtx 590 with a second XSPC block for quad sli...but unfortunately for some odd reason the GPU closest to the display port temp is always much higher (60-70 C versus 40 C). I have tried RMAing the waterblock...same issue...Ive tried RMAing the GTX 590...same issue....I am at a loss. There must be something stupid I am doing here....but I cannot figure out what!

Thanks in advance...and I am getting so confused.


----------



## mojobear

just to give u some more info...the blocks are run in parallel with koolance VID connectors between the two.

For some odd reason its just this one GPU....hopefully someone can offer some insight cause I think I am going crazy :S


----------



## Wogga

inproper mounting, i.e. air pockets between GPU and waterblock
or not enough pressure/flow from pump


----------



## Masked

Quote:


> Originally Posted by *mojobear;15562625*
> hey guys,
> 
> i have a quick question for any gtx 590 owners with an XSPC waterblock.
> 
> Okay, so I recently bought a second gtx 590 with a second XSPC block for quad sli...but unfortunately for some odd reason the GPU closest to the display port temp is always much higher (60-70 C versus 40 C). I have tried RMAing the waterblock...same issue...Ive tried RMAing the GTX 590...same issue....I am at a loss. There must be something stupid I am doing here....but I cannot figure out what!
> 
> Thanks in advance...and I am getting so confused.


Quote:


> Originally Posted by *Wogga;15566432*
> inproper mounting, i.e. air pockets between GPU and waterblock
> or not enough pressure/flow from pump


If you've RMA'd the hardware and the same issue persists, it's *you* boyo.

I'd guestimate that your flow can't push the bubble through the block and you have an open res so there's no constant pressure so, basically, TLDR, shake the living crap out of your tower...It'll knock the bubble out.

Also realize that you've already gone through 3 cores of constant heat by the time it hits the 4th core so, your 3rd core may indeed be running significantly hotter than the others...


----------



## mojobear

thanks guys, i wish it was that simple. Ive remounted I dont even know how many times...at least more then 10. Changed it from parallel to serial, changed positions so the hot gpu wil be second on the flow through the gpus...but still smae core has high temps. I noted that the choke coils were different from the first GTX 590 I got, and that the xspc block is in general sitting higher than the other gtx 590 (once again...I have two gtx 590 both with xspc blocks). Thus, I wonder if other recent xspc owners, such as scorpion....has had any of my issues.

Could it be that gpuz is picking up one of the temps of my vrm or something mistaking it for my gpu core?


----------



## mojobear

also, masked, everytime i change up the gpus...i have to drain my loop, so I dont think its a bubble issue.


----------



## Masked

Quote:


> Originally Posted by *mojobear;15569261*
> also, masked, everytime i change up the gpus...i have to drain my loop, so I dont think its a bubble issue.


...I've run my rad box through 15 systems this morning, 15/15 had bubbles in the cards until they were shaken out.

Been watercooling for a very long time mate and koolance blocks are infamous for bubbles in those 2 corners.

Give your desktop a shake and watch in wonder as your pump starts to rev, then a couple bubbles pulse through the loop.

My understanding of the Koolance 590 blocks is that there was a 2nd revision...If you re-seat and the block still sits much higher, then you have a rev1//rev2...However, that still won't make the difference you claim.

You've swapped the hardware and the block so, the //ONLY// factor left that's undecided is your setup.

My best guess, as I said before, is that the bubbles in your CPU are being pushed out and forced into the 2nd koolance block...Not having a closed res means that there isn't enough pressure to force-feed the bubbles out...You could wash out your system 100 times and still have bubbles in there.

You also have to realize that in your loop, if it's non-serial, you're going 1core, 2core, 3core, 4core...The 4th core is ALWAYS the hottest because you've changed the delta through the 3 cores...

I should have power by tuesday...I really hate this state with a burning passion.


----------



## Xyphyr

I'll post a picture once I get it via USPS. :3


----------



## AlExAlExAlEx

_*Some PROOF that you own the card (photo, screenshot etc. preferably with your OCN name)_








_*Please include the BRAND of your card_
Gigabyte
_*Please include the CLOCKS you are running your card at. With proof please._
Stock


----------



## Smo

Quote:



Originally Posted by *AlExAlExAlEx*


_*Some PROOF that you own the card (photo, screenshot etc. preferably with your OCN name)_








_*Please include the BRAND of your card_
Gigabyte
_*Please include the CLOCKS you are running your card at. With proof please._
Stock


Welcome to the club dude - I'd seriously consider remanaging your cables though! GPU #2 will cook with them like that.


----------



## Arizonian

Quote:



Originally Posted by *Smo*


Welcome to the club dude - I'd seriously consider remanaging your cables though! GPU #2 will cook with them like that.


AlExAlExAlEx - 100% this. You've got a great rig, cable management can't be stressed enough. Better air flow second reason. OCD for clean rig pics a third.









Edited to add: Just noticed this was your first post on OCN - Welcome to both the GTX 590 Club and to OCN!


----------



## Scorpion49

Quote:



Originally Posted by *Arizonian*


AlExAlExAlEx - 100% this. You've got a great rig, cable management can't be stressed enough. Better air flow second reason. OCD for clean rig pics a third.










Yep, also why is it such a low slot?


----------



## Tept

I honestly dont think I ever showed proof I have a 590 LOL.


----------



## Shinobi Jedi

So I seem to be getting incredible performance in SP BF3, almost always above 120fps.

MP, seems to be a variation with FPS all over the place. I also wonder how much server stuff can affect FPS? For example, one server I played Metro on, I got sick FPS, almost always above 120fps. Then on a different server, it hovered around 90fps. Of course, different games have different results/things going on so maybe it was that too?

I never dipped below mid 60's, but some outdoor maps playing last night it didn't go above the 60's. To be fair, it didn't stop me from having fun or playing any better (or worse) than I normally do.

I really wish we could get some Quad SLI profiles going or just 590 focused drivers.

Also, I did notice that my cards were bumping up against the 1.5gb vram limit on some of the larger maps.

Is it possible this game needs even more VRam at 1920x1080?

ETA: Whose buying Modern Warfare 3 on day one? With there not being any pre-order bonuses that seem like worth having a la BF3, and Batman:AC coming out a week later, I think I'm actually probably going to hold off on this one for awhile. I'll get it, and I know it'll never go on sale until a week before the MW4 comes out. But if the reports on how they treated IW are true, then I don't see any reason to contribute to MW3's opening sales and add to the misbegotten reputation that because it's been more financially successful than BF3, that it is any better.

I am actually interested in the SP campaign and the adventures of Soap, but my inside sources at Activision say not to expect much from the story. It's kind of a mess. Because Activision had 3 development houses working on different parts of the game at the same time, and not communicating with each other. So the writing/story team really had a struggle to come up with a cohesive narrative to connect them all together. I have diminished expectations for the SP story now, but I wouldn't take that to mean that its going to suck at all. The writing/story team there is very talented and good people who are real gamers and talented writers and narrative composers.


----------



## emett

Quote:


> Originally Posted by *Tept;15583689*
> I honestly dont think I ever showed proof I have a 590 LOL.


Neither.


----------



## Wogga

Shinobi Jedi, how did you noticed lack of VRAM? i dont have any problems even on 64ppl maps with little bit higher res (1920*1200)


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Wogga;15589963*
> Shinobi Jedi, how did you noticed lack of VRAM? i dont have any problems even on 64ppl maps with little bit higher res (1920*1200)


I'm not saying there's a lack of VRam exactly, I'm "asking".

But I use EVGA Precision Pro, which is the same as MSI Afterburner, and I run it on my Logitech G13 LCD screen and can watch the numbers in real time.

The most I've seen it hit is 1450+ but I've never seen it hit 1500.

It might just be how it's being buffered, the same way RagingCain explained about Crysis 2.

Of course, there's some 580 3gb owners that are insisting the game needs over 2gb VRam the same they did Crysis 2.









So, I thought I'd get a second, unbiased opinion here from someone like Masked or Raging or Rush.

I just got done playing again and my FPS are all over the place between 60-120fps. But everything is still smooth turned all the way up. So I'm pretty happy if I don't pay attention to the FPS readouts.


----------



## Masked

Quote:



Originally Posted by *Shinobi Jedi*


I'm not saying there's a lack of VRam exactly, I'm "asking".

But I use EVGA Precision Pro, which is the same as MSI Afterburner, and I run it on my Logitech G13 LCD screen and can watch the numbers in real time.

The most I've seen it hit is 1450+ but I've never seen it hit 1500.

It might just be how it's being buffered, the same way RagingCain explained about Crysis 2.

Of course, there's some 580 3gb owners that are insisting the game needs over 2gb VRam the same they did Crysis 2.









So, I thought I'd get a second, unbiased opinion here from someone like Masked or Raging or Rush.

I just got done playing again and my FPS are all over the place between 60-120fps. But everything is still smooth turned all the way up. So I'm pretty happy if I don't pay attention to the FPS readouts.










I don't think so...Just like I didn't think so with Crysis 2.

I still don't have power so I can't give you a difinitive answer but, over my 4 glorious days of playing the retail version...Vram usage never went above 80% on ultra...

We haven't really hit the buffer wall where you need 2gb+ on a single monitor but, I imagine it's coming around the corner.


----------



## grifers

Any link of user review for this card? Game-Benchmarks at 1080P and etc (Crysis, Crysis warhead, Just Cause, Metro, Far Cry 2, etccc... gamebenchmarks, not sintetics test like 3Dmark's or Unigine's)..., user from this forum only (or another forum), but no web-reviews.

Thanks and sorry my english language!


----------



## Wogga

*here is* some sort of what you need.


----------



## grifers

Quote:


> Originally Posted by *Wogga;15606955*
> *here is* some sort of what you need.


Thanks...But those are a QUad sli vs Quad crossfire jejeje, I need only one Gtx590 results







.

Anyway thanks!!. Bye!


----------



## [email protected]

my new computer and gtx 590 asus


----------



## Evil_elf

Hi,

I'm new to the forum and got a question about a gtx590 card.

Does any off you guys know if there are serious shipment delays with Gainward?
The thing is...I orderd a GTX590 on 16 september together with a new mobo, ram and cpu but as I'm writing this, my gtx-590 still hasn't arrived.

My cpu, ram and mobo are all ready to be picked up, but a delivery delay from nearly 2 months for a gfx-card??
According to the computershop nvidia should have problems delivering the necessary chips to gainward so they can't produce.
Contacting gainward didn't shed light to the case and they told me to order the card somewhere else. Problem is... That supplier is the only for my country and all neighbouring country's are out off stock also so I need to pre-order there also.

I really want the gainward card...but 6 weeks and counting is a bit to much I think.

Did any off you guys who recently got his card get this kinda delays?
Can any off you confirm that nvidia has troubles delivering the chips to nvidia?

I had hoped to get the card before the end off last month, like things are looking now it will be a early christmas present :/

Future rig when I have the card will be:
1*Gainward nvidia GTX590 3GB CUDA
1*Asus 1366 Rampage III Extreme X
2 Corsair DDR3 12GB 1600 Vengeance 3x4GB
1 INTEL Core I7 Extreme 990x 32nm 6 Cores 12 Threads 3.4ghz ~ 3.6ghz 12mb Lga 1366 Boxed

And yes, I will post proof off that


----------



## rush2049

Quote:


> Originally Posted by *Evil_elf;15610406*
> Hi,
> 
> I'm new to the forum and got a question about a gtx590 card.
> 
> Does any off you guys know if there are serious shipment delays with Gainward?
> The thing is...I orderd a GTX590 on 16 september together with a new mobo, ram and cpu but as I'm writing this, my gtx-590 still hasn't arrived.
> 
> My cpu, ram and mobo are all ready to be picked up, but a delivery delay from nearly 2 months for a gfx-card??
> According to the computershop nvidia should have problems delivering the necessary chips to gainward so they can't produce.
> Contacting gainward didn't shed light to the case and they told me to order the card somewhere else. Problem is... That supplier is the only for my country and all neighbouring country's are out off stock also so I need to pre-order there also.
> 
> I really want the gainward card...but 6 weeks and counting is a bit to much I think.
> 
> Did any off you guys who recently got his card get this kinda delays?
> Can any off you confirm that nvidia has troubles delivering the chips to nvidia?
> 
> I had hoped to get the card before the end off last month, like things are looking now it will be a early christmas present :/
> 
> Future rig when I have the card will be:
> 1*Gainward nvidia GTX590 3GB CUDA
> 1*Asus 1366 Rampage III Extreme X
> 2 Corsair DDR3 12GB 1600 Vengeance 3x4GB
> 1 INTEL Core I7 Extreme 990x 32nm 6 Cores 12 Threads 3.4ghz ~ 3.6ghz 12mb Lga 1366 Boxed
> 
> And yes, I will post proof off that


***begin echo of info***
The gtx 590 is/was a limited run of cards.

There are no new cards being produced.

What is available right now, is all that there will be.

Second hand / hoarders will probably be your best bet. Check Ebay and other such sites.
**end echo of info***


----------



## TerrabyteX

rush what are your affirmations base upon ? that no other 590s are produced anymore ? i want to raise cash to buy a second one in march....


----------



## MKHunt

No 590s are being produced any more. Limited run. Any new 590s you can buy now are just released RMA stock.


----------



## Arizonian

Quote:



Originally Posted by *grifers*


Any link of user review for this card? Game-Benchmarks at 1080P and etc (Crysis, Crysis warhead, Just Cause, Metro, Far Cry 2, etccc... gamebenchmarks, not sintetics test like 3Dmark's or Unigine's)..., user from this forum only (or another forum), but no web-reviews.

Thanks and sorry my english language!


 Team Green vs. Team Red - The Friendly OCN Battle Royale  Dirt 3 & Metro 2033


----------



## 2slick4u

is it worth it to go for the gtx 590 right now? i have a gtx 580 lightning xtreme edition 3gb?


----------



## Arizonian

Quote:


> Originally Posted by *2slick4u;15613516*
> is it worth it to go for the gtx 590 right now? i have a gtx 580 lightning xtreme edition 3gb?


I'm not a 590 owner but the most practical option for you would be SLI that bad boy. More VRAM and highly over clockable with great air cooling on that 580 3GB Lightning.


----------



## 2slick4u

Oh i know but i dont plan on sli so would it be better if i went with a dual gpu over a single if i dont plan on sli? or will my 580 be powerful enough for a long time without sli? only reason i look into 590 so i dont need to buy a second card..


----------



## L1eutenant

Quote:


> Originally Posted by *2slick4u;15613955*
> only reason i look into 590 so i dont need to buy a second card..


What? lol, buying a second 580 should be cheaper than buying a 590 and throwing out your current 580


----------



## 2slick4u

Quote:


> Originally Posted by *L1eutenant;15614095*
> What? lol, buying a second 580 should be cheaper than buying a 590 and throwing out your current 580


well i mean i can sell my lightning xtreme for decent amount of money since its pretty new and i dont really want to have 2 cards in my system i dont have much room in my case i have majority of my slots used up already also..


----------



## L1eutenant

Quote:


> Originally Posted by *2slick4u;15614111*
> well i mean i can sell my lightning xtreme for decent amount of money since its pretty new and i dont really want to have 2 cards in my system i dont have much room in my case i have majority of my slots used up already also..


Right, that information is helpful,

2 x 580 will get higher results than a single 590

If i could go back and pick 2 x 580's over a 590 i would


----------



## 2slick4u

if i were to get another 580..would i have to get another lightning xtreme or can i get a reference or just regular lightning?


----------



## Arizonian

Quote:


> Originally Posted by *2slick4u;15614253*
> if i were to get another 580..would i have to get another lightning xtreme or can i get a reference or just regular lightning?


You get a reference and you'll be limited to the reference clocks and VRAM.

Example, if you mix a 3GB VRAM card with a 1.5GB VRAM card essentially you'll only have 1.5GB VRAM usage in games.

IF you get a reference card that dosen't over clock well as your Lightning, you'll have to run *both* at the lower clocks of that reference card to stay stable.

DO NOT GET A REFERENCE. Get another 3GB Lightning or stay with the one you have until it's met it's match down the road.

That's my advice. Good luck in what ever you decide.


----------



## 2slick4u

Quote:


> Originally Posted by *Arizonian;15614288*
> You get a reference and you'll be limited to the reference clocks and VRAM.
> 
> Example, if you mix a 3GB VRAM card with a 1.5GB VRAM card essentially you'll only have 1.5GB VRAM usage in games.
> 
> IF you get a reference card that dosen't over clock well as your Lightning, you'll have to run *both* at the lower clocks of that reference card to stay stable.
> 
> DO NOT GET A REFERENCE. Get another 3GB Lightning or stay with the one you have until it's met it's match down the road.
> 
> That's my advice. Good luck in what ever you decide.


thanks! i'll stick to one card for now...if i get another card i wont be able to use my sound card...the lightnings are too big i had my sound card on the slot below my lightning and the fan hits the top of pcb on the sound card


----------



## NecroPS3

http://valid.canardpc.com/show_oc.php?id=2088276

EVGA Classified 590 sli stock


----------



## MKHunt

Haven' tested this whitebox 590 I got from the RMA. Should I just waterblock it and be ready for skyrim or should I keep looking for someone to let me test it in their PC first. None of my friends have cases that can hold it or have time conflicts.

Doesn't EVGA test them before shipping? So I should be fine and can start cutting these effing thermal pads, right? These thermal pads better be made of gold and silver for being $19.


----------



## Shinobi Jedi

So, I impulse bought MW3. I haven't tried it yet. I'm not expecting much from it. I pretty much got it for SP like I said before, but we'll see.

I asked Jacob over at EVGA if they'd do an SLI profile for BF3 to help Quad and SLI. He replied there's a new patch around the corner to adress the issues. We'll see.

While I'm sure I'm stating the obvious to those of you in the know, I can say that ping is definitely affecting my FPS. When I'm on a server that has super low Ping, the same outdoor maps that I was only getting 60fps on were hovering around 100-120fps.

Honestly, I'm at the point where I just turn my Vsync on and forget about it and just get absorbed into the game.

I haven't had a chance to try it properly yet, but since BF3 supports the XBox 360 controller for Windows in MP, guess what I'm switching to every time I get a chance to fly a jet or a helicopter?









While I'm sure it'd be less optimal for well trained KB/M veteran BF pilot players, it totally feels like an advantage when you're in game and happen to jump into either one with little to no experience.

Word of advice though: Do not press the "A" button unless you really need to eject out of your aircraft. I learned that one the hard way..









Good Times.


----------



## MKHunt

Haven' tested this whitebox 590 I got from the RMA. Should I just waterblock it and be ready for skyrim or should I keep looking for someone to let me test it in their PC first. None of my friends have cases that can hold it or have time conflicts.

Doesn't EVGA test them before shipping? So I should be fine and can start cutting these effing thermal pads, right? These thermal pads better be made of gold and silver for being $19.


----------



## kzinti1

Is this proof enough of ownership?
http://www.techpowerup.com/gpuz/6mn5w/
My name and OCN is on it.


----------



## Masked

Quote:



Originally Posted by *kzinti1*


Is this proof enough of ownership?
http://www.techpowerup.com/gpuz/6mn5w/
My name and OCN is on it.


No.

JK...

Or am I?

~ So, I came by a weird issue yesterday and the only reason I noticed it is because I was sitting directly across the room staring at Jason's screen while he played BF3 (I'm a great "boss", aren't I?)...

Anyway, he has a single 590...And I ritually stare at his screen because he's directly across from me, facing the wall...

He was playing BF3 and occasionally there would be a green flicker shoot through his screen particularly when he looked into the light or side-armed his weapon from a scoped position...

I can't comment if I have this issue at home or not but, I know it USED to occur because tessellation hadn't computed and there was a gap of 1-2ms where you'd see a geometrical flicker.

And to clarify...I see it out of the corner of my eye because, truth be told it's bright as can be but, that actually hasn't been an issue with any Nvidia cards for a while...

When I talked to the interns about it as a group, they said it's been an issue for a while but, it wasn't really enough to complain about...

Anyone know what's up with that?


----------



## Wogga

i had few times. last 2-3 days i havent seen this green sht


----------



## MKHunt

So yesterday.... I found a surprise.



RMA^2! EVGA was cool about it though.


----------



## CJL

That's the new GTX 545


----------



## 2slick4u

EVGA Classified GTX 590


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> No.
> JK...
> Or am I?
> ~ So, I came by a weird issue yesterday and the only reason I noticed it is because I was sitting directly across the room staring at Jason's screen while he played BF3 (I'm a great "boss", aren't I?)...
> Anyway, he has a single 590...And I ritually stare at his screen because he's directly across from me, facing the wall...
> He was playing BF3 and occasionally there would be a green flicker shoot through his screen particularly when he looked into the light or side-armed his weapon from a scoped position...
> I can't comment if I have this issue at home or not but, I know it USED to occur because tessellation hadn't computed and there was a gap of 1-2ms where you'd see a geometrical flicker.
> And to clarify...I see it out of the corner of my eye because, truth be told it's bright as can be but, that actually hasn't been an issue with any Nvidia cards for a while...
> When I talked to the interns about it as a group, they said it's been an issue for a while but, it wasn't really enough to complain about...
> Anyone know what's up with that?


Yes! I've totally randomly had this. I think I read that it's widespread on all cards.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Yes! I've totally randomly had this. I think I read that it's widespread on all cards.


It's funny because it ONLY happens in BF3...

The way our office is set up, I have my back to a wall and then there's a big glass window, Jason is @20 feet after that, facing the wall.

Jason has been here for @3 years, I'd say; I give him all the games I don't want to beta so, I spend a good hour of my day watching him test but, I've been really busy lately.

Then all the sudden there this green spark almost and I look up, I'm like "*** was that"...It's BF3...

Like I said before that USED to happen because tessellation snapped out for 1-2ms but, that's like mega old school...Like EQ1//Ultima old school.

Doesn't happen in ANY OTHER GAME...Just BF3.

It's almost annoying tbh and I'm not even playing the game!

Anywhoo -- I got power back Tuesday morning, basically...11 days without power







...Downloading Skyrim right now...Gamestop once again botched my Collector's edition order but, we'll forgive them because I get an upgrade for Assassin's Creed Collector's for free.

Will let you guys know how the 590 does on Skyrim in...15 minutes.


----------



## rush2049

The green flicker in BF3 is a documented engine bug currently.... waiting on that patch.... It isn't nearly as bad as the persistent black screen that Bad Company 2 still has.

I don't have very high hopes for the graphics of skyrim, but I am sure the gameplay will be as good if not better than oblivion. Bethesda rarely screws up in that department.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> So, I came by a weird issue yesterday and the only reason I noticed it is because I was sitting directly across the room staring at Jason's screen while he played BF3 (I'm a great "boss", aren't I?)...


Seriously, Masked. If you guys were located any where near Los Angeles, you'd have my resume already. I know I'd have way more fun testing Hardware than I would or did when I was bug testing or any other job.


----------



## Masked

Quote:


> Originally Posted by *rush2049*
> 
> The green flicker in BF3 is a documented engine bug currently.... waiting on that patch.... It isn't nearly as bad as the persistent black screen that Bad Company 2 still has.
> I don't have very high hopes for the graphics of skyrim, but I am sure the gameplay will be as good if not better than oblivion. Bethesda rarely screws up in that department.


Agreed...I've experienced that one myself and it's not too fun.
Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Seriously, Masked. If you guys were located any where near Los Angeles, you'd have my resume already. I know I'd have way more fun testing Hardware than I would or did when I was bug testing or any other job.


We do relocate people you know...Shoot me a resume.

The hiring process for this office is a bit obsurd and you'll constantly have to sign NDA's...Remember back in the day when I posted a SS of 850mhz? I still get crap about that twice a week because of 2 apps that were "visible"...Keep that in mind









So I played Skyrim a bit at 2:15 this morning, I think the graphics look great...It's got that real world, I live in Britain feel to it...Which, is great...You can DEFINITELY tell the 590's aren't optomized for AA yet, which is a driver issue but, hopefully soon.

I also heard through the grapevine they may give us a bit more headroom in the next few drivers...Too bad they've sucked so bad, I'd rather have them function then have headroom...


----------



## aismartin

Hello, I am new to this, and looking to build my first rig. I went all out and purchased this GTX 590.

However I am going back and forth on selecting

Motherboard
Power supply
Cooling

Please post some rig setups so I wont waste money buying something I should not with this card.

Thanks


----------



## MKHunt

Quote:


> Originally Posted by *aismartin*
> 
> Hello, I am new to this, and looking to build my first rig. I went all out and purchased this GTX 590.
> However I am going back and forth on selecting
> 
> Motherboard
> Power supply
> Cooling
> Please post some rig setups so I wont waste money buying something I should not with this card.
> Thanks


I'm assuming you have a Sandy Bridge processor?

ASUS Sabertooth P67 has puled some of the highest single 590 bench results so I'd look into that. I feed my card with an 850W PSU (Corsair AX850) so I can have some OC headroom on the CPU and GPU while still running all my peripherals. I cool everything with water. If you go water, the two best blocks are the Koolance nickel and the XSPC Razor. Best CPU block is the XSPC Raystorm. To cool a sandy bridge processor and single 590 with little to no OC you will need AT LEAST a thick 120mm x 3 radiator with high speed fans. I run 120 x 5 all with high speeds and nothing goes over 50C... ever.

Also, the system requirements for Skyrim are insanely low. Since my sig rig can't run on the GTX 545 (nice name for it) I'm playing on my laptop equipped with GeForce Mobile 9600! Yaaaaaaay.


----------



## NecroPS3




----------



## Xyphyr

Gah, I get my card tomorrow.


----------



## SpunkyXL

I get my GTX 590 SLI next week


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> Agreed...I've experienced that one myself and it's not too fun.
> We do relocate people you know...Shoot me a resume.
> The hiring process for this office is a bit obsurd and you'll constantly have to sign NDA's...Remember back in the day when I posted a SS of 850mhz? I still get crap about that twice a week because of 2 apps that were "visible"...Keep that in mind
> 
> 
> 
> 
> 
> 
> 
> 
> So I played Skyrim a bit at 2:15 this morning, I think the graphics look great...It's got that real world, I live in Britain feel to it...Which, is great...You can DEFINITELY tell the 590's aren't optomized for AA yet, which is a driver issue but, hopefully soon.
> I also heard through the grapevine they may give us a bit more headroom in the next few drivers...Too bad they've sucked so bad, I'd rather have them function then have headroom...


Masked, that's awesome! Made my day. I'll shoot you a PM about it.

No BF3 performance improvements at all on BF3 with the new Beta drivers.

One thing I noticed that I'm feeling a little unsure about is that on this new Beta driver and the WHQL driver, my GPU 2 memory stays fully clocked at 1728mhz, when the other four downclock to 324mhz to stay cool.

Is this something I should be worried about?


----------



## Xyphyr

Quote:


> Originally Posted by *SpunkyXL*
> 
> I get my GTX 590 SLI next week


----------



## jpongin

Hi everyone,

I just built my GTX 590 Quad SLI rig for BF3 (here's proof - http://hardforum.com/showthread.php?t=1646299), and I have a burning question regarding over everyone's experience overclocking on these specs:

- GTX 590 Quad SLI (I have the ASUS cards)
- 285.62 drivers
- 925mV Core Voltage (locked)
- watercooled @ 28C idle / 43C load

How far are you guys pushing your card on 925mV?

The reason I'm asking is because I cannot get these cards past 620 core. I mind as well just leave it stock! I really want to push these cards without having to downgrade my drivers or using custom BIOS. Any feedback is appreciated. Thanks.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Masked, that's awesome! Made my day. I'll shoot you a PM about it.
> No BF3 performance improvements at all on BF3 with the new Beta drivers.
> One thing I noticed that I'm feeling a little unsure about is that on this new Beta driver and the WHQL driver, my GPU 2 memory stays fully clocked at 1728mhz, when the other four downclock to 324mhz to stay cool.
> Is this something I should be worried about?


I don't think so...

Like I said earlier, the solder on your chip is why the temp of 100c exists...It's not because the ram on the card will fry or the core...A solder will melt away and "disconnect" and then, your card is "bricked"...So, as long as you're not going over 100c with the 2nd set of ram maxed, I genuinely wouldn't give it a second thought...

What I find amusing//a little alarmed about is the fact that they're trying to control the card through driver manipulation which, it's about time this happened, it's been LONG coming but, if you're going to do that, you actually have to do it right which...They aren't...Not only are they not doing it right, they're actually stringing the entire control under the driver layer...Which, IMO is









What it will allow the end user to do is customize their usage themselves rather than tailor usage to a moron from Nvidia that can't write a driver worth his left nut.

I BEGGED for this on our Mobile systems, I literally BEGGED Nvidia at E3 because we (Mainly me) have to write every single mobile driver due to the fact that while they're Nvidia ref, they're not driver supported...I was basically told it wasn't worth the time and then Rick mentioned driver manipulation...That was last year...So, I can't say I didn't see this coming...but, the plan "we" discussed was one of easing into it and informing the "public" this is what was going on...Which, not only haven't they done that but, every driver they play with something else.

In the 270's it was manipulation over the 560's, especially the TI's...Early 280 was the 580 and with this driver, you see Vram manipulation for the primary display and while tessellation is almost gone...You also see LESS throttling which, has apparently been their goal since day 1.

What sucks though is that the latest drivers are absolutely terrible in terms of stability for ANY platform...NVLDDKM errors up the wall...So, they're not only not doing it right but, I have a feeling they really don't know *** they're doing in the first place...Again, far from a shocker.

I think we'll get there but, it may NOT be before Kepler...


----------



## SpunkyXL

Having 2X GTX 590's = Quad SLI?


----------



## Xyphyr




----------



## Scorpion49

Hey guys, any of you happen to play eve online? I'm getting terrible artifacts and texture tearing in that game only, and I know for a fact that I didn't have it with my 580's. Driver issue maybe? It works fine in every other game I play so I'm assuming its the game itself.


----------



## MRHONGKONGDAVE

Hi guys, been reading and I want in!

I am a complete noob so please be patient.









Check my pics/profile for my stuff.

When someone replies, I would like a lot of help with stats etc as I am getting some funny readings i.e.

My CPU Core Speed is reading 1605.3 Mhz but jumps to 3803 sometimes. Is this just because there is not as much load on the CPU when idling? would overclocking to 3.8 fix that?

My DRAM Frequency is reading 668.9 Mhz is that right as thought it should read 1600 Mhz. They are in slots 2 and 4 should I move them to 1 and 3.

On my 590 when playing BF3 sometimes the second GPU doesn't do anything? One goes up to 99% the other sometimes 0-65% is this normal? Should I activate that AFR thing in the nvidia control panel to spread the load equally?

My temps on the 590 are 33C idle, 75C average and up to 86C is the highest it's ever been. Do you think I should switch my 590 to the 8x slot and put my soundcard in the 16x slot, would that aid cooling? Take a look at the pics as it is about 10mm from my cpu cooler at the mo, I thought it may drag some heat off of it being so close but looking at the readings on this forum not so sure.

I am taking pics of CPU Z and GPU Z to upload.

Thanks in advance.

Dave


----------



## OutlawNeedsHelp

Quote:


> Originally Posted by *SpunkyXL*
> 
> Having 2X GTX 590's = Quad SLI?


Yes, you cannot have more than 2 590's.


----------



## MRHONGKONGDAVE

Here is my screen


----------



## Scorpion49

Quote:


> Originally Posted by *MRHONGKONGDAVE*
> 
> Hi guys, been reading and I want in!
> 
> I am a complete noob so please be patient.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Check my pics/profile for my stuff.
> 
> When someone replies, I would like a lot of help with stats etc as I am getting some funny readings i.e.
> 
> My CPU Core Speed is reading 1605.3 Mhz but jumps to 3803 sometimes. Is this just because there is not as much load on the CPU when idling? would overclocking to 3.8 fix that?
> 
> My DRAM Frequency is reading 668.9 Mhz is that right as thought it should read 1600 Mhz. They are in slots 2 and 4 should I move them to 1 and 3.
> 
> On my 590 when playing BF3 sometimes the second GPU doesn't do anything? One goes up to 99% the other sometimes 0-65% is this normal? Should I activate that AFR thing in the nvidia control panel to spread the load equally?
> 
> My temps on the 590 are 33C idle, 75C average and up to 86C is the highest it's ever been. Do you think I should switch my 590 to the 8x slot and put my soundcard in the 16x slot, would that aid cooling? Take a look at the pics as it is about 10mm from my cpu cooler at the mo, I thought it may drag some heat off of it being so close but looking at the readings on this forum not so sure.
> 
> I am taking pics of CPU Z and GPU Z to upload.
> 
> Thanks in advance.
> 
> Dave


I'm guessing you run a SB CPU? FIRST STEPP: go to your profile and fill out your system specs so we can better assist you.

As far as the CPU, Unless you turn speedstep off, it will always downclock to 1600mhz when idling to save power and create less heat. Thats something you want, unless you just like running hot and sucking down 90+ watts all the time. DDR3 for SB runs natively at 667mhz x2 (double data rate... yeah?) and if its showing that and you bought 1600mhz modules, you need to use the XMP profile in your BIOS. They will show in CPU-Z as 800mhz after that. I'm going to take a guess this is your first time with any sort of overclocking or performance PC?

Your temps on the 590 are not too high, but could be better. Download MSI afterburner or EVGA precision and set yourself up a custom fan profile, run it 10% for every 10*C, so 40*C = 50% and ensure that its at 100% as soon as it hits 80*C or so. Remember, the 590 exhausts on both ends so you don't want a case fan blowing directly into the back end of it as it will hamper the coolers ability. Good airflow in the case is also important.

BF3 has known bugs with CF/SLI usage, I would investigate the BF3 thread and see about some fixes.


----------



## MRHONGKONGDAVE

Hi Scorpion,

Thanks for the reply, will go into BIOS and edit that. My rig is in my profile in components, is it not displaying? My profile might not be visible yet, if it is all the pics of the rig are inthere along with a screenshot of all running performance monitors etc? Will the 8x pci slot downgrade performance compared to the 16x?

Is anyone else running a 90 degree case with vertical fans? Have you had cooling issues?

Raven RV02-EVO
i7 2600k
Win7 64 Ultimate
Asus Maximums IV Gene Z68
Asus GTX 590
Asus Xonar HDAV 1.3 Deluxe
Corsair Vengeance LP DDR3 1600 1.35v
Antec 850w High Current Pro
Be Quiet! Dark Rock Pro
Cyborg V7 RAT7 XFLY 5

Check my pics if poss.


----------



## Scorpion49

Quote:


> Originally Posted by *MRHONGKONGDAVE*
> 
> Hi Scorpion,
> 
> Thanks for the reply, will go into BIOS and edit that. My rig is in my profile in components, is it not displaying? My profile might not be visible yet, if it is all the pics of the rig are inthere along with a screenshot of all running performance monitors etc? Will the 8x pci slot downgrade performance compared to the 16x?
> 
> Is anyone else running a 90 degree case with vertical fans? Have you had cooling issues?
> 
> Raven RV02-EVO
> i7 2600k
> Win7 64 Ultimate
> Asus Maximums IV Gene Z68
> Asus GTX 590
> Asus Xonar HDAV 1.3 Deluxe
> Corsair Vengeance LP DDR3 1600 1.35v
> Antec 850w High Current Pro
> Be Quiet! Dark Rock Pro
> Cyborg V7 RAT7 XFLY 5
> 
> Check my pics if poss.


I would not run the 590 in an 8x slot unless you have to. Reason being is its a dual GPU card so the extra bandwidth can make a difference. Other than that I can see your problem already. The Raven RV02 is NOT a good case for a card like the 590 (I used to have one). Those fans on the bottom force the hot air right back into the GPU, thats why your temps are high. Nothing you can do will really fix that, possibly find a small strip of plastic or something to secure onto the fan right below the end of the card to prevent it from blowing back into the exhaust? Maybe cardboard? Your temps aren't at a critical level but I suspect you might end up with overheating problems.


----------



## MRHONGKONGDAVE

I will stick my spare drive bay cover to the fan directly under the 590 and see if it makes a difference. I've got a drive bay triple 40mm fan thing coming and an expansion slot fan also, hopefully they'll lower the temps a bit but probably gonna sound like a flipping helicopter!!!

Any other ways of adding cooling to my system rather than replacing anything as all new? I don't mind losing the cheap slot fan or drive fans in place of something more tried and tested, nothing too hard to achieve and would like to avoid dismantling components!!!

Thanks for all your advice buddy.


----------



## MRHONGKONGDAVE

Just checked out your "Black Box" that is a crazy PC, looks like a life support machine from the Starship Enterprise! How long does something like that take. It was nerve racking enough just building my PC let alone taking components apart.


----------



## Masked

Quote:


> Originally Posted by *Scorpion49*
> 
> I would not run the 590 in an 8x slot unless you have to. Reason being is its a dual GPU card so the extra bandwidth can make a difference. Other than that I can see your problem already. The Raven RV02 is NOT a good case for a card like the 590 (I used to have one). Those fans on the bottom force the hot air right back into the GPU, thats why your temps are high. Nothing you can do will really fix that, possibly find a small strip of plastic or something to secure onto the fan right below the end of the card to prevent it from blowing back into the exhaust? Maybe cardboard? Your temps aren't at a critical level but I suspect you might end up with overheating problems.


No it doesn't.

Ontop of that the NF200 actually splits 16x total...So if he has 2 cards, they're 2 by 8x...So, it genuinely doesn't matter.

The difference between an 8x and a 16x is 3% on a dual GPU slot, it's 4% total difference.

A 4% performance loss vs. a better performance temp ~ The temp will and should always be a primary point vs a 3-5% performance loss.


----------



## Scorpion49

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> I would not run the 590 in an 8x slot unless you have to. Reason being is its a dual GPU card so the extra bandwidth can make a difference. Other than that I can see your problem already. The Raven RV02 is NOT a good case for a card like the 590 (I used to have one). Those fans on the bottom force the hot air right back into the GPU, thats why your temps are high. Nothing you can do will really fix that, possibly find a small strip of plastic or something to secure onto the fan right below the end of the card to prevent it from blowing back into the exhaust? Maybe cardboard? Your temps aren't at a critical level but I suspect you might end up with overheating problems.
> 
> 
> 
> No it doesn't.
> 
> Ontop of that the NF200 actually splits 16x total...So if he has 2 cards, they're 2 by 8x...So, it genuinely doesn't matter.
> 
> The difference between an 8x and a 16x is 3% on a dual GPU slot, it's 4% total difference.
> 
> A 4% performance loss vs. a better performance temp ~ The temp will and should always be a primary point vs a 3-5% performance loss.
Click to expand...

Either way, moving it won't help when theres an AP-181 blowing directly into it from 1/2 inch away. His problem isn't sufficient cooling, its that its configured wrong for the style of card he has. The case is great, the card is fine by itself, but combining the two is the problem. Thats why I suggested he block the air directly under the rear exhaust of the card so it cant vent properly. I'll bet he has a difference of at least 10*C between the cores the way it is now.


----------



## Wogga

on the x8 or x16 question: if there is any sence - my quad in x8/x8 performs same as on X58 with x16/x16 at least in 3dMark11 (even few points higher







)
so there is barely no difference.

played Skyrim..alot. on ultra settings with vsync fps 90% of time is 60 but sometimes it drops to 50-54 for a second or two. also before adding some fixes in .ini file i had huge problems with water and vsync. now it runs fine except one thing - after 3-4 hours fire and other light effects start to leave some kind of trails. as for GPU usage: all 4 GPUs at 50-60% load.
i've noticed that BF3 on ultra (not cheat "ultra" w/o HBAO) runs great without any artifacts and constantly on 60 fps even on one GPU (sometimes i forget about turning SLi on after folding) but Skyrim seems to want more power if it fails to run at constant 60 fps


----------



## Masked

Quote:


> Originally Posted by *Scorpion49*
> 
> Either way, moving it won't help when theres an AP-181 blowing directly into it from 1/2 inch away. His problem isn't sufficient cooling, its that its configured wrong for the style of card he has. The case is great, the card is fine by itself, but combining the two is the problem. Thats why I suggested he block the air directly under the rear exhaust of the card so it cant vent properly. I'll bet he has a difference of at least 10*C between the cores the way it is now.


Yes but, I'm addressing the misinformation.

The real world difference between the x16, x6, x4 slots is 4% on a dual GPU...On a single it's 3%...There is hardly ANY reason in the world to require that his card be in the x16 slot.

On the SB motherboards it's x16 TOTAL. So if he has 2 cards, the X16 becomes x8, x8.

So for example, I have a GTX 590, a GTX 580 and a Fatal1ty...So, in reality my actual lane usage is x16/3 or 5.3_ ~

I do agree that it's configured incorrectly...The card will just exhaust hot air back into the case and it's being repeatedly recycled rather than exhausting/intaking fresh air.

There is no "venting" properly and what you suggest will raise the heat exposure of the 2nd core.

The front/back blow-back on the 590 is because it runs fresh air over BOTH cores...By blocking 1, you're essentially overheating the other so, no...That's not smart either.

My suggestion would be to get more ventilation without blocking the exhaust because if you go to bench that card and it's blocked...You're performance will half-life.


----------



## Asus11

got a asus gtx 590 coming, just wondered? how easy is it to overclock, also I heard you can softmod these?
got it for a great price


----------



## MRHONGKONGDAVE

Hi Scorpion / Masked,

Thanks for the the info guys it's been really helpful.

The reason for swapping the 590 PCIE slot from 16 to 8 was so that when I put the drive bay fans and slot fan in, I could move the soundcard to the 16 slot and then the sound card wont be in the way allowing fresh cool air to hit the 590's intake fan directly, also the centre of the middle AP181 would then be directly underneath the card which would mean the heat would not be blasted straight back into the card due to the 50mm centre in the fan. The surrounding air flow in the case should shift the bottom vented heat as it is amazing in this case, if however it doesn't change much I could stick one of those spot fans to sweep across the bottom of the card.


----------



## supra_rz

Got a question for you guys, lets say i got a gtx 590 installed and several games have problems with SLI set ups, could i just disable 1 gpu and use it like a single gtx 580?

sorry if its a stupid question

ps. 300 pages yay


----------



## Wogga

by occasion (folding lol) i've played BF3 with SLi disabled on ultra w/o any troubles


----------



## Scorpion49

Quote:


> Originally Posted by *Masked*
> 
> Yes but, I'm addressing the misinformation.
> 
> The real world difference between the x16, x6, x4 slots is 4% on a dual GPU...On a single it's 3%...There is hardly ANY reason in the world to require that his card be in the x16 slot.
> 
> On the SB motherboards it's x16 TOTAL. So if he has 2 cards, the X16 becomes x8, x8.
> 
> So for example, I have a GTX 590, a GTX 580 and a Fatal1ty...So, in reality my actual lane usage is x16/3 or 5.3_ ~
> 
> I do agree that it's configured incorrectly...The card will just exhaust hot air back into the case and it's being repeatedly recycled rather than exhausting/intaking fresh air.
> 
> There is no "venting" properly and what you suggest will raise the heat exposure of the 2nd core.
> 
> The front/back blow-back on the 590 is because it runs fresh air over BOTH cores...By blocking 1, you're essentially overheating the other so, no...That's not smart either.
> 
> My suggestion would be to get more ventilation without blocking the exhaust because if you go to bench that card and it's blocked...You're performance will half-life.


You're saying I'm giving misinformation but then agreeing with me that there is, in fact, a difference. A small one to be sure, but it exists. I'm not sure where the argument is here, if the reason he wants to put the card in the x8 is because he thinks it will help with cooling when the sound card has nothing to do with the cooling problem, nor will adding fans to the drive bay help it any if the 590 can't exhaust properly. Also, I was talking about blocking a small strip of the fan under the card not the card vent itself, sorry if I didn't make that clear. The intent was to keep the 181 from blowing the 590's exhaust right back into it, not block the exhaust entirely.


----------



## Masked

Quote:


> Originally Posted by *Scorpion49*
> 
> You're saying I'm giving misinformation but then agreeing with me that there is, in fact, a difference. A small one to be sure, but it exists. I'm not sure where the argument is here, if the reason he wants to put the card in the x8 is because he thinks it will help with cooling when the sound card has nothing to do with the cooling problem, nor will adding fans to the drive bay help it any if the 590 can't exhaust properly. Also, I was talking about blocking a small strip of the fan under the card not the card vent itself, sorry if I didn't make that clear. The intent was to keep the 181 from blowing the 590's exhaust right back into it, not block the exhaust entirely.


3-4% in reality is not a difference worth having a discussion over...Genuinely.

Yes, an infinitely small percentile of a difference exists, again, to the community, and in real world apps, it's negligible.

He can put that card wherever in god's name he wants, it is a 4% real world difference which is not going to make a lick of difference in BF3 due to the fact he doesn't have AA maxed.

So, again, in real world usage, no, there is no difference in slot choice...In benching, there is only because 3dMark and other benches test their tessellation engines which, 3-4% again, is not a real world usage difference.

If he games, nothing will change...No performance difference.

My agreeing with you is that his case is not properly vented and that, it's honestly a bad case for this hardware.

I personally believe there are some better options out there, to the extent I'd almost reccomend he games with the side of the case off.

That doesn't change the fact that he can literally drop the card in any slot and get absolutely the same gaming performance that he had in the X16.


----------



## MRHONGKONGDAVE

HI Masked / Scorpion,

Right then, have moved the card to the 8x slot so now the vapour chamber exhuasts on to the centre of the fan which is about 50mm and have been playing BF3 all night and after setting up the custom fan profile in MSI Afterburner the cards have NOT gone above 70C. On average the one GPU is about 2C more than the other but both show the same max, also for the first time both GPUs are showing the same load now maxing at 99% where as before one was always considerably less? Could it of been shutting down due to the heat?

I have plus repped you both as have used info from both of you.

For anyone reading this conversation, I checked reviews for the best reviewed components I could afford but didn't check compatibility!!! Working now though so thanks for the help!

I trust these temperatures are more acceptable!


----------



## Scorpion49

Quote:


> Originally Posted by *MRHONGKONGDAVE*
> 
> HI Masked / Scorpion,
> 
> Right then, have moved the card to the 8x slot so now the vapour chamber exhuasts on to the centre of the fan which is about 50mm and have been playing BF3 all night and after setting up the custom fan profile in MSI Afterburner the cards have NOT gone above 70C. On average the one GPU is about 2C more than the other but both show the same max, also for the first time both GPUs are showing the same load now maxing at 99% where as before one was always considerably less? Could it of been shutting down due to the heat?
> 
> I have plus repped you both as have used info from both of you.
> 
> For anyone reading this conversation, I checked reviews for the best reviewed components I could afford but didn't check compatibility!!! Working now though so thanks for the help!
> 
> I trust these temperatures are more acceptable!


70*C is much better than 86*C before. So allowing the card to vent at the bottom worked? Nice of them to put a dead spot in the fan there. Glad it worked out for you.


----------



## chaosneo

i got myself a Raven RV02e as well.

despite that this casing put the GTX590 in an awkward position where hot exhaust air is against inflow, there still seem to be a good deal of cooling inside this particular casing. once i complete my modding and port over the entire system, i will post temps.

MrHongKongDave,
care to share how your PC casing interior looks like? would like to see how your GTX590 is located inside there.


----------



## Scorpion49

I pulled this out of his profile.


----------



## MKHunt

Guys... my RMA 590 is different. My water block doesn't fit. I genuinely don't know what to do.


----------



## Recipe7

***? What is the difference between the one you got and a 'normal' one?


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7*
> 
> ***? What is the difference between the one you got and a 'normal' one?


These whatevers. They're taller.





Ugh. Just when I thought I'd have my rig back together...

I know I can do a workaround that would _fit_ with the vram and all, but the cores would be out of direct contact. I could use the pink pad everywhere else (1mm vs .5mm) and then use only paste on the inductors(?). But I'm out of paste.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> These whatevers. They're taller.
> 
> 
> Ugh. Just when I thought I'd have my rig back together...


EVGA sent you that?

I'd call them...Complain about that...While it //IS// an RMA, you have the right to be told if you're not getting a reference part or not...


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> EVGA sent you that?
> I'd call them...Complain about that...While it //IS// an RMA, you have the right to be told if you're not getting a reference part or not...


I was afraid of this answer. I RMA'ed the first RMA since one core was dead and the IHS came off with the vapor chambers, this came in today and was the RMA of the original RMA.... RMA^3?


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> I was afraid of this answer. I RMA'ed the first RMA since one core was dead and the IHS came off with the vapor chambers, this came in today and was the RMA of the original RMA.... RMA^3?


Honestly, everyone I know has once been through the dreaded 3x returns...Just happened to me with an SSD 3 weeks ago, 6 returns...Damn OCZ, DAMN YOU!

Anyway, my suggestion is to call them and have them elevate your situation...

Tell them you had a waterblock and it needs to be on water so non reference doesn't cut it...You're also extremely unhappy you weren't informed of this...While you're greatful for the RMA itself, you'd very much like to swap the card for a ref pcb...

That's pretty much it.


----------



## MKHunt

Ah base-level tech said they can't do anything about it tonight and I'll have to call back tomorrow and go straight to a manager. Bit of a bummer, but I suppose it gives me time to order up another tube of ICDiamond since I'm out.

I wanted to delve into Skyrim @ ultra tomorrow, too. At least my laptop can play it at mediumish. So unfortunate. Maybe my sig rig was never meant to exist?


----------



## Recipe7

Ouch. That is really something else. At the end of it all, is it possible to contact them regarding this, maybe you can point out the problem and they can provide you with a unit identical to your previous.

It's weird that they would replace parts with parts that don't allow fitment of heatsinks...


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7*
> 
> Ouch. That is really something else. At the end of it all, is it possible to contact them regarding this, maybe you can point out the problem and they can provide you with a unit identical to your previous.
> It's weird that they would replace parts with parts that don't allow fitment of heatsinks...


The heatsinks have been redesigned to fit the new parts. I feel like an idiot for not noticing that until it was too late. The first picture is the card I originally got back from RMA, but it has the same heatsink design as the new ones even though it has the inductors that fit with waterblocks.

New HS design:


Old HS design:


It's sad because the new parts are beefier and, for the purposes of the 590, better. If the WB fit I'd count it as a win in every way.


----------



## Recipe7

Hmm, that is interesting. This is very disappointing from a consumer standpoint, especially if you are keen on putting things under water. Does this mean that all existing WB on the market will not fit the new designs?

Atleast you have a working unit... albeit aircooled =\


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7*
> 
> Hmm, that is interesting. This is very disappointing from a consumer standpoint, especially if you are keen on putting things under water. Does this mean that all existing WB on the market will not fit the new designs?
> Atleast you have a working unit... albeit aircooled =\


Normally this is the mindset I'd take. But I dropped big bucks on this rig and the GPU completes my loop. It goes from the top back of the case to the GPU to the bottom. I _will not_ resort to air cooling as that would basically be burning money. Plus I share a room with the computer and fold on the GPU at night. That much noise is unacceptable.


----------



## Wogga

i'd changed thermal pads to something liquid (MX-2 for example). but if you can RMA then its ok. such a shame that its useless in our country -_-


----------



## MKHunt

One more thing.

Should I be concerned about this?



and a zoom-out:



Is that a VRM? A _mismatched_ VRM? It says 330 while the others say 470 and it has all sorts of different markings.


----------



## Recipe7

Quote:


> Originally Posted by *MKHunt*
> 
> Normally this is the mindset I'd take. But I dropped big bucks on this rig and the GPU completes my loop. It goes from the top back of the case to the GPU to the bottom. I _will not_ resort to air cooling as that would basically be burning money. Plus I share a room with the computer and fold on the GPU at night. That much noise is unacceptable.


I get you. Your PC is pretty much how I want mine, I wouldn't be happy with an aircooled gpu in light of what you have built for yourself.

What a crappy situation MKH... let us know how things go.

As for the VRM mismatch... have you ran the gpu yet with the stock heatsink?


----------



## Terence W

Hello. I'm not sure if I qualify as a member here as my machine was built by someone else. I've just bought it for £1800 to run Microsoft FS2004 (with Golden Wings 3 added) but it does have a GTX 590 graphics card. Do you want a screenshot of the device manage read out or will my word do?

I'm after advice to tweak the GTX 590 as the picture I'm getting when running my FS had a lot of flickering & quivering ofr trees & window bars, etc. I'm not sure what parameters to change.

Thanks.....Terry .


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> One more thing.
> Should I be concerned about this?
> 
> and a zoom-out:
> 
> Is that a VRM? A _mismatched_ VRM? It says 330 while the others say 470 and it has all sorts of different markings.


It's supposed to be like that.

Do me a favor, take a picture of the whole card as HQ as you can get it and PM it to me.

I once heard there were 3 revisions...1&2 being a production swap but, a 3rd? Hrm...Didn't believe it until now.


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> It's supposed to be like that.
> Do me a favor, take a picture of the whole card as HQ as you can get it and PM it to me.
> I once heard there were 3 revisions...1&2 being a production swap but, a 3rd? Hrm...Didn't believe it until now.


The plot thickens! I'll try and find the 14MP camera we have kicking around and the room with the most even lighting. Just give me a couple hours.


----------



## Shinobi Jedi

Sucks that happened to you MKH. Sorry....

But I have to say, this is getting interesting... (Grabs Popcorn)


----------



## toX0rz

Quote:


> Originally Posted by *supra_rz*
> 
> Got a question for you guys, lets say i got a gtx 590 installed and several games have problems with SLI set ups, could i just disable 1 gpu and use it like a single gtx 580?
> sorry if its a stupid question
> ps. 300 pages yay


Yes you can. However, since the chips on the 590 run with lowered clocks, you wont quite see the performance of a single GTX 580 when disabling SLI, it will be more like a single GTX 570.


----------



## Mongol

I just got my cards in for an upcoming project/build. ^__^

Would a pic of them along with a signed/dated slip of paper suffice?
(I won't be installing them just yet)


----------



## Icekilla

Hi guys! I'd like some advice...

I was thinking of getting two GTX 570's 2.5GB and do SLI, but, what fi I get an EVGA GTX 590 instead? Would it be a better buy? It's for my new rig, Dark Knightess.

Discussion about rig
Discussion about GPU's


----------



## Mongol

^2 570's in sli trump a single 590.


----------



## Masked

Quote:


> Originally Posted by *Icekilla*
> 
> Hi guys! I'd like some advice...
> I was thinking of getting two GTX 570's 2.5GB and do SLI, but, what fi I get an EVGA GTX 590 instead? Would it be a better buy? It's for my new rig, Dark Knightess.
> Discussion about rig
> Discussion about GPU's


Technically because of the Vram, you'd be better off doing the SLI 570s.

And no, a single core on the 590 is not = to a stock 570...A well overclocked 570, yes but...It has a big leg up.


----------



## Icekilla

Quote:


> Originally Posted by ***********
> 
> ^2 570's in sli trump a single 590.


Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Icekilla*
> 
> Hi guys! I'd like some advice...
> I was thinking of getting two GTX 570's 2.5GB and do SLI, but, what fi I get an EVGA GTX 590 instead? Would it be a better buy? It's for my new rig, Dark Knightess.
> Discussion about rig
> Discussion about GPU's
> 
> 
> 
> Technically because of the Vram, you'd be better off doing the SLI 570s.
> 
> And no, a single core on the 590 is not = to a stock 570...A well overclocked 570, yes but...It has a big leg up.
Click to expand...

How big is the performance difference between two GTX 570's in SLI and the GTX 590? I've seen some benchmarks and they're very close to each other. What if I grab a GTX 590 and overclock it to 750MHz? Would there still be a marginal performance difference?


----------



## Scorpion49

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Icekilla*
> 
> Hi guys! I'd like some advice...
> I was thinking of getting two GTX 570's 2.5GB and do SLI, but, what fi I get an EVGA GTX 590 instead? Would it be a better buy? It's for my new rig, Dark Knightess.
> Discussion about rig
> Discussion about GPU's
> 
> 
> 
> Technically because of the Vram, you'd be better off doing the SLI 570s.
> 
> And no, a single core on the 590 is not = to a stock 570...A well overclocked 570, yes but...It has a big leg up.
Click to expand...

The 2.5GB may be attractive, and I have not seen any real reviews of them at all. I can say stock 570's at 732mhz run just a little bit faster than stock 590 at 630mhz (I'm not sure what the vanilla 590 are clocked at, 607?).

3Dmark isn't the best bench in the world to use, but its the easiest one I have on hand without digging through my photobucket.

My 1.2GB 570's ran P9379 at stock.

My 590 runs P9645 stock.

A basic, easy to attain 875mhz on the 570's beats just about anything out of the 590's that I've seen at P10482 with a graphics score 2000 points higher.

However, the 570's were with the i5 2500k at stock speeds, where the 590 was at 4.5ghz. The 570's actually score a higher graphics score, and have a much higher OC headroom than the 590. I doubt I would be able to match my 950mhz 570's on air with my 590 maxed out on water to be honest. I haven't extensively benched the 590 but I will say the 590 seems to have a smoother experience overall despite the lower clock speed. The 570's rode the Vram limit a lot, but also had higher average frame rates. The 590 gets better minimums thanks to more powerful cores. I would say it depends entirely on how much room you have in your case and if you want to deal with a multi-card solution, since 570's are usually cheaper than a 590 (excepting the 2.5Gb variant).


----------



## Icekilla

If I get the GTX 570's, they'd be the 2.5GB variant. I'm just curious because both solutions would cost almost the same ($790 for the GTX 570's and $750 for the GTX 590). They're both outdated already and I think I'll upgrade to Maxwell anyway in 2013. I just want the best bang for the buck, and either way, how much can I overclock this card without causing damage to it? Is 750MHz-770MHz a feasible number without overvolting?

Either way, space is not a concern since I'll be using a Rosewill Blackhawk Ultra case (it's HUGE!! As big as the Xigmatek Elysium), so I'm guessing it'll have enough airflow to stay cool. My only concern is price-performance between the two options.

With the GTX 590 I'd take only 2 slots, while the GTX 570's will take 4









EDIT: Also, I've heard those 2.5Gb models suck for overclocking, so I'm not really gonna put my hopes too high on that. Something that concerns me is: Can you use all the 3GB of memory from the GTX 590?


----------



## Scorpion49

Quote:


> Originally Posted by *Icekilla*
> 
> If I get the GTX 570's, they'd be the 2.5GB variant. I'm just curious because both solutions would cost almost the same ($790 for the GTX 570's and $750 for the GTX 590). They're both outdated already and I think I'll upgrade to Maxwell anyway in 2013. I just want the best bang for the buck, and either way, how much can I overclock this card without causing damage to it? Is 750MHz-770MHz a feasible number without overvolting?
> 
> Either way, space is not a concern since I'll be using a Rosewill Blackhawk Ultra case (it's HUGE!! As big as the Xigmatek Elysium), so I'm guessing it'll have enough airflow to stay cool. My only concern is price-performance between the two options.
> 
> With the GTX 590 I'd take only 2 slots, while the GTX 570's will take 4


My 590 Classy won't even hit 650mhz with stock voltage. It seems like it takes everything its got just for stock speed, and thats under water with max temp of 41*C loaded. If you want bang for the buck grab a pair of Sapphire Toxic 6950's. As far as Vram, 590 only has 1.5Gb per core, same as a 580. Its not a 3GB card.


----------



## Icekilla

I gotta stick with Nvidia, though







I *NEED* CUDA. Otherwise I'd pick an HD 6990 instead.

OMG Is it really THAT hard to overclock the GTX 590? O_O How did those guys manage to push them that far then?


----------



## Masked

Quote:


> Originally Posted by *Icekilla*
> 
> If I get the GTX 570's, they'd be the 2.5GB variant. I'm just curious because both solutions would cost almost the same ($790 for the GTX 570's and $750 for the GTX 590). They're both outdated already and I think I'll upgrade to Maxwell anyway in 2013. I just want the best bang for the buck, and either way, how much can I overclock this card without causing damage to it? Is 750MHz-770MHz a feasible number without overvolting?
> Either way, space is not a concern since I'll be using a Rosewill Blackhawk Ultra case (it's HUGE!! As big as the Xigmatek Elysium), so I'm guessing it'll have enough airflow to stay cool. My only concern is price-performance between the two options.
> With the GTX 590 I'd take only 2 slots, while the GTX 570's will take 4
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Also, I've heard those 2.5Gb models suck for overclocking, so I'm not really gonna put my hopes too high on that. Something that concerns me is: Can you use all the 3GB of memory from the GTX 590?


Quote:


> Originally Posted by *Scorpion49*
> 
> My 590 Classy won't even hit 650mhz with stock voltage. It seems like it takes everything its got just for stock speed, and thats under water with max temp of 41*C loaded. If you want bang for the buck grab a pair of Sapphire Toxic 6950's. As far as Vram, 590 only has 1.5Gb per core, same as a 580. Its not a 3GB card.


Sapphire is the only AMD company to honor their warranties recently but, still, after my 6990 woes (15th RMA), I don't suggest them to anyone...Not even the people I secretly hate.

That being said, the 570's are the better option...The 2.5gbs has 1gb+ of Vram per card and considering where games are going (15+ active betas atm) I'd STRONGLY suggest more vram as opposed to the 590.

If your 590 can't go over 650mhz, you're doing it wrong...Cain's overclock thread will steer you in the right direction and it doesn't/won't void your warranty.

Quote:


> Originally Posted by *Icekilla*
> 
> I gotta stick with Nvidia, though
> 
> 
> 
> 
> 
> 
> 
> I *NEED* CUDA. Otherwise I'd pick an HD 6990 instead. OMG Is it really THAT hard to overclock the GTX 590? O_O How did those guys manage to push them that far then?


He's just not doing it right.


----------



## Icekilla

I still don't get it... doesn't the GTX 590 has 3GB of VRAM anyway? Wouldn't that be enough for a while until Maxwell comes out in 2013?


----------



## Masked

Quote:


> Originally Posted by *Icekilla*
> 
> 
> 
> 
> 
> 
> 
> 
> I still don't get it... doesn't the GTX 590 has 3GB of VRAM anyway? Wouldn't that be enough for a while until Maxwell comes out in 2013?


I run a beta office.

No.

Go with the 570's the extra 1gb of Vram will come into play far before you realize.

The GTX 590 has 2 cores with 3 TOTAL Vram...That = 1.5 vram / core. Making it essentially 2 underclocked 580's with 1.5gb.


----------



## Icekilla

Talk about false advertising









So... considering Maxwell is around the corner, it's not worth it to spend in the GTX 580's 3GB, therefore, the best option would be the GTX 570's 2.5GB, am I right?


----------



## Masked

Quote:


> Originally Posted by *Icekilla*
> 
> Talk about false advertising
> 
> 
> 
> 
> 
> 
> 
> 
> So... considering Maxwell is around the corner, it's not worth it to spend in the GTX 580's 3GB, therefore, the best option would be the GTX 570's 2.5GB, am I right?


It's not false advertising at all...It's simple math and has //ALWAYS// been that way...In reality, ATI started that lovely little exception so, if you want to be pissed at someone, fire off at AMD.

That being said, KEPLER is "just" around the corner and it's going to be basically a hybrid refresh...At this point, personally, I'd buy 1x580 gb and "deal" with it until Kepler is released then, jump over to that...It's actually what I intend to do myself.

Samples are going strong so far


----------



## Icekilla

AFAIK Maxwell will come out in 2013, so I'll be holding on with Fermi until those show up. I guess I'll just skip Kepler entirely UNLESS they're worth it. It's my current dilemma: stick with my HD 4870 until Kepler comes out or buy the 570's and wait for Maxwell, which apparently will be 150% faster than Kepler.


----------



## Masked

Quote:


> Originally Posted by *Icekilla*
> 
> AFAIK Maxwell will come out in 2013, so I'll be holding on with Fermi until those show up. I guess I'll just skip Kepler entirely UNLESS they're worth it. It's my current dilemma: stick with my HD 4870 until Kepler comes out or buy the 570's and wait for Maxwell, which apparently will be 150% faster than Kepler.


Dyes are shrinking substantially there's just no real breakthrough and as always, it will be a year between releases.

Right now, we're at a wall in coding virtualization that //will// fade once Kepler is released.

So, my suggestion is to go Fermi - Kepler - Maxwell...Especially how things are going, Kepler will release in January and Maxwell won't release until December 2013...Same thing happened with Fermi.

I'd go the 580 3gb until January, swap over and then, swap over again


----------



## Icekilla

I'd do it if I paid for them








But since I'm "somewhat limited" by my father's "budget"... I don't think he'd be pleased to see me switching GPU's that quickly, even if I sold them. Also, considering how fast they'd depreciate... See where this is going?

BTW Masked, can you add me on steam please?


----------



## Recipe7

Your new avatar scares me Masked,


----------



## Masked

Quote:


> Originally Posted by *Icekilla*
> 
> I'd do it if I paid for them
> 
> 
> 
> 
> 
> 
> 
> But since I'm "somewhat limited" by my father's "budget"... I don't think he'd be pleased to see me switching GPU's that quickly, even if I sold them. Also, considering how fast they'd depreciate... See where this is going?
> BTW Masked, can you add me on steam please?


The issue that you face is, when Kepler is released, things will change because like with the new CPU cores, significantly more can be done...You're genuinely doing yourself a disservice but not, picking up Kepler and the wait will be a LONG one...

If the same problems with Kepler arrive when Maxwell is started, we'll just have the same dye but new VRMS etc.

So, either I'd wait till Kepler or pick up 1x580 3gb...

What you could also do is pick up a 6950...I mean ATI's release (I can't say anything else) is soon...So you could wait for that as well.

Steam name is: maskedsin
Quote:


> Originally Posted by *Recipe7*
> 
> Your new avatar scares me Masked,


I just made our new backgrounds a few weeks ago and I liked the green//yellow alien









Almost used the red one!


----------



## Icekilla

I can't find you on steam Masked :/ ...unless you're the guy from Austria or whatever Steam says you're from









D: HD 6950? I'd get the 2GB one... I won't have CUDA for a few months







sigh... If I take an AMD card, bye bye CUDA, and sadly, AMD APP is nowhere near good/supported compared to CUDA.

GOD PLEASE UNLEASH OPENCL ALREADY!!! T_T


----------



## MKHunt

Blegh this RMA problem seems to be taking forever. The callback turned into an email asking for a picture of a ruler next to the inductors. I used a digital caliper instead. That was yesterday at ~3:50pm CA time and still no response. It'd be awfully nice to have a working gfx card before Thanksgiving.

The first RMA started on October 14. It's been 33 days. Ugh.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> Blegh this RMA problem seems to be taking forever. The callback turned into an email asking for a picture of a ruler next to the inductors. I used a digital caliper instead. That was yesterday at ~3:50pm CA time and still no response. It'd be awfully nice to have a working gfx card before Thanksgiving.
> The first RMA started on October 14. It's been 33 days. Ugh.


The 1 and only thing that even remotely bothers me about EVGA is while their CS is amazing...None of the higher ups are ever on time...It's always early or late...Never, ever on time.

My suggestion would be to give them a call and "remind" them...you'll get a call back shortly.

I know it's a hassle but, it's well worth it.


----------



## MKHunt

I was just told it is officially Koolance's problem by EVGA. This is... bad to say the least. There are obvious differences shown _IN PICTURES_ and the answer I get is, "It's a reference PCB so if the waterblock doesn't fit then you need to contact the waterblock manufacturer." That is the response word-for-word.

Kinda blown away.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> I was just told it is officially Koolance's problem by EVGA. This is... bad to say the least. There are obvious differences shown _IN PICTURES_ and the answer I get is, "It's a reference PCB so if the waterblock doesn't fit then you need to contact the waterblock manufacturer." That is the response word-for-word.
> Kinda blown away.


Actually that makes sense...Check your PM's in a few minutes.


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> Actually that makes sense...Check your PM's in a few minutes.


If the reference spec has changed, then yes it would make sense. I was told that they pulled my serial from the same batch they assemble into HydroCopper cards and that they haven't had any fitment problems with those.

Watching my inbox.


----------



## Scorpion49

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Icekilla*
> 
> Talk about false advertising
> 
> 
> 
> 
> 
> 
> 
> 
> So... considering Maxwell is around the corner, it's not worth it to spend in the GTX 580's 3GB, therefore, the best option would be the GTX 570's 2.5GB, am I right?
> 
> 
> 
> It's not false advertising at all...It's simple math and has //ALWAYS// been that way...In reality, ATI started that lovely little exception so, if you want to be pissed at someone, fire off at AMD.
> 
> That being said, KEPLER is "just" around the corner and it's going to be basically a hybrid refresh...At this point, personally, I'd buy 1x580 gb and "deal" with it until Kepler is released then, jump over to that...It's actually what I intend to do myself.
> 
> Samples are going strong so far
Click to expand...

Yeah definitely not false advertising, more like a "look at me" tactic. While it is technically true that there does exist 3GB of Vram on the card, it is not 3GB per GPU so you only get 1.5GB useable.

Personally I wouldn't recommend a 6990 to anyone, I had two and both died within hours of use. 6970's were fine and I have a 6870 right now that seems to be a solid midrange card.


----------



## MKHunt

I did a little digging on these new inductors... The datasheet looks VERY promising.

25C max current: 65 Ampres
125C max current: 49 Ampres

I believe the max current for the older VRMs is 35 Ampres. I'm now looking into having my block milled down a bit to accommodate the new inductors while preserving the direct contact cooling. However, I don't have the delta between the old VRMs and the new ones. Would anyone with a caliper or excellent measuring skills be willing to get a measure from the PCB to the top of the inductors?


----------



## Frosty88

Just replaced my Asus GTX 580 Matrix with an EVGA GTX 590 Classified. Currently running at 675/900 on stock volts. Here's a screenie:


----------



## jdangond

Just upgraded to a 590 Hydro Copper currently running at 660/880. Would like to be added.


----------



## solsamurai

Looks clean in there! I like how uncluttered the tubing looks.


----------



## kot0005

Quote:


> Originally Posted by *jdangond*


are you running your entire loop on a single 120mm Rad







?


----------



## ryder

Is there going to be a major advancement with the keplar series over the current 500's? Or do u have to wait for maxwell to see a major advancement in vid cards?


----------



## Masked

Quote:


> Originally Posted by *ryder*
> 
> Is there going to be a major advancement with the keplar series over the current 500's? Or do u have to wait for maxwell to see a major advancement in vid cards?


This was literally answered a few pages ago...Suggest you go back a couple and read


----------



## jdangond

Quote:


> Originally Posted by *kot0005*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jdangond*
> 
> 
> 
> 
> 
> are you running your entire loop on a single 120mm Rad
> 
> 
> 
> 
> 
> 
> 
> ?
Click to expand...

There's a 360 rad up top there


----------



## ryder

Quote:


> Originally Posted by *Masked*
> 
> This was literally answered a few pages ago...Suggest you go back a couple and read


i see.

perhaps what i'll do is get a price reduced 570 or 580 when the new keplars come out, and keep my money for the major advancement in 2013 with maxwell


----------



## kot0005

Quote:


> Originally Posted by *jdangond*
> 
> There's a 360 rad up top there


I couldnt see the tubing go up , so thought you had 1 rad!


----------



## jpongin

Hi GTX 590 owners, I just finished building my GTX 590 Quad SLI BF3 rig with these specs:

CPU: Intel Core i7 2600K @ 4.6Ghz (40C idle / 70C load)
GPU: Asus GTX 590 Quad SLI w/ 285.62 Drivers STOCK (no over-clocking whatsoever) (28C idle / 50C load)
Mobo: Asus P8P67 Pro
Mem: Corsair Dominator 8GB
OS: Windows 7 Pro
Build Log - http://hardforum.com...d.php?t=1646299

BF3 is perfectly smooth in campaign mode @ 2560x1600 / Ultra Settings. It's also butter smooth at ANY lower resolution.

BUT...

BF3 is not playable in 64/64 multiplayer in Ultra Settings preset on ANY map at 2560x1600. None. There's no network lag or anything like that. The four GTX 590 GPUs just struggle to render those frames at 2560x1600 on any multiplayer server with more than 50 players. When the game is "struggling" to render frames, I too am getting the "green flash". When I log the GPU utilization in MSI Afterburner, all four GPUs are averaging at near 95% - 100% utilization.

The only way I'm able to play it at 2560x1600 on the "Ultra Settings" pre-set, is if I switch to "Custom", and then crank down the deferred antialiasing down to 2x instead of the 4x. 4x will kill at 2560x1600 even with the second most powerful Quad SLI setup. Even at 2x, it's only playable 90% of the time when you're not in a "dense" firefight.

The only way to play 2560x1600 at Ultra Settings is if you completely turn off "deferred antialiasing". Then it's constantly at 60FPS (0 frame drop) in any multiplayer situation. But this SUCKS - how can GTX 590 QUAD SLI not handle deferred antialiasing for BF3 at 2560x1600 in a 64/64 person server with Ultra settings?

This leads me to two questions:
1) Am I doing anything wrong? If GTX 590 Quad SLI should run BF3 @ 2560x1600 butter smooth in Ultra Setting in a full 64/64 multiplayer server, then I'm all ears. Tell me what I should do to fix this.
2) If GTX 590 Quad SLI's do in fact struggle to run BF3 in Ultra Settings @ 2560x1600 in 64/64 multiplayer mode, then what SLI cards should I have gotten instead?

Lastly, I'd like to know if anyone has been able to keep BF3 above 60FPS (constant) @ 2560x1600 Ultra Preset in 64/64 Multiplayer. If so, I'd LOVE to know what you're running.


----------



## Masked

Quote:


> Originally Posted by *jpongin*
> 
> Hi GTX 590 owners, I just finished building my GTX 590 Quad SLI BF3 rig with these specs:
> CPU: Intel Core i7 2600K @ 4.6Ghz (40C idle / 70C load)
> GPU: Asus GTX 590 Quad SLI w/ 285.62 Drivers STOCK (no over-clocking whatsoever) (28C idle / 50C load)
> Mobo: Asus P8P67 Pro
> Mem: Corsair Dominator 8GB
> OS: Windows 7 Pro
> Build Log - http://hardforum.com...d.php?t=1646299
> BF3 is perfectly smooth in campaign mode @ 2560x1600 / Ultra Settings. It's also butter smooth at ANY lower resolution.
> BUT...
> BF3 is not playable in 64/64 multiplayer in Ultra Settings preset on ANY map at 2560x1600. None. There's no network lag or anything like that. The four GTX 590 GPUs just struggle to render those frames at 2560x1600 on any multiplayer server with more than 50 players. When the game is "struggling" to render frames, I too am getting the "green flash". When I log the GPU utilization in MSI Afterburner, all four GPUs are averaging at near 95% - 100% utilization.
> The only way I'm able to play it at 2560x1600 on the "Ultra Settings" pre-set, is if I switch to "Custom", and then crank down the deferred antialiasing down to 2x instead of the 4x. 4x will kill at 2560x1600 even with the second most powerful Quad SLI setup. Even at 2x, it's only playable 90% of the time when you're not in a "dense" firefight.
> The only way to play 2560x1600 at Ultra Settings is if you completely turn off "deferred antialiasing". Then it's constantly at 60FPS (0 frame drop) in any multiplayer situation. But this SUCKS - how can GTX 590 QUAD SLI not handle deferred antialiasing for BF3 at 2560x1600 in a 64/64 person server with Ultra settings?
> This leads me to two questions:
> 1) Am I doing anything wrong? If GTX 590 Quad SLI should run BF3 @ 2560x1600 butter smooth in Ultra Setting in a full 64/64 multiplayer server, then I'm all ears. Tell me what I should do to fix this.
> 2) If GTX 590 Quad SLI's do in fact struggle to run BF3 in Ultra Settings @ 2560x1600 in 64/64 multiplayer mode, then what SLI cards should I have gotten instead?
> Lastly, I'd like to know if anyone has been able to keep BF3 above 60FPS (constant) @ 2560x1600 Ultra Preset in 64/64 Multiplayer. If so, I'd LOVE to know what you're running.


Your VRAM is choking...At that resulotion if you want better results, trade the 2 590's in for 3x580 3gb's.


----------



## MKHunt

Does anybody have a HydroCopper that they have taken off and would be willing to sell? EVGA won't sell me one and milling a block to fit the new VRMs ended up close to $250.

Koolance makes one that fits BUT appears to have ZERO cooling on the inductors which is both idiotic and unacceptable. I imagine it went something like this:
Quote:


> "Nvidia has redesigned the 590 with taller inductors and more direct cooling for the inductors since they were getting too hot. Our blocks don't fit any more."
> 
> "If we designed them around the new inductors what would we have to do? What would users of the old card have to do?"
> 
> "We'd simply need to mill out about 1/3mm more in the inductor area. Users of old cards could just place pink (1.0mm) thermal pads there and still have good contact. What should we do?"
> 
> "Remove all cooling for the inductors and flip the water channeling on the interior of the card. This is clearly the best course of action."
> 
> Result:


I has the angers. And I kind of want a more conservative design like the HydroCopper or EK blocks. The MCP-35X has enough power for those blocks, 3 rads, and a Rasa block. I shot off an e-mail to EK asking if they had any measurements from their 590 block available or if they could confirm fitment with the taller inductors and have yet to hear back.

Oh, and then my mom backed into my car. Maybe I'll try to be an opportunist and get this bumper. I have a white four-door 330xi. It's been a bad day I'm going to start drinking now.


----------



## jpongin

Quote:


> Your VRAM is choking...At that resulotion if you want better results, trade the 2 590's in for 3x580 3gb's.


Awe that stinks. I got 590 Quad SLI *because* I wanted to game at 2560x1600 MAXED. You'd think load balancing between 4 GPUs and 4 x 1.5GB VRAMs would decrease the VRAM fill rate (and keep it steadily below the 1.5GB ceiling).

I mean I understand that each GPU only has 1.5GB to work with, but this is exactly why spreading the rendering load over 4 nodes ought to keep it from going over 1.5GB VRAM.

I hope this is the case because then it can be optimized by software if the load balancing algorithm is indeed that inefficient.


----------



## MKHunt

Quote:


> Originally Posted by *jpongin*
> 
> Quote:
> 
> 
> 
> Your VRAM is choking...At that resulotion if you want better results, trade the 2 590's in for 3x580 3gb's.
> 
> 
> 
> Awe that stinks. I got 590 Quad SLI *because* I wanted to game at 2560x1600 MAXED. You'd think load balancing between 4 GPUs and 4 x 1.5GB VRAMs would decrease the VRAM fill rate (and keep it steadily below the 1.5GB ceiling).
> I mean I understand that each GPU only has 1.5GB to work with, but this is exactly why spreading the rendering load over 4 nodes ought to keep it from going over 1.5GB VRAM.
> I hope this is the case because then it can be optimized by software if the load balancing algorithm is indeed that inefficient.
Click to expand...

In SLI and CrossFire VRAM is mirrored, not shared. Each GPU stores the same information as all the other GPUs so you have 1.5GB VRAM effective total.


----------



## Recipe7

MKH, why 250$? That's crazy. What happened to the 30-40?


----------



## emett

Quote:


> Originally Posted by *jpongin*
> 
> Quote:
> 
> 
> 
> I hope this is the case because then it can be optimized by software if the load balancing algorithm is indeed that inefficient.
> 
> 
> 
> Can I ask you who you think is going to make that software?
Click to expand...


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7*
> 
> MKH, why 250$? That's crazy. What happened to the 30-40?


The $30-40 guy was going to do the work on company machines at severely reduced cost and mark it as something he had done for himself. He was gone when I showed up, so rather than having me leave my number they just got the VP of the company to deal with it and, of course, he wanted to do everything through the company itself. If I show up againand ask for the other guy after the VP referred me to another shop (which was cheaper, but wanted $130 just to set up the machinery since he only had CNC) because it was too expensive. Since that guy basically offered to cheat the company to help me out, I don't want to risk him getting in trouble.


----------



## CJL

I hope you get everything sorted MK.

On my side, lately i've been having these weird dips while folding. Previously i had 99% usage on both cores and lately (i think since my system locked-up a few days ago), core0 is dipping like crazy. I hope the card is not damaged.





What would be causing this all of a sudden? I haven't really run anything else since the crash. Over this weekend i'll do some more testing.


----------



## Recipe7

Quote:


> Originally Posted by *MKHunt*
> 
> The $30-40 guy was going to do the work on company machines at severely reduced cost and mark it as something he had done for himself. He was gone when I showed up, so rather than having me leave my number they just got the VP of the company to deal with it and, of course, he wanted to do everything through the company itself. If I show up againand ask for the other guy after the VP referred me to another shop (which was cheaper, but wanted $130 just to set up the machinery since he only had CNC) because it was too expensive. Since that guy basically offered to cheat the company to help me out, I don't want to risk him getting in trouble.


That's considerate of you. It wouldn't hurt to somehow get your contact's direct line, maybe he can still help you out.


----------



## JB3

I am building a new computer and am trying to decide between two or three 580's or one or two 590's. I only use one monitor, a Dell U2711 with 2560 x 1440 resolution but want to play games at max settings. How would two 580's compare to one 590 in performance? How about 3 or 4 580's compared to two 590's in performance? My monitor had displayport, any advantage that the 590 has mini displayport output or should DVI work just fine with my monitor? Ordered Asus Rampage IV Extreme X79 motherboard a few days ago in case the motherboard I am using has any impact on what video cards I should use.

Thanks in advance for the help!


----------



## Shinobi Jedi

I would think you're going to want to go with a 580 card that has 3gb of VRam to play at that resolution. (Not the 590 3gb, which is really 1.5gb mirrored on the 590)

Maybe even two, if you have the coin. 590 is great for 1920x1080 120hz/3D displays if you're a heavy FPS player who wants 120fps/120hz smooth butter like experience.

But for higher resolutions, you want at card that has 3gb VRAM.


----------



## jpongin

Quote:


> Originally Posted by *MKHunt*
> 
> In SLI and CrossFire VRAM is mirrored, not shared. Each GPU stores the same information as all the other GPUs so you have 1.5GB VRAM effective total.


Wow really? That makes no software architectural sense! If you take four frames, and each of the four GPUs compute one frame, why then would GPU 1 reserve memory not only on MEM1, but also MEM2, MEM3, and MEM4? Why would GPU 2 need any references from GPU 1's frame in the memory space? That's just weird.

UNLESS, all 4 GPUs just use one memory set from one of the GPUs. Then that would make sense (but still super inefficient). But if all four GPUs have the *ability* to access their own memory set (1.5GB), it just makes no sense to replicate the memory allocation on all four sets (replication is both expensive and completely unnecessary unless you're working with critical transactions).

Maybe I need to read an SLI white paper to fully understand how SLI's load distribution is implemented. If my logic is incorrect, then by all means someone please educate me.


----------



## jpongin

So I ran some Single Player experiments and it's conclusive that VRAM is maxed at Ultra Settings Preset @ 2560x1600. The three biggest suspects are:
1) Resolution
2) MSAA
3) Ambient Occlusion

It's a zero sum game with those three settings - if you want MSAA, you need to lower your resolution and/or ambient occlusion. If you want Resolution, then you need to crank down your MSAA or ambient occlusion. The biggest factor, just as in [H]'s BF3 performance review is MSAA. VERY VRAM expensive.

So now the issue is: Memory Management. Can NVidia optimize memory management by clearing the addresses of ALL immediate unused references to make room for the next frame? Can 4 GPUs help make that process go faster? One would think so.

Here were my results (VRAM was taken as a rough average)

SINGLE PLAYER VRAM EXPERIMENT

Swordbreaker - APC and Sniper Scene - 4x MSAA + HIGH FXAA - LOW SETTINGS (max motion blur + VSync ON)
SMOOTH at no drops below 60FPS
MEM1: 1360
MEM2: 1360
MEM3: 1360
MEM4: 1360

Swordbreaker - APC and Sniper Scene - 4x MSAA + HIGH FXAA - MEDIUM SETTINGS (max motion blur + VSync ON)
SMOOTH at no drops below 60FPS
MEM1: 1405
MEM2: 1405
MEM3: 1405
MEM4: 1405

Swordbreaker - APC and Sniper Scene - 4x MSAA + HIGH FXAA - HIGH SETTINGS (max motion blur + VSync ON)
VERY LAGGY not playable. VRAM is maxed. ~23FPS
MEM1: 1525
MEM2: 1525
MEM3: 1525
MEM4: 1525

Swordbreaker - APC deployment and Sniper Scene - 4x MSAA + HIGH FXAA - ULTRA (max motion blur + VSync ON)
VERY LAGGY not playable. VRAM is maxed. ~23FPS
MEM1: 1530
MEM2: 1530
MEM3: 1530
MEM4: 1530

Swordbreaker - APC and Sniper Scene - 2x MSAA + HIGH FXAA - ULTRA SETTINGS (max motion blur + VSync ON)
SMOOTH at no drops below 60FPS
MEM1: 1400
MEM2: 1400
MEM3: 1400
MEM4: 1400

Swordbreaker - APC and Sniper Scene - NO MSAA + HIGH FXAA - ULTRA SETTINGS (max motion blur + VSync ON)
SMOOTH at no drops below 60FPS
MEM1: 1360
MEM2: 1360
MEM3: 1360
MEM4: 1360


----------



## jpongin

Okay, after some SLI technology research, I've concluded these points:

1) SLI works as a master / slave pattern. The master sends frames ( or part of a frame depending on if you're using split or alternate methods ) to the slaves to render. When the slaves are finished rendering the frames, it then sends the results back to the master, the master then sorts everything out and outputs it to the screen, and then the process repeats. http://en.wikipedia.org/wiki/Scalable_Link_Interface

2) According to NVidia, each GPU manages it's own frame buffer memory (VRAM). There really is no "mirroring" ( as mirroring implies replication - which would be ridiculous ). So GPU1 uses MEM1, GPU2 uses MEM2, GPU3 uses MEM3, GPU4 uses MEM4, and so on. MEM1 does not replicate or mirror across MEM2, MEM3, and MEM4. That wouldn't make sense. http://www.slizone.com/page/slizone_faq.html#c26

So if QUAD SLI cannot smoothly run BF3 @ 2560 x 1600 at Ultra Settings Preset at the native 60Hz refresh rate ( constant 60FPS ), what it means is 1 GPU with a 1.5GB frame memory buffer is struggling to render @ an effective 15FPS.


----------



## Masked

Quote:


> Originally Posted by *jpongin*
> 
> Okay, after some SLI technology research, I've concluded these points:
> 1) SLI works as a master / slave pattern. The master sends frames ( or part of a frame depending on if you're using split or alternate methods ) to the slaves to render. When the slaves are finished rendering the frames, it then sends the results back to the master, the master then sorts everything out and outputs it to the screen, and then the process repeats. http://en.wikipedia.org/wiki/Scalable_Link_Interface
> 2) According to NVidia, each GPU manages it's own frame buffer memory (VRAM). There really is no "mirroring" ( as mirroring implies replication - which would be ridiculous ). So GPU1 uses MEM1, GPU2 uses MEM2, GPU3 uses MEM3, GPU4 uses MEM4, and so on. MEM1 does not replicate or mirror across MEM2, MEM3, and MEM4. That wouldn't make sense. http://www.slizone.com/page/slizone_faq.html#c26
> So if QUAD SLI cannot smoothly run BF3 @ 2560 x 1600 at Ultra Settings Preset at the native 60Hz refresh rate ( constant 60FPS ), what it means is 1 GPU with a 1.5GB frame memory buffer is struggling to render @ an effective 15FPS.


1) No.

There is no master/slave...It's purely a gap of communication.

I've made my own custom drivers forever and there is no primary GPU...Sometimes GPU 2 is the first to respond and sometimes GPU 1 is...That's why there are Vsync issues and lag//tessellation...

It's merely the time it takes to communicate from 1 to the other, that's it, that simple.

In a dual core card, that lane is pipe-lined thus it's faster but, power etc is slaved so you come out with a compromise.

The compromise on the 590 is that you lose Vram.

2) Of course there's no mirroring...It's actually more of a sharing process...Each group is assigned a module and is streamlined to that module.

~
It's not a question that the cores work better on dual core cards...The issue is merely one of your limitations.

You CANNOT smoothly run 1.5gb at that resolution, REGARDLESS of the card...It's just not happening.

I'd suggest 2x or 3x 580 3gb's ~

The buffer is literally 1750 ~ The 590's JUST fall short...Not really anything you can do about it other than work around it.


----------



## jpongin

Quote:


> Originally Posted by *Masked*
> 
> 1) No.
> There is no master/slave...It's purely a gap of communication.
> I've made my own custom drivers forever and there is no primary GPU...Sometimes GPU 2 is the first to respond and sometimes GPU 1 is...That's why there are Vsync issues and lag//tessellation...
> It's merely the time it takes to communicate from 1 to the other, that's it, that simple.
> In a dual core card, that lane is pipe-lined thus it's faster but, power etc is slaved so you come out with a compromise.
> The compromise on the 590 is that you lose Vram.
> 2) Of course there's no mirroring...It's actually more of a sharing process...Each group is assigned a module and is streamlined to that module.
> ~
> It's not a question that the cores work better on dual core cards...The issue is merely one of your limitations.
> You CANNOT smoothly run 1.5gb at that resolution, REGARDLESS of the card...It's just not happening.
> I'd suggest 2x or 3x 580 3gb's ~
> The buffer is literally 1750 ~ The 590's JUST fall short...Not really anything you can do about it other than work around it.


1) I should have been more clear. What I meant is that there is some type of logic which is controlling the traffic and frame sorting before it's ultimately output through the "master". That stuff doesn't "just happen". Example, if I have one monitor, and it's connected through one cable, and I use Quad SLI, there MUST be some logic to output frames through that one connection. It doesn't "just happen". In that sense, that logic or controller of that particular gateway is acting as the master as the slaves route their rendered frames through it. Now if I'm completely wrong then correct me again, and if there is absolutely no master gateway from which the frames are sorted and ultimately sent through that single cable to the monitor, then how does it "just happen"? Perhaps there is some lower level logic than the API's NVidia offers to write drivers. But if not, I find it hard to believe it's just magic.

2) I agree. 1.5GB VRAM is not enough to buffer 15FPS (which is the minimum needed to hit 60FPS in quad SLI) for 1 GPU for BF3 at those settings. Every experiment leads me to that conclusion. If I go 2x 580 3GBs, then that would mean each GPU would need to render at minimum 30FPS. In addition at 3x (tri SLI), each GPU would need to handle at minimum 20FPS.


----------



## Wogga

Quote:


> Originally Posted by *CJL*
> 
> I hope you get everything sorted MK.
> On my side, lately i've been having these weird dips while folding. Previously i had 99% usage on both cores and lately (i think since my system locked-up a few days ago), core0 is dipping like crazy. I hope the card is not damaged.
> 
> 
> What would be causing this all of a sudden? I haven't really run anything else since the crash. Over this weekend i'll do some more testing.


i had same problem as a sign of unstable OC. now running [email protected] (0.975v monitored) and having 98-99% always. and one more thing - you can have higher ppd by adding "extra_parms=-advmethods"


----------



## CJL

Thanks for the tip. Went from 13xxx to 14xxx PPD, but taking alot longer to render. I guess it's similar to bigadv on CPU.

I'm not OCing so it's not that. Today i launched [email protected] and now it's back to normal. Gotta love PCs, never boot the same way twice. Probably had something loaded that interfered with folding. All is good.



BTW, jealous of your (and others) dual water cooled 590s. I was going to do that recently when one online retailer got 12+ in stock, but they sold out in no time, so missed out on that. Maybe i'll just wait for 6 series.


----------



## timd78

I feel i have to chip in and say that while some comments about the 590's VRAM are accurate and constructive i do think on these forums there are a hell of a lot of people jumping on the bandwagon QQ ing about the 590's VRAM. It is particularly bad when there could be other issues which the poster has with their card but all their hear back is " that will never work. Not enough vram". This is way too much of a brash statement and not constructive to people solving their problems. This post isnt in realtion to any immidiate posts but its something i have wanted to say for a fair bit of time while reading these forums. The last couple investivations show fairly well the effect of vram running out (and that effect is not subtle)

I am a Quad SLI 590 user and they can handle large resolutions just fine. You just have to be aware of its limitations (all things have their limits). I run MSI afterburner and so keep a keen eye on the ram usage. Playing in surround at 5760 x 1080 takes a hell of a lot of grunt and while one 590 does the job it can get a bit pushed at times and for example BF3 you have to turn the settings down to keep good frames. Having 2 really makes the experience more pleasurable. It also makes 3d vision at good frames and settings plausable across 3 screens. All you have to do is steer clear of stupidly high AA settings and you are laughing.

At 5760 i can assure you the cards are not so powerful that you would have it pinneed up super high anyway. I prefer to just keep maximum settings and be able to run at 80 - 90 frames on my 120hz screens. Which is a noticably better experience than 50-60 fps on mediums. I have never been a massive fan of AA anyway and so am quite happy using FXAA or some minimal settings when it permits. You just have to keep an msi afterburner open and monitor your usage. Which shouldnt be a problem if you have such a big desktop. Hell you dont even need to do that because you will soon know when you run out of ram because it stutters to ****. For example this happens on crysis 2 with the DX11 texture pack on maximum settings. All you need do is turn a couple things down one click and the problem is gone. Not that i found that game any fun anyway







BF3 for instance i run on high textures and it uses 1.2-1.3 gigs of ram. I have run it on ultra textures and while its capping out the ram i have not noticed any negative effect. To avoid stutter in multiplayer i therefore keep it to high settings and all is good. BF3 is a bad topic really because after beta they quite badly broke tripple support. Can you honestly say the diference between high and ultra textures is that much visually??

Other games commonly dont even come close to useing up all the vram. SKyrim for example runs at around 900 megs maybe 1100. It is also worth reminding people that going from single screen to tripple does not tripple the ammount of vram used. It adds 250-350 to the usage most of the time if i had eyeball it.

If you want to have a quad SLI solution with lots of grunt and not too much noise the 590 Quad comes recomended from me. Particularly if you want to use that extra gunt to play 3d surround. Skyrim in 3d on tripple screens really is a sight to behold.

So there you have it. My rant is over. Yes the 1.5 gig of ram is something to know about when your getting into it and you might need to fiddle your settings around but no it is not the end of the world and yes you can still have a fantastic setup with them. I would rate the vram issue less of an issue than simply having to wait for decent quad sli profiling to arrive as quad is a bit more fussy than a single 590 it would seem.

One of the reasons i went away from ATI and their dreadfully slow to come out crossfire profiles.


----------



## jpongin

Quote:


> You're reading into that statement too much, replication does occur.
> 
> http://developer.download.nvidia.com...s_2011_Feb.pdf


I stand corrected. Frame buffer memory replication *does* occur in SLI.

I can only think of two reasons for this: Either it's more efficient to replicate in order to output frames from one frame buffer memory set as opposed to routing it all from different un-replicated sets, or the four threads are sharing the same references. If it's the latter reason, I'm curious as to why they would? For frame sync'ing?
Quote:


> A note on GPU Memory in SLI
> In all SLI-rendering modes all the graphics API resources (such as buffers or textures) that would normally be expected to be placed in GPU memory are automatically replicated in the memory of all the GPUs in the SLI configuration. This means that on an SLI system with two 512MB video cards, there is still only 512MB of onboard video memory available to the application. Any data update performed from the CPU on a resource placed in GPU memory (for example, dynamic texture updates) will usually require the update to be broadcast other GPUs. This can introduce a performance penalty depending on the size and characteristics of the data. Other performance considerations are covered in the section on SLI performance.


----------



## Masked

Quote:


> Originally Posted by *jpongin*
> 
> Quote:
> 
> 
> 
> You're reading into that statement too much, replication does occur.
> http://developer.download.nvidia.com...s_2011_Feb.pdf
> 
> 
> 
> I stand corrected. Frame buffer memory replication *does* occur in SLI.
> I can only think of two reasons for this: Either it's more efficient to replicate in order to output frames from one frame buffer memory set as opposed to routing it all from different un-replicated sets, or the four threads are sharing the same references. If it's the latter reason, I'm curious as to why they would? For frame sync'ing?
> Quote:
> 
> 
> 
> A note on GPU Memory in SLI
> In all SLI-rendering modes all the graphics API resources (such as buffers or textures) that would normally be expected to be placed in GPU memory are automatically replicated in the memory of all the GPUs in the SLI configuration. This means that on an SLI system with two 512MB video cards, there is still only 512MB of onboard video memory available to the application. Any data update performed from the CPU on a resource placed in GPU memory (for example, dynamic texture updates) will usually require the update to be broadcast other GPUs. This can introduce a performance penalty depending on the size and characteristics of the data. Other performance considerations are covered in the section on SLI performance.
> 
> Click to expand...
Click to expand...

I genuinely don't know how to dumb this down so, follow me.

In a standard SLI situation, slot to slot, the cards are primary/secondary etc...in a Dual GPU card, there is no true master/slave.

SLI USED to be basically half your screen...Each card would literally take, top/bottom or top/middle/bottom (three) etc etc etc...This doesn't happen anymore because the world figured out they do AFR (frame by frame) instead of SFR (Half/Half).

So right now, basically 90% of the world's SLI is enabled to do AFR.

AFR in a dual gpu card is truly RANDOM and this is what I was discussing.

I have 2 cards at work where generally, core 2 does the brunt of the work and at home I have a card where core 1 does 20-30% more work. It is genuinely, random.

In every dual GPU card, there's an inherent bridge...Always has been, always will be...It's typically @ 400mhz...But as I said, the frame buffer process is truly random, it has to be for the cards to work in a true tandem...As we all know, not all cores are equal.

I'm EXTREMELY familiar with the 7900GX2 because it was Alienware's first custom card when I was a lowly tech here...That's another infamous card that NEVER worked in Vsync, always had to be AFR.

Ram is split, it's ALWAYS been split because it has to be for the ram to be mirrored, if it wasn't then the SLI configuration would operate on the lower memory as max ramdec...So if you had 1500 on 1 card and 1200 on the other, 1200 becomes Max/Primary...Regardless of which core it is.

You're genuinely over-thinking how SLI works...

At any resolution higher than 1900, you very simply need more Vram...There's no real discussion to be had because it's a fact...

With 1.5gb you'll choke the VRAM and won't receive the results you're looking for, regardless of how many cards you dump on the project.

If you're going to do a single monitor then you're fine until the standards change with Kepler JUST down the road because, I have a very strong feeling, 1.5gb isn't going to cut it anymore.

Now, again, the genuine max ram usage for most games/apps atm is 1750...JUST over the 1500 which is why it actually performs "well" but, not as well as it should...

In my opinion, you're better off getting 2x 580 3gbs, overclock them and frag to the max.


----------



## MKHunt

Good news everyone!
Quote:


> Originally Posted by *EK CS*
> Thank you for your interest.
> 
> We are familiar with PCB redesing Nvidia did, and all our EK-FC590 released within last 6 months already have neccessary changes made and block fit.
> 
> There is 0.64mm gap between inductors and block according to my 3D model, which is more then enough.


----------



## jpongin

Thanks for the explanation Masked. I completely agree with you about VRAM and how it's used in games at large resolutions. I dig *deep* and I'm all about learning new stuff so two more questions to the experts:

1) On VRAM - I guess I'm still confused as to why the VRAM still needs to be mirrored in the first place. Do the threads share the same variables? Or Is it faster to output frames from a replicated single frame buffer memory set? Or is it some other reason?

2) On AFR randomness - I understand the split and alternate methods, and how AFR is the leading and more popular one. So I guess my next question is if these frames are being displayed randomly, how come in a Quad SLI configuration, it appears to me that all my frames are being rendered in order? Is it the result of First-In-First-Out timing between the four GPUs? And if the frames are rendered more out of order than in order, is this what causes the "micro stuttering" effect? If it's truly random with no controller logic to order frames, am I right to conclude that this wouldn't horizontally scale well? Because increasing threads should increase the likelihood of randomness right?


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I genuinely don't know how to dumb this down so, follow me.
> In a standard SLI situation, slot to slot, the cards are primary/secondary etc...in a Dual GPU card, there is no true master/slave.
> SLI USED to be basically half your screen...Each card would literally take, top/bottom or top/middle/bottom (three) etc etc etc...This doesn't happen anymore because the world figured out they do AFR (frame by frame) instead of SFR (Half/Half).
> So right now, basically 90% of the world's SLI is enabled to do AFR.
> AFR in a dual gpu card is truly RANDOM and this is what I was discussing.
> I have 2 cards at work where generally, core 2 does the brunt of the work and at home I have a card where core 1 does 20-30% more work. It is genuinely, random.
> In every dual GPU card, there's an inherent bridge...Always has been, always will be...It's typically @ 400mhz...But as I said, the frame buffer process is truly random, it has to be for the cards to work in a true tandem...As we all know, not all cores are equal.
> I'm EXTREMELY familiar with the 7900GX2 because it was Alienware's first custom card when I was a lowly tech here...That's another infamous card that NEVER worked in Vsync, always had to be AFR.
> Ram is split, it's ALWAYS been split because it has to be for the ram to be mirrored, if it wasn't then the SLI configuration would operate on the lower memory as max ramdec...So if you had 1500 on 1 card and 1200 on the other, 1200 becomes Max/Primary...Regardless of which core it is.
> You're genuinely over-thinking how SLI works...
> At any resolution higher than 1900, you very simply need more Vram...There's no real discussion to be had because it's a fact...
> With 1.5gb you'll choke the VRAM and won't receive the results you're looking for, regardless of how many cards you dump on the project.
> If you're going to do a single monitor then you're fine until the standards change with Kepler JUST down the road because, I have a very strong feeling, 1.5gb isn't going to cut it anymore.
> Now, again, the genuine max ram usage for most games/apps atm is 1750...JUST over the 1500 which is why it actually performs "well" but, not as well as it should...
> In my opinion, you're better off getting 2x 580 3gbs, overclock them and frag to the max.


Hey Masked,

Will 2x 580 3GB be able to rock 3 2560x1600 monitors at Ultra Settings and stay above 60fps? (Or say even 120fps when the day comes that 120hz displays can do that resolution?)

If not two cards, how about three?

Not that I plan on switching over. My next upgrade will be Kepler or later.

I'm just curious...

Also -

I don't know about anyone else, but I've been in a bit of a gaming funk since I got a chance to be in the last weekend beta of SW:TOR, two weeks ago.

Played some BF3 last night, which was great and helped. But I can't get SW:TOR off my mind. I'm not really an MMO player, but as an OG Star Wars geek and as a burgeoning screenwriter/games writer, I realized this game is going to take away my whole life. Because I'm probably going to roll every class simply to see how each story plays out. I've already warned the missus that the marriage is in jeopardy









I'm just going to get her into the game with me.

Lastly, I upgraded my Internet to the top tier speed. It was an extra $20 a month on top of what I'm already paying, so I said, Furk it. (Time Warner Cable RR Extreme for those in the SoCal or TW hoods) -

and Dam! That increase in upload speed pretty much doubled and tripled my average Kill score in BF3! Highly recommend it, or similar if you have the Robert DeNiro's for it.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Hey Masked,
> Will 2x 580 3GB be able to rock 3 2560x1600 monitors at Ultra Settings and stay above 60fps? (Or say even 120fps when the day comes that 120hz displays can do that resolution?)
> If not two cards, how about three?
> Not that I plan on switching over. My next upgrade will be Kepler or later.
> I'm just curious...


Yes.

The VRAM is a massive help in that situation...

3 is where it's hit or miss but, 2, definitely.
Quote:


> Originally Posted by *jpongin*
> 
> Thanks for the explanation Masked. I completely agree with you about VRAM and how it's used in games at large resolutions. I dig *deep* and I'm all about learning new stuff so two more questions to the experts:
> 1) On VRAM - I guess I'm still confused as to why the VRAM still needs to be mirrored in the first place. Do the threads share the same variables? Or Is it faster to output frames from a replicated single frame buffer memory set? Or is it some other reason?
> 2) On AFR randomness - I understand the split and alternate methods, and how AFR is the leading and more popular one. So I guess my next question is if these frames are being displayed randomly, how come in a Quad SLI configuration, it appears to me that all my frames are being rendered in order? Is it the result of First-In-First-Out timing between the four GPUs? And if the frames are rendered more out of order than in order, is this what causes the "micro stuttering" effect? If it's truly random with no controller logic to order frames, am I right to conclude that this wouldn't horizontally scale well? Because increasing threads should increase the likelihood of randomness right?


I honestly don't have any answers beyond what I've already said.

My understanding of SLI is that it's random...I know it used to be, if that's changed then I'm incorrect...I genuinely don't even think about it to the extent we're discussing...Apologies.


----------



## Shinobi Jedi

But the VRAM concern is only for resolution though. Not screen size, correct?

So, if I get a 27" or larger (if and when they're made) of a 120hz 1080p monitor, I shouldn't have a VRAM issue, yes?

Batman: AC is unlocked on Steam! Ahh Yeah, Sucka!


----------



## emett

Yes


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> But the VRAM concern is only for resolution though. Not screen size, correct?
> So, if I get a 27" or larger (if and when they're made) of a 120hz 1080p monitor, I shouldn't have a VRAM issue, yes?
> Batman: AC is unlocked on Steam! Ahh Yeah, Sucka!


Quote:


> Originally Posted by *emett*
> 
> Yes


Ummm, no...

What you have to realize is that your resolution scales to said screen size so it's actually producing a larger image thus, using more Vram.

So, I would make the same suggestion I made earlier...

While you may be "fine" you're actually on the verge of not being fine...I'd strongly suggest going 2x580 3gb's...


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> Ummm, no...
> What you have to realize is that your resolution scales to said screen size so it's actually producing a larger image thus, using more Vram.
> So, I would make the same suggestion I made earlier...
> While you may be "fine" you're actually on the verge of not being fine...I'd strongly suggest going 2x580 3gb's...


Ahh, man. See that's what I thought.









And here I was thinking that my Quad 590's would handle that new Asus 27" 120hz with Lightboost and I was going to pick it up when I had the $$.

I may get it anyways, and see how well they do and then upgrade if I have to. But now this has me re-evaluating my next purchase. Maybe instead I'll get the Z68 FTW and 2700K and stick with my BenQ. I wanted the Asus more for the size and HDMI 1.4 for PS3 3D, but I don't know how often I'll use that.

The other thing is, if these two cards can't handle a 27" 1080p 120hz, or struggle at least - and I have to swap cards if I upgrade displays regardless, then I may consider a 30" IPS display that can do 2560x1600. As I was going to hold off on that until I got some cards with more VRam.

Now I'm looking at needing a new mobo, CPU and set of cards? Just to run the games on the kind of display I want at the settings I want?

If I gotta do that much, then I'm thinking I may as well sit tight until Kepler and see how it benches before I mess with any of it.

Hey Masked, would hooking my system up to my 60" Plasma like I sometimes do which runs at 60hz, and ran games with VSync off (getting massive tearing obviously) , would the performance results I get from that give me any idea on how the cards would handle that Asus 27" 120hz? Or does the difference in refresh rate ability skew any potential gauge from that?

Worst comes to worst, I guess if I get the display, I'll go buy it at Fry's where I can return it within 14 days for a full refund if needed. The 8 mile drive through Los Angeles traffic to Burbank sucks, but it beats Newegg's monitor return policy.

Anybody try the new BF3 patch that came out last night? Any improvements on the 590's? I've been stuck working. And pretty much used up my break on this post. Good Times..

Back to the grind...


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> Ummm, no...
> What you have to realize is that your resolution scales to said screen size so it's actually producing a larger image thus, using more Vram.
> So, I would make the same suggestion I made earlier...
> While you may be "fine" you're actually on the verge of not being fine...I'd strongly suggest going 2x580 3gb's...


Are you sure?

Everything I've seen and read and learned in my optical engineering courses and experienced in my internships with L3 Communications indicate that two monitors with the same native resolution but different physical sizes will use the same amount of vram because they're using the same number of pixels. The only difference is the size and density of said pixels.

For example a 1080p monitor with a 23" usable screen will have dimensions of 11.27"H x 20.05"W
1920 pixels across the 20.05" width yields pixels with a width of .0104". Since we know that LCD pixels are square (as opposed to CRT pixels, which are actually subpixels) we can multiply that number by 1080 and get the vertical dimension in inches. This math works out assuming that pixels have ZERO space between one another.

But really this math is unnecessary since display resolution is a measure of how many distinct pixels are along each axis. You could have a 2' x 2' pixel that would make the card work just as hard as a .0104" pixel since the signals and the resources required to color _one pixel_ are constant from the viewpoint of the card.

Since a pixel is a pixel no matter its size, it explains why my laptop gets better frame rates when connected to a 1080p 42" display as opposed to its native 1200p 15.4" display. Any "upsizing" of the signal strength is done by the display hardware, which is part of the reason larger displays require more power (the other being the backlight and different lightpipe technologies used). If physical size changes across identical resolutions had an impact, console makers would be in serious, serious trouble. It would mean that anybody playing an FPS (without an fps cap) on a console would have a large advantage using a laptop sized screen as opposed to a 60" TV. This simply isn't true.

Think of it this way. You have two grids of LEDs measuring 1920 x 1080. One is comprised of 3.5mm LEDs and one of 5mm LEDs. Both are powered by signal controllers that output the exact amount of power needed for the LEDs to light. However, the controllers need a signal telling them _which_ LEDs to light. You can plug in a signal generator with a set amount of memory (for storing LED lighting patterns) and no matter which grid you attach it to, both will work flawlessly because you're controlling the same amount of LEDs even though they differ in physical size.

However, if you have a grid of 1920 x 1200 3.5mm LEDs and try and use the same controller with the same memory, you will either need more memory to store signal patters for _more LEDs_ or a storage unit connected to the memory via a processing chip to do "smart swaps" of specific data as it is needed to light the LEDs. *This is the vram limit. It is 100% pixel count dependent.*

Of course that analogy was excruciatingly oversimplified, but it conveys what is happening. It's the same problem with using a home fiber optic signal converter to try and effectively run a connection to a datacenter. It _can_ process the data with a strong processor, but more memory is needed to efficiently store conversion algorithms for more connections.

I go with your word on pretty much everything hardware related. But on this matter, I will not back down.

Edited for bold.


----------



## Scorpion49

Quote:


> Originally Posted by *Masked*
> 
> I honestly don't have any answers beyond what I've already said.
> 
> My understanding of SLI is that it's random...I know it used to be, if that's changed then I'm incorrect...I genuinely don't even think about it to the extent we're discussing...Apologies.


I did some digging to see if I could find the answer, here is what I found, kind of helpful to understand the load balancing:


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> I go with your word on pretty much everything hardware related. But on this matter, I will not back down.
> Edited for bold.


Well, not every day do I get to learn something.

My "experience" with scaling actually comes from us hooking up "the big screen"...At 1080p it chews through Vram like a monkey in a banana factory...Tried the 590 and you could literally tell it was just choking at 1.5...Genuinely, it was just RIGHT at the limit.

Swapped to a 3gb card, haven't had an issue scaling anything yet.

I think you may be right that it doesn't scale by the size of the screen but, by the overall resolution...However, when you reach resolutions like 1080p//720p it's dependent on the size of said interface.

~~

SFR vs. AFR ~ I actually sent an email out last night to one of the lead designers at Nvidia to see if I could get an answer because, ironicaly as we were having this discussion I found that basically nobody in my entire "click", actually has this answer...Which, bugs me.

So, I petitioned for an official answer with the literature involved.

I do know for a 100% fact that AFR was originally random...It may not be now...


----------



## Scorpion49

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MKHunt*
> 
> I go with your word on pretty much everything hardware related. But on this matter, I will not back down.
> Edited for bold.
> 
> 
> 
> Well, not every day do I get to learn something.
> 
> My "experience" with scaling actually comes from us hooking up "the big screen"...At 1080p it chews through Vram like a monkey in a banana factory...Tried the 590 and you could literally tell it was just choking at 1.5...Genuinely, it was just RIGHT at the limit.
> 
> Swapped to a 3gb card, haven't had an issue scaling anything yet.
> 
> I think you may be right that it doesn't scale by the size of the screen but, by the overall resolution...However, when you reach resolutions like 1080p//720p it's dependent on the size of said interface.
> 
> ~~
> 
> SFR vs. AFR ~ I actually sent an email out last night to one of the lead designers at Nvidia to see if I could get an answer because, ironicaly as we were having this discussion I found that basically nobody in my entire "click", actually has this answer...Which, bugs me.
> 
> So, I petitioned for an official answer with the literature involved.
> 
> I do know for a 100% fact that AFR was originally random...It may not be now...
Click to expand...

I think it was back in around 2006, I remember hearing the same thing and it was a problem with DX9 that didn't allow for more than 3 pre-rendered frames so it was basically a toss up. As far as the Vram, what were you playing on that screen? I got to the Vram limit on my 570's at 1920x1200 maxing out crysis 2 with DX11 and high-res pack but it did nothing to performance at all. it rode 1280MB like it was going out of style. It took 3 570's to gain enough GPU power to show it but it still had 0 effect on playability. Were you trying to run on a 120/240hz tv with interpolated frames that might have caused it to look like it was?


----------



## lollygag

.


----------



## kot0005

Follow my Build log over here If you guys are keeeen!!
http://www.overclock.net/t/1172984/gearing-up-for-swtor-arming-my-storm-trooperb-build-log#post_15763763

also My new GTX 590, Its an OEM and Nvidia branded as It came with my new Alienware (Which is absolutely free







)
Proof:


----------



## iARDAs

Are you guys extremely happy with your 590s?


----------



## Scorpion49

Quote:


> Originally Posted by *iARDAs*
> 
> Are you guys extremely happy with your 590s?


I am. Its the only GPU I've bought in the last ~2 years that has completely filled my expectations compared to what I thought going in. The 6990 sure didn't.


----------



## TheBlindDeafMute

I miss my 590. It made me feel special,


----------



## rush2049

I am not extremely happy with it. I am comfortable with its performance, yeah that's what I feel like.


----------



## iARDAs

Quote:


> Originally Posted by *Scorpion49*
> 
> I am. Its the only GPU I've bought in the last ~2 years that has completely filled my expectations compared to what I thought going in. The 6990 sure didn't.


Glad to see that your i5 2500k @ 4.5 is holding 590 well.

I will either go for a 590 or wait for 680 or 690 or whatever it will be.


----------



## Wogga

as an upgrade from AGP x1950 pro, quad 590 feels great:thumb:


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> as an upgrade from AGP x1950 pro, quad 590 feels great:thumb:


I moved from a 9600m and core 2 duo 2.8ghz mobile. The difference made my hair stand on end.


----------



## kzinti1

Quote:


> Originally Posted by *iARDAs*
> 
> Are you guys extremely happy with your 590s?


I certainly am! I'll be even happier with the next greatest there ever was (will be). And the next, and the next, on and on and on, ad infinitum.


----------



## Masked

Quote:


> Originally Posted by *Scorpion49*
> 
> I think it was back in around 2006, I remember hearing the same thing and it was a problem with DX9 that didn't allow for more than 3 pre-rendered frames so it was basically a toss up. As far as the Vram, what were you playing on that screen? I got to the Vram limit on my 570's at 1920x1200 maxing out crysis 2 with DX11 and high-res pack but it did nothing to performance at all. it rode 1280MB like it was going out of style. It took 3 570's to gain enough GPU power to show it but it still had 0 effect on playability. Were you trying to run on a 120/240hz tv with interpolated frames that might have caused it to look like it was?


I was told to look for an answer on Monday so, Mark or Tim...Which means that it will be about 10 pages of technical crap but, it will be a good, solid answer.

As far as on the big screen...Our office is basically a giant L...My office is basically at the end of the L next to our server farm and the interns occupy the bottom of the L so, it's a very open, yet personal office...I.E. everyone has their own space etc etc etc.

We have a sample Aquios and the thing is F'ing massive...I can't give up the size because it's not out yet but, we throw everything on there.

Fraps actually can't run on the screen because of resolution issues so, BF3 games are out but, the interns freely stream practically whatever they want on it...2 of them are on "pro" CoDBO teams so, they've transferred over to MW3 and they throw their games up there...SC2 games are streamed...Netflix...3 of the interns are RPI graduates so, we're tapped into their database (...I'd give a nut for their cloud capacity)...We stream anything and everything...

We tried allowing family to come to like a movie night and Aquios slapped me with an NDA requirement that's literally 50 pages long with a signature required on like 20 of them...So, we only got like their brothers/sisters to come...Ended up watching Despicable Me...Which, kicks some serious cohones at 1080p...Especially when the little girl is like "IT'S SQUISHY" in the theme park...I laughed so hard I spit water everywhere. ~ I also got so tired of them getting rickrolled on the big screen, I literally blocked ALL INSTANCES of that crap from our servers...permanently.

What happens is, 1080 is always 1080, however when I was using the 590 as a mediapc it just couldn't hack it...I think a lot of it has to do with the SIZE of the screen considering the resolution is "fixed" but, I had to go 3gbs...So, now our mediabox is SLI 580 3gb's and it has 5tb's of storage which are almost full again...

Come Kepler you'll start to see a trend..."Bigger is better" which is why I keep saying, make sure you have a 3gb in reserve because 1.5 just isn't enough for anything that's coming our way.


----------



## Smo

Quote:


> Originally Posted by *Masked*
> 
> I was told to look for an answer on Monday so, Mark or Tim...Which means that it will be about 10 pages of technical crap but, it will be a good, solid answer.
> As far as on the big screen...Our office is basically a giant L...My office is basically at the end of the L next to our server farm and the interns occupy the bottom of the L so, it's a very open, yet personal office...I.E. everyone has their own space etc etc etc.
> We have a sample Aquios and the thing is F'ing massive...I can't give up the size because it's not out yet but, we throw everything on there.
> Fraps actually can't run on the screen because of resolution issues so, BF3 games are out but, the interns freely stream practically whatever they want on it...2 of them are on "pro" CoDBO teams so, they've transferred over to MW3 and they throw their games up there...SC2 games are streamed...Netflix...3 of the interns are RPI graduates so, we're tapped into their database (...I'd give a nut for their cloud capacity)...We stream anything and everything...
> We tried allowing family to come to like a movie night and Aquios slapped me with an NDA requirement that's literally 50 pages long with a signature required on like 20 of them...So, we only got like their brothers/sisters to come...Ended up watching Despicable Me...Which, kicks some serious cohones at 1080p...Especially when the little girl is like "IT'S SQUISHY" in the theme park...I laughed so hard I spit water everywhere. ~ I also got so tired of them getting rickrolled on the big screen, I literally blocked ALL INSTANCES of that crap from our servers...permanently.
> What happens is, 1080 is always 1080, however when I was using the 590 as a mediapc it just couldn't hack it...I think a lot of it has to do with the SIZE of the screen considering the resolution is "fixed" but, I had to go 3gbs...So, now our mediabox is SLI 580 3gb's and it has 5tb's of storage which are almost full again...
> Come Kepler you'll start to see a trend..."Bigger is better" which is why I keep saying, make sure you have a 3gb in reserve because 1.5 just isn't enough for anything that's coming our way.


I want to work for you.


----------



## Xyphyr

Quote:


> Originally Posted by *TheBlindDeafMute*
> 
> I miss my 590. It made me feel special,


Trade + cash?


----------



## jpongin

Quote:


> SFR vs. AFR ~ I actually sent an email out last night to one of the lead designers at Nvidia to see if I could get an answer because, ironicaly as we were having this discussion I found that basically nobody in my entire "click", actually has this answer...Which, bugs me.
> So, I petitioned for an official answer with the literature involved.
> I do know for a 100% fact that AFR was originally random...It may not be now...


Thanks for reaching out and looking into this. Maybe there's a difference in syntax, like if rendering means just producing the frames, but the frames have not yet been sent to the monitor. Maybe there's a later process to take these randomly rendered frames and order them before sending it to your monitor. Either way, it would be good learnings.

Let's talk about something simpler: Is anyone here using the latest NVidia drivers, but are still able to modify their core voltage? I have the Asus brand, do I need some type of BOIS update in order to bypass NVidia's voltage lock? I'm really bummed that I cannot clock my GTX 590 Quads beyond stock because I'm stuck at 925mV.


----------



## RagingCain

Just stopped by to say hai, and happy Thanksgiving my ex-bros.

Just read last few posts:

VRAM is not 100% based on resolution. However, screen monitor size makes no difference. In actuality though, image quality can look crappier on larger monitors due to PPI. Stretching out a 640x480 image video to full screen on a 1920x1080 is very similar to a 1920x1080 image to a 40"+ monitor / TV.

Other things to consider: Brute Force AA works by increase the image size by 2x,4x,8x etc, then sampling it to fit your original resolution. Done all in video memory. *AA is the biggest VRAM using component after Textures*. After that its SHADRES. Shadow Resolution... the only game I know that points out the Megapixel quality of its in-game shadows is Metro2033. Go figure its also one of the hardest games on GPUs.

AA Evolution:
SuperSampling / Full-Scene AA -> MultiSampling AA -> Newer ones like CQ / EQ / MsAAA / FXAA

BF3 Multiplayer doesn't seem to be fully optimized. Its not entirely GPU or even drivers, or even CPU bottlenecks. I anticipate that lowering the graphics a little will increase playability a ton. Multiplayer has always been harder to run that singleplayer, but we haven't ever gotten definitive answers why, but I believe there is a bottleneck in software / thread optimization.

The Green flicker is from the game. Both AMD and nVidia guys have it









Alternatively, consider lowering Post Processing or Deferred AA. They increase smooth lines from AA maybe another 10%, but cost easily 20-30% of FPS. Potentially even more depending on moving people and tanks. Being that Textures is a HUGE percentage of VRAM used, I would begin tweaking there. Move on then to sacrificing AA to 2x, and turn off any post / deferred processing with AA. Then go down to medium shadows. Give another try. This will cut image quality of course, but it takes it away from things you aren't hugely focused on at any one time.

Quad-SLI GPU rendering is usually crippled. The reason for that is that there is a performance focus usually on no more than Tri-SLI. Simple reason is there are not many customers. Unfortunately, the 590 is watered down 580 GPUs, so when the performance is weakend to 75% of what is physically these slower GPUs have to do more work, than if it was optimized to distribute workload evenly among the 4 GPUs. Something that helps with is changing the maximum number of frames to render to 4,5 up to 8. Test what works better. This can be done in drivers, and always try and use the latest drivers with the latest games too.


----------



## kot0005

Join me up for the 590 owners club please!

Proof:
Brand: Unbranded OEM card from dell (took it out of an alienware that i got for free)

Its Underwater now







GPU-Z validation Link: http://www.techpowerup.com/gpuz/h2c8k/


----------



## SirWaWa

who makes the best 590?


----------



## MKHunt

Quote:


> Originally Posted by *SirWaWa*
> 
> who makes the best 590?


All 590s are reference and produced (PCB) by nvidia. There is no 'best.' Buy based on price/availability/warranty/stickers.


----------



## emett

Palit is the best 590 because it's the cheapest!


----------



## kot0005

I will vote for 590 hydro copper just because the Evga logo on the waterblock lights up


----------



## chaosneo

i am extremely happy with mine as it suited my purpose, single card that carries dual GPU, triple screen capable, and definitely powerful. while it definitely cannot compare with two GTX580 performance, but it consumes & generate less power & heat.


----------



## kzinti1

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SirWaWa*
> 
> who makes the best 590?
> 
> 
> 
> All 590s are reference and produced (PCB) by nvidia. There is no 'best.' Buy based on price/availability/warranty/stickers.
Click to expand...

There is, without even the slightest doubt, "The Best GTX590." It's the one whose company has the balls to offer a Lifetime Warranty. There is only one.


----------



## SirWaWa

Quote:


> Originally Posted by *kzinti1*
> 
> There is, without even the slightest doubt, "The Best GTX590." It's the one whose company has the balls to offer a Lifetime Warranty. There is only one.


evga?


----------



## Scorpion49

Quote:


> Originally Posted by *SirWaWa*
> 
> evga?


Of course.

My 590 will be back running soon, and I'm really glad for it. I figured I could scrape by on an HD 6870 for a while since my main rig is down, but its just too slow after using a 590. I don't WANT to deal with 50fps in a DX10 game that barely stressed one core on my beast









Its leak testing right now! Here is a shot of the tubing mock-up and what it will look like when its done.


----------



## emett

very nice!


----------



## MKHunt

Oh man that Raystorm looks so nice.

My EK block is going to mess up the flow of my tubing in a big way


----------



## SirWaWa

after reading that some evga 590's had voltage issues (not enough) it turned me off


----------



## RagingCain

Quote:


> Originally Posted by *SirWaWa*
> 
> after reading that some evga 590's had voltage issues (not enough) it turned me off


Best bet is to get the cheapest / best warranty. They are all technically the same. Especially if you are not water cooling.


----------



## solar0987

Would i be better buying a 590 or 2 560 ti's 448 core? money is not a factor.
It will be watercooled also. Not the evga hydrocopper.
Probably the gigabyte one as it comes with a handy free mouse!


----------



## Masked

Quote:


> Originally Posted by *solar0987*
> 
> Would i be better buying a 590 or 2 560 ti's 448 core? money is not a factor.
> It will be watercooled also. Not the evga hydrocopper.
> Probably the gigabyte one as it comes with a handy free mouse!


2x580 3gb's would be your best bet.

I'd go EVGA due to the warranty...


----------



## MKHunt

Quote:


> Originally Posted by *solar0987*
> 
> Would i be better buying a 590 or 2 560 ti's 448 core? money is not a factor.
> It will be watercooled also. Not the evga hydrocopper.
> Probably the gigabyte one as it comes with a handy free mouse!


Between those two options, probably a 590 since you'll have an extra 126 cuda cores. If you're open to anything, 2x 580 3gb like Masked said.


----------



## Krazeswift

Just downloaded the current 290.36 beta, seems that the max voltage has dropped back to 963mv


----------



## Masked

Quote:


> Originally Posted by *Krazeswift*
> 
> Just downloaded the current 290.36 beta, seems that the max voltage has dropped back to 963mv


Voltage isn't going to be unlocked for a while, I'm afraid.

Unless EVGA decides to unlock the BIOS...It's just not happening.

Unfortunately, I think I'm going to re-write the drivers myself to allow for more juice...

That's going to take some serious time though, especially considering the code catastrophe the last set was.


----------



## solar0987

So the 590 isnt worth it?


----------



## Masked

Quote:


> Originally Posted by *solar0987*
> 
> So the 590 isnt worth it?


In relation to what?

As a single slot solution, even driver locked...It beats a stock 6990 in MOST real world situations, including gaming...Once the 6990 is OC'd, well, that's a different story.

Voltage regulations are in place because "we" as a community, decided to turn the voltage up PAST what Nvidia recommended and in a style of bad parenting, smacked us with a voltage lock + Driver controlled stability etc.

You can go beyond the voltage lock and OC the card which at about 600-800mhz and a voltage of 0.99v is actually a VERY good performer...But, it takes some serious effort to do this.

If you have the cash and don't need a single slot solution then, I'd go with 2x580 3gb's as they're the "ultimate" performance solution right now.

It depends on your needs, honestly.


----------



## lollygag

Hello.. I have a pair of EVGA 590 Classifieds.. I dont want to overvolt.. I dont even really want to overclock.. Ive followed RCains topics at the evga forums and here and Ive read through much of this thread and all of 590 flashing overclocking thread and still have yet to find the answer.

I have 1 of 4 gpus with a max voltage of .913 the other 3 are .925. Why? I have no problem flashing the cards if it will fix it.. Ive edited my bios and set the min voltage to .925 but Im wary to flash it because I dnt know enough about it and I dont know that its even going to change what I want.. the max voltage for gpu 1 to be .925


----------



## solar0987

Quote:


> Originally Posted by *Masked*
> 
> In relation to what?
> As a single slot solution, even driver locked...It beats a stock 6990 in MOST real world situations, including gaming...Once the 6990 is OC'd, well, that's a different story.
> Voltage regulations are in place because "we" as a community, decided to turn the voltage up PAST what Nvidia recommended and in a style of bad parenting, smacked us with a voltage lock + Driver controlled stability etc.
> You can go beyond the voltage lock and OC the card which at about 600-800mhz and a voltage of 0.99v is actually a VERY good performer...But, it takes some serious effort to do this.
> If you have the cash and don't need a single slot solution then, I'd go with 2x580 3gb's as they're the "ultimate" performance solution right now.
> It depends on your needs, honestly.


I really only wanted 1 card but will work with 2 if i have 2, all i do is game/fold and its on a 1920x1080 monitor 23" i just want 80-100fps in anything i play with max graphics/aa/sampling ect.Will the 590 accomplish that?
Oh and GREEN forever amd is not a option at all!!!!


----------



## MKHunt

Quote:


> Originally Posted by *solar0987*
> 
> I really only wanted 1 card but will work with 2 if i have 2, all i do is game/fold and its on a 1920x1080 monitor 23" i just want 80-100fps in anything i play with max graphics/aa/sampling ect.Will the 590 accomplish that?
> Oh and GREEN forever amd is not a option at all!!!!


I think the 590 meets all those criteria, but you might need to drop AA down one notch from max. It has the power, just limitedish vram, though at 1080 it's very workable with the programs out at the moment.. I was getting ~15-15.5k ppd per core with core at 680 and memory at 884 IIRC.


----------



## emett

The 590 is a great card and fills a spot in the market. I payed $800aud for mine, the 580 msi lightening is $700aud. So its a bargin when you look at it like that.
Also never had any issues with the card, had it 4 months.


----------



## Scorpion49

Quote:


> Originally Posted by *lollygag*
> 
> Hello.. I have a pair of EVGA 590 Classifieds.. I dont want to overvolt.. I dont even really want to overclock.. Ive followed RCains topics at the evga forums and here and Ive read through much of this thread and all of 590 flashing overclocking thread and still have yet to find the answer.
> 
> I have 1 of 4 gpus with a max voltage of .913 the other 3 are .925. Why? I have no problem flashing the cards if it will fix it.. Ive edited my bios and set the min voltage to .925 but Im wary to flash it because I dnt know enough about it and I dont know that its even going to change what I want.. the max voltage for gpu 1 to be .925


Because thats what it takes for that core to run at that speed. Why worry about it? Forcing it to run a different voltage will at best run it hotter, at worst destabilize it unless you bump up the clock speed to match.


----------



## Krazeswift

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krazeswift*
> 
> Just downloaded the current 290.36 beta, seems that the max voltage has dropped back to 963mv
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Voltage isn't going to be unlocked for a while, I'm afraid.
> 
> Unless EVGA decides to unlock the BIOS...It's just not happening.
> 
> Unfortunately, I think I'm going to re-write the drivers myself to allow for more juice...
> 
> That's going to take some serious time though, especially considering the code catastrophe the last set was.
Click to expand...

Sorry I meant with the bios mod, I was able to hit a max of 975 for a while with previous drivers but it seems nvidia have knocked it back down to 963.

Modded drivers sounds promising though


----------



## kzinti1

I bought mine so I didn't have to go through the hassle of running two GTX580's in sli. I'm through with anything other than a single slot solution.
When I 1st bought this card I had the ability to control the voltage in MSI's Afterburner. I don't know what I did but it no longer works. My max voltage is now 0.9380V and the idle is 0.8750V.
This is the second coolest card I've ever owned. The MSI GTX580 Lightning Xtreme 3GB is the coolest running card I've ever owned. That card takes some hard work to get it to top 70C., which is very good since it can't be watercooled. It has one monster of an oversized non-reference pcb. 12"x5"x1.77". It wouldn't fit in my CoolerMaster HAF-X until I removed the side fan and shroud. The mid-case vga cooling fan won't fit, either.


----------



## Mongol

Just came in:










Won't be installed until the remainder of my X79 gear arrives...may take some time as I'm waiting on EVGA to release the Classified.


----------



## CapitanPelusa

Hey fellow GTX590 owners.

I was wondering what GPU usage % are you guys getting at 1080p on Ultra on BF3. My 590 is hovering around 50-70% always with some 98% spikes. Framerate not too good. =(


----------



## emett

Both my gpu cores sit at 98% running bf3 @1080p all settings maxed and no vsync. My gpu temps playing this are 70-72.
Sounds like you either ha v-sync on or your cpu is a bottleneck.


----------



## Wogga

yesterday i've tried 290.36 drivers. after clean install everything except wallpaper was pixelized like if i had 800*600 or even lower resolution while it was set to 1920*1200. reboot didnt helped, lowering resolution and then changing it back didnt helped, playing with settings via nvpanel didnt helped. also i tried to play BF3 and...lol it looked like quake on my first PC (MMX 200MHz and rage 3D)
it was fun bit i`ve returned back to 285.79
thats my bad expirience with 290.36 on quad


----------



## L D4WG

Quote:


> Originally Posted by *Wogga*
> 
> yesterday i've tried 290.36 drivers. after clean install everything except wallpaper was pixelized like if i had 800*600 or even lower resolution while it was set to 1920*1200. reboot didnt helped, lowering resolution and then changing it back didnt helped, playing with settings via nvpanel didnt helped. also i tried to play BF3 and...lol it looked like quake on my first PC (MMX 200MHz and rage 3D)
> it was fun bit i`ve returned back to 285.79
> thats my bad expirience with 290.36 on quad


Hey Wogga, Ive just installed the 290.36 beta drivers and aren't having the issues you are.

I attempted a clean install the first time and after a few seconds the screen just went black and stayed there till i manually restarted.
Came back up with a low res (Looked like it had removed the old drivers but not installed the new ones)

So I ran the drivers again and this time gave me some more menu option etc and asked me to click restart when it was done.

After if came back up this second time my Resolution was fixed and BF3 seems fine, no issues now.

Give them another go


----------



## evilmustang66

i will miss owning my gtx 590 quad sli pack i havent even own at 6 months when pop goes card one and dvi port on second no they were not oc or overvolted i was surfing web doing college homework so i rma 1st time when they came back test single card one at time then installed evga fan unlocker etc frimware from their site now i restart then go to put in sli mode boom i get nice lines from top to bottom so i rma that set now i get the thrid set already doing the same thing from the get go again so i got offered a refund wow i miss being a owner of gtx 590 model was the 1598 thank you have a wonderful holiday


----------



## kot0005

Load Highest fan speeds









Load lowest Fan Speeds









Idle Lowest fan speeds


----------



## aguante84

All of you guys works with the hidro cooper version or somebody works with fan version??? I ask because i bought a fan version, only one, but i read in everywhere and everybody says that this video card have some troubles with temps and fan % i have a phantom NZXT what do you suggest me to use this card with the MAX SETTINGS and try to do overclock if its possible? and will don't have any problem please.

sorry my english im from bogota-Colombia.


----------



## Scorpion49

Quote:


> Originally Posted by *aguante84*
> 
> All of you guys works with the hidro cooper version or somebody works with fan version??? I ask because i bought a fan version, only one, but i read in everywhere and everybody says that this video card have some troubles with temps and fan % i have a phantom NZXT what do you suggest me to use this card with the MAX SETTINGS and try to do overclock if its possible? and will don't have any problem please.
> 
> sorry my english im from bogota-Colombia.


The fan version worked fine for me, going water was mostly for noise and appearance. Search for the GTX 590 overclocking thread for detailed info on how to overclock.


----------



## kzinti1

No heat problem with the EVGA in my sig, on air. I've used MSI's Afterburner OSD so long I don't even bother turning it off for any gaming. No temp. problem there. I keep GPU-Z running in the background and logging to file and it seems to agree with Afterburners readings.
I may add a waterblock after IvyBridge arrives, (this one; http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_240_580&products_id=32269 , unless I find out something bad about it). If I do, I'll really miss the lit up Nvidia logo in green and "GeForce" in white. A simple but very classy touch there.
BTW, which is GPU #1? The one next to the power connectors or the other one towards the back of the case? On mine, the one at the back of the case runs, at most, 2C hotter than the front one. Naturally, since the direction of the exhaust which makes it through the card, and not just inside the case, goes from the front and out the back of the card. I haven't seen a schematic differentiating one core from the other.


----------



## MKHunt

Never had problems with my 590 when it was on air. A random short killed it and my PSU. Not sure where that came from.

So guys what's the general consensus between the Koolance 590 and the EK 590? I might have both.

I mean, worst possible situation is having a second block inspires me to get a new PSU and a second 590... for a single screen at 1080.


----------



## iARDAs

Ok guys

I currently have a MSI Twin Frozr III 570

It is factory OCed @ 770/2000

If i go for a 590 how big a performance increase would i have?

20%?
50%?
100%?


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> Ok guys
> I currently have a MSI Twin Frozr III 570
> It is factory OCed @ 770/2000
> If i go for a 590 how big a performance increase would i have?
> 20%?
> 50%?
> 100%?


60-70%


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> 60-70%


More like 100% but...Okay.


----------



## jonnylaris

Quote:


> Originally Posted by *toX0rz*
> 
> 60-70%


Why the hell would you want a 590? Just get another 570 and SLI and boom, same performance as 590 if not more.


----------



## iARDAs

Quote:


> Originally Posted by *jonnylaris*
> 
> Why the hell would you want a 590? Just get another 570 and SLI and boom, same performance as 590 if not more.


My board seems Crossfire compatible but not SLI

i dont want to change the MOBO as its new..

Also i guess i will just wait for 6xx series.


----------



## Masked

Quote:


> Originally Posted by *jonnylaris*
> 
> Why the hell would you want a 590? Just get another 570 and SLI and boom, same performance as 590 if not more.


This is ignorant...Please don't post misinformation here.

The 590 is actually an improvement over 2 stock 570's.

When their OC'd the 570's have a slight edge but, we're talking @8%...Nothing major.

Not only that but, when the 590 is unlocked and OC'd that 8% loss, turns into a 12% gain over the 570's so, I'd venture to say they're about equal.

Also don't forget they cost practically the same, the 590 takes up 1 slot and has a far better resale value than the 570's do.

...


----------



## Image132

Does anyone know if a "EK-KIT H3O - Supreme HF 240" kit would cool a 590 better than stock?


----------



## TerrabyteX

IF you plan to loop only 590 then yes ! it would definitely cool it better then the stock vapor chamber/fan combo. From 75 degrees on full load with 90% fan and 670 oc to around 45-49 degrees.


----------



## Wogga

EK-KIT H3O - Supreme HF 240, i dont see any VGA waterblocks in it so how could it cool 590? 
want EK cooling - take EK-FC590 GTX fullcover


----------



## aguante84

Do you have a fans system installed on your case, if i want to play with MAX SETTINGS, I need help and support because i want this video card with a long life (2 years at least). and what is the best driver to install and dont have problems with the temps and the fan speed please???


----------



## lollygag

2 birds with 1 stone. I'll join, I have a pair of the Evga Classified LE's (03G-P3-1598-AR)...





3D Mark '11

Plus theyre for sale.. used only 4 months in my vfx workstation.. never oc or bios modded. I just need 3-3GB 580's for my hugesurround display real estate.


----------



## Shinobi Jedi

I can't decide which display to play Batman: AC on with these DX11 problems..

Should I play it on my 60" plasma at 60hz/60fps now, or wait for the patch and play it and 120hz (and hopefully 120fps) on my 24" 120hz BenQ? There's a few reasons why I'm torn but am leaning towards the big screen:

First, I can't seem to get the FPS in game unlocked, even though I've changed every "bsmoothframerate" command to False that I could find in the config files in the documents tab and the steamapps/common tabs, so unless I'm missing one (which I may well be) I'm assuming not getting past 60 most of the time is due to the poor coding? Or is anyone else able to unlock their FPS in game under DX11?

Secondly, impatient to play this game and not wanting to play it on less than DX11 settings, I grabbed the incomplete (allegedly) leaked patch. Which didn't do anything but add maybe 10fps average on DX11. Unless, again, I missed one of the commands to unlock the FPS in the config files. So I ditched it, not really seeing much of an improvement at all.

So, with or without the leaked patch I get a decent 45-60fps on DX11 with tessellation turned down or off. The GPU's are barely pushed at all, maybe %20-%30 max.

That being the case, and it not being an online competitive MP game, I figured I'd play it on my 60" Kuro Plasma like I do with my console games. Which it is beautiful on and where I'm leaning towards. Plus it runs color correct on that display too.

Yet, my eyes have gotten really spoiled by 120hz. and I'm tempted to wait on the game until they get it properly patched so I can play it like that. Unless, I did miss something in the config files like I hope and can somehow get it working right now.

Ironically, the game being 3DVision ready, when played in 3D, pushed all four GPU's to the max and arguably does a better job of maintaining a constant 60fps more than playing in 2D.

So, I figure I'd get some opinions. What do you all think?

Play it now at around 45-60fps on a 60" plasma and go Big screen/Home Theater style for this game? or, Wait until its patched properly to play on my 24" 120hz 3D display? I am also tempted to play it in 3D, but its a pretty dark game for 3D without lightboost (if lightboost lives up to the hype, which reportedly it does)

Anyone else playing the game? What kind of results are you all getting?


----------



## Wogga

i've got 32-60 fps (48 avg) with fully maxed settings (incl. those x32 CSAA or something like that) and ~40% usage of each GPU. i'm still on 285.79 drivers


----------



## Glockshna

I know I'm a bit late to the party but, I FINALLY got one. I'll post proof when I get home


----------



## emett

I'm playing batman in dx9. I only have 1 590







The game needs a patch big time!


----------



## Blando

Is the 590 being discontinued? I see that Newegg deactivated the EVGA 590s on their site. I have an EVGA 590 classified (air) and it's still within the return period. I have some concerns about it and if it's also being discontinued, I'm inclined to return it. My concerns are primarily operating temperature and VRAM. I've been testing the card with bf3 and I get temps averaging 81 and 85 on the GPUs. I also see the VRAM is maxed out. I'm running it on a new rig with 16gb of ram, an i7 2600k, SSDs and a single 120hz 1900x1080 monitor. The VRAM concerns me because when I pop in 3GB 580s, the VRAM runs even higher, at about 1650 or so. That said, I like the 590's relatively compact profile and noise level for this rig. The 580 classified's are probably overkill for this setup and when those fans spin up past 30% it gets pretty loud fast from there.

I'm curious if others have similar concerns. Appreciate any comments or insight.

Thanks.


----------



## Masked

Quote:


> Originally Posted by *Blando*
> 
> Is the 590 being discontinued? I see that Newegg deactivated the EVGA 590s on their site. I have an EVGA 590 classified (air) and it's still within the return period. I have some concerns about it and if it's also being discontinued, I'm inclined to return it. My concerns are primarily operating temperature and VRAM. I've been testing the card with bf3 and I get temps averaging 81 and 85 on the GPUs. I also see the VRAM is maxed out. I'm running it on a new rig with 16gb of ram, an i7 2600k, SSDs and a single 120hz 1900x1080 monitor. The VRAM concerns me because when I pop in 3GB 580s, the VRAM runs even higher, at about 1650 or so. That said, I like the 590's relatively compact profile and noise level for this rig. The 580 classified's are probably overkill for this setup and when those fans spin up past 30% it gets pretty loud fast from there.
> I'm curious if others have similar concerns. Appreciate any comments or insight.
> Thanks.


The 590 was ALWAYS a limited release.

Limited release means there are no more.

And 1500 of Vram has ALWAYS been a limiting factor --


----------



## NecroPS3

Quote:


> Originally Posted by *Blando*
> 
> Is the 590 being discontinued? I see that Newegg deactivated the EVGA 590s on their site. I have an EVGA 590 classified (air) and it's still within the return period. I have some concerns about it and if it's also being discontinued, I'm inclined to return it. My concerns are primarily operating temperature and VRAM. I've been testing the card with bf3 and I get temps averaging 81 and 85 on the GPUs. I also see the VRAM is maxed out. I'm running it on a new rig with 16gb of ram, an i7 2600k, SSDs and a single 120hz 1900x1080 monitor. The VRAM concerns me because when I pop in 3GB 580s, the VRAM runs even higher, at about 1650 or so. That said, I like the 590's relatively compact profile and noise level for this rig. The 580 classified's are probably overkill for this setup and when those fans spin up past 30% it gets pretty loud fast from there.
> I'm curious if others have similar concerns. Appreciate any comments or insight.
> Thanks.


590 and 6990 were only limited time cards so ya but the temps are suppose to get up to ~80 thats safe tempsi think i read somewhere the max temp was around 110. and the vram isnt actually maxed out its something else like preloading stuff my vram at 1920x1080 hasnt went over 1.2Gb in BF3


----------



## Scorpion49

I think they go on and off like that all the time, depending on stock maybe. They were deactivated everywhere when i got mine off of amazon.


----------



## jamalm23

Hi all, I have a 590 and was wondering what was everyones avg fps after the dx11 patch release of AC?
Please indicate what the advanced graphic settings are set to and resolution.

Thanks!


----------



## Scorpion49

Quote:


> Originally Posted by *jamalm23*
> 
> Hi all, I have a 590 and was wondering what was everyones avg fps after the dx11 patch release?
> Please indicate what the advanced graphic settings are set to and resolution.
> 
> Thanks!


DX11 patch release of what.


----------



## emett

Arkham City.


----------



## Masked

Quote:


> Originally Posted by *Scorpion49*
> 
> I think they go on and off like that all the time, depending on stock maybe. They were deactivated everywhere when i got mine off of amazon.


Cain and I were discussing stock and why there's continual stock @ 15 pages ago.

Basically after a certain amount of time companies release RMA stock...That's basically what it is...RMA stock hitting the shelves.

Keep in mind, most RMA stock is NIB...So, there are no issues with it plus, everything else remains the same.


----------



## Saizer

Quote:


> Originally Posted by *Blando*
> 
> . I've been testing the card with bf3 and I get temps averaging 81 and 85 on the GPUs


That depends on how much noise affects you. In my case, as I like playing with a headset on, it isolates me from the outside noises letting me to put the fan speed to 90% (very noisy) leaving both gpus at 70°C. But as I have the PC quite far away from where I play and as I use a closed ear headset, then that doesn't bother me a lot.


----------



## kzinti1

When will the 690's be out?


----------



## StrayderGame

Hello every1,as u see this is my first post ,and i have joined coz i have extreme doubt about VGA...

However i bought:

i7-2600k
Maximus IV extreme-z
Noctua nh-d14
1200w Cooler master gold PSU
NZXT phantom case...

What is my question,i'm missing VGA as u can see and does GTX 590 single card runs into bottleneck on BF with full setings (but i mean everything possible maxed out)?

I have read some reviews to gtx 590 can go into V-ram bottleneck especially on caspian border multiplayer.

I have read also to that card uses just 1.5gb of total 3gb,and 2 gpu processors (what is for me pointless to drow away 1.5 gb).I couldn't find how on multi gpu that memory system works? Does they split or how? coz if ppl get bottleneck with gtx 590 3gb (1.5gb effectivly) whats going on with other 1.5gb? waste?

if it uses all 3gb it shouldn't go into V-ram bottleneck coz gtx 580 3gb doesn't go...My logical opinion was to gtx 590 share memory,and to both are uses parallel what is obviously wrong.?

I'm not trolling or anything,i have used AMD and Nvidia in my long career but in last time didn't pay attention too much on new hi tech ,and 650 euros isn't small amount of money so i need some fast answers...and i'm up for Nvidia coz of CUDA,Physx and other options...

-Once more,when i said Full settings i rly mean full,everything pulled to max...and will play on resolution 1920x1080p 60hz Dell U2311h IPS monitor.

-Also i plann to use this graphic card for atleast 3 years with adding one more for SLI (either gtx 590 or gtx 580 3gb)

-I'm not w8ing for Kepler coz first serie will have problems with drivers,function of that system PCI-e 3.0 with supporting 2.0 etc,and better cards will not be released until Q3 2012 to work nice...and then i would have to switch processor,mobo and gpu what is a lot of money....

So what u would recommand since you have biggest experiences with that card/cards (gtx 590/gtx580) and only u can give me truly answer...

Thx in advance...


----------



## Not A Good Idea

count me in


----------



## Masked

Quote:


> Originally Posted by *StrayderGame*
> 
> Hello every1,as u see this is my first post ,and i have joined coz i have extreme doubt about VGA...
> However i bought:
> i7-2600k
> Maximus IV extreme-z
> Noctua nh-d14
> 1200w Cooler master gold PSU
> NZXT phantom case...
> What is my question,i'm missing VGA as u can see and does GTX 590 single card runs into bottleneck on BF with full setings (but i mean everything possible maxed out)?
> I have read some reviews to gtx 590 can go into V-ram bottleneck especially on caspian border multiplayer.
> I have read also to that card uses just 1.5gb of total 3gb,and 2 gpu processors (what is for me pointless to drow away 1.5 gb).I couldn't find how on multi gpu that memory system works? Does they split or how? coz if ppl get bottleneck with gtx 590 3gb (1.5gb effectivly) whats going on with other 1.5gb? waste?
> if it uses all 3gb it shouldn't go into V-ram bottleneck coz gtx 580 3gb doesn't go...My logical opinion was to gtx 590 share memory,and to both are uses parallel what is obviously wrong.?
> I'm not trolling or anything,i have used AMD and Nvidia in my long career but in last time didn't pay attention too much on new hi tech ,and 650 euros isn't small amount of money so i need some fast answers...and i'm up for Nvidia coz of CUDA,Physx and other options...
> -Once more,when i said Full settings i rly mean full,everything pulled to max...and will play on resolution 1920x1080p 60hz Dell U2311h IPS monitor.
> -Also i plann to use this graphic card for atleast 3 years with adding one more for SLI (either gtx 590 or gtx 580 3gb)
> -I'm not w8ing for Kepler coz first serie will have problems with drivers,function of that system PCI-e 3.0 with supporting 2.0 etc,and better cards will not be released until Q3 2012 to work nice...and then i would have to switch processor,mobo and gpu what is a lot of money....
> So what u would recommand since you have biggest experiences with that card/cards (gtx 590/gtx580) and only u can give me truly answer...
> Thx in advance...


I swear to god...We need a FAQ that PERMANENTLY answers this.

The TOTAL ram on the card is 3gb.

There are 2 CORES that share the TOTAL ram.

So 2 cores share 3gb...Which means each core gets 1.5gb of Vram total...There is no waste.

I do not have ANY issues in Caspian or any other map in BF3 on 1900+...The ONLY issue is if you tri-monitor and game.

I'm glad you're not waiting for Kepler because the 690 (If there will be one) isn't due until December (The 2012 one)

I would recommend future proofing and getting 2x 580 3gb's however, the 590 is a limited edition and will have a better resale value...


----------



## Scorpion49

Quote:


> Originally Posted by *Not A Good Idea*
> 
> count me in


It made it huh? How is it running in quad? The 580 should be here tomorrow.


----------



## StrayderGame

Quote:


> Originally Posted by *Masked*
> 
> I swear to god...We need a FAQ that PERMANENTLY answers this.
> The TOTAL ram on the card is 3gb.
> There are 2 CORES that share the TOTAL ram.
> So 2 cores share 3gb...Which means each core gets 1.5gb of Vram total...There is no waste.
> I do not have ANY issues in Caspian or any other map in BF3 on 1900+...The ONLY issue is if you tri-monitor and game.
> I'm glad you're not waiting for Kepler because the 690 (If there will be one) isn't due until December (The 2012 one)
> I would recommend future proofing and getting 2x 580 3gb's however, the 590 is a limited edition and will have a better resale value...


hehee that FAQ option would be nice since there is 317 pages so far xD... well ye but i noticed to newegg for long time doesn;t have gtx 580 Lightning XE 3gb,and i'm afraid to they stoped to sell it due to new kepler,however i can order that gtx 580 but it's from other country and cost like 580 euros and gtx 590 gigabyte is 640 euros in my country ( i mean on countries coz of warranty)

So i guess to gtx 590 is much more reliable to get,according to similar price and according to warranty destination...am i right? And here i can buy Gigabyte ,Gainward and MSI manufacturer gtx 590...which u recommand?

and last question no stuttering at all there?

Thx for fast answer


----------



## Not A Good Idea

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Not A Good Idea*
> 
> count me in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> pics
> 
> 
> 
> It made it huh? How is it running in quad? The 580 should be here tomorrow.
Click to expand...

yeah i wasn't expecting it until Saturday. both cards and water block got here on the same day. it was like Christmas on Thursday. it runs great. they don't go over 35c at all. so far i ran a couple of benchmarks and im happy with them @ stock. maybe when i get bored ill up the mhz a bit.


----------



## Scorpion49

Quote:


> Originally Posted by *Not A Good Idea*
> 
> yeah i wasn't expecting it until Saturday. both cards and water block got here on the same day. it was like Christmas on Thursday. it runs great. they don't go over 35c at all. so far i ran a couple of benchmarks and im happy with them @ stock. maybe when i get bored ill up the mhz a bit.


Nice! I'm going to abuse that poor 580, its going to need therapy afterwards.


----------



## Not A Good Idea

Quote:


> Originally Posted by *Scorpion49*
> 
> Nice! I'm going to abuse that poor 580, its going to need therapy afterwards.


lmao and at the same time









the sli cable is too long and didnt work at first. i was getting some crazy ass screen movement. i replaced it with a short one and its all gravy.


----------



## rush2049

Quote:


> Originally Posted by *Masked*
> 
> I swear to god...We need a FAQ that PERMANENTLY answers this.
> 
> The TOTAL ram on the card is 3gb.
> 
> There are 2 CORES that share the TOTAL ram.
> 
> So 2 cores share 3gb...Which means each core gets 1.5gb of Vram total...There is no waste.
> 
> I do not have ANY issues in Caspian or any other map in BF3 on 1900+...The ONLY issue is if you tri-monitor and game.
> 
> I'm glad you're not waiting for Kepler because the 690 (If there will be one) isn't due until December (The 2012 one)
> 
> I would recommend future proofing and getting 2x 580 3gb's however, the 590 is a limited edition and will have a better resale value...


I saw his post in my email because I am subscribed.... and I was thinking in my head how snarky I could be and not get yelled at.....

but you hit the issue on the head.....


----------



## Masked

Quote:


> Originally Posted by *rush2049*
> 
> I saw his post in my email because I am subscribed.... and I was thinking in my head how snarky I could be and not get yelled at.....
> but you hit the issue on the head.....


Lol.

I love how people ask the same question each page









Think I'm going to get "snarkier"...


----------



## StrayderGame

Quote:


> Originally Posted by *Masked*
> 
> Lol.
> I love how people ask the same question each page
> 
> 
> 
> 
> 
> 
> 
> 
> Think I'm going to get "snarkier"...


hahaha c'mone m8 there is 318 pages and at first 10 there is no similar question xD


----------



## Masked

Quote:


> Originally Posted by *StrayderGame*
> 
> hahaha c'mone m8 there is 318 pages and at first 10 there is no similar question xD


Quote:


> Originally Posted by *MKHunt*
> 
> In SLI and CrossFire VRAM is mirrored, not shared. Each GPU stores the same information as all the other GPUs so you have 1.5GB VRAM effective total.


Quote:


> Originally Posted by *Shinobi Jedi*
> 
> I would think you're going to want to go with a 580 card that has 3gb of VRam to play at that resolution. (Not the 590 3gb, which is really 1.5gb mirrored on the 590)
> Maybe even two, if you have the coin. 590 is great for 1920x1080 120hz/3D displays if you're a heavy FPS player who wants 120fps/120hz smooth butter like experience.
> But for higher resolutions, you want at card that has 3gb VRAM.


Quote:


> Originally Posted by *Masked*
> 
> 1) No.
> There is no master/slave...It's purely a gap of communication.
> I've made my own custom drivers forever and there is no primary GPU...Sometimes GPU 2 is the first to respond and sometimes GPU 1 is...That's why there are Vsync issues and lag//tessellation...
> It's merely the time it takes to communicate from 1 to the other, that's it, that simple.
> In a dual core card, that lane is pipe-lined thus it's faster but, power etc is slaved so you come out with a compromise.
> The compromise on the 590 is that you lose Vram.
> 2) Of course there's no mirroring...It's actually more of a sharing process...Each group is assigned a module and is streamlined to that module.
> ~
> It's not a question that the cores work better on dual core cards...The issue is merely one of your limitations.
> You CANNOT smoothly run 1.5gb at that resolution, REGARDLESS of the card...It's just not happening.
> I'd suggest 2x or 3x 580 3gb's ~
> The buffer is literally 1750 ~ The 590's JUST fall short...Not really anything you can do about it other than work around it.


Oh, it wasn't discussed in the last 10 pages?










That was 2 pages back for me but, I view 50 at a clip


----------



## Envyyyyyyy

I have an EVGA GTX 590... I actually had 2 in quad SLI but sold the 2nd one due to over heating, stuttering and more problems than it was worth to me.
I intially bought the 590 to run my 3 27 2ms Asus LED monitors so I could game at 6000x1080 in high frames.... ALTHOUGH after this thread (last 20-30 pages or so) i've learned a great deal. You guys are very knowledgable and friendly, but i'm disspointed in one thing however. I WISH I would of got 2 580s 2gb and SLI'd them. That would of helped my situation. I just want more frames in high/ultra settings but can't get it due to the low Vram per core. (not low, just not high enough for my wants)..
My setup is.
NZXT Phantom all fans swapped for Areo cool blue sharks
i7-2600k (corsair H80 cooled push pull)
Asus MB
16gb 1866 corsair Vengeance RAM
1200w AX PSU
120gb Intel SSD
2x 2TB HDD
a few other goodies..
I'm new to the scene, but i've learned a lot. This is my 1st post and I wanted it to be in here. I hope to become an active member and help and learn as much as i can.

P.S. I hope 600 series cards have more VRAM!!! lol
Also proof for my EVGA 590s is at
Youtube.com/Chaseme187


----------



## Glockshna

Okay, I have it and it's installed (Still need to post my proof, I have the pictures but haven't put them on my computer yet.) but I'm a bit worried about the temperatures I'm seeing out of it.
It idles at around 55 C. at load it's around 90ish with the fan on auto. Around 80-87 if i force it to 100% and this is with my case open. This is with the stock cooler and at stock settings, also, for some reason my Futuremark scores are around 5k (I'm at work and don't have the exact score on hand.) which seems low. During the futuremark run (If i recall correctly,) only GPU 1 lit up, not GPU 2. Is that normal? This is my first 'SLI' or otherwise multi GPU type setup so I don't know what to expect.

As for performance in games, it's excellent. There doesn't appear to be any throttling of any sort and running Eve Online every setting (Including AA) maxed at 5760x1080 is smooth as silk. I can even run GTA IV (As horrible as that game is) in tri-monitor at that resolution and hold about 30 FPS with every setting maxed save for View distance which was set at 50/100.
The rendering engine for Skyrim freaks out when I play at 5760x1080 (Has something to do with the FOV everything just screws up but I don't think its an issue with the card, more just the engine doesn't support it) In 1920x1080 is caps out at an unwavering 60 FPS in any area.

TL;DR
Card idles 55C ish load 87-90 Normal?
If not,could it be a problem with the card/cooler itself or should I look into aftermarket coolers? (Water cooling perhaps?)
Only one GPU lit up for Futuremark which scored in the low 5XXXs
Both GPUs light up when I play games but GPU2 always lights up first and GPU 1 only lights up when GPU2 hits 100%
FPS performance is silky smooth even at triple monitor resolutions with every setting maxed.

Basically I just need to know if I should RMA this beast. The performance is a dream and I don't regret buying it in a slightest but those temperatures worry me, especially with what I read towards the beginning of the thread about these cards frequently burning out and EVGA not taking them back. I want to return it for a replacement before it's too late.

Advice? (Sorry for the wall of text)


----------



## iARDAs

Would it be smart to invest in a 590 now or wait for a 680?


----------



## Scorpion49

Quote:


> Originally Posted by *Glockshna*
> 
> TL;DR
> Card idles 55C ish load 87-90 Normal?
> If not,could it be a problem with the card/cooler itself or should I look into aftermarket coolers? (Water cooling perhaps?)
> Only one GPU lit up for Futuremark which scored in the low 5XXXs
> Both GPUs light up when I play games but GPU2 always lights up first and GPU 1 only lights up when GPU2 hits 100%
> FPS performance is silky smooth even at triple monitor resolutions with every setting maxed.
> 
> Basically I just need to know if I should RMA this beast. The performance is a dream and I don't regret buying it in a slightest but those temperatures worry me, especially with what I read towards the beginning of the thread about these cards frequently burning out and EVGA not taking them back. I want to return it for a replacement before it's too late.
> 
> Advice? (Sorry for the wall of text)


How is the airflow in your case? What is the rest of your setup? Those temps are a little to the high side but within limits for an air cooled card. Do you run a custom fan profile with afterburner or EVGA precision? Nine times out of ten people that ask if their card its too hot and thinks its broken never bothered to turn the fan up.


----------



## Glockshna

Quote:


> Originally Posted by *Scorpion49*
> 
> How is the airflow in your case? What is the rest of your setup? Those temps are a little to the high side but within limits for an air cooled card. Do you run a custom fan profile with afterburner or EVGA precision? Nine times out of ten people that ask if their card its too hot and thinks its broken never bothered to turn the fan up.


Quote:


> at load it's around 90ish with the fan on auto. Around 80-87 if i force it to 100%


I'm using a large case (Fractal Design Define XL Black) The cable management is excellent and there's a lot of airflow in the case. I also had both panels of the case open, but i will admit the fans in the case are made for reduced noise not maximum airflow. I've never had a problem with heat before this, however I was using a HD 6850 previously so that is to be expected. The case fans (1 180mm on top (Exhaust) and 2 140mm fans one in front (Intake) one in back (Exhaust)) I have not touched in terms of speed.

As for the rest of my setup, I went in the rig builder and made my system but i guess i didn't make it show in my sig, I'll try and get that to show but for the time being my system is:

EVGA X58 SLI -3 Motherboard
Intel Core i7 950 @ 3.06 GHZ (Stock) Stock cooler as well
24 GB G.Skill DDR3 1600Mhz ram
EVGA nVidia GTX 590 Classified 3GB


----------



## MKHunt

Quick question and a little bit of a pickle. So my EK 590 block came today and for some reason I was under the impression that the EVGA backplate would fit w/o doing anything. But it doesn't. The holes are a bit too small for the screws EK uses. So if I were to go backplateless how much do you guys think that would change things?

On the other side of the options list is opening the EVGA backplate holes to 3mm diameter. The downside to this is that it's more time and I'd have to pay $$ for damages if/when I have to RMA the card. Upside is VRAM cooling (though the vram is just a few watts) and aesthetics.

Been without my computer since Oct 14 and I'm itching to get it back together. The EK backplate is another $35-40+shipping which is quite difficult to justify when I already have a backplate.

What would you do? DO you think EVGAs charge would be more than the EK backplate + shipping?

As a side note, this block is a looker.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> Quick question and a little bit of a pickle. So my EK 590 block came today and for some reason I was under the impression that the EVGA backplate would fit w/o doing anything. But it doesn't. The holes are a bit too small for the screws EK uses. So if I were to go backplateless how much do you guys think that would change things?
> On the other side of the options list is opening the EVGA backplate holes to 3mm diameter. The downside to this is that it's more time and I'd have to pay $$ for damages if/when I have to RMA the card. Upside is VRAM cooling (though the vram is just a few watts) and aesthetics.
> Been without my computer since Oct 14 and I'm itching to get it back together. The EK backplate is another $35-40+shipping which is quite difficult to justify when I already have a backplate.
> What would you do? DO you think EVGAs charge would be more than the EK backplate + shipping?
> As a side note, this block is a looker.


Personally, after Eddy's responses, I would not have gone EK but, to each their own.

Realistically, you're talking maybe 1/2 of a C...so, 2-3f ~ That's literally all a back-plate dissipates so, it's not even worth worrying about.

40$ for an aesthetic? Hell no.

I wouldn't worry about it...Just go forward without a back-plate.


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> Personally, after Eddy's responses, I would not have gone EK but, to each their own.
> 
> Realistically, you're talking maybe 1/2 of a C...so, 2-3f ~ That's literally all a back-plate dissipates so, it's not even worth worrying about.
> 
> 40$ for an aesthetic? Hell no.
> 
> I wouldn't worry about it...Just go forward without a back-plate.


EK wasn't my first choice, but they were the only company that guaranteed fitment with the new inductors. I tried to get an HC from EVGA but they would only sell an entirely new card so that route dead-ended as well. KL said their Rev. 1.1 hasn't had any reports of not fitting a 590 but also had not heard about new circuitry. I also have yet to see anyone else with a card with the new circuitry so I wasn't sure if it was worth the risk. After everything was ordered and shipped, KL approved my RMA on the original block and said a replacement would be the 1.1 Rev. However their fins, while extremely effective, are just a ticking time bomb when it comes to losing nickel with low viscosity fluids at high flow rates. It's not an issue of nickel quality with the fins, just the design. They work fantastically with premixes due to the high viscosity and more gel-like fluids, but I use distilled.+ PTN because it offers a few "free" degrees and doesn't cost $40+shipping every time I drain my half-gallon loop.

In the short time it took to get a reply, I'll admit I caved. Hellfire-PC had it for $41 w/ free 2-day shipping on ebay. After posting I put the naked card in my pc and just looked at it for a while. The brown board did all kinds of clashing with my GB mobo (actual black PCB) and it bugged me to no end. Plus the heads of the EK screws ended-up not fitting the EVGA backplate and the EK plate cools the solid-state capacitors on the back which, when I would fold, got quite toasty (I went through a feeling heat dissipation with fingers phase).

I guess in the end I'm a complete whore for aesthetics. Besides, my friends bought me almost eighty bucks in drinks last night for my birthday so I'm rationalizing it by telling myself we went 50/50.

Good to know that it will do almost nothing but stress my PCIe socket more though. Thanks for the info.


----------



## Not A Good Idea

xspc razor fits the cards with the higher chipset.


----------



## MKHunt

Quote:


> Originally Posted by *Not A Good Idea*
> 
> xspc razor fits the cards with the higher chipset.


Brushed alu and Cu didn't fit build aesthetics at all. When in close proximity with black, gunmetal, and green anything copper looked awful.

So you have a newer card too? Awesome. Any coil whine? My older 590 had a tiny dentist trapped inside.


----------



## mojobear

hey not a good idea,

Your GTX 590 fit the razor block easily? Man I totally dont get why my second GTX 590 with the higher inductor would not fit properly. I bought my first GTX 590 a while back in June, which had the lower inductor, and that fit perfectly with the XSPC block. The second one, no matter what...would cause high temps on one of the GPUs. It had the new inductors like the picture from MKHunt....I tried a new razor block, but had the same issue! I tried switching positions of the blocks in the loop, but the same GPU would sky rocket in temps (70-80s). Finally I replaced the GTX 590, with the RMAed one having the same higher inductor, and again...same GPU, same high temps. I have seriously repositioned the block on the PCB like over 10 times, with the same result...so it can't be the mounting, and Masked mentioned it could be air bubbles, but I switched the position of the blocks and it was the same GPU with the problem....it went from GPU3 to GPU1.

If your razor blocks work fine then there must be something wrong I am doing. I for the life of me cannot think of a good reason! Are you using the backplate provided by EVGA and the washers for the screws? Any advice or if you came across similar issues?

Thanks!


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I swear to god...We need a FAQ that PERMANENTLY answers this.
> The TOTAL ram on the card is 3gb.
> There are 2 CORES that share the TOTAL ram.
> So 2 cores share 3gb...Which means each core gets 1.5gb of Vram total...There is no waste.
> I do not have ANY issues in Caspian or any other map in BF3 on 1900+...The ONLY issue is if you tri-monitor and game.
> I'm glad you're not waiting for Kepler because the 690 (If there will be one) isn't due until December (The 2012 one)
> I would recommend future proofing and getting 2x 580 3gb's however, the 590 is a limited edition and will have a better resale value...


Hey Masked,

From what you're seeing, do you think games are going to get to a state where we'll still need 2x 580 3gb even for 1920x1080 'before' Kepler comes out? Or, will the 590 @ 1.5gb hold up at 1920x1080p for a whole another year until 12/12 if that's when Kepler debuts. If so, that's my plan. To stick with 1080p @ 120hz and the 590's until Kepler. If not, then I'll either sell them for 2x580 3gb or maybe even keep them for a second rig. I'll admit, I'm a little more attached to these cards than I normally am. I'm pretty sure its due to the LED's on the cards and the kick ass aesthetic difference it gives my rig, which shouldn't really be a deciding factor in whether to keep or sell them, but I admit that for me it does a bit. I'm trusting more companies will adopt some similar aesthetics for their ultra high end cards.

On a tangential note for all of the Jedi's hanging out here, I'm having a real blast playing Batman: AC in 3D. It's some of the best 3D I've ever seen, and is done so well in some way that it doesn't seem to induce any headaches or fatigue in a couple hours like many of the other games do.

Playing in 3D also seems to really make use of all my GPU's and really utilizes the 3rd and 4th one. Maintaining 60fps with everything turned up is pretty much the reason I went Quad SLI other than max FPS for Battlefield. And playing BF3 in 3D.

On an unrelated note that only this crowd can appreciate, I'm totally stressing my arrival of SW:TOR Collector's Edition that I was able to snag weeks after they were reportedly sold out due to Walmart.com still having pre-orders available for retail price.

What's bumming me out is, the actual collector's package and code to play the game after the early release period is scheduled to not arrive until 1/3/12









Worse, after not being able to find a record of my order on their website and dealing with their customer service to verify the order, I was informed that I may receive it before then, since the SKU technically releases 12/20. Then I got a strange email telling me if my order showed up "cancelled" to disregard it as a system error, but if I got an actual "cancellation email" then my order was cancelled (thankfully, I didn't) and to expect my item to ship in a few days...

So what really sucks about the whole thing, is my parents want my wife and I to go from Los Angeles to Scottsdale, AZ. for the holidays (about a 6 hour drive) and I'm totally hesitant to go because my apartment is a 4-plex from a really old building and so my apartment's front door goes directly to the outside street. And is not an apartment in a bigger building that usually has a front lobby to pass through.

And being so, any packages I get are left right on my doorstep on the street for anyone to jack, should they be inclined enough to. (fortunately, I live right next door to a busy 99c store so there's always people around and haven't had any packages stolen in the 6 years I've lived here)

But between the robbery I ran into on my street last week at 4am while walking my dog, and with there being no one in my building to grab it for me because everyone will be gone for at least a week, I am sketching about traveling to Scottsdale to visit my parents having that package arrive and sit out front for a week only to get stolen. And we all know how sold out the collector's edition is, so its not like I can just go to a brick and mortar store and get another should my delivery get jacked. So now, because of Walmart.com being unable to provide any normal pertinent information like most online stores so I can schedule my life, has decided that instead that my parents are most likely coming out to LA, like they normally do every year.

All for this collector's edition of SW:TOR - Gawd, I feel like such a selfish arse.... I figure if anyone can understand it'd be this crowd. (Fortunately my Jedi parents do too. The wife though, she wants to go to AZ. and doesn't get it. Which makes sense when you factor that she was born in '83 and has never seen the original holy trilogy. To be fair, she is a big Farscape and Dr. Who geek. To the shock of all the other girls in the fashion biz she works with.)

Thanks for letting me vent my guilt!


----------



## Scorpion49

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> And being so, any packages I get are left right on my doorstep on the street for anyone to jack, should they be inclined enough to. (fortunately, I live right next door to a busy 99c store so there's always people around and haven't had any packages stolen in the 6 years I've lived here)
> 
> But between the robbery I ran into on my street last week at 4am while walking my dog, and with there being no one in my building to grab it for me because everyone will be gone for at least a week, I am sketching about traveling to Scottsdale to visit my parents having that package arrive and sit out front for a week only to get stolen. And we all know how sold out the collector's edition is, so its not like I can just go to a brick and mortar store and get another should my delivery get jacked. So now, because of Walmart.com being unable to provide any normal pertinent information like most online stores so I can schedule my life, has decided that instead that my parents are most likely coming out to LA, like they normally do every year.
> 
> Thanks for letting me vent my guilt!


There are few instances when a shipping company will actually just leave the stuff at the door. Usually they will try and if you're not there they take it back and try again the next day, up to 3 times (UPS, fed-ex) USPS will just keep it and leave you a note to come pick it up. I have the same issues in my complex, and even items not signature required do not get left at the door. Its up to the drivers discretion, if there seem to be no "safe" place to stick it they will take it back.


----------



## Not A Good Idea

Quote:


> Originally Posted by *MKHunt*
> 
> Brushed alu and Cu didn't fit build aesthetics at all. When in close proximity with black, gunmetal, and green anything copper looked awful.
> So you have a newer card too? Awesome. Any coil whine? My older 590 had a tiny dentist trapped inside.


lol sucks about the co-ordination. i have one newer card and an older card. i have about equal amount of coil whine out of both cards only while i was benchmarking unigine. kind of scared me until i realized what it was.
Quote:


> Originally Posted by *mojobear*
> 
> hey not a good idea,
> Your GTX 590 fit the razor block easily? Man I totally dont get why my second GTX 590 with the higher inductor would not fit properly. I bought my first GTX 590 a while back in June, which had the lower inductor, and that fit perfectly with the XSPC block. The second one, no matter what...would cause high temps on one of the GPUs. It had the new inductors like the picture from MKHunt....I tried a new razor block, but had the same issue! I tried switching positions of the blocks in the loop, but the same GPU would sky rocket in temps (70-80s). Finally I replaced the GTX 590, with the RMAed one having the same higher inductor, and again...same GPU, same high temps. I have seriously repositioned the block on the PCB like over 10 times, with the same result...so it can't be the mounting, and Masked mentioned it could be air bubbles, but I switched the position of the blocks and it was the same GPU with the problem....it went from GPU3 to GPU1.
> If your razor blocks work fine then there must be something wrong I am doing. I for the life of me cannot think of a good reason! Are you using the backplate provided by EVGA and the washers for the screws? Any advice or if you came across similar issues?
> Thanks!


yes, it fit with the stock backplate. i used the screws that came with the Xspc block, but not the little red washers. when i tried to just fit them together without actually screwing it in, it looked like i was going to have a 1mm gap. it didnt and there is no card warping. at full load the max temp across both the cards was 37c.


----------



## MKHunt

Quote:


> Originally Posted by *Not A Good Idea*
> 
> lol sucks about the co-ordination. i have one newer card and an older card. i have about equal amount of coil whine out of both cards only while i was benchmarking unigine. kind of scared me until i realized what it was.
> yes, it fit with the stock backplate. i used the screws that came with the Xspc block, but not the little red washers. when i tried to just fit them together without actually screwing it in, it looked like i was going to have a 1mm gap. it didnt and there is no card warping. at full load the max temp across both the cards was 37c.


Hmm, oh well. So long as its not louder. My old card would only whine in WEI and folding. So... when I was trying to sleep lawl. 6x 1800 rpm fans will drown it out though


----------



## Glockshna

Here's my proof







Finally got my phone connected to the computer.

Anyway, any ideas on how i should proceed with the temperature situation?

I am 100% down for water cooling but I have no idea how or where to even start on that.

GPU-z for the skeptics.

http://www.techpowerup.com/gpuz/wmbgq/


----------



## Scorpion49

Quote:


> Originally Posted by *Glockshna*
> 
> Here's my proof
> 
> 
> 
> 
> 
> 
> 
> Finally got my phone connected to the computer.
> 
> Anyway, any ideas on how i should proceed with the temperature situation?
> 
> I am 100% down for water cooling but I have no idea how or where to even start on that.
> 
> GPU-z for the skeptics.


Have you tried a custom fan profile for your card? My temps on air were fine, I went water cooled for fun. For water, be advised that most 590 GPU blocks have instructions for reference style (Asus/Nvidia) 590's and the EVGA is slightly different as far as screw holes and the backplate.


----------



## jdangond

Quote:


> Originally Posted by *Glockshna*
> 
> Here's my proof
> 
> 
> 
> 
> 
> 
> 
> Finally got my phone connected to the computer.
> Anyway, any ideas on how i should proceed with the temperature situation?
> I am 100% down for water cooling but I have no idea how or where to even start on that.
> GPU-z for the skeptics.
> http://www.techpowerup.com/gpuz/wmbgq/


you can always go hydro copper







or just get a Koolance water block



GPUs idle at 28 and 27C in ambient of 20C, been playing Skyrim and haven't gone past 48C


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Hey Masked,
> From what you're seeing, do you think games are going to get to a state where we'll still need 2x 580 3gb even for 1920x1080 'before' Kepler comes out? Or, will the 590 @ 1.5gb hold up at 1920x1080p for a whole another year until 12/12 if that's when Kepler debuts. If so, that's my plan. To stick with 1080p @ 120hz and the 590's until Kepler. If not, then I'll either sell them for 2x580 3gb or maybe even keep them for a second rig. I'll admit, I'm a little more attached to these cards than I normally am. I'm pretty sure its due to the LED's on the cards and the kick ass aesthetic difference it gives my rig, which shouldn't really be a deciding factor in whether to keep or sell them, but I admit that for me it does a bit. I'm trusting more companies will adopt some similar aesthetics for their ultra high end cards.


You KNOW I'm not allowed to actually answer that









Personally, yes, it's the trend...This is why I consistently refer people to the 580 3gb.

It's hard to explain because we break a barrier, games become more demanding, apps become more demanding but, over time...things rationalize and then we break another one and the process repeats.

Right now, games are at a stand still because the tech really just isn't there to do what they want to do...So, we're limited...DX11 is a LIMITATION more than anything else.

I feel that in the next year or two, you'll see this Vram barrier just fall over...Things in this industry also grow exponentially...

I don't think Kepler is going to be the bringer of change, I think it will be Maxwell.

For the next year, you're safe but, it's a very smart idea, if you game at max resolution, to start future-proofing.


----------



## kzinti1

I've been thinking about putting a waterblock on my EVGA GTX590 Classified but have no idea if it's a new or old version. How can I tell?

I've been taking a hard look at the AlphaCool NexXxoS: http://www.alphacool.com/product_info.php/info/p1002_Alphacool-NexXxoS-NVXP-NV-590-Serie.html?language=en&XTCsid=k2b7mj92sdhhh885amofj3qdvn01lktm, since I really didn't like the way EKWaterblocks blamed everybody else, especially SidewindersPC and Petra's, for what turned out to be EK's poor electroplating. TBH, it pissed me off to no end.

I wanted to buy the HydroCopper version but that's just about the ugliest block I've ever seen. Nowhere near as nice looking as the GTX580 FTW HydroCopper I bought last Christmas.


----------



## Saizer

Will that waterblock be compatible with all 590s?


----------



## mojobear

thanks for your reply,

does anyone know why this temp difference could be happening? Its always the same GPU on the GTX 590 with the new inductors. I am open to any suggestions! the blocks are in parallel with two koolance VID connectors. it looks like this.


----------



## mojobear

the black arrows highlight the water flow and the red circle is the gpu which gets hot...Im assuming that is gpu 1 and 3 depending on where it is in the loop.


----------



## Image132

So I don't want to start a new thread for this but I have to ask for your opinion.

I have 1 Asus 590 and it hits the usual temp (70 -90 degrees C) in my case. Now I've wanted to get into water cooling so I thought what better way than to cool my 590. I got all geared up only to find out that WC my 590 would cost the same as buying another 590.

So now I have the dillema of choosing between watercooling or getting another 590.

What do you guys think? I've heard people have problems with the quad sli of the 590's?

any advice would be awesome!


----------



## kzinti1

Quote:


> Originally Posted by *Saizer*
> 
> Will that waterblock be compatible with all 590s?


Supposedly, yes. I can say the same for all waterblocks because EVGA, supposedly, made 2 versions of the GTX590.

However, if this block was produced before EVGA changed the design, parts or whatever, it may not be compatible with the newest version of my card. If I actually have the newer version and not the original, which would be compatible.

Almost every single videocard I have ever bought turned out to be different from the majority of the same cards; like my Galaxy GTX470's. They turned out to be non-reference, my fault because I didn't know what non-reference meant back then. I had no idea they wouldn't be able to be watercooled. They run way too hot to be run in SLi.

Now, I find my GTX590 may also, possibly, be non-reference. I only found out that there really are 2 versions of this card, I asked what the difference is and how I can tell, and never received a response.

I've searched the Forum for this block and only found references to AquaCool Aqueros. Not the same thing as the one I want.

I'm also posting this in the watercooling threads to see if anyone there knows the answers.


----------



## Image132

Quote:


> Originally Posted by *mojobear*
> 
> the black arrows highlight the water flow and the red circle is the gpu which gets hot...Im assuming that is gpu 1 and 3 depending on where it is in the loop.


I'm new to watercooling but I'm pretty sure there was a reason why just about everyone puts one connector between 2 graphics cards.

Water will take the path of least resistance which means there is the chance that it won't get to one of your cores. You put one connector between them and suddenly the water is forced to cool all cores.

So it should be like this:



Thats just my understanding. I'll let one of the WC vets step in here.


----------



## Juggalo23451

There is only one version of the GTX590 not 2. There is no revision on either


----------



## kzinti1

Quote:


> Originally Posted by *Juggalo23451*
> 
> There is only one version of the GTX590 not 2. There is no revision on either


Thanks Juggalo! Here are the numbers of the posts in this thread that got me all confused: Posts #3184-#3187 and post #3190.

If EVGA used different parts of different sizes then isn't that the same as a different revision? These people said they had difficulties in getting their waterblocks to fit properly and that is the basis of my concern about my own videocard and the AlphaCool NexXxoS that I want to use. http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_240_580&products_id=32269

As I said earlier, I'd rather avoid EKWaterblocks because of their accusations against SidewindersPC and especially Petra's, for accusing their PT Nuke biocide for causing the erosion of the electroplating on EK's blocks, when, in fact, it was the poor electroplating of EK themselves that was the actual problem. I just can't put up with a manufacturer that blames innocent companies for EKWaterblocks own faults.

Thanks again.


----------



## Masked

Quote:


> Originally Posted by *kzinti1*
> 
> Thanks Juggalo! Here are the numbers of the posts in this thread that got me all confused: Posts #3184-#3187 and post #3190.
> If EVGA used different parts of different sizes then isn't that the same as a different revision? These people said they had difficulties in getting their waterblocks to fit properly and that is the basis of my concern about my own videocard and the AlphaCool NexXxoS that I want to use. http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_240_580&products_id=32269
> As I said earlier, I'd rather avoid EKWaterblocks because of their accusations against SidewindersPC and especially Petra's, for accusing their PT Nuke biocide for causing the erosion of the electroplating on EK's blocks, when, in fact, it was the poor electroplating of EK themselves that was the actual problem. I just can't put up with a manufacturer that blames innocent companies for EKWaterblocks own faults.
> Thanks again.


Okay, since I happen to know the details about this, I'll answer.

There was, is no, and has never been an "official" revision of this card and the PCB has never been revised.

What's that mean? According to anyone you ask, there has never been a revision.

What Nvidia does, frequently, actually...Is they'll occasionally fix an "issue" on the line and that's that.

In the situation of the 590, it looks as if that half way through the second run, they "upgraded" some of the core components.

Now if you know anything about EE, it's not really an upgrade it's more/less changing a pathway.

In no way is this a revision and in no way does this change the performance of the card.

What it DID do is change the height of some of the components by .5mm ~ This throws off any waterblock because like I said, there's never been an official revision.

Now, there are a couple brands that actually revised their blocks upon noticing that a "change" had been made and made the necessary changes.

It's best if you plan on using water, to check with the MFCTR and make sure your block will fit.


----------



## Scorpion49

ARG. Stupid new forums strike again, I click the subscription link and it dumps me 40 pages ago so I look like a tard replying to something that happened 6 months ago.









Quote:


> Originally Posted by *Image132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mojobear*
> 
> the black arrows highlight the water flow and the red circle is the gpu which gets hot...Im assuming that is gpu 1 and 3 depending on where it is in the loop.
> 
> 
> 
> I'm new to watercooling but I'm pretty sure there was a reason why just about everyone puts one connector between 2 graphics cards.
> 
> Water will take the path of least resistance which means there is the chance that it won't get to one of your cores. You put one connector between them and suddenly the water is forced to cool all cores.
> 
> So it should be like this:
> 
> 
> 
> Thats just my understanding. I'll let one of the WC vets step in here.
Click to expand...

You can cool them in serial or parallel. Parallel causes less restriction, however with those particular blocks I think serial would be better because the water ports inside the threads are so close to the fitting when it is screwed in that it may be causing an issue. The XSPC block instructions explicitly state to use short threaded barbs/fittings specifically for this reason, the thread can cover/partially cover the ports to the block. Thats where I would look for my problem first.


----------



## mojobear

argh. thanks for your response guys. its getting pretty fustrating...

So i thought about the serial versus parallel idea, but I had planned the parallel blocks so that either way the water goes, there is the same restriction.

Scorpion49 - I have checked the threads and looking at the XSPC block, I can see that there is tons of clearance between the parallel connectors and the water holes on the block.

I just dont see how you guys with the new inductors could place the XSPC block so perfectly with excellent temps, but my block with 10+ remounts and position changes...gives me the same issue!


----------



## mojobear

also...im assuming that GPU1 and GPU3 are the ones closest to the display ports right?


----------



## mojobear

also I have tired the blocks in series too...with the same high temps on that one gpu....although for the series test I didnt remount but just took out one of the koolance VID connectors.


----------



## Scorpion49

Do you have one of the cards that doesn't have a heatspreader on the GPU?


----------



## MKHunt

Quote:


> Originally Posted by *kzinti1*
> 
> Thanks Juggalo! Here are the numbers of the posts in this thread that got me all confused: Posts #3184-#3187 and post #3190.
> If EVGA used different parts of different sizes then isn't that the same as a different revision? These people said they had difficulties in getting their waterblocks to fit properly and that is the basis of my concern about my own videocard and the AlphaCool NexXxoS that I want to use. http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_240_580&products_id=32269
> As I said earlier, I'd rather avoid EKWaterblocks because of their accusations against SidewindersPC and especially Petra's, for accusing their PT Nuke biocide for causing the erosion of the electroplating on EK's blocks, when, in fact, it was the poor electroplating of EK themselves that was the actual problem. I just can't put up with a manufacturer that blames innocent companies for EKWaterblocks own faults.
> Thanks again.


AlphaCool was not a company I contacted when searching for blocks with inductor area clearance so I cannot confirm. Telling if your card has the new components or old is really simple. Just look straight down through the fan blades. Can you see square parts? Do they have a big "C" on them? If so, you have the model with the updated inductors. For reference, here's a picture showing them.
The inductors are the five 'cubes' inward of the capacitors and PWM MOSFETs. There is a row of five directly to the left of the purple-topped capacitor.


The only company I received confirmation of fitment from was EK. Koolance said that nobody has ever said that their Rev 1.1 block hasn't fit, but did not confirm that they had been made aware of the taller inductors and had revised the block for those. However I suspect that's exactly what happened. On the models with the original inductors, you cannot see them through the fan blades. With the new inductors came an entirely new heatsink base-plate design.
Quote:


> Originally Posted by *Masked*
> 
> Okay, since I happen to know the details about this, I'll answer.
> There was, is no, and has never been an "official" revision of this card and the PCB has never been revised.
> What's that mean? According to anyone you ask, there has never been a revision.
> What Nvidia does, frequently, actually...Is they'll occasionally fix an "issue" on the line and that's that.
> In the situation of the 590, it looks as if that half way through the second run, they "upgraded" some of the core components.
> _Now if you know anything about EE, it's not really an upgrade it's more/less changing a pathway._
> In no way is this a revision and in no way does this change the performance of the card.
> What it DID do is change the height of some of the components by .5mm ~ This throws off any waterblock because like I said, there's never been an official revision.
> Now, there are a couple brands that actually revised their blocks upon noticing that a "change" had been made and made the necessary changes.
> It's best if you plan on using water, to check with the MFCTR and make sure your block will fit.


Well, the italicized part isn't that true. In no way whatsoever did this change a pathway. If a pathway were changed, a new PCB design would be required since the copper traces are what determine the pathing. Changing inductors for beefier models (what nvidia did in this case) is like changing a resistor. No pathways were altered, merely a better quality component put in the old one's place. Many people were complaining about an inadequate VRM scheme (wrongly) when all but one documented failure has been a resistor or capacitor. The failure where the VRM MOSFET blew, however, was after the reviewer bypassed a fail-safe resistor on the input line then overvolted. The MOSFETs themselves aren't a problem though. They're rated for an average current of 12V and 35A (per datasheet) each. The previous inductors were rated for about half the amperage of the current inductors. I'm not sure what nvidia's logic behind changing them was.

In everyday usage, the card is no different. What you get, however, are components that run cooler and have a higher tolerance for heat and voltage. The chance of having cards with coil whine (inductors = coils) is also lower though, in no way, eliminated. Really what the inductors do is protect the silicon; which is why I am a tad confused why they felt they needed to be beefed-up.

I have read SO MANY datasheets for parts on this card. I don't particularly enjoy thinking about how many hours/days were put into that research.

ETA: This WB also just looks _so good_. I'm a chump for aesthetics, and this delivers.


----------



## Recipe7

Still awaiting your results, MKHunt


----------



## Image132

Quote:


> Originally Posted by *MKHunt*
> 
> ETA: This WB also just looks _so good_. I'm a chump for aesthetics, and this delivers.


I have to agree Hunt. Best looking waterblock for the 590 by far. After the new year I'm going to order one for my 590









Make my card look beast.


----------



## Masked

@Mk

They told us it was for a "VRM Pathway" according to our email...I had to basically gauge for a week to even get that.

I do understand that may not be what happened...Could pull out 5 revisions from the 480 series just to prove that one (Haha)

That's interesting though because I thought if you changed a pathway (I.E. order) that the entire PCB didn't need to be changed just the core lanes? No?

Learn something new every day.


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> @Mk
> They told us it was for a "VRM Pathway" according to our email...I had to basically gauge for a week to even get that.
> I do understand that may not be what happened...Could pull out 5 revisions from the 480 series just to prove that one (Haha)
> That's interesting though because I thought if you changed a pathway (I.E. order) that the entire PCB didn't need to be changed just the core lanes? No?
> Learn something new every day.


Oh man that is truly ambiguous! XD They may have changed something before the MOSFET/Inductor area. That, I'll never know because the traces are way too fine for me to easily follow by eye and nvidia will never release a nice circuitry diagram







Of course, you could also read it as, "the new inductors are for a VRM pathway." Which is true. It would be about like saying "the knife by the bananas is for the bananas." If you got it from nvidia themselves, there could be some translation fog about as well.

I help correct translations for a friend high up in a clothing mfg company based in Japan (Descente) and I've noticed that he likes to omit some words that designate purpose. No idea if Chinese (IIRC they speak Chinese on Taiwan?) is the same. English over there is taught by correlating it to native grammar(CN, JP, KR) rather than reteaching the English grammar itself which is where we get Engrish from!









However I can say with certainty that the MOSFET/Inductor area is pretty much unchanged except for a component swap. For all I know, nvidia may have lost the original inductor manufacturer to a secret alien attack.


----------



## kzinti1

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kzinti1*
> 
> Thanks Juggalo! Here are the numbers of the posts in this thread that got me all confused: Posts #3184-#3187 and post #3190.
> If EVGA used different parts of different sizes then isn't that the same as a different revision? These people said they had difficulties in getting their waterblocks to fit properly and that is the basis of my concern about my own videocard and the AlphaCool NexXxoS that I want to use. http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_240_580&products_id=32269
> As I said earlier, I'd rather avoid EKWaterblocks because of their accusations against SidewindersPC and especially Petra's, for accusing their PT Nuke biocide for causing the erosion of the electroplating on EK's blocks, when, in fact, it was the poor electroplating of EK themselves that was the actual problem. I just can't put up with a manufacturer that blames innocent companies for EKWaterblocks own faults.
> Thanks again.
> 
> 
> 
> AlphaCool was not a company I contacted when searching for blocks with inductor area clearance so I cannot confirm. Telling if your card has the new components or old is really simple. Just look straight down through the fan blades. Can you see square parts? Do they have a big "C" on them? If so, you have the model with the updated inductors. For reference, here's a picture showing them.
> The inductors are the five 'cubes' inward of the capacitors and PWM MOSFETs. There is a row of five directly to the left of the purple-topped capacitor.
> 
> 
> The only company I received confirmation of fitment from was EK. Koolance said that nobody has ever said that their Rev 1.1 block hasn't fit, but did not confirm that they had been made aware of the taller inductors and had revised the block for those. However I suspect that's exactly what happened. On the models with the original inductors, you cannot see them through the fan blades. With the new inductors came an entirely new heatsink base-plate design.
> Quote:
> 
> 
> 
> Originally Posted by *Masked*
> 
> Okay, since I happen to know the details about this, I'll answer.
> There was, is no, and has never been an "official" revision of this card and the PCB has never been revised.
> What's that mean? According to anyone you ask, there has never been a revision.
> What Nvidia does, frequently, actually...Is they'll occasionally fix an "issue" on the line and that's that.
> In the situation of the 590, it looks as if that half way through the second run, they "upgraded" some of the core components.
> _Now if you know anything about EE, it's not really an upgrade it's more/less changing a pathway._
> In no way is this a revision and in no way does this change the performance of the card.
> What it DID do is change the height of some of the components by .5mm ~ This throws off any waterblock because like I said, there's never been an official revision.
> Now, there are a couple brands that actually revised their blocks upon noticing that a "change" had been made and made the necessary changes.
> It's best if you plan on using water, to check with the MFCTR and make sure your block will fit.
> 
> Click to expand...
> 
> Well, the italicized part isn't that true. In no way whatsoever did this change a pathway. If a pathway were changed, a new PCB design would be required since the copper traces are what determine the pathing. Changing inductors for beefier models (what nvidia did in this case) is like changing a resistor. No pathways were altered, merely a better quality component put in the old one's place. Many people were complaining about an inadequate VRM scheme (wrongly) when all but one documented failure has been a resistor or capacitor. The failure where the VRM MOSFET blew, however, was after the reviewer bypassed a fail-safe resistor on the input line then overvolted. The MOSFETs themselves aren't a problem though. They're rated for an average current of 12V and 35A (per datasheet) each. The previous inductors were rated for about half the amperage of the current inductors. I'm not sure what nvidia's logic behind changing them was.
> 
> In everyday usage, the card is no different. What you get, however, are components that run cooler and have a higher tolerance for heat and voltage. The chance of having cards with coil whine (inductors = coils) is also lower though, in no way, eliminated. Really what the inductors do is protect the silicon; which is why I am a tad confused why they felt they needed to be beefed-up.
> 
> I have read SO MANY datasheets for parts on this card. I don't particularly enjoy thinking about how many hours/days were put into that research.
> 
> ETA: This WB also just looks _so good_. I'm a chump for aesthetics, and this delivers.
Click to expand...

Thank you MKHunt! That's exactly what I needed. It looks like I'll have to go with EKWaterblocks anyway. Plus an EK backplate.

Your post and pics should be posted somewhere in the Watercooling section of OCN and every other Forum that addresses the watercooling public on the web. Maybe then, ALL manufacturers will revise their blocks so none of us GTX590 owners will put their cards at risk from trying to install an improper waterblock again.

Then again, since vga blocks cannot be returned after they're removed from their packaging, the manufacturers have no incentive to fix this problem. They'll still get their money. They don't have to worry about repeat business because there are so few people that make these things, if they lose one customer another 3 will have already bought from them. That's one helluva vicious circle.

Thanks again. You've given me the best help I've asked for in a long time.
BTW, as far as I'm concerned, any change whatsoever done to a reference product is a new revision. Whether it's marked as a new revision or not. And it damned well should be marked that way.


----------



## MKHunt

Glad I could help. Oh and as a note to EKWB users with the newer 590s, try putting a .5mm thermal pad on your NF200 chip. When observing dry fitting I can see a thin gap of empty space between the NF200 and the WB. EK recommends using TIM there, but I don't particularly believe that a super thick layer of TIM will work quite as well as a nicely compressed thermal pad. TIM is meant to fill microscopic gaps, not bridge sections of open air. I don't really know how much heat that chip even generates, but it seems safer to make sure it has a well-established escape-route. The only TIMs I would consider acceptable in this situation are Shin-Etsu or ICD, but would still prefer a thermal pad.

I'll work on figuring out how much space is between the inductor tops and water block so I can know how much pad to put up there (instructions don't even say to put thermal material on them). EK also says to put a small blob of TIM on each PWM mosfet _then_ add thermal pad. Everywhere I've read says not to do thermal padding and TIM, so if anyone more in the know with that stuff can advise, I'd appreciate it. My old KL block used pads with no thermal paste, so if nobody can answer definitively, that's what I'll probably do.


----------



## MKHunt

Wow, EK played it super safe. There's about 1.2mm of space between the top of the new inductors and the WB. If you choose to cool them, I would suggest either 1.0mm thermal pad and TIM (between pad and block) or maybe a 1.5mm pad, though that might throw the NF200 chip a little further from the block. All depends how squishy/stretchy your pads are.

Personally, I'm going to use the pink (1.0mm) Koolance pads and thermal paste on the pink side. It'll still save paste compared to EK instructions since I'm going to cut up some toothpicks to hold the spacers in place rather than waste TIM.


----------



## ReignsOfPower

I installed an EK Waterblock to my GTX590 Classified only last week, worked great. The NF200 chip touches the waterblock just fine with a small dab of thermal paste. Besides, im not worried too much about that chip. VRMS, Memory and GPU's are what matter most.







Check out my build log in my sig if you're interested in some pictures


----------



## Wogga

glad you all came to the EKWB in the end =)
used them for 3-4 months already and have to say that the only problem is hot backplate.


----------



## Xyphyr

Quote:


> Originally Posted by *iARDAs*
> 
> Ok guys
> 
> I currently have a MSI Twin Frozr III 570
> 
> It is factory OCed @ 770/2000
> 
> If i go for a 590 how big a performance increase would i have?
> 
> 20%?
> 50%?
> 100%?


Probably ~50% or more, on 1 monitor it whooped my 6850 xfire (better than 580) Being able to max out any game is nice, but it takes that $$$.


----------



## OutlawNeedsHelp

I'm about to put my 590 under water, any tips on how to install the water block and take off the stock heat sink? It's the EVGA classified reference 590.


----------



## MKHunt

Start by watching all of Juggalo's videos. Here's number one.






Also, you have a PM.

Cliff notes:
Remove all visible screws from the backplate and back of card.
Remove backplate by gently pulling up.
Remove hex-shaped DVI port hole screw things by griping with pliers and unscrewing.
Remove two phillip's head screws on faceplate (with the DVI ports)
Remove TWO torx (t-4?) screws from each corner of the VGA card's backside. One holds the port faceplace and the other holds one of the corners
CAREFULLY pry the heatsink off.
Unhook the fan cable and LED cable
Clean the card
Follow WB manufacturer's instructions.

Shouldn't take more than an hourish, but go slow and be sure to look at everything you want to see. Oh and still watch Juggalo's videos before you attempt my cliff notes. He disassembles the 'long way' but it's important to understand how things are held together.


----------



## OutlawNeedsHelp

Quote:


> Originally Posted by *MKHunt*
> 
> Start by watching all of Juggalo's videos. Here's number one.
> 
> 
> 
> 
> 
> 
> Also, you have a PM.
> 
> Cliff notes:
> Remove all visible screws from the backplate and back of card.
> Remove backplate by gently pulling up.
> Remove hex-shaped DVI port hole screw things by griping with pliers and unscrewing.
> Remove two phillip's head screws on faceplate (with the DVI ports)
> Remove TWO torx (t-4?) screws from each corner of the VGA card's backside. One holds the port faceplace and the other holds one of the corners
> CAREFULLY pry the heatsink off.
> Unhook the fan cable and LED cable
> Clean the card
> Follow WB manufacturer's instructions.
> 
> Shouldn't take more than an hourish, but go slow and be sure to look at everything you want to see. Oh and still watch Juggalo's videos before you attempt my cliff notes. He disassembles the 'long way' but it's important to understand how things are held together.


I've watched his vids, TTLs video one from linus and one from danger den.


----------



## MKHunt

Just finished mounting the EK block. I have some recommendations that might help some people out. There are several areas that have extrusions on the block for contact, but the instructions say nothing about them. So I made a modified panel from the instructions.



-The red bits are REQUIRED but not mentioned in the instructions.
-The blue bit is highly recommended.
-Orange is a personal recommendation. When I tested dry fitting, there was a bit more space between the NF200 chip and the block than I cared for. About enough for two to three sheets of paper. I feel that this is too large of a gap for thermal paste so I re-purposed my Koolance thermal pads. A well-compressed thermal pad will transfer heat better than loose TIM.
-The green is completely optional. My first Koolance block had direct contact inductor cooling and I liked that. These components CAN handle up to 65A of current, so I want them to be cooled correctly.

Also, hooray for MS Paint.


----------



## Cooperdale

Hello guys, I've just joined the club with a stock Asus 590. It gave me some trouble the first few days, but that's just because I mistakenly used a single rail for both power connectors on the card, sometimes there would be no current and BSODs ensued. Anyway, that's fixed now.

Nonetheless, I must say I am a bit disappointed: the performance is great, but I had read the card was particularly silent for a dual. Well, it's not for me. Playing Dragon Age 2, temperatures go sky high, about 86-87C at times, and the fan with default profile is like a twister. I tried changing the profile with Afterburner, but then the National Guard gave the wind from my case a girl's name, and had the town evacuated.

So first thing I'm going to do is buy a lateral window for the case (Bitfenix Colossus) with one or two silent fans, this lowered temps by almost 15C in my old rig. I hope with lower temps I'll be able to bear the noise.

But I'm starting to think about water-cooling too... I never got to it because I'm lazy, I like to build but I hate recurring maintenance: this card is almost convincing me though, love the performance but hate the noise.


----------



## Tuthsok

Quote:


> Originally Posted by *Cooperdale*
> 
> But I'm starting to think about water-cooling too... I never got to it because I'm lazy, I like to build but I hate recurring maintenance: this card is almost convincing me though, love the performance but hate the noise.


Video card noise is the exact reason I started watercooling. I am running a pair of 590 cards (EVGA with stock Hydro Copper blocks) and during gaming I run the fans on the radiators at about 20%. The loudest thing in my system is the PSU. When running every thing at full load ([email protected]) the fans get loud from running them at 100% to keep things cool (all cores keep below 50C).


----------



## Cooperdale

That's very tempting I must say, full silence while gaming.

But it looks like I was rash to think the system is stable.

Even after using two rails to power the card, I can't run Unigine Heaven more than a few minutes. It either crashes or there's that damn "driver stopped responding" error and then the pc freezes...

I don't know what the problem may be, I haven't even tried overclocking yet... Temps are high, but not so high as to crash the card.


----------



## RushMore1205

I think my cup summarizes how I live and can I please join the club

what happen to the 590 can't buy anywhere now


----------



## rush2049

The 590 was a limited run of cards. Anything that is up for sale is RMA stock that is being sold.

We really need to edit the op to include some of this info.....


----------



## MKHunt

Quote:


> Originally Posted by *rush2049*
> 
> The 590 was a limited run of cards. Anything that is up for sale is RMA stock that is being sold.
> We really need to edit the op to include some of this info.....


I agree. A lot of learning has happened since the OP.


----------



## Image132

Quote:


> Originally Posted by *rush2049*
> 
> The 590 was a limited run of cards. Anything that is up for sale is RMA stock that is being sold.
> We really need to edit the op to include some of this info.....


So the card is reaching the end of it's life already? Sad.

I was just planning on buying another one...


----------



## iARDAs

i cant understand what limited run of cards mean

Will i not be able to find any 590s around in the future?


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> i cant understand what limited run of cards mean
> Will i not be able to find any 590s around in the future?


Not brand new, no.


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Not brand new, no.


Hmmm

In that case sadly i will lean away from getting a 590.

I was thinking of gettin 1 now and 1 later but i live in Turkey and it might be hard to find a 2nd hand one.

Sad


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hmmm
> In that case sadly i will lean away from getting a 590.
> I was thinking of gettin 1 now and 1 later but i live in Turkey and it might be hard to find a 2nd hand one.
> Sad


The card was always a limited edition...The press releases in November of 2010 even stated: This is a limited edition card.

I really don't understand how this is shocking/news to anyone because it's been what, 14 months since the first press releases said it was a limited edition card?

The card has been OUT of production since...I believe February...Because production on the 580 3gb's ramped up just AFTER that.

Right now, the stock that you see is unused RMA stock that's been authorized for sale OR, they've held some back...Retailers often hold onto stock and then release it during the holidays hoping for a quick sale.


----------



## Mongol

I grabbed Newegg's last two EVGA HC 590's a couple of weeks ago. (for my next build)

I actually mounted them in my current build for the time being and I'll post proof when I get home.


----------



## MKHunt

Good lawd! Omg I am an aesthetics slut.



Leak testing,commence!


----------



## Shinobi Jedi

Quote:


> Originally Posted by *MKHunt*
> 
> Good lawd! Omg I am an aesthetics slut.
> 
> Leak testing,commence!


That is Super-Pimp-Jedi, MK.

For real.

That is the sexiest Hydro card I've ever seen anywhere. I'm ashamedly not a WC guy as I don't have any experience with them and am paranoid that with my luck I'd get a leak. And since Jacob at EVGA told me in a PM that watercooling leak damage is covered by warranty under conditions so strict, that they may as well not be, imho - that it talked me out of it altogether.

Seeing that card makes me reconsider. Its that hot. Congrats.

I'd seriously consider buying a WC PC that was built for me. Especially from AW, since I've had a great experience with my M11x and Masked's involvement here shows me that they employ passionate enthusiasts who truly know their product. Rather than people who are just there to holla so they can make the ol' mighty dolla, dolla!

But building one? I don't think I could. Unless I won the lottery or came into some obscene amount of money that I wouldn't care when I inevitably fry the first round of gear while trying to learn what to do









Or maybe I'll buy one for the next build and use this build or older parts as a guinea pig. I know they're not that complicated, but I wouldn't put it past myself to douche something anyway since I have absolutely no personal experience with them.

I definitely want one though, no doubt.

Especially after seeing that Supafly card.

Congrats again MK! I know you've gone through a lot of headaches and hassles, but it looks to have turned out to be totally worth it!


----------



## MKHunt

Thanks man. I put it together this afternoon and made the me gusta face. Suddenly everything had come together and I felt real happy. It just looks so amazing. Oh and KL approved my RMA and said they shipped a replacement block my way today. Not sure what to do with it. Maybe get another 590? but then the blocks wouldn't even come close to lining up for SLI and the PSU won't support it.

The bend it makes going into the bottom res....

motherofgod.jpg.



Tubing wanted to kink a bit so the zips keep it in check until I can get the rig up to full temp for some days and reshape the tubes.


----------



## RushMore1205

what fans are you using?


----------



## MKHunt

Gelid Wing 12 PL (1850rpm version). They're pretty quiet (fan has to touch your ear before you hear the motor) but move a lot of air, so the air noise is higher than something like a CM R4.

That said, I can sleep with all six running at full tilt. Creates a good white noise


----------



## RushMore1205

New build incoming

GTX 590 Classified EVGA for sale 600 SHIPED, less then 5 hours of use!!!!!!!!!!!!!!!!!!!!!


----------



## Image132

Quote:


> Originally Posted by *MKHunt*
> 
> Good lawd! Omg I am an aesthetics slut.
> 
> Leak testing,commence!


Again absolutely sexy! I want it









Please take some more photos.


----------



## MKHunt

Quote:


> Originally Posted by *Image132*
> 
> Again absolutely sexy! I want it
> 
> 
> 
> 
> 
> 
> 
> 
> Please take some more photos.


Hahaha I can make more, I'm sure. I just don't know what else there is to take a picture of.

ETA: This card has zero coil whine whatsoever. Dead silence. And it has the new .938V bios which will hopefully let me OC moar. I applied my old folding OC (685) and it's taking it like a champ. Maybe 700? Nah. that's probably asking for too much.


----------



## Wogga

oh, coil whine is the only thing that stops me from 24/7 folding

SLi with these blocks will look like this


----------



## MKHunt

Wogga I really like how you've accounted for the copper color in your sleeving. Not having any other copper was the only reason I got nickel.


----------



## Smo

Gorgeous!


----------



## OutlawNeedsHelp

Hunt, how did you get the block standoffs on?
NVM, go it.


----------



## RushMore1205

got a 590 Classified for sale if anyone interested!!!!!!!!!!!!!!!!!!!!!!


----------



## OutlawNeedsHelp

Quote:


> Originally Posted by *RushMore1205*
> 
> got a 590 Classified for sale if anyone interested!!!!!!!!!!!!!!!!!!!!!!


I would love to have another, but I can't justify the price. Sorry.


----------



## RushMore1205

Quote:


> Originally Posted by *OutlawNeedsHelp*
> 
> I would love to have another, but I can't justify the price. Sorry.


make me an offer


----------



## rush2049

Hey rushmore1205,

I've got a buddy that is interested in that card, sent you a pm.


----------



## OutlawNeedsHelp

Quote:


> Originally Posted by *RushMore1205*
> 
> Quote:
> 
> 
> 
> Originally Posted by *OutlawNeedsHelp*
> 
> I would love to have another, but I can't justify the price. Sorry.
> 
> 
> 
> make me an offer
Click to expand...

lol, I don't have more than $100 since I just put a lot of money into WC'ing my own 590.


----------



## Masked

I've also just recently put mine up for sale...

Can't justify it anymore :/.

Going to keep my other 1 but, having 2 just isn't justifiable anymore.


----------



## PeteMcTee

I'm interested Rushmore. This is rush2049's cousin, I'll send u a PM.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I've also just recently put mine up for sale...
> 
> Can't justify it anymore :/.
> 
> Going to keep my other 1 but, having 2 just isn't justifiable anymore.


What are you going to get replace it with Masked, if anything?


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> What are you going to get replace it with Masked, if anything?


I have 2 samples at the moment that are working well.

Although, in the long run...Not sure yet.

I have a love/hate relationship with this card...Especially in SLI (4x)...

I'm still having 2nd thoughts about selling this one.]

I guess a lot depends on what Nvidia decides to do in the next few rounds of driver releases...Then I'll more than likely make a final decision.


----------



## Image132

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shinobi Jedi*
> 
> What are you going to get replace it with Masked, if anything?
> 
> 
> 
> I have 2 samples at the moment that are working well.
> 
> Although, in the long run...Not sure yet.
> 
> I have a love/hate relationship with this card...Especially in SLI (4x)...
> 
> I'm still having 2nd thoughts about selling this one.]
> 
> I guess a lot depends on what Nvidia decides to do in the next few rounds of driver releases...Then I'll more than likely make a final decision.
Click to expand...

Whats wrong with quad sli? I was thinking of getting another 590. Is that a bad idea now?


----------



## MKHunt

Folding 700mhz stock volts. This is fantastic.


----------



## Wogga

PPD? mine making 15k per GPU @670MHz (cant get it higher, coil whine blows my head)


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> PPD? mine making 15k per GPU @670MHz (cant get it higher, coil whine blows my head)


V7 shows an estimated PPD of 15.6K. But I have a sneaking suspicion this card has a bit more left in it. I've just slowly been increasing. Haven't had to lower anything yet.

ETA: At 705 I pick up a coil whine I can only hear with the case side off. The new BIOS raises the volts to .938 so I'm thinking I can probably push decently into the 700s. Maybe 715 to 720? Slowly inching closer to those 580s, though the memory will never reach that speed. IMC has too little juice.


----------



## Recipe7

Quote:


> Originally Posted by *MKHunt*
> 
> Folding 700mhz stock volts. This is fantastic.


When do you think I should RMA my ASUS, MKH?


----------



## teichu

I just bought gtx590 , does any one know which drivers performace better for 590 ?? thanks


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7*
> 
> When do you think I should RMA my ASUS, MKH?


Lul.



This card... winrar. I'm hoping for 730.


----------



## Recipe7

You have the best 590 ever made. At least the best one ever documented,


----------



## MKHunt

But wait there's more.


----------



## Recipe7

But wait... there's more........much more, I hope.

All the way to the next page,


----------



## Krazeswift

3D mark 11 it! Shouldn't get any PDL at that voltage hopefully.


----------



## MKHunt

My CPU is at so bone stock it hurts. But Ok. Expect to see bottleneck. I see graphics score gains up to 4.5Ghz

Edit: Won't bench at those speeds. But [email protected] shows no errors in the WUs, so w/e. It's a folding clock anyway. I game at 680 core and below.


----------



## Krazeswift

Quote:


> Originally Posted by *MKHunt*
> 
> My CPU is at so bone stock it hurts. But Ok. Expect to see bottleneck. I see graphics score gains up to 4.5Ghz
> 
> Edit: Won't bench at those speeds. But [email protected] shows no errors in the WUs, so w/e. It's a folding clock anyway. I game at 680 core and below.


Damn thats a shame, reckon you could of been on to something with such low voltage.


----------



## MKHunt

It's working out to about 16.2k PPD per core (735), so I'm not complaining.


----------



## Krazeswift

I bet + i imagine your happy to have your system up and running again


----------



## MKHunt

Quote:


> Originally Posted by *Krazeswift*
> 
> I bet + i imagine your happy to have your system up and running again


The day it went live I saud, "Finally! My computer is alive again! Now what should I do first?" *Set to folding* *put on a loop of every Futurama episode and click random* *get in bed and go to sleep*

Yup. That's how I justify my rig.


----------



## tehvampire

What kind of performance are you guys getting with Batman Arkham City?
With my gtx590 I get about 60fps almost all the time in DX11 with everything on max (using 4x msaa)

But when I turn on phyx Im getting about 30-40fps


----------



## Mongol

Quad SLI EVGA GTX590 Classified HC has been installed into my primary rig as my new rig is in the works.
(after my 3 580's return from RMA, they'll be getting a 4th little brother as Kyre Banorg comes to life)


----------



## MKHunt

Gimme that sweet, sweet PPD.


----------



## Wogga

ah..v7
it shows lower PPD for me than 6.34 so i stopped using it
maybe i should try now =) install drivers with voltage regulation and fold @ 800


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> ah..v7
> it shows lower PPD for me than 6.34 so i stopped using it
> maybe i should try now =) install drivers with voltage regulation and fold @ 800


Just remember that folding draws more power usually, which could show through a bad cap or resistor. I'd just flash the newest bios and see what happens.

ETA: I accidentally just gamed with the core at 730. There were well... let's just say there were zero problems or hiccups at all







Dirt 3 maxed out.


----------



## GLuGGy cOw

Quote:


> Originally Posted by *jdangond*
> 
> There's a 360 rad up top there


Hi, soz im a new user and am planning to be running the exact loop as yourself in a corsair 800D, what temps are you getting after long periods of time stress testing on the gpu and cpu's and the idle temps please

Thanks


----------



## rush2049

Has anyone noticed that Asus put up a new bios.

Link to the page: http://www.asus.com/Graphics_Cards/NVIDIA_Series/ENGTX5903DIS3GD5/#download
Direct Link to Bios File: http://dlcdnet.asus.com/pub/ASUS/vga/nVidia/BIOS/GTX590SS_BIOSUpdate.rar

I extracted it and looked at the bin file and it says it is version number:
ASUS ENGTX590 VB Ver 70.10.42.00.AS01M
aka
70.10.42.00.01/02

Anyone notice this was up there?

My card is an Asus NA version. The stock bios is: 70.10.37.00.01/02
I don't know what the version number was for the patched bios (the one that limited voltage increases). I never flashed to it as it was stupid and wasn't official from nvidia. Plus I didn't want anything extra I would have to fight with for overclocking.

I will maybe be flashing to this new bios here.... in hopes it will increase stability at higher clocks....


----------



## L1eutenant

Im currently playing Batman Arkham City and when i turn the Phys X to high i get serious fps issues.

When i look at my GPU usage in-game the cards are only running 50% each. Why would the cards not run @ 100% to achieve maximum performance?

Is there anyway to force this?

Take BF3; when i play this my 590 is running @ 95-99%

And advice/help would be appreciated.


----------



## emett

Have you patched bat man?


----------



## L1eutenant

Quote:


> Originally Posted by *emett*
> 
> Have you patched bat man?


Through Steam, Yes I think so. I verified files and it was 100%.

So unless the update is third party and not through Steam, it should be up-to-date


----------



## kzinti1

I only just noticed something. This is a damned BIG thread! Page 33 by my settings.

How many GTX590's did Nvidia produce for all of their vendors?


----------



## Masked

Quote:


> Originally Posted by *L1eutenant*
> 
> Im currently playing Batman Arkham City and when i turn the Phys X to high i get serious fps issues.
> When i look at my GPU usage in-game the cards are only running 50% each. Why would the cards not run @ 100% to achieve maximum performance?
> Is there anyway to force this?
> Take BF3; when i play this my 590 is running @ 95-99%
> And advice/help would be appreciated.


Batman has some known issues atm...It's not your card, it's the game.

Nothing you can do about it short-term.


----------



## Saizer

Quote:


> Originally Posted by *kzinti1*
> 
> I only just noticed something. This is a damned BIG thread! Page 33 by my settings.
> How many GTX590's did Nvidia produce for all of their vendors?


MIllions I guess...


----------



## Masked

Quote:


> Originally Posted by *kzinti1*
> 
> I only just noticed something. This is a damned BIG thread! Page 33 by my settings.
> How many GTX590's did Nvidia produce for all of their vendors?


Not allowed to answer that.

Far from millions...Promise you that.


----------



## iARDAs

How big is a MSI 590 compared to a MSI Twin Frozr III 570 in terms of size

I am asking it to know if it will fit in my Haf 922 case very well.


----------



## Saizer

Quote:


> Originally Posted by *iARDAs*
> 
> How big is a MSI 590 compared to a MSI Twin Frozr III 570 in terms of size
> I am asking it to know if it will fit in my Haf 922 case very well.


Sell that and get a HAF 932 Advanced/X; there you will fit a big radiator + 4 590s in SLI


----------



## OutlawNeedsHelp

Quote:


> Originally Posted by *Saizer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iARDAs*
> 
> How big is a MSI 590 compared to a MSI Twin Frozr III 570 in terms of size
> I am asking it to know if it will fit in my Haf 922 case very well.
> 
> 
> 
> Sell that and get a HAF 932 Advanced/X; there you will fit a big radiator + 4 590s in SLI
Click to expand...

You mean 2...


----------



## iARDAs

Quote:


> Originally Posted by *Saizer*
> 
> Sell that and get a HAF 932 Advanced/X; there you will fit a big radiator + 4 590s in SLI


Hey bro.

I wish i could but not really an option for me.

Also i am not going to go for a SLI

1 590 will be what i want for the next 2-3 years.

After 2 years i might get a sexond 590 or a 7xx card and than would change the case for sure.


----------



## Image132

Quote:


> Originally Posted by *iARDAs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Saizer*
> 
> Sell that and get a HAF 932 Advanced/X; there you will fit a big radiator + 4 590s in SLI
> 
> 
> 
> Hey bro.
> 
> I wish i could but not really an option for me.
> 
> Also i am not going to go for a SLI
> 
> 1 590 will be what i want for the next 2-3 years.
> 
> After 2 years i might get a sexond 590 or a 7xx card and than would change the case for sure.
Click to expand...

Before you make that choice know that the 590 card is a limited edition, and that it is already coming to the end of it's life on the shelves.

so if you want to buy a second one a in a year or two it will have to be second hand.


----------



## Shinobi Jedi

Did anyone catch the review of the AMD 7970 on Anandtech?

The 590 was the top score in almost all the tests.

ETA: In terms of the 590, the review also seems to give a great indication of how much better the performance has gotten from the more mature drivers compared to its performance at launch.

Unless some kick ass display technology comes out before Kepler like a 2560x1600 monitor that can do 120hz, I'll definitely be sticking with these cards.

I'm really psyched on these cards for SW:TOR as they seem to be one of the few cards that can handle AA when written into the config.ini file.

Never in my life have I appreciated AA as much as I do in SW:TOR. It makes it soooo much prettier. Anyone who rolled a Jedi and had to stare at the jaggy arse practice blade your character has on their back the first 10 levels will know exactly what I'm talking about.

I'm not much of an MMO player, and I am really, really loving it. Kotaku's take on it being KOTOR III but instead of one story and one character, you have eight, is a pretty good description imho.

I'm going to roll each toon to see each story play out, but I started with Jedi Consular, if its mutually agreed that the Consular's story is the most boring, then consider me psyched because I'm actually enjoying it quite a bit.

My buddies who are hardcore WOW players and I were agreeing that even if you didn't want to get into the endgame content and just wanted to play it solo/SP style through the leveling up cycles, the fact that you have 8 completely enriched narratives to carry the leveling of each of those classes brings a frackload of content even if you never wanted to play with anyone else.

Maybe its because I stick to my single BenQ 120hz 1080p display and don't mess with surround, but I still marvel at the performance I get from these cards while pretty much not having a single issue running them since ownership (save any bugs game side)

SW:TOR
Skyrim
Batman: AC
Deus Ex: HR
BF3
The Witcher 2

Just those titles alone show how great a year this is not just for gaming, but PC gaming especially.

And an even better year for us 590 owners
















Happy Holidayz, 590 Pimps!


----------



## iARDAs

Quote:


> Originally Posted by *Image132*
> 
> Before you make that choice know that the 590 card is a limited edition, and that it is already coming to the end of it's life on the shelves.
> so if you want to buy a second one a in a year or two it will have to be second hand.


Hmmm

True that

Perhaps a 590 can handle games in 3d @ 1080p for sometime and I can buy 7xx cards when they come out

I currently have a single 570 and it handles game at 1080p very well. I am sure it will do so for the next 2 years.

But in 3D either a 570 SLI or 590 should do the trick.


----------



## Ubeermench

Got my 590 last night but its DOA









Have to wait until after Christmas for my replacements.


----------



## marti69

hi need a bios 70.10.42.00.02 plz help


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Image132*
> 
> Before you make that choice know that the 590 card is a limited edition, and that it is already coming to the end of it's life on the shelves.
> so if you want to buy a second one a in a year or two it will have to be second hand.
> 
> 
> 
> Hmmm
> 
> True that
> 
> Perhaps a 590 can handle games in 3d @ 1080p for sometime and I can buy 7xx cards when they come out
> 
> I currently have a single 570 and it handles game at 1080p very well. I am sure it will do so for the next 2 years.
> 
> But in 3D either a 570 SLI or 590 should do the trick.
Click to expand...

If I were you, SLI you GTX 570 TFIII since your half way there. It will be pretty close to same performance. You'll have great gaming for the next couple years anyway.


----------



## rush2049

Quote:


> Originally Posted by *marti69*
> 
> hi need a bios 70.10.42.00.02 plz help


Go back a few posts.... but you really should get the one from your manufacturer.....


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> If I were you, SLI you GTX 570 TFIII since your half way there. It will be pretty close to same performance. You'll have great gaming for the next couple years anyway.


I changed my ming SO MANY times

First of all my MOBO is not SLI compatible so i need to change that. It just seems a hassle for me.

I have a z68 MOBO but got one that is only crossfire compatible. (rookie mistake)

So if i want to go 570 SLI than mobo needs to be changed and i dont want to replace a MOBO i bought 3 months ago.

But i might have to as well.

If i already had the SLI mobo than i would never ask this. But right now grabbing a 590 seems a good option too.


----------



## Qu1ckset

Any of you running nvidia surround on your 590 can you help me in this thread?

http://www.overclock.net/t/1188532/gtx-590-nvidia-surround-help


----------



## kzinti1

Quote:


> Originally Posted by *iARDAs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Arizonian*
> 
> If I were you, SLI you GTX 570 TFIII since your half way there. It will be pretty close to same performance. You'll have great gaming for the next couple years anyway.
> 
> 
> 
> I changed my ming SO MANY times
> 
> First of all my MOBO is not SLI compatible so i need to change that. It just seems a hassle for me.
> 
> I have a z68 MOBO but got one that is only crossfire compatible. (rookie mistake)
> 
> So if i want to go 570 SLI than mobo needs to be changed and i dont want to replace a MOBO i bought 3 months ago.
> 
> But i might have to as well.
> 
> If i already had the SLI mobo than i would never ask this. But right now grabbing a 590 seems a good option too.
Click to expand...

According to these people: http://www.asus.com/Motherboards/Intel_Socket_1155/P8Z68V/#specifications the motherboard you have listed is definitely Quad-SLi Supported. And Quad-Crossfire, and Lucid Virtu.


----------



## toX0rz

Quote:


> Originally Posted by *Masked*
> 
> More like 100% but...Okay.


no, do some research.

A GTX 590 will *not* fetch double the FPS than a GTX 570, not even close, especially when we're talking about an overclocked 570.
There might be some exceptions at very high resolutions where the 570 will hit VRAM limit earlier but thats it (the HardOCP review is a good example of that since it benches at resolutions that a minority is running and *even then* it just ties a GTX 570 SLI setup which is *not 100% over a single 570*, people always forget the 10-20% scaling loss.........).

And if you'd looked a little closer, you would have noticed that the person asking how much an of an improvement the 590 would be is just running a single 24" screen and at a 1920x1080 res you wont see more than a 60-70% boost over a single 570.

As a matter of fact, I've seen several occasions where a GTX 590 did not outperform a GTX 560 Ti SLI setup (at 1080p in particular, compare the gurud3d reviews, the 560's even have the edge sometimes).

Also:
Quote:


> Originally Posted by *Masked*
> 
> Also don't forget they cost practically the same, the 590 takes up 1 slot and has a far better resale value than the 570's do.
> ...


570 SLI are over 100€ cheaper than a GTX 590 here in germany which is 130 USD.
Newegg also has them cheaper for like 60-90 USD.

I wouldn't be sure about the resell value either.
The upcoming high-end single GPU of the Kepler series will most likely perform similarly to a GTX 590 making the latter card pretty much obsolete considering it's expected VRAM disadvantage as well as general SLI and scaling issues compared to a single card.
So if someones planning to sell their GTX 590 I'd advice to do it soon.


----------



## MKHunt

bad temp sensor? 18 seems unreal. the 24 i believe because my room is like 64F or below.


nevermind, my cpu is at 16c. my room is just really cold 1 core hets toasty i guess. at load they are identical.


----------



## kzinti1

I still can't unlock the voltage in Afterburner Beta. Is there some trick to it?


----------



## L1eutenant

Quote:


> Originally Posted by *kzinti1*
> 
> I still can't unlock the voltage in Afterburner Beta. Is there some trick to it?


Click on Settings (near apply etc.) and there will be a tab with options to allow you to change them


----------



## iARDAs

Quote:


> Originally Posted by *kzinti1*
> 
> According to these people: http://www.asus.com/Motherboards/Intel_Socket_1155/P8Z68V/#specifications the motherboard you have listed is definitely Quad-SLi Supported. And Quad-Crossfire, and Lucid Virtu.


Unfortunately mine is

Asus P8Z68V-*LE*

i should have got the no LE version

Stupid stupid mistake.


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> no, do some research.
> A GTX 590 will *not* fetch double the FPS than a GTX 570, not even close, especially when we're talking about an overclocked 570.
> There might be some exceptions at very high resolutions where the 570 will hit VRAM limit earlier but thats it (the HardOCP review is a good example of that since it benches at resolutions that a minority is running and *even then* it just ties a GTX 570 SLI setup which is *not 100% over a single 570*, people always forget the 10-20% scaling loss.........).
> And if you'd looked a little closer, you would have noticed that the person asking how much an of an improvement the 590 would be is just running a single 24" screen and at a 1920x1080 res you wont see more than a 60-70% boost over a single 570.
> As a matter of fact, I've seen several occasions where a GTX 590 did not outperform a GTX 560 Ti SLI setup (at 1080p in particular, compare the gurud3d reviews, the 560's even have the edge sometimes).
> Also:
> 570 SLI are over 100€ cheaper than a GTX 590 here in germany which is 130 USD.
> Newegg also has them cheaper for like 60-90 USD.
> I wouldn't be sure about the resell value either.
> The upcoming high-end single GPU of the Kepler series will most likely perform similarly to a GTX 590 making the latter card pretty much obsolete considering it's expected VRAM disadvantage as well as general SLI and scaling issues compared to a single card.
> So if someones planning to sell their GTX 590 I'd advice to do it soon.


You know...I had this whole thing typed up but, you're just so wrong, I genuinely don't feel like debating with "blind justice".

Never once did I say double FPS, I said twice the AMOUNT, $$$$.

I just sold my 590 HC on ebay for 1100$...Cain's sold his for 1k per...I'd suggest you do YOUR research before running your mouth.

Have a merry xmass.


----------



## MKHunt

Well I'm having a merry holiday season. I did manage to expose instability in my 730MHz clock. If I fold + run minecraft 3d + watch flash video simultaneously it gets choppy.

Temps have deviated a bit now that ambients are lower. It's probably down to just general variation in the silicon and IHS mounting. Sadly this 2600k runs about 7C hotter than my last and likes moar volts. That old one wasn't quite golden, but it was a solid polished silver.



I ran with 740 for a day, but that seemed to be getting greedy so 730 it is. Games fine and doesn't artifact but won't 3DMark. Whatever. PPD, baby!


----------



## Wogga

SWTOR is ridiculous =)
why should i have drops to 50 fps? sometimes i feel like i'm still on my old x1950


----------



## rush2049

Just flashed my 590 to the voltage unlocked bioses. (Thanks RaginCain)
Even though I was the one to originally supply the Asus 590 NA rom's I waited till now to flash to the modded ones. So far so good. I am getting much less jumping between power states, which in my mind is much better for the hardware. Also the lag when launching a 3d app is gone.... almost instant context switching, its nice..... and all from making the low-power states be gone.....


----------



## iARDAs

Decided to start a new thread about my other questions

but I can ask this one here better

MSI 590 or ZOTAC 590?

Which one would you go with?


----------



## Saizer

Quote:


> Originally Posted by *iARDAs*
> 
> Decided to start a new thread about my other questions
> but I can ask this one here better
> MSI 590 or ZOTAC 590?
> Which one would you go with?


Between MSI and Zotac the only difference will be price here (we are talking about 2 reference cards)... if you ask me go after the Zotac (cause that's what i bought and it's awesome). But if you can, get the EVGA 590 classy. Just for the warranty... Anyways, this cards are beast, so any card you get will be a wise decision.


----------



## iARDAs

Quote:


> Originally Posted by *Saizer*
> 
> Between MSI and Zotac the only difference will be price here (we are talking about 2 reference cards)... if you ask me go after the Zotac (cause that's what i bought and it's awesome). But if you can, get the EVGA 590 classy. Just for the warranty... Anyways, this cards are beast, so any card you get will be a wise decision.


Hey there bro

Thank you for the response

Where i am buying MSI and Zotac have the same price tag for both.

Unfortunately we dont have the EVGA model here. I could find the Asus one but its a bit more expensive.

So I can easily say that Zotac it is?

Also does reference cards have any downsides? just wondering.


----------



## MKHunt

All 590's are reference. So really the brands are only different in warranty and cosmetics.


----------



## iARDAs

I read in 2-3 places that the MSI 590 is very OCable

it says yo u can OC it about 30% without a single danger

Is that true?

http://nurhakim.com/asus-and-msi-gtx590.html

Also how OCable is Zotac?


----------



## MKHunt

They are all the same board. The only difference between OCs is the same gamble you take when buying a CPU; variations in the dies.


----------



## iARDAs

Hmm I see

One last question than

If i get the Zotac, would i OC it by using MSI afterburner or another program?

Also would i need additional cooler if i OC my card lets say 10%?


----------



## MKHunt

Just use AB to overclock. I use Ab for my EVGA 590. They're all based on RivaTuner IIRC.

Cooling needed depends mostly on voltage since it is the largest determinant of heat. If you can get a 10% OC on stock volts, stock cooling should suffice.


----------



## OutlawNeedsHelp

Quote:


> Originally Posted by *iARDAs*
> 
> Hmm I see
> 
> One last question than
> 
> If i get the Zotac, would i OC it by using MSI afterburner or another program?
> 
> Also would i need additional cooler if i OC my card lets say 10%?


Afterburner works with all cards.
Yea, probably.


----------



## iARDAs

Ok thank you guys

SO far I am leaning towards Zotac

but i might have a change of heart to MSI at the last minute.

I already ordered one but can switch to MSI since the card wont be shipped before 24 hours.


----------



## iARDAs

Ok i got one last question

My motherboard is not an SLI motherboard

I know it sounds stupid but some people say that since 590 has 2 fermis its like an SLI but since i will be using it in just 1 slot (16x one) i iwll not have problems am i?


----------



## MKHunt

No problems. All the connections the 590 needs are already there in one slot.


----------



## kzinti1

Quote:


> Originally Posted by *L1eutenant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kzinti1*
> 
> I still can't unlock the voltage in Afterburner Beta. Is there some trick to it?
> 
> 
> 
> Click on Settings (near apply etc.) and there will be a tab with options to allow you to change them
Click to expand...

Yes. That's how the voltage is unlocked. As I said, it does NOT work. The voltage isn't even listed, just a blank where it's supposed to be.


----------



## Wogga

you need drivers with unlocked voltage control. as i remember 280.19 are the latest that allow you to change voltage


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> you need drivers with unlocked voltage control. as i remember 280.19 are the latest that allow you to change voltage


I can change voltage with 285.62. Not up, but down.

Are you really really sure that you have AB Beta? AShould be beta 9.


----------



## Wogga

i mean increasing. lowering isnt so interesting


----------



## iARDAs

Guys i orderedmy zotac 590. What do i need to do in order to be a part of the club?

---
I am here: http://maps.google.com/maps?ll=38.423063,27.128981


----------



## Ubeermench

Read the OP?
*To be added please include in your post:*
**Some PROOF that you own the card (photo, screenshot etc. preferably with your OCN name)*
**Please include the BRAND of your card*
**Please include the CLOCKS you are running your card at. With proof please.*


----------



## iARDAs

Will take a pic tomorrow when the card arrives. Thank u









---
I am here: http://maps.google.com/maps?ll=38.431254,27.140770


----------



## PeteMcTee

Has anyone seen Rushmore around? I've tried PMing him and I still can't get a hold of him. I bought a video card off of him about 2 weeks ago, I know Christmas and stuff has been going on but I'm starting to get worried.


----------



## Recipe7

It shows he has been recently online... like seconds ago.

You there Rushmore?


----------



## MKHunt

Quote:


> Originally Posted by *Recipe7*
> 
> It shows he has been recently online... like seconds ago.
> You there Rushmore?


----------



## iARDAs

Heyfolks

Just got my ZOTAC 590

Here is my first 3Dmark score

http://3dmark.com/3dm11/2431022

Is it good?
normal?
low?

I will be happy if you can share some feedback

I overclocked my i52500k to 4.0

Should i go for 4.5?

EDIT : NVM i went for 4.5 clock speeds and the result was

http://3dmark.com/3dm11/2431103

A definite improvement.


----------



## Mongol

Ok, odd issue I'm having here.

2 Evga HC 590s in quad-sli. Used 2 danger den sli bridge connectors to run cards in parallel (flow) There's definitely water passing through, yet my top card loads nearly 30c+ higher than the bottom card. Idle, only about 10c difference...almost like the bottom card is stuck at 2d speeds during gaming...there is constant activity though. Should I attach my monitors with 2 on the bottom card and 1 on the top?

Thoughts?


----------



## Masked

Quote:


> Originally Posted by ***********
> 
> Ok, odd issue I'm having here.
> 2 Evga HC 590s in quad-sli. Used 2 danger den sli bridge connectors to run cards in parallel (flow) There's definitely water passing through, yet my top card loads nearly 30c+ higher than the bottom card. Idle, only about 10c difference...almost like the bottom card is stuck at 2d speeds during gaming...there is constant activity though. Should I attach my monitors with 2 on the bottom card and 1 on the top?
> Thoughts?


There's not enough water hitting the 2nd core on the first card...Never run parallel in a dual core-card setup...Ever.


----------



## Wogga

hm...i'm running 2x590 parallel and barely hit 43 under load on both cards. but! i have 2 serial DDC 1T+ and MoRA3 (9x120). also CPU and MB mosfets are in the same loop. what i can say - it seems that you need more flow


----------



## iARDAs

Hey there guys

Add me to the club



Its a ZOTAC 590 using it at stock levels.


----------



## Mongol

Quote:


> Originally Posted by *Masked*
> 
> There's not enough water hitting the 2nd core on the first card...Never run parallel in a dual core-card setup...Ever.


That's what I was afraid of. First timer when it comes to dual core cards. Thanks.


----------



## Ubeermench

I've tried multiple drivers and it still wont work. I'm getting the error code 43


----------



## iARDAs

Should i be worried if my GPU goes above 80?

Also do you guys have a custom fan profile that you are very happy with?


----------



## Recipe7

I'm out of the country (so away from my computer) so I can't tell you exactly what my fan profile is at.

Either way, I set it the fan at:
95% for 85C
90% for 80C
75% for 70C
50% for 55C
40% for 50C and lower.

I forgot what it's called in msi afterburner, but there is a setting in the fan section where you input a number which will designate when the fan will go down to a lower speed based on how low the temperature has gotten based on your applied fan percentage.

Hard to explain, but I set that part to (i believe) 8 to 10.

Wish I could be of more help


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> I'm out of the country (so away from my computer) so I can't tell you exactly what my fan profile is at.
> Either way, I set it the fan at:
> 95% for 85C
> 90% for 80C
> 75% for 70C
> 50% for 55C
> 40% for 50C and lower.
> I forgot what it's called in msi afterburner, but there is a setting in the fan section where you input a number which will designate when the fan will go down to a lower speed based on how low the temperature has gotten based on your applied fan percentage.
> Hard to explain, but I set that part to (i believe) 8 to 10.
> Wish I could be of more help


Thank you for the input i will adopt it if i ever see a value above 80 from now on.

The reason i had 85 was a rookie mistake in my part.

Some thick bunch of cables were covering the place where the heat is supposed to go out from in my card. Now that i got that cable placed somewhere else, now i saw 70 degrees max in BF3 gaming in 3D. Those thick cables were covering about 40% of the thingie i am talking about.

But i already created a custom fan profile with those settings that you gave me but did not activate it yet. If i see anything above 80 i will though.

Thank you.


----------



## Recipe7

Yeah, those cables can be a nuisance. You can see how much of a nut I am with cabling in my 'Jesktop' PC... haha. I have 10 noctua fans keeping the case cool


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> Yeah, those cables can be a nuisance. You can see how much of a nut I am with cabling in my 'Jesktop' PC... haha. I have 10 noctua fans keeping the case cool


Honestly if people were not talking about the airflow inside the case i would never think of removing that cable from the end of my 590. But hey i did and a 10-15 degrees cooler is just FANTASTIC.

You got tooo many fans bro









I only have 3 i guess in my case. 1 for the GPU. 1 in the PSU and 1 in the CPU... In other words all STOCK COOLERS


----------



## PeteMcTee

Rushmore, did you ship my card? This seems to be the only way to contact you posting through this forum... I still haven't received it yet.


----------



## Semedar

Hey guys, sorry for the trouble, but for the wired Nvidia 3D Vision glasses, will any USB to Mini / Micro USB cable work? Or do you need the official Nvidia USB cable for the glasses?


----------



## MKHunt

Quote:


> Originally Posted by *Semedar*
> 
> Hey guys, sorry for the trouble, but for the wired Nvidia 3D Vision glasses, will any USB to Mini / Micro USB cable work? Or do you need the official Nvidia USB cable for the glasses?


Can't speak from experience, but I'd bet any cable can work. It's pretty hard to call it USB when the plugs are different from the standard


----------



## johnz

Well I feel late to the party.

I just got my brand new RMA'd eVGA GTX 590 Classifieds after the other ones I previously had were malfunctioning due to being permanently under-volted at .913v somehow..









Anyways, everything is okay now. Time for the proof!











eVGA GTX 590 CLASSIFIEDs @ 700/1400/1800 (stock 630/1260/1728)


----------



## MKHunt

Quote:


> Originally Posted by *johnz*
> 
> Well I feel late to the party.
> I just got my brand new RMA'd eVGA GTX 590 Classifieds after the other ones I previously had were malfunctioning due to being permanently under-volted at .913v somehow..
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways, everything is okay now. Time for the proof!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> eVGA GTX 590 CLASSIFIEDs @ 700/1400/1800 (stock 630/1260/1728)


ah that was not an undervolt actually. That was the original shipping voltage of the 590s.

Welcome


----------



## johnz

Quote:


> Originally Posted by *MKHunt*
> 
> ah that was not an undervolt actually. That was the original shipping voltage of the 590s.
> Welcome


Thank you! It was frustrating, no matter what driver I installed they were locked at .913, and I wasn't about to flash a BIOS and void my warranty, I didn't want to do that just yet. eVGA had real nice support


----------



## Semedar

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Semedar*
> 
> Hey guys, sorry for the trouble, but for the wired Nvidia 3D Vision glasses, will any USB to Mini / Micro USB cable work? Or do you need the official Nvidia USB cable for the glasses?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't speak from experience, but I'd bet any cable can work. It's pretty hard to call it USB when the plugs are different from the standard
Click to expand...

Thanks hopefully any generic USB cable will work.







Just bought the last pair on NewEgg as an Open Box.









http://www.newegg.com/Product/Product.aspx?Item=N82E16814998034R


----------



## MKHunt

Quote:


> Originally Posted by *Semedar*
> 
> Thanks hopefully any generic USB cable will work.
> 
> 
> 
> 
> 
> 
> 
> Just bought the last pair on NewEgg as an Open Box.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814998034R


Holy reasonable price batman. If I had dropped the $$ on a 120Hz monitor I would be sooooooooo jelly.


----------



## OutlawNeedsHelp

lol, I just realized that I never got accepted in the OP. Page 272, post 5 was my proof.


----------



## iARDAs

590 in 3Dvision runs like a charm.

Except BF3 where I have to take down some things. I personally play BF3 everything high and disable all the AA.

Get around 50 FPS.


----------



## johnz

Quote:


> Originally Posted by *iARDAs*
> 
> 590 in 3Dvision runs like a charm.
> Except BF3 where I have to take down some things. I personally play BF3 everything high and disable all the AA.
> Get around 50 FPS.


Nice man! I have wanted to get a 3D Vision monitor for a while now, but I want to wait until some more of the LightBoost compatible ones come out. I am very eager to get a 3D Vision 2 kit. I wanna play some 3D, although I'm not a fan of 3D on the TV. Are the games in 3D easy on the eyes? I haven't tested it at all, so I would be purchasing on the hope that I like it.


----------



## Image132

I personally feel 3D is a bit of gimick. I tried it on my samsung SA950 and it's pretty good (With some games) but it's not worth halving your performance for. I prefer gaming at 120Hz, far more satisfying.

With everything on ultra I get about 60-40 fps depending on the map and where I am. I'm waiting for some watercooling stuff and then I'm going to overclock my cpu. I read somewhere that my cpu is a bottle neck right now at stock speeds :/


----------



## iARDAs

Quote:


> Originally Posted by *johnz*
> 
> Nice man! I have wanted to get a 3D Vision monitor for a while now, but I want to wait until some more of the LightBoost compatible ones come out. I am very eager to get a 3D Vision 2 kit. I wanna play some 3D, although I'm not a fan of 3D on the TV. Are the games in 3D easy on the eyes? I haven't tested it at all, so I would be purchasing on the hope that I like it.


Never ever game on a TV as the TVs dont support 3Dvision but only supports 3Dtvplay which is mediocre at best. Due to HDMI 1.4a restrictions in 3Dtvplay you can game in 1080p @ 24 fps or 720p @ 60 fps, both are horrible.

Of course if you are going to get the monitor wait for a good Lightboost one. Asus is supposed to be good with his new 27 inch but some of them are broken. The built in emitters dont work so its better to wait a bit more.

When i started playing in 3Dvision i set the depth at 15% which is the default depth and in time i took it up to 100%. I have no eye problems or anything else. Its just about getting used to it.

Also not every game has a quality 3d rating. Check this out. http://www.nvidia.com/object/3d-vision-games.html

Definitely aim for games that are rated good or higher. I actualy never played a game that says poor but i played some games that are not recommended and yeah they are not recommended for a reason.

So far I personally played

Just Cause 2, Battlefield 3, Bad Company 2, Black Ops, Trine, Homefront, Fallout New Vegas, The Witcher and Orcs Must die, and let me put it that way, when i switched back to 2D I can not stand thos games anymore


----------



## iARDAs

Quote:


> Originally Posted by *Image132*
> 
> I personally feel 3D is a bit of gimick. I tried it on my samsung SA950 and it's pretty good (With some games) but it's not worth halving your performance for. I prefer gaming at 120Hz, far more satisfying.
> With everything on ultra I get about 60-40 fps depending on the map and where I am. I'm waiting for some watercooling stuff and then I'm going to overclock my cpu. I read somewhere that my cpu is a bottle neck right now at stock speeds :/


I feel you bro

Even playing games in 60 fps in 3d it doesnt feel as smooth as 60 fps in 2d, but i just love the feeling it gives me.

Its an evolving technology and i hope in the future it gets better...

God bless my 590 really. With a single 570 i used to have things were not as good.


----------



## Image132

Quote:


> Originally Posted by *iARDAs*
> 
> I feel you bro
> Even playing games in 60 fps in 3d it doesnt feel as smooth as 60 fps in 2d, but i just love the feeling it gives me.
> Its an evolving technology and i hope in the future it gets better...
> God bless my 590 really. With a single 570 i used to have things were not as good.


Yeah you could be right. I mean when I first tried 3d I wasn't expecting anything amazing but I tried mass effect 2 and new vegas and I literaly went "wow" then I got lag (Witcher 2) and sorta left it. 120Hz smooth is good enough for me









Maybe when gpu's get enough horsepower to run it and it evolves a bit more I'll give it a go again.


----------



## MKHunt

Just noticed something interesting.

285.62 drivers: .938V max

290.53 drivers: .925V max

I can still fold at 730 though so... ?


----------



## Ubeermench

Finally got my 590 to work. I hate windows some times...Now just waiting for newegg to get my other 590 in stock so they can ship a replacement.


----------



## OutlawNeedsHelp

My WCing has worked. 26-28C Idle and max 34C on one of the cores while playing BF3.


----------



## Cooperdale

Man I just can't stand the noise from this card. Playing Dragon Age 2 I feel like I'm on an airplane. I still haven't had the courage to even install Skyrim, I've got the Collector's Edition but all it's collecting is dust.

Is there no alternative to water cooling, like changing the stock fan or something? I tried putting a side-fan blowing right towards the card and all I gained was 2°C on full load! I wonder how that is even possible, since this move made me gain about 10°C on my previous rig...

I'm short on money, but I'll stop eating for a month if I can get this card to shut up. I may go for a 360 radiator to cool the card and the cpu (I have no overclocking aims for now) if there's no other way.


----------



## Image132

Quote:


> Originally Posted by *Cooperdale*
> 
> Man I just can't stand the noise from this card. Playing Dragon Age 2 I feel like I'm on an airplane. I still haven't had the courage to even install Skyrim, I've got the Collector's Edition but all it's collecting is dust.
> Is there no alternative to water cooling, like changing the stock fan or something? I tried putting a side-fan blowing right towards the card and all I gained was 2°C on full load! I wonder how that is even possible, since this move made me gain about 10°C on my previous rig...
> I'm short on money, but I'll stop eating for a month if I can get this card to shut up. I may go for a 360 radiator to cool the card and the cpu (I have no overclocking aims for now) if there's no other way.


You can do what I did. I got a AC unit, put some ducting pipe from the AC to the card. Cranked it up and my card went down 5°C and I nealy lost my toes to frostbite.

I've given up and am now going WC.


----------



## Cooperdale

Quote:


> Originally Posted by *Image132*
> 
> You can do what I did. I got a AC unit, put some ducting pipe from the AC to the card. Cranked it up and my card went down 5°C and I nealy lost my toes to frostbite.
> I've given up and am now going WC.


Jesus... The watercooling configuration I've come up with would cost me about 700 €, I wonder if it's worth it. I also wonder why this card has to hit such high temps on some games, I had a 5870 up to one month ago, Dragon Age 2 fps were about 50% of what I get now at the same config, but there was no noise in comparison.


----------



## emett

What temp is the card running at cooper?


----------



## Cooperdale

In Dragon Age 2 it goes to about 85-86, from what I've read online in forums and reviews these are the usual temps at full load. It's the same result I get in Furmark or Unigine Heaven.

In the other games I've tried noise and temps are no issue, but I'm paving the way for Skyrim and I don't wanna play it with a Wookiee howling inside my case. Unless it's a DA2 specific problem: I didn't think a game could squeeze 100% out of this card with all these ****ty console port going around.

Luckily when I work with gpu-enabled software I get almost no noise. The pc in itself is the quietest I've built on air, I'm short on money so the water-cooling option has to be the last resort: moreover this is the machine I work on too, I don't know if I would be comfortable water-cooling it.


----------



## iARDAs

I decided to have the following fan control for my Zotac

50C --- 40%
60C --- 50%
65C --- 55%
70C --- 65%
75C --- 75%
80C --- 85%
85C --- 95%

With this i never get above 80 degrees

but i agree that when the fan is around 75% its getting loud.

My MSI Twin Frozr III 570 was much much quiter under full load, though of course 590 is more powerful

I just wish not all the 590s were reference cards and companies had modified it like how they do to 570 or 580. Wİth dual fans as well.


----------



## Cooperdale

Quote:


> Originally Posted by *iARDAs*
> 
> I decided to have the following fan control for my Zotac
> 50C --- 40%
> 60C --- 50%
> 65C --- 55%
> 70C --- 65%
> 75C --- 75%
> 80C --- 85%
> 85C --- 95%
> With this i never get above 80 degrees
> but i agree that when the fan is around 75% its getting loud.
> My MSI Twin Frozr III 570 was much much quiter under full load, though of course 590 is more powerful
> I just wish not all the 590s were reference cards and companies had modified it like how they do to 570 or 580. Wİth dual fans as well.


Yeah, they should have. Anyway right now I don't care much about temps since the card is stable: it's the noise accompanying those temps I'm having trouble with... I tried changing the fan profile, but you get lower temps at even higher noise.
I guess I only have 2 options, get used to it or go water.

Honestly, I was an idiot. I didn't get enough info, had I known the HD6990 has a custom Accelero cooler of stellar quality, it would have been a no brainer.


----------



## CapitanPelusa

Quote:


> Originally Posted by *Cooperdale*
> 
> In Dragon Age 2 it goes to about 85-86, from what I've read online in forums and reviews these are the usual temps at full load. It's the same result I get in Furmark or Unigine Heaven.
> In the other games I've tried noise and temps are no issue, but I'm paving the way for Skyrim and I don't wanna play it with a Wookiee howling inside my case. Unless it's a DA2 specific problem: I didn't think a game could squeeze 100% out of this card with all these ****ty console port going around.
> Luckily when I work with gpu-enabled software I get almost no noise. The pc in itself is the quietest I've built on air, I'm short on money so the water-cooling option has to be the last resort: moreover this is the machine I work on too, I don't know if I would be comfortable water-cooling it.


Just use MSI afterburner and force like 65% fan speed and just roll with it. 590 is designed to work at 85-90c anyways. itll be quiet at those fan speeds.


----------



## Cooperdale

Quote:


> Originally Posted by *CapitanPelusa*
> 
> Just use MSI afterburner and force like 65% fan speed and just roll with it. 590 is designed to work at 85-90c anyways. itll be quiet at those fan speeds.


Thanks, I'll try that.


----------



## Smo

My EVGA 590 is for sale *here* if anybody's interested.


----------



## MKHunt

Quote:


> Originally Posted by *Smo*
> 
> My EVGA 590 is for sale *here* if anybody's interested.


whaaaa? Why?


----------



## rush2049

He's switching camps..... they must be using something shiny over there.....


----------



## Recipe7

He will be back in due time


----------



## Arizonian

Quote:


> Originally Posted by *Recipe7*
> 
> He will be back in due time


Most likely when Nvidia launches the 680 IF it beats the 7970 in performance.


----------



## Recipe7

It will definitely beat it... by a percentage or two, hehe.


----------



## iARDAs

SO i am assuming 680 will beat 590 too?


----------



## Recipe7

It's hard to make any claims, but from what I have witnessed, especially the past years, the new models always win over the competition by an inch. Not always the case, but it has been lately.

From what the 7970s have been showing, their power will be equivalent to a 590 at high overclocks. The 680 should exceed the 590 at the least, and at the most beat it out by an inch. (with overclocking)


----------



## iARDAs

So there would not be any need to upgrade my 590 until the 7 series if thats the case.

Unless i see a 50% increase at a single card or a dual gpu card i will not be interested in upgrading.

590 is my beast really and i love it.

I just hoped that in BF3 at ultra i should have FPS always higher than 60

in some maps i can see low 40s (like metro map in the beginnin) and to be honest that dissapoints me.

Hopefully further drivers will adress these issues.


----------



## Smo

Quote:


> Originally Posted by *MKHunt*
> 
> whaaaa? Why?


So far I've been very impressed with the results seen from the new AMD 7970s. On top of that, they have 3GB of usable VRAM compared to our 1.5GB. I'm changing direction with my rig away from surround and would rather have a central 27/30" to game on with two 22"s as slaves.

With this in mind I'm looking at running two overclocked 7970s in Crossfire, under water.

Quote:


> Originally Posted by *rush2049*
> 
> He's switching camps..... they must be using something shiny over there.....


Initial benchmarks and overclocking results are very impressive, especially considering supported voltage control hasn't reared it's head yet.

Quote:


> Originally Posted by *Recipe7*
> 
> He will be back in due time


Possibly, if the GTX 780 is a good performer. I don't care which 'side' I'm on, beit AMD or NVIDIA. All I'm interested in is performance









Quote:


> Originally Posted by *Arizonian*
> 
> Most likely when Nvidia launches the 680 IF it beats the 7970 in performance.


FYI the GTX 680 series are the 'M' cards for laptops. The desltop range are 7XX. However it may well beat the 7970, and if does - I'll be back!

Quote:


> Originally Posted by *Recipe7*
> 
> It will definitely beat it... by a percentage or two, hehe.


Who knows bud!

Quote:


> Originally Posted by *iARDAs*
> 
> SO i am assuming 680 will beat 590 too?


I doubt that the 780 will beat the 590 but I imagine it would be right on it's heels.

Quote:


> Originally Posted by *Recipe7*
> 
> It's hard to make any claims, but from what I have witnessed, especially the past years, the new models always win over the competition by an inch. Not always the case, but it has been lately.
> From what the 7970s have been showing, their power will be equivalent to a 590 at high overclocks. The 680 should exceed the 590 at the least, and at the most beat it out by an inch. (with overclocking)


I think it's unlikely that the new single GPU cards will outperform the current dual GPU cards at stock, however - from what we've seen in terms of overclocking potential with the new architecture from AMD, they may well beat them! Here's hoping the 780 can take it's clocks to some high levels.


----------



## Recipe7

I just noticed I made a mistake... I meant that the 780(another mistake, haha) will be AT the power of 590 at the least and exceed it at the most, this should be true with high overclocking.

Thanks for clarifying SMO. Hope you stick around here on the 590 thread, it's good to have you around.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> So there would not be any need to upgrade my 590 until the 7 series if thats the case.
> Unless i see a 50% increase at a single card or a dual gpu card i will not be interested in upgrading.
> 590 is my beast really and i love it.
> I just hoped that in BF3 at ultra i should have FPS always higher than 60
> in some maps i can see low 40s (like metro map in the beginnin) and to be honest that dissapoints me.
> Hopefully further drivers will adress these issues.


I'm a casual gamer, due to me being so swamped with life's obstacles. If I had it my way, I'd play my games everyday for hours and buy the best of the best hardware.

The 590 will last me at the least until end of warranty, 2014. I just have to get it out of my head that I don't have to game at the max settings with upped Anti-aliasing







. I'm hoping that when I call ASUS for an RMA in 2013/2014, they will have no choice but to give me a 7xx, haha.

I, too, am waiting for the drivers that will max the 590... can always hope for the best.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> I'm a casual gamer, due to me being so swamped with life's obstacles. If I had it my way, I'd play my games everyday for hours and buy the best of the best hardware.
> The 590 will last me at the least until end of warranty, 2014. I just have to get it out of my head that I don't have to game at the max settings with upped Anti-aliasing
> 
> 
> 
> 
> 
> 
> 
> . I'm hoping that when I call ASUS for an RMA in 2013/2014, they will have no choice but to give me a 7xx, haha.
> I, too, am waiting for the drivers that will max the 590... can always hope for the best.


We are on the same boat

I just got married 12 months ago and my gaming time has decreased. Hopefully we can have a baby in the next 12 months which will take my gaming time even lower...

I am just lucky that wife lets me play games 1*2 hours per day at night and few hours in the weekend.









So a 590 should last me a good while and at least for 2 years everything maxed... I just cant see a game beating BF3 in terms of graphics in the next 2 years...

in 2013-2014 the Maxwell cards are coming. So i can easily skip Kepler.

And yes we definitely need better drivers for few games. The latest beta patch did not mention anything about 590 in terms of SKYRIM boost though i am sure there is some.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> We are on the same boat
> I just got married 12 months ago and my gaming time has decreased. Hopefully we can have a baby in the next 12 months which will take my gaming time even lower...
> I am just lucky that wife lets me play games 1*2 hours per day at night and few hours in the weekend.
> 
> 
> 
> 
> 
> 
> 
> 
> So a 590 should last me a good while and at least for 2 years everything maxed... I just cant see a game beating BF3 in terms of graphics in the next 2 years...
> in 2013-2014 the Maxwell cards are coming. So i can easily skip Kepler.
> And yes we definitely need better drivers for few games. The latest beta patch did not mention anything about 590 in terms of SKYRIM boost though i am sure there is some.


We may be sitting next to each other on this boat... haha. I myself got married almost 2 years ago. My playing time comes when she is working, or when she is busy on her laptop







.

Looks like we may be getting Maxwell cards together


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> We may be sitting next to each other on this boat... haha. I myself got married almost 2 years ago. My playing time comes when she is working, or when she is busy on her laptop
> 
> 
> 
> 
> 
> 
> 
> .
> Looks like we may be getting Maxwell cards together


I am 99% sure that this is whats going to happen.

Though although being married rocks and everything i just wish to go back in time few years back where i could game 12 hours a day if i wanted to









Also i just got my wife every single season of desperate housewives and she watches 1 episode a day. Which means 40 mins more freedom of gaming









So Maxwell it is.

Expect a PM from in in late 2013


----------



## Recipe7

Haha, my wife has already finished DH.

Introduce your wife to pinterest.com, she might like it. She can find nice cooking recipes. My wife has found a few already and likes to browse there often.


----------



## Saizer

I'm going to skip Kepler and jump straight into Maxwell.


----------



## Image132

Quote:


> Originally Posted by *Saizer*
> 
> I'm going to skip Kepler and jump straight into Maxwell.


Hear hear. Just like I'm going to skip Ivy bridge and go for Haswell. I dislike intel's Tick processors, waste of money in my opinion. :/


----------



## Arizonian

Code:

I'd have to agree with those not upgrading to the new cards if you alrady have a GTX 590. It's an awesome card and will easily carry you over to 2013 Maxwell. It would only be needed for an epeen move.

One thing I would suggest is start a secret fund where you set aside a little bit of cash every now and again. By the time Maxwell does come out you'll be ready. One thing I learned being married for a long time is wives don't understand boys and thier toys.









We don't get older, our toys get more expensive. There should be a club support group on OCN for such members. Throw in kids and diapers and upgrading seems bleak. I didn't get a chance during those years myself. I'm talking from experience gentlemen, take heed.


----------



## MKHunt

Quote:


> Originally Posted by *Arizonian*
> 
> Code:


I'd have to agree with those not upgrading to the new cards if you alrady have a GTX 590. It's an awesome card and will easily carry you over to 2013 Maxwell. It would only be needed for an epeen move.
One thing I would suggest is start a secret fund where you set aside a little bit of cash every now and again. By the time Maxwell does come out you'll be ready. One thing I learned being married for a long time is wives don't understand boys and thier toys.








We don't get older, our toys get more expensive. There should be a club support group on OCN for such members. Throw in kids and diapers and upgrading seems bleak. I didn't get a chance during those years myself. I'm talking from experience gentlemen, take heed.









I need to hear this from time to time to resist the urge to buy whatever nvidia's next monster is. if I survived on a 6200 (with turbocaching!) and a 9600m until 2011 there's really no room for me to complain.


----------



## Calado90

Well my Asus Gtx590 arrives today so i tried to overclock it a litle but in the msi afterburner i can´t put the voltage more than 0.938v. My card is the voltage tweak version. What can i do?


----------



## MKHunt

Quote:


> Originally Posted by *Calado90*
> 
> Well my Asus Gtx590 arrives today so i tried to overclock it a litle but in the msi afterburner i can´t put the voltage more than 0.938v. My card is the voltage tweak version. What can i do?


voltage is driver locked. you can apply the bios hack or roll back to 280.19. the 290 drivers lock the voltage down further to .925


----------



## Shinobi Jedi

Hey all,

Is there some reviews I'm missing? Because according to the AnandTech review, the 590 is beating the 7970 in most benches...

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/16

I still get frustrated dealing with AMD drivers on my Asus G73JH that its enough to convince me to stick with Nvidia for awhile. Too much work. Uninstall the driver, reboot, use Driversweeper, reboot, clean out the registry's, reboot, install new driver...

Nvidia? Just hit the custom option and check "clean install"

That right there is keeping me green for now.

I'm also surprised a 590 AC owner is unhappy with the noise. I'm running two in SLI and its got to be one of the quietest AC systems I've ever owned. Albeit, I don't overclock (because I still don't seem to find a need to)

I thought the 6990's were leaf blowers? Perhaps this custom cooler mentioned negates that?

Like someone else said, unless I see a %50 increase in gaming, I'm sticking with my 590's. Besides, the 7970's still don't have the Pimpin' aesthetics that the 590's have.









SMO, if you're going to get a 27" 120hz/3D display, you don't need the 3gb Vram as they're still capped at 1080p. If you want a 27" IPS monitor that does 2560x1600, then that makes sense.

I'll check around the net for other reviews. Because according to AnandTech, the 7970's do not seem to be worth the headache of selling cards, installing new ones, etc. just for gaming. If someone wants to switch to try and push the OC limits on their rig for folding or whatever, then that makes sense.

But for gaming? Until I see some other reviews, it seems that going to the 7970's are a step down rather than a step up.

Happy New Year all!


----------



## MKHunt

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> -snip-
> Like someone else said, unless I see a %50 increase in gaming, I'm sticking with my 590's. Besides, the 7970's still don't have the *Pimpin' aesthetics that the 590's have.*
> 
> 
> 
> 
> 
> 
> 
> 
> SMO, if you're going to get a 27" 120hz/3D display, you don't need the 3gb Vram as they're still capped at 1080p. If you want a 27" IPS monitor that does 2560x1600, then that makes sense.
> I'll check around the net for other reviews. Because according to AnandTech, the 7970's do not seem to be worth the headache of selling cards, installing new ones, etc. just for gaming. If someone wants to switch to try and push the OC limits on their rig for folding or whatever, then that makes sense.
> But for gaming? Until I see some other reviews, it seems that going to the 7970's are a step down rather than a step up.
> Happy New Year all!


Seriously. This is aftermarket, but I have yet to see an aftermarket solution for the 7970 that looks anywhere near as good. Since it's a single GPU card I don't see full length blocks in the future either.



Also, AMDs for folding is silly, but I don't think Smo plans on folding.

If Kepler can saturate a PCIE 2.0 x16 bus, I might consider it. But the 590 only just saturates an x8 bus, so whatever.


----------



## kzinti1

Is there a waterblock that retains the LED lit "NVIDIA" and the Nvidia logo?

The EK block above is just about the classiest there is, but with the Nvidia logo seemingly floating in mid-air against the black background it would look 10 times better.


----------



## MKHunt

Quote:


> Originally Posted by *kzinti1*
> 
> Is there a waterblock that retains the LED lit "NVIDIA" and the Nvidia logo?
> The EK block above is just about the classiest there is, but with the Nvidia logo seemingly floating in mid-air against the black background it would look 10 times better.


Closest I'm aware of is the EVGA HydroCopper which has the EVGA lettering in blue-white at the fitting extrusion.


----------



## kzinti1

I was afraid of that. I wanted the HydroCopper but it's about the ugliest waterblock I've ever seen. So I bought my plain Classy instead. I think I'll just do without a block. Mine doesn't run too hot anyway. It's V locked too low to matter much and I'm too paranoid to go messing with its BIOS. It ain't like I can just go out and buy another if I screw it up.

I wonder if there'll be a GTX690? Or, a GTMax790? Now there's a new designation for the upcoming Maxwell's!


----------



## iARDAs

Guys I have a question

I am playing Divinity 2 (great game btw) and in 3Dvision both of my GPUs are used and i get steady 60 fps no problem

When I game in 2D, just one of my cores is being used and the other one is at 0 or 1% when gaming. Only in cinematics both cores are being used equally.

I get 100 FPS easily so no problem in performance. (it also has a 100 fps lock)

So does this mean that the game does not require much power so one of the GPUs are resting?

or does the game not have an SLI profile?


----------



## Calado90

Guys I just instal Msi Afterburner in my pc and there i can see that my gpu2 of the 590 is everytime up and down. what is the reason for that?


----------



## iARDAs

Quote:


> Originally Posted by *Calado90*
> 
> Guys I just instal Msi Afterburner in my pc and there i can see that my gpu2 of the 590 is everytime up and down. what is the reason for that?


while in gaming only? or does it do it in desktop as well?


----------



## Calado90

Quote:


> Originally Posted by *iARDAs*
> 
> while in gaming only? or does it do it in desktop as well?


In desktop only friend. Its like 0% of use and a second later 5%. I will post a pic


----------



## Masked

Quote:


> Originally Posted by *Calado90*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iARDAs*
> 
> while in gaming only? or does it do it in desktop as well?
> 
> 
> 
> In desktop only friend. Its like 0% of use and a second later 5%. I will post a pic
Click to expand...

It's normal and nothing special...nothing to worry about.

Everyones primary or secondary core runs these functions during low tasking...


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Guys I have a question
> 
> I am playing Divinity 2 (great game btw) and in 3Dvision both of my GPUs are used and i get steady 60 fps no problem
> 
> When I game in 2D, just one of my cores is being used and the other one is at 0 or 1% when gaming. Only in cinematics both cores are being used equally.
> 
> I get 100 FPS easily so no problem in performance. (it also has a 100 fps lock)
> 
> So does this mean that the game does not require much power so one of the GPUs are resting?
> 
> or does the game not have an SLI profile?


Over half the games on the current market don't have sli profiles...that's nothing new...the 295 experienced the same issues.

It also obviously doesn't require that much got power since you're on max only wt 50%...


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> Over half the games on the current market don't have sli profiles...that's nothing new...the 295 experienced the same issues.
> It also obviously doesn't require that much got power since you're on max only wt 50%...


I wish MineCraft had an SLI profile...


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Masked*
> 
> Over half the games on the current market don't have sli profiles...that's nothing new...the 295 experienced the same issues.
> It also obviously doesn't require that much got power since you're on max only wt 50%...
> 
> 
> 
> I wish MineCraft had an SLI profile...
Click to expand...

I can tach a few of you how to make them if you want-- not going to do it on my iPad but, when I get home from this business trip then I'd be more than happy too.

You still won't see an increase in performance like you all think you will because the technology itself just doesn't work that way...an increase though, yes.


----------



## MKHunt

hahahahaha I was being silly. Not sure if SLI in MC is even possible since its java.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> hahahahaha I was being silly. Not sure if SLI in MC is even possible since its java.


I was speaking in general...anyone whos interested can pm me...will have to be after the 16th.


----------



## Shinobi Jedi

Going to CES, Masked? If so, have a blast! Excited to see any new products coming from AW. I just got back from a road trip and still get impressed with how my little M11x-R2 performs. Getting the itch to upgrade it though. Here's to hoping there will soon be an announcement on an R4









Though I know you can't comment on such matters


----------



## GDP

Do I have to post a picture of the actual card, or can I do a GPUZ + my name on notepad? Cause im lazy. Plus Im sure most would rage at the sight of an alienware PC lol!

Here is the validation: http://www.techpowerup.com/gpuz/czc27/


----------



## Ubeermench

To be added please include in your post:
*Some PROOF that you own the card (photo, screenshot etc. preferably with your OCN name)
*Please include the BRAND of your card
*Please include the CLOCKS you are running your card at. With proof please.


----------



## GDP

So it has to be a photo? I dont have a branded card, its a dell/nvidia card.


----------



## kzinti1

Quote:


> Originally Posted by *Ubeermench*
> 
> To be added please include in your post:
> *Some PROOF that you own the card (photo, screenshot etc. preferably with your OCN name)
> *Please include the BRAND of your card
> *Please include the CLOCKS you are running your card at. With proof please.


What kind of screenshot are you referring to? I posted a GPU-Z validation with my name and OCN (BTW; kzinti1-OCN is my GPU-Z and CPU-Z name, among other sites, too) on it.

I have no use for a camera. The only one I ever owned was a Polaroid Swinger somebody gave me when I was about 12. That would make it ca.1968. I used one pack of film and now I don't even know, or really care, where the camera is. I've moved since then and it could easily have been trashed.

So, I'll repost the validation: http://www.techpowerup.com/gpuz/6mn5w/

Tell you that it's an EVGA GTX590 Classified. Most likely Revision 2 as shown in this link: http://vr-zone.com/articles/revised-geforce-gtx-590-cards-in-june/12222.html

And that it's running at the same clocks listed in the GPU-Z validation.

I guess I could post a 3DMark 11 validation, http://3dmark.com/3dm11/2097030

The only other "proof" is my receipt from Amazon. Don't hold your breath waiting on me to scan and post that one.

I guess I could cut the label off the box and scan that into my computer and post that, as well, but I really don't care to mutilate the box I may one day have to use for a warranty repair.

BTW: the link to the proof that there are actually 2 revisions to the GTX590; http://vr-zone.com/articles/revised-geforce-gtx-590-cards-in-june/12222.html, would be very good to post in Aaltar's OP, proving to everybody who swears otherwise, that there really are 2 revisions to this card and that everybody who's thinking about an aftermarket watercooling block should take this into consideration before ordering something that they either can't use, or possibly even return for a refund.

I also just pre-ordered another of these cards from Amazon. Whether they actually get them back in stock is anybody's guess. JIC they do, I have my order in. Do you?


----------



## grifers

Hi!!. Please, can yours run Metro 2033 benchmark with this settings?:

- 1080p, Directx 11, Very High, DOF and Tesselation Enable, MSA4X enable and Physx disable

Stock clocks on Gtx590, one Gtx590 only please









Thanks a lot and sorry my language!!

P.D - Show results (with GPU-z) image capture like this http://www.overclock.net/t/817064/metro-2033-benchmark-thread-using-official-tool/990#post_16101415


----------



## Ubeermench

Quote:


> Originally Posted by *GDP*
> 
> So it has to be a photo? I dont have a branded card, its a dell/nvidia card.


Just take a screenshot with Gpu-z open and notepad with your name

Something like this but with your ocn name included in the screenshot


I'll take a better screenshot example when i get home.


----------



## Arizonian

Quote:


> Originally Posted by *Ubeermench*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GDP*
> 
> So it has to be a photo? I dont have a branded card, its a dell/nvidia card.
> 
> 
> 
> Just take a screenshot with Gpu-z open and notepad with your name
> 
> Something like this but with your ocn name included in the screenshot
> 
> 
> I'll take a better screenshot example when i get home.
Click to expand...

A Windows 7 'sticky note' would work great in that example screen shot.


----------



## GDP




----------



## grifers

Quote:


> Originally Posted by *grifers*
> 
> Hi!!. Please, can yours run Metro 2033 benchmark with this settings?:
> 
> - 1080p, Directx 11, Very High, DOF and Tesselation Enable, MSA4X enable and Physx disable
> 
> Stock clocks on Gtx590, one Gtx590 only please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks a lot and sorry my language!!
> 
> P.D - Show results (with GPU-z) image capture like this http://www.overclock.net/t/817064/metro-2033-benchmark-thread-using-official-tool/990#post_16101415


Any help me?







.

bye!


----------



## Ubeermench




----------



## Recipe7

Quote:


> Originally Posted by *grifers*
> 
> Quote:
> 
> 
> 
> Originally Posted by *grifers*
> 
> Hi!!. Please, can yours run Metro 2033 benchmark with this settings?:
> 
> - 1080p, Directx 11, Very High, DOF and Tesselation Enable, MSA4X enable and Physx disable
> 
> Stock clocks on Gtx590, one Gtx590 only please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks a lot and sorry my language!!
> 
> P.D - Show results (with GPU-z) image capture like this http://www.overclock.net/t/817064/metro-2033-benchmark-thread-using-official-tool/990#post_16101415
> 
> 
> 
> Any help me?
> 
> 
> 
> 
> 
> 
> 
> .
> 
> bye!
Click to expand...

When I get the time, I will help you







. I will post the benches I have, I have to locate them first.


----------



## H4rd5tyl3

Forgot to put my name down here







. My ASUS GTX 590 is running @ stock and has been running fine with a sexy nickel/plexi ek block since. Quick question though. I bought it from someone who bought it brand new within a few days of it's release and it was kept in the heat shrink and what not since so it was new, unused etc. But I'm guessing it still has the old bios they first shipped with. Would I be able to flash it to the new bios that made it have the nuclear explosion vrm issue without voiding warranty?


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Going to CES, Masked? If so, have a blast! Excited to see any new products coming from AW. I just got back from a road trip and still get impressed with how my little M11x-R2 performs. Getting the itch to upgrade it though. Here's to hoping there will soon be an announcement on an R4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though I know you can't comment on such matters


I cannot confirm nor deny my presence at ces









The r2's a my favorite, actually...not so much into the 17s as I used to be...now it's all about size and performance...smaller the better.


----------



## grifers

Quote:


> Originally Posted by *GDP*
> 
> Do I have to post a picture of the actual card, or can I do a GPUZ + my name on notepad? Cause im lazy. Plus Im sure most would rage at the sight of an alienware PC lol!
> 
> Here is the validation: http://www.techpowerup.com/gpuz/czc27/


Ok, thanks!!!

Bye!


----------



## VegasGT

Todays my Birthday and what a great Birthday it is here is my new Asus 590 i hope this works for Proof. Add me to the List of Owners! Thanks! Time to hook it up and game all day!!!


----------



## Arizonian

Quote:


> Originally Posted by *VegasGT*
> 
> Todays my Birthday and what a great Birthday it is here is my new Asus 590 i hope this works for Proof. Add me to the List of Owners! Thanks! Time to hook it up and game all day!!!


Got to love it. Happy GTX 590 birthday







Also welcome to OCN bud.


----------



## johnz

Quote:


> Originally Posted by *VegasGT*
> 
> Todays my Birthday and what a great Birthday it is here is my new Asus 590 i hope this works for Proof. Add me to the List of Owners! Thanks! Time to hook it up and game all day!!!


Happy Birthday VegasGT! Have fun gaming with that 590


----------



## GDP

Quote:


> Originally Posted by *grifers*
> 
> Ok, thanks!!!
> Bye!


Really? Stay classy buddy.


----------



## kzinti1

Amazon just sent me an estimated arrival date for another EVGA GTX590 Classified! Somewhere between March 8th and April 4th.

If any of you want another, or your 1st, I'd advise you to pre-order now!


----------



## grifers

Quote:


> Originally Posted by *GDP*
> 
> Really? Stay classy buddy.


Dont understand you XD.


----------



## MKHunt

IT KNOWS



'bout my new folding clock.


----------



## iARDAs

Hey folks

I had 2 issues playing BF3

1-) YEsterday i had a Directx Runtime error while playing BF3

2-) Today i had my first BSOD ever since started gaming in PC after PS3

The BSOD said something like hardware failure but my hardware works fine in every other game.

Any suggestions?


----------



## iARDAs

Quote:


> Originally Posted by *iARDAs*
> 
> Hey folks
> I had 2 issues playing BF3
> 1-) YEsterday i had a Directx Runtime error while playing BF3
> 2-) Today i had my first BSOD ever since started gaming in PC after PS3
> The BSOD said something like hardware failure but my hardware works fine in every other game.
> Any suggestions?


EDIT : Could this be related with the voltage of my CPU overclock?


----------



## Platinum Rook

here we go!

http://www.techpowerup.com/gpuz/menfc/

incorrect pixel fillrate with GPU-Z 5.7 i know some other people having the same problem aswell.

Zotac version


----------



## bruflot

Quote:


> Originally Posted by *iARDAs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iARDAs*
> 
> Hey folks
> I had 2 issues playing BF3
> 1-) YEsterday i had a Directx Runtime error while playing BF3
> 2-) Today i had my first BSOD ever since started gaming in PC after PS3
> The BSOD said something like hardware failure but my hardware works fine in every other game.
> Any suggestions?
> 
> 
> 
> EDIT : Could this be related with the voltage of my CPU overclock?
Click to expand...

Definitely. You need to adjust the voltage to make it stable.

Sent from my GT-I9100 using Tapatalk


----------



## GDP

Sorry go ahead and take me outta the club, I no longer have my GTX 590, traded it for a 580.


----------



## kzinti1

Amazon just contacted me again with anorder update. It's now supposed to be here Thursday, Jan.12th! Not between March 8th and April 4th as they originally said.

If you want one you'd best hurry.

I do hope these run okay in SLi. Anything special I need to know about doing this?


----------



## Calado90

hi guys seem like the card i just bought (in second hand but never used before) is one of the firsts 590 to came out (about 6 mouths old card) And i'm surfing at the web when i see that the first 590 blows up. i should do something to prevent that? I'm using the 290.53 drivers and i don't put another bios in the 590. Im at stock voltage and core speed.

Thanks Calado90


----------



## iARDAs

I believe those were issues with the drivers and the recent drives should never ever make your card blow up since you are in stock everything.


----------



## Image132

Quote:


> Originally Posted by *iARDAs*
> 
> I believe those were issues with the drivers and the recent drives should never ever make your card blow up since you are in stock everything.


If you don't overclock, but if you overclock you know the risks you take. Under stock use it will not blow up. Just make sure you are ready for the temps this card gets to normally. Mine while playing bf3 on Ultra goes to 68-80 degrees C.


----------



## toX0rz

Has anyone else encountered a weird problem where the fan would randomly spin up to 100% @ idle ?


----------



## iARDAs

Quote:


> Originally Posted by *toX0rz*
> 
> Has anyone else encountered a weird problem where the fan would randomly spin up to 100% @ idle ?


yep

Happens to me in desktop

full fan for a second or two than back to normal.


----------



## iARDAs

Quote:


> Originally Posted by *Image132*
> 
> If you don't overclock, but if you overclock you know the risks you take. Under stock use it will not blow up. Just make sure you are ready for the temps this card gets to normally. Mine while playing bf3 on Ultra goes to 68-80 degrees C.


I am not thinking of OCing this beast.

I run my games at 1080p and everything is peachy.

When I feel like OCİNG i will get a card that is stronger than 590 which will probably come out more than a year later









But i agree that 590 could be the most dangerous card for OCing.


----------



## GDP

Should I be worried @ 92C?

....also I traded back with my friend so I have the 590 again.


----------



## Galactipuss

I have 2 of these baddies on quad SLI, waiting to get wet with EK acrylic electroless blocks. This is still under construction will post results and opinion later.

Full build found here http://www.overclock.net/t/1195617/project-fear-factory-liquid-inside


----------



## MKHunt

Quote:


> Originally Posted by *Galactipuss*
> 
> I have 2 of these baddies on quad SLI, waiting to get wet with EK acrylic electroless blocks. This is still under construction will post results and opinion later.
> Full build found here http://www.overclock.net/t/1195617/project-fear-factory-liquid-inside
> -snip-


You got the backplates too, right? The backplate looks amazing on the EK. Plus the backplate cools more than the EVGA backplate and. And the EVGA plate wont fit with the EK block screws.

As for OCing the card, I don't see it as dangerous as long as you leave the volts alone. Without altering volts it seems unreasonably safe for a card that targets the enthusiast demographic.


----------



## Galactipuss

Yeah, I got the back plates. I'm still waitin on parts to arrive like the SLI bridge


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> yep
> Happens to me in desktop
> full fan for a second or two than back to normal.


When it happens to me, it won't stop until I either reboot or start MSI Afterburner that will automatically throttle it due to my customized fan profile.
This happens only when watching live streams though (youtube and other stuff).


----------



## iARDAs

Quote:


> Originally Posted by *toX0rz*
> 
> When it happens to me, it won't stop until I either reboot or start MSI Afterburner that will automatically throttle it due to my customized fan profile.
> This happens only when watching live streams though (youtube and other stuff).


Interesting

what temperature is it at when you are having this full fan issue?


----------



## Wogga

OCing a card like 590, known as bad OCing card, could be more interesting than OCing something like 580.

btw, i've seen nickel plexi EK waterblock on 6990 (looks same as 590) with nickel backplate - looks awesome. nearly as awesome as black backplate


----------



## Pmerkert

Hello everybody,

i have the same issue from time to time.My fan goes up to 100% no matter what temp.For me it seems like the card gets bored when i watch a vid etc.








A customized fan control won´t help i tried it with afterburner, ntune , nothing worked.
But when i play a game and my 590 has a lot to do it wont happen...Miracle

PS: Sorry if you dont understand me, i am from Germany ;-)

Cheers

Pmerkert


----------



## iARDAs

Those of you having the FAN issue which drivers are you using?


----------



## Image132

Are you sure it's an issue and not some sort of maintenance the card performs periodically? It happens to me but not often and only for a second or two and then spins down again.

Using Nvidia 285.62. I can't use the newer beta drivers because they don't support displayport for some reason. :/


----------



## Pmerkert

I use the 285.62 WHQL.too.Before i used the 290.xx beta, but it was no help in that case
No it is not really an issue , but i wake up everytime if iam sleeping..that sucks.









What a bout a firmwareupdate from Gainward? Did anyone tested it.
I think my fw is from May


----------



## Wogga

it's a known issue. there is a thread about it somwhere on nvidia forum. they stated that it is ok


----------



## iARDAs

One of my cores in the 590 runs 7-8 degrees hotter than the other one?

Is that normal?

They are always below 80 though

One is 78 and the other is 70

And its always the same core that is getting hotter.


----------



## MKHunt

Quote:


> Originally Posted by *iARDAs*
> 
> One of my cores in the 590 runs 7-8 degrees hotter than the other one?
> Is that normal?
> They are always below 80 though
> One is 78 and the other is 70
> And its always the same core that is getting hotter.


100% normal


----------



## iARDAs

Ah great to hear thanks

I believe this is due to the fact that its a SLI right?

never had an SLI rig before thats why I am asking.


----------



## MKHunt

Quote:


> Originally Posted by *iARDAs*
> 
> Ah great to hear thanks
> I believe this is due to the fact that its a SLI right?
> never had an SLI rig before thats why I am asking.


differences in silicon and cooling solutions. one of the cores doesnt have full contact on the stock heatsink due to the 8-pin connectors. similarly under water that core is usually cooled after the water has passed through the other core's fins.


----------



## rush2049

Just a short update, currently have my 590 running at:

700mhz core (1400 shader), memory is still at stock.... all with a core voltage of 963mV

I am on air still, only hitting 70C and 78C at 100% for 3+ hours of heaven.

It is heaven and BF3 stable.... good enough for me....


----------



## squishysquishy

Hello everyone. I am currently tinkering with my new build. I have 2 590's for SLI in my mATX case. Naturall, I only have one installed due to outrageously high temps (85+C in game). I have all the necessary components for liquid cooling, except for a pump.

Do you all thing that either of the following pumps will push enough fluid to cool 2 590's?
Danger Den DD-CPX1
Phobya DC12-400

I have a huge space issue in this case, because I want stock appearance. I only have (LxWxH): 3in x 2in x 3.5in for a pump.

What is in my box.
1x180mm rad.
2x180 phobya fans (push-rad-pull).
fesser tubing.
XSPC single bay reservoir.
Distilled water + kill coil.
Koolance VID-NX590
and the neccessary number of barbs and misc fittings.

only the GPU's in this loop. Temps pref under 70C at load?

please help, I would love to play some BF3 in surround...


----------



## MKHunt

180mm rad will not be enough to cool two 590s. Usually a 2x120mm rad is recommended PER 590. Keeping a stock appearance while cooling two 590s in a mATX case will be difficult to say the least.

Personally, I would avoid both of those pumps. I'd stick to the tried and true MCP-655 Vario or an MCP-355 or MCP-35X. I use the MCP-35X and it's one hell of a pump.


----------



## squishysquishy

Quote:


> Originally Posted by *MKHunt*
> 
> 180mm rad will not be enough to cool two 590s. Usually a 2x120mm rad is recommended PER 590. Keeping a stock appearance while cooling two 590s in a mATX case will be difficult to say the least.
> Personally, I would avoid both of those pumps. I'd stick to the tried and true MCP-655 Vario or an MCP-355 or MCP-35X. I use the MCP-35X and it's one hell of a pump.


I will try to use the MCP-35, that would be a really tight fit...hopefully I can make it fit without too much case modding.

Do you think a single 180mm rad would be sufficient at stock clocks? I might be able to cram 2X180mm in there...one for each.


----------



## MKHunt

Quote:


> Originally Posted by *Crackheadkid*
> 
> I will try to use the MCP-35, that would be a really tight fit...hopefully I can make it fit without too much case modding.
> Do you think a single 180mm rad would be sufficient at stock clocks? I might be able to cram 2X180mm in there...one for each.


One for each would be better. For two I wouldn't chance it. A 120mmx2 is 288 square centimeters of rad and a 180 is 324 square centimeters of rad. The 180mm fan also most likely has a lower static pressure and will push through less air.

At stock volts and clocks with a top end of 70C you can probably get away with 2x180mm as long as they both get cool air. The 590 is definitely not a card you want running any hotter than it already does on air.


----------



## kzinti1

There's something completely screwy going on with the image upload. I can't load from my computer through OCN and OCN can't accept the link from ImageShack.

I guess I'm not meant to show that I actually own a pair of EVGA GTX590's, so to hell with it.

Check out these scores, whether you believe me or not. 3 3DMark 11 and 1 3DMark Vantage.

http://3dmark.com/3dm11/2531115 P10000-Old Card

http://3dmark.com/3dm11/2531329 P10011-New Card

http://3dmark.com/3dm11/2531471 P15938-Quad SLi.

http://3dmark.com/3dmv/3811772 P47971- 3DMark Vantage

Oh. Friday the 13th. I get it now.


----------



## kzinti1

I will be damned. Finally the pic shows. I think this is everything that was asked for.


----------



## Swolern

Hey what's up guys. So I'm looking into a dedicated PhysX card for my EVGA 590 / i5 2500k / 1050w psu. I'm trying to play Batman Arkham City. With all settings & tessellation maxed and PhysX off I get 60-80 FPS but when I turn PhysX on high I get 25-50 FPS.

-Has anyone seen any benchmarks for a 590 & dedicated PhysX card?
-Anyone have a dedicated PhysX card setup with the 590?

I have been searching everywhere and can't find any benchmarks and don't know which card I should go with that won't be overkill. Batman AC states for PhysX on high a dedicated PhysX card needs to be a GTX 460 or higher. Thanks.

BTW Batman AC PhysX is the best I've seen, but it definitely at least 3x as complex as Batman AA PhysX. It's gorgeous!


----------



## Image132

Quote:


> Originally Posted by *Swolern*
> 
> Hey what's up guys. So I'm looking into a dedicated PhysX card for my EVGA 590 / i5 2500k / 1050w psu. I'm trying to play Batman Arkham City. With all settings & tessellation maxed and PhysX off I get 60-80 FPS but when I turn PhysX on high I get 25-50 FPS.
> -Has anyone seen any benchmarks for a 590 & dedicated PhysX card?
> -Anyone have a dedicated PhysX card setup with the 590?
> I have been searching everywhere and can't find any benchmarks and don't know which card I should go with that won't be overkill. Batman AC states for PhysX on high a dedicated PhysX card needs to be a GTX 460 or higher. Thanks.
> BTW Batman AC PhysX is the best I've seen, but it definitely at least 3x as complex as Batman AA PhysX. It's gorgeous!


This is a known issue if you havent patched your batman.

Here you go:

http://www.joystiq.com/2011/11/25/batman-arkham-city-pc-performance-issues-linked-to-dx11-patch/

and

http://www.joystiq.com/2011/12/07/batman-arkham-city-pc-patched-dx-11-fixed-for-some/

Have you patched it?


----------



## Galactipuss

SLI bridge finally came in. The center screws were short *** EK.. The 2 blocks on the bridge feel sturdy and heavy, no cards at this point. On the top block the plug closest to the bridge fitting was not clearing as is with most 1/2x3/4 so i had to file off a bit, rather do this than to have used a barb fitting


----------



## Swolern

Quote:


> Originally Posted by *Image132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Swolern*
> 
> Hey what's up guys. So I'm looking into a dedicated PhysX card for my EVGA 590 / i5 2500k / 1050w psu. I'm trying to play Batman Arkham City. With all settings & tessellation maxed and PhysX off I get 60-80 FPS but when I turn PhysX on high I get 25-50 FPS.
> -Has anyone seen any benchmarks for a 590 & dedicated PhysX card?
> -Anyone have a dedicated PhysX card setup with the 590?
> I have been searching everywhere and can't find any benchmarks and don't know which card I should go with that won't be overkill. Batman AC states for PhysX on high a dedicated PhysX card needs to be a GTX 460 or higher. Thanks.
> BTW Batman AC PhysX is the best I've seen, but it definitely at least 3x as complex as Batman AA PhysX. It's gorgeous!
> 
> 
> 
> This is a known issue if you havent patched your batman.
> 
> Here you go:
> 
> http://www.joystiq.com/2011/11/25/batman-arkham-city-pc-performance-issues-linked-to-dx11-patch/
> 
> and
> 
> http://www.joystiq.com/2011/12/07/batman-arkham-city-pc-patched-dx-11-fixed-for-some/
> 
> Have you patched it?
Click to expand...

Yes I have patched it. Before the patch I was getting framerate in the single digits. So DX11 is much better now. It's when I turn on PhysX is whats boggs me down. But I see why it's the most complex PhysX engine I've seen.

Anyone use dedicated PhysX card with 590 or seen benchmarks for it running Batman AC w/ a dedicated card. Thanks


----------



## heyskip

Quote:


> Originally Posted by *Swolern*
> 
> Yes I have patched it. Before the patch I was getting framerate in the single digits. So DX11 is much better now. It's when I turn on PhysX is whats boggs me down. But I see why it's the most complex PhysX engine I've seen.
> Anyone use dedicated PhysX card with 590 or seen benchmarks for it running Batman AC w/ a dedicated card. Thanks


I'm running a 560ti as dedicated PhysX with my 590. My benchmark results (vsync on, all settings max, latest patch) were MIN 18, MAX 60, AVG 43.

Actual gameplay is quite smooth with no severe slowdown yet, only played for hour so far. PhysX card usage peaks at around 60% so 560ti is a bit of overkill (had it leftover from old build).

It seems to me that the game still requires a bit of performance patching.


----------



## MKHunt

A bit? I got better framerates on BF3.


----------



## heyskip

And by a bit I mean alot









Its the reason I haven't played much yet. Couldn't put the original Arkham Asylum down but this one I'm going to wait for the next patch before jumping in.


----------



## L1eutenant

I was getting the same issue with Batman, but i was always getting above 30fps with physx on, so i was happy. Except one boss fight, i was down to 1-5 fps. there was a lot of papers flying around


----------



## Swolern

Quote:


> Originally Posted by *heyskip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Swolern*
> 
> Yes I have patched it. Before the patch I was getting framerate in the single digits. So DX11 is much better now. It's when I turn on PhysX is whats boggs me down. But I see why it's the most complex PhysX engine I've seen.
> Anyone use dedicated PhysX card with 590 or seen benchmarks for it running Batman AC w/ a dedicated card. Thanks
> 
> 
> 
> I'm running a 560ti as dedicated PhysX with my 590. My benchmark results (vsync on, all settings max, latest patch) were MIN 18, MAX 60, AVG 43.
> 
> Actual gameplay is quite smooth with no severe slowdown yet, only played for hour so far. PhysX card usage peaks at around 60% so 560ti is a bit of overkill (had it leftover from old build).
> 
> It seems to me that the game still requires a bit of performance patching.
Click to expand...

I'm trying to figure out if a dedicated graphics card is worth it for Batman AC. How much did your framerate increase with all settings maxed, PhysX high and Vsync off ? With just 590 compared to 590+ 560ti. Thanks a bunch for ur input.


----------



## heyskip

Quote:


> Originally Posted by *Swolern*
> 
> I'm trying to figure out if a dedicated graphics card is worth it for Batman AC. How much did your framerate increase with all settings maxed, PhysX high and Vsync off ? With just 590 compared to 590+ 560ti. Thanks a bunch for ur input.


I ran some benchmarks and with the game in its current state a dedicated PhysX card doesn't seem worth it (on my system anyway).

560ti dedicated physx - MIN 18 MAX 100 AVG 53

590 one core dedicated physx - MIN 23 MAX 71 AVG 52

590 physx active but no cores dedicated - MIN 19 MAX 107 AVG 52

CPU dedicated physx - MIN 20 MAX 114 AVG 49

If I get curious I may try my 560ti in the x8 slot instead of x4 to see if theres any difference. Also once the game is patched I will retry because a I expected a bigger difference with a 560ti dedicated. If it doesn't make a difference in a physx heavy game like AC then its probably not worth the extra power usage and increased heat.


----------



## Image132

Quote:


> Originally Posted by *heyskip*
> 
> I ran some benchmarks and with the game in its current state a dedicated PhysX card doesn't seem worth it (on my system anyway).
> 560ti dedicated physx - MIN 18 MAX 100 AVG 53
> 590 one core dedicated physx - MIN 23 MAX 71 AVG 52
> 590 physx active but no cores dedicated - MIN 19 MAX 107 AVG 52
> CPU dedicated physx - MIN 20 MAX 114 AVG 49
> If I get curious I may try my 560ti in the x8 slot instead of x4 to see if theres any difference. Also once the game is patched I will retry because a I expected a bigger difference with a 560ti dedicated. If it doesn't make a difference in a physx heavy game like AC then its probably not worth the extra power usage and increased heat.


Something that I found very interesting and very relevant to what you want to do:











Kinda makes the speed of pci-e 3.0 usless, at least for graphics cards.


----------



## Swolern

Quote:


> Originally Posted by *heyskip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Swolern*
> 
> I'm trying to figure out if a dedicated graphics card is worth it for Batman AC. How much did your framerate increase with all settings maxed, PhysX high and Vsync off ? With just 590 compared to 590+ 560ti. Thanks a bunch for ur input.
> 
> 
> 
> I ran some benchmarks and with the game in its current state a dedicated PhysX card doesn't seem worth it (on my system anyway).
> 
> 560ti dedicated physx - MIN 18 MAX 100 AVG 53
> 
> 590 one core dedicated physx - MIN 23 MAX 71 AVG 52
> 
> 590 physx active but no cores dedicated - MIN 19 MAX 107 AVG 52
> 
> CPU dedicated physx - MIN 20 MAX 114 AVG 49
> 
> If I get curious I may try my 560ti in the x8 slot instead of x4 to see if theres any difference. Also once the game is patched I will retry because a I expected a bigger difference with a 560ti dedicated. If it doesn't make a difference in a physx heavy game like AC then its probably not worth the extra power usage and increased heat.
Click to expand...

Hmm it looks as if the dedicated PhysX card does nothing for performance when added with the 590. Thanks for reply. Rep+1


----------



## Swolern

Quote:


> Originally Posted by *Image132*
> 
> Something that I found very interesting and very relevant to what you want to do:
> 
> 
> 
> 
> 
> 
> 
> 
> Kinda makes the speed of pci-e 3.0 usless, at least for graphics cards.


Interesting, thx for sharing. I dont think we will fully use PCI-e 2.0 bandwidth for many years until the 700 series cards or higher are released.


----------



## GDP

So I took my 590 apart last night and reapplied thermal paste, IC Diamond 7 to be precise. That stuff is like playdough lol. Then again I didnt heat it up like recomended and ironically it didnt help with temps really at all. I guess at least I didnt mess anything up either hah. Also I have to say who the hell designed the second heat sink contact lol.


----------



## Masked

Quote:


> Originally Posted by *GDP*
> 
> So I took my 590 apart last night and reapplied thermal paste, IC Diamond 7 to be precise. That stuff is like playdough lol. Then again I didnt heat it up like recomended and ironically it didnt help with temps really at all. I guess at least I didnt mess anything up either hah. Also I have to say who the hell designed the second heat sink contact lol.


If you have an Alienware and thus an AW ref 590, it's revision 2 so, it's actually recommended to keep the heat sink pads on...

I believe I even added a page to the 590 orders in your user manual about NOT doing the above...

It really doesn't help and it actually voids your warranty...


----------



## emett

I have no heat issues with my 590. Although i run an automatic custom fan profile.
That would have been what i tried long before i started taking the card apart.


----------



## DazzaS

Greetings,

Here is a pic of my GTX590 on water, no over-clocking *Yet. Case is Silverstone FT02










Temps max at 50deg on a HOT day running BF3.










Previously was running 2 GTX480's


----------



## kzinti1

When I run GPU-Z, which gpu is listed as #2? Since this is always the hottest, under load, I'm guessing it's the one furtherest from the power connectors at the front of my cards. I do this because the flow of air in the card goes from the front, at the power connectors, mainly to the rear of the card. Is this right?

I run 2 instances of GPU-Z all the time, usually minimized to the tray, and have been choosing gpu's 2 & 4 to monitor.

I really need to get these cards under water asap. There's a huge amount of hot air being pulled out of the top of my case. If I leave this (my sig) computer running 24x7 I don't need to run a heater in my room.

As I'm typing this, the cards are running at 38C on the top card and 34C on the lower one. About 23C ambient. The hottest I've run them, so far, is the low 90's. That's running Stone Giant at max settings, with a very mild OC.

Unfortunately, no there is no overvolting available even though the option is listed with the EVGA Precision v2.1.1 program running. I can move the slider but it instantly reverts to stock voltage when I release the mouse button. At idle the cards run at 0.8750mV and 0.9380mV max. I think adding about 0.25mV, or even more, would be best for these cards. Under water, of course. I'd like to volt mod the BIOS but don't know how and certainly don't want to brick these expensive cards. They can easily run any game I own at max settings, so I guess I'd best just leave the BIOS alone.


----------



## GDP

Quote:


> Originally Posted by *Masked*
> 
> If you have an Alienware and thus an AW ref 590, it's revision 2 so, it's actually recommended to keep the heat sink pads on...
> I believe I even added a page to the 590 orders in your user manual about NOT doing the above...
> It really doesn't help and it actually voids your warranty...


Meh whatevs. But to clarify this particular 590 only had pads on the memory chipsets and whatnot, which of course I kept those on. It said 503b I believe on the sticker. So when you say it voids the warranty, youre talking about the gfx card only I assume? And actually I think ive seen a 2c drop, certainly not worth it though. Also do you have any tips on the airflow of this case with a 590 given how hot the 590 gets?
Quote:


> Originally Posted by *emett*
> 
> I have no heat issues with my 590. Although i run an automatic custom fan profile.
> That would have been what i tried long before i started taking the card apart.


I do run an automatic fan profile. And im happy for you, that you have no heat issues.


----------



## Masked

Quote:


> Originally Posted by *GDP*
> 
> Meh whatevs. But to clarify this particular 590 only had pads on the memory chipsets and whatnot, which of course I kept those on. It said 503b I believe on the sticker. So when you say it voids the warranty, youre talking about the gfx card only I assume? And actually I think ive seen a 2c drop, certainly not worth it though. Also do you have any tips on the airflow of this case with a 590 given how hot the 590 gets?
> I do run an automatic fan profile. And im happy for you, that you have no heat issues.


I know...I ordered them that way.

I'm simply clarifying that it's a rev.2 (b) and it voids your warranty to do what you did.

I doubt we'd void the whole PC, as I don't think it's necessary but, your card is definitely void if you send it in that way and have an issue with it.

Depends on which of our cases you have...The Aurora is a different case from the rest etc...So, it really depends on what you ordered...If you let me know I can better answer that.


----------



## Dennybrig

Guys, i come to you since i think you are the only ones that can help me...
Look i just bought a gtx 590 videocard from another guy via eBay but when i connect it to my MOBO the light on the side starts (the GEFORCE logo) and the fan starts too but i cannot get an image out of the DVI ports! I already tried connecting it to the other pci express port but it does not work either.
I also already connected it to another computer and it shows video there(i tried it with a MOBO that did not had a processor running and just got to the BIOS screen, it was from a computer store near home).
I have a Maximus IV Gene Z and when i connect it to the MOBO and go to the BIOS the GPU. DIMM POST shows no cards are connected.

What to you guys think is the problem? In theory if the video card outputs video its an indication it works right? Also i was using a DVI D to HDMI adaptor on the DVI-I port of the GTX 590 i hope you can help me with that!

Many thanks friends!


----------



## rush2049

Check the cable you were using. Switch the one from the working computer to the other one. It might be the monitor not 'waking' from sleep mode very quickly....

Don't forget to plug in the 2 8-pin power connectors.... in case you did...


----------



## Dennybrig

But what about my MOBO not recognizing it?


----------



## firestorm1

hey guys. wondering if any of you can answer a question or 2 for me. i have an evga 590 classified comming to me in a couple weeks. i was wondering if nvidia till has the voltage locked on these cards? if so, is there a modded or unlocked bios floating around or would i be better off just seeing how far i can go on the stock voltage. also, what about fan control? is there manual adjustment for it or its it stuck on automatc?

thanks.


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> But what about my MOBO not recognizing it?


If it's powered but, will not turn on then ultimately, the card may be dead...
Quote:


> Originally Posted by *firestorm1*
> 
> hey guys. wondering if any of you can answer a question or 2 for me. i have an evga 590 classified comming to me in a couple weeks. i was wondering if nvidia till has the voltage locked on these cards? if so, is there a modded or unlocked bios floating around or would i be better off just seeing how far i can go on the stock voltage. also, what about fan control? is there manual adjustment for it or its it stuck on automatc?
> thanks.


Voltage is locked...There are bios floating around but, the 10%-15% gain really isn't worth it IMHO.


----------



## firestorm1

for that little, no not really. i know every card oc's different, but what clocks are people able to get on the stock voltage?


----------



## Masked

Quote:


> Originally Posted by *firestorm1*
> 
> for that little, no not really. i know every card oc's different, but what clocks are people able to get on the stock voltage?


700mhz in some cases...Nothing crazy ~ Even that's only a 5% increase in overall performance.

It's literally not worth the effort...Especially considering most whom have Gigabyte/Asus, it instantly violates the warranty.

Personally, I don't care if our (AW) customer's do it and I'm sure EVGA doesn't but, it's still just not worth it.

IMHO, rock it stock.


----------



## firestorm1

ok. well thanks for the info.


----------



## firestorm1

one more question. the card im getting is less than 5 months old and i want to replace the stock tim and pads that are on it with something different. i know it doesnt need it, but thats just how i am. i have some prolimatech pk1 im going to use for the gpu's but what about the thermal pads? what thickness should i get?


----------



## rush2049

I am selling my 590 along with my entire desktop. If anyone from the PA area is interested see this ad on craigslist: http://lancaster.craigslist.org/sys/2807393070.html

And I know the ad is not super technical.... if you have questions feel free to contact.


----------



## MKHunt

Totally worth stock volt overclocking for folding. I fold rock solid at 730MHz/.925V. 17.6k PPD.


----------



## GDP

Quote:


> Originally Posted by *Masked*
> 
> I know...I ordered them that way.
> I'm simply clarifying that it's a rev.2 (b) and it voids your warranty to do what you did.
> I doubt we'd void the whole PC, as I don't think it's necessary but, your card is definitely void if you send it in that way and have an issue with it.
> Depends on which of our cases you have...The Aurora is a different case from the rest etc...So, it really depends on what you ordered...If you let me know I can better answer that.


Yep Im aware your affiliation with AW, thats why I asked you the question directly. And trust me im not the type to expect warranty if it breaks anytime after modifying it in that manner. Hell I will even give you my tag if you want to add a note to your system lol. I actually have the aurora R3. I think its a decent design for what it is, thought was deffinately put into cooling. But I dont think it was ever designed for a card like the 590 (unavoidable with dual GPU). I already removed the bracket from the card that holds it to the PCI fan (impressed with the forethough on that though in regards to shipping). I also run both HDD and PCI fan @ 35%.

What are your thoughts on the PCI fan and the 590 back exhaust hitting one another? do you think the PCI fan could overpower the 590 fan causing hotter temps?

Lastly why the hell does the GPU2 heatsink only cover 85ish % of the GPU? lol.

Overall im happy with the r3. I added a second fan for push/pull on the CPU though. Not that I think it was really needed overall, but I did it to get more static preassure through the rad and to suck more hot air out. Anyway getting off topic lol ...


----------



## Masked

Quote:


> Originally Posted by *GDP*
> 
> Yep Im aware your affiliation with AW, thats why I asked you the question directly. And trust me im not the type to expect warranty if it breaks anytime after modifying it in that manner. Hell I will even give you my tag if you want to add a note to your system lol. I actually have the aurora R3. I think its a decent design for what it is, thought was deffinately put into cooling. But I dont think it was ever designed for a card like the 590 (unavoidable with dual GPU). I already removed the bracket from the card that holds it to the PCI fan (impressed with the forethough on that though in regards to shipping). I also run both HDD and PCI fan @ 35%.
> What are your thoughts on the PCI fan and the 590 back exhaust hitting one another? do you think the PCI fan could overpower the 590 fan causing hotter temps?
> Lastly why the hell does the GPU2 heatsink only cover 85ish % of the GPU? lol.
> Overall im happy with the r3. I added a second fan for push/pull on the CPU though. Not that I think it was really needed overall, but I did it to get more static preassure through the rad and to suck more hot air out. Anyway getting off topic lol ...


When/if you return the PC, just put the thermal pads back on...Not that it's going to happen but, it'll save you a tremendous amount of drama.

Back exhaust is not really a problem since, it won't exactly happen...The turbine spinning outward in the 590 is more of a pressure area than the front fan...I mean it's not ideal but, it's a 1-2c difference.

The GPU2 question is a good one and I don't have an answer for it.


----------



## MKHunt

The second vapor chamber covers less because it is physically smaller than the other due to the PCIE power connectors. As you can see here, the baseplate/shroud cannot physically house a vapor chamber of similar size and configuration.


The temps on that GPU are hotter not because of contact area, but because the vapor chamber is ever so slightly less capable due to size. On water you'll notice the same difference simply because the water hits that core second.

Don't worry too much about contact. The cooler still covers the entire core, just not the IHS.


----------



## GDP

Quote:


> Originally Posted by *Masked*
> 
> When/if you return the PC, just put the thermal pads back on...Not that it's going to happen but, it'll save you a tremendous amount of drama.
> Back exhaust is not really a problem since, it won't exactly happen...The turbine spinning outward in the 590 is more of a pressure area than the front fan...I mean it's not ideal but, it's a 1-2c difference.
> The GPU2 question is a good one and I don't have an answer for it.


What thermal pad? If you mean the thermal pads on the memory chips I never removed those. If you mean on the GPU itself, there was no pads, just paste. Just curious, as I have no intention of returning the PC. But thanks for the info. I still have no idea why mine seems to run hotter then a lot of other 590 owners. Same voltage as everyone else, same speed. Thats why I repasted it. I thought maybe it was the AW case but guess not.


----------



## iARDAs

my Zotac 590 runs at 80 degrees undero FULL LOAD.

Is yours higher than that bro?

i saw a video by Nvidia that the guy had set the fan profile at normal and his stock 590 was running at 85 degrees and the Nvidia developer was 100% cool with that.


----------



## grifers

Quote:


> Originally Posted by *grifers*
> 
> Hi!!. Please, can yours run Metro 2033 benchmark with this settings?:
> - 1080p, Directx 11, Very High, DOF and Tesselation Enable, MSA4X enable and Physx disable
> Stock clocks on Gtx590, one Gtx590 only please
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks a lot and sorry my language!!
> P.D - Show results (with GPU-z) image capture like this http://www.overclock.net/t/817064/metro-2033-benchmark-thread-using-official-tool/990#post_16101415


Im waiting................







.

Thanks!


----------



## Dennybrig

Guys, i need some help with my build, see, i have a MicroATX rig and im going to add two GTX 590s Watercooled with a Kollance Exos 2 V2 with Heatkiller waterblocks and i am making the calculations of the amount of nozzles and the lenght of tubing i will need and a doubt came to my mind.

I am planning on using the Heatkiller GPU-X² / GPU-X³ X Dual Link Bridge Block (link to the item is:

http://www.frozencpu.com/products/14821/ex-blc-1035/Heatkiller_GPU-X_GPU-X_X_Dual_Link_Bridge_Block_-_1_Sot_10191.html )

And my question is how do the video card waterblocks attach to the Heatkiller Dual Link Bridge? Do i need to consider 2 additional nozzles to attach each waterblock to the Dual Link? Or do the Waterblocks attach to the Dual Link in some way i havent yet seen?

The thing here is the more i look for a picture that shows 2 GTX 590 Video cards connected in SLI with the Heatkiller Waterblocks and the Dual Link the less i find any relevant info!

Please help! Im hesitant to build it and post how it looks!


----------



## GDP

Quote:


> Originally Posted by *iARDAs*
> 
> my Zotac 590 runs at 80 degrees undero FULL LOAD.
> Is yours higher than that bro?
> i saw a video by Nvidia that the guy had set the fan profile at normal and his stock 590 was running at 85 degrees and the Nvidia developer was 100% cool with that.


I have seen it hit 92 consistantly. That is, before the new TIM. I dont see above 88 now. I guess it may be that its all good until 97 but I just cant stand high temps lol. I am ADD about it.


----------



## iARDAs

Quote:


> Originally Posted by *GDP*
> 
> I have seen it hit 92 consistantly. That is, before the new TIM. I dont see above 88 now. I guess it may be that its all good until 97 but I just cant stand high temps lol. I am ADD about it.


Same here bro

I see 80 MAX. Usually around low 70s though

however I dont mind always gaming on low 60s though i cant bother with water cooling at this moment.


----------



## emett

I notice over 1,000 3dmark2011 points going from 608mhz to 700mhz. That's a lot more than 5%.


----------



## toX0rz

@Masked

Is it possible to run a 590 along with a 580 in Tri-SLI or why does your signature rig state GTX 590 + GTX 580?


----------



## Masked

Quote:


> Originally Posted by *emett*
> 
> I notice over 1,000 3dmark2011 points going from 608mhz to 700mhz. That's a lot more than 5%.


This is true, however, you don't actually see this gain in anything other than 3dmark...

In the VAST majority of games that are out there, you're already capped...

The only REAL reason to OC the card, or even to worry about it, is benching...Otherwise, it's simply a waste of energy.

The real world benefit is @5% which, is folly.
Quote:


> Originally Posted by *toX0rz*
> 
> @Masked
> Is it possible to run a 590 along with a 580 in Tri-SLI or why does your signature rig state GTX 590 + GTX 580?


My cards are not in Tri-Sli.

All versions of windows since XP have allowed the end user to have 2 different graphics driver profiles enabled on their system.

I use a GTX 590 on my primary monitor and I use the GTX 580 for Physx and to power the 2 outside monitors in my setup.

Since we've been over this many times, I'll mention it again ~ It is NOT possible to tri-sli the GTX590...Not even in customizing your drivers because you lose sync at the bios level...

It is possible to use them separately.


----------



## Dennybrig

Guys do you think te Corsair HX1000 is enough to run 2 gtx 590 in SLI?


----------



## andyvee

Going by my own usage i'd say no , in my case 1 gtx 590 was more than it could handle with all the other stuff being powered in my rig . trouble with the HX1000 is its really 2 500w psu's with a max of 40 amps per side . By the time i took my board / proc usage on 1 side and all my drives / fans etc on the other side i did'nt have enough amps left . It would work but under stress the pc would just powerdown without warning . As mentioned i only have 1 590 and so upgraded to an HX1050 with a single 100amp rail and all is now fine , if your running 2 then you may need more .


----------



## Image132

Quote:


> Originally Posted by *andyvee*
> 
> Going by my own usage i'd say no , in my case 1 gtx 590 was more than it could handle with all the other stuff being powered in my rig . trouble with the HX1000 is its really 2 500w psu's with a max of 40 amps per side . By the time i took my board / proc usage on 1 side and all my drives / fans etc on the other side i did'nt have enough amps left . It would work but under stress the pc would just powerdown without warning . As mentioned i only have 1 590 and so upgraded to an HX1050 with a single 100amp rail and all is now fine , if your running 2 then you may need more .


I currently have 1x 590 in my case with a HX1000 and have NO PROBLEMS at all with my gfx, load or otherwise. Maybe you got a bad one?


----------



## andyvee

Quote:


> Originally Posted by *Image132*
> 
> I currently have 1x 590 in my case with a HX1000 and have NO PROBLEMS at all with my gfx, load or otherwise. Maybe you got a bad one?


Maybe , it was 2 years old so maybe not 100% but when i added up the amps being drawn on my parts i was over the 40 amp a side limit . It would work if i had 1 8 pin power connector coming from 12v1 and 12v2 each but thats not recommended and could easily damage the 590
, card i have is also pulling more power i guess as its the POV ultra so slightly higher clocks ( 670 ) . End result is i'm working fine with the HX1050 and HX1000 is working fine in a friends rig with a single 580 but Op wants 2 590's and i think thats not gonna happen with either of these psu's .


----------



## Dennybrig

Guys, i come to you because i need your help watercooling my 2 GTX 590s. Look i recently got myself a HAF X Case in order to have more space in its interior to place radiartor, reservoir, etc on the case but i am short on budget so here is the straight question:

What would be the better combination of components that allow me to water cool my two GTX 590 cards?

I already have the waterblocks for the Cards (are the Heatkiller Hole Edition ones and are made out of copper)

I dont care if it is a Bay Reservoir Pump or an external one i just need to keep that bank account in check!!!

Please let me know and if by any means any of you have discount codes for the popular watercooling stores will be more than appreciated!

Thanks!!


----------



## w4rp1e

Palit GTX590
Stock clocks, heres proof:


----------



## squishysquishy

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys, i come to you because i need your help watercooling my 2 GTX 590s. Look i recently got myself a HAF X Case in order to have more space in its interior to place radiartor, reservoir, etc on the case but i am short on budget so here is the straight question:
> What would be the better combination of components that allow me to water cool my two GTX 590 cards?
> I already have the waterblocks for the Cards (are the Heatkiller Hole Edition ones and are made out of copper)
> I dont care if it is a Bay Reservoir Pump or an external one i just need to keep that bank account in check!!!
> Please let me know and if by any means any of you have discount codes for the popular watercooling stores will be more than appreciated!
> Thanks!!


Frozen CPU is where I ordered all of my stuff. I used "pcapex" and got 5% off. which covered my shipping. Unfortunately, I had to pay tax on a 600 dollar order :'( since I live in NY. BTW, it was still valid as of 1/28/2012 8:30pm EST ^__^

I wish I could have found a 10% off coupon...would have covered tax and shipping. drats


----------



## Dennybrig

Thanks for the coupon!


----------



## squishysquishy

Quote:


> Originally Posted by *Dennybrig*
> 
> Thanks for the coupon!


No problem, be sure to shop around. I paid a little more there because they are closer to me. saved me on shipping.


----------



## squishysquishy

On an unrelated note, What does everyone think is the best thermal paste to use with the water blocks? I have enough MX4 from my last build for my 2 590's. But I only want to apply it once.

Information please ^___^


----------



## MKHunt

Quote:


> Originally Posted by *Crackheadkid*
> 
> On an unrelated note, What does everyone think is the best thermal paste to use with the water blocks? I have enough MX4 from my last build for my 2 590's. But I only want to apply it once.
> Information please ^___^


MX-4 is good. My personal favorite is IC Diamond.


----------



## Wogga

MX-2 is good too. 38C under load


----------



## squishysquishy

Quote:


> Originally Posted by *Wogga*
> 
> MX-2 is good too. 38C under load


I am assuming that you are talking about your processor...otherwise those are ridiculously low GPU temps.


----------



## Masked

On water, I don't think my 590 has ever even seen 40 on load...Max so far is 34c.


----------



## ProfeZZor X

I need some advice from some of you experts... and fairly quickly.









I'm in the process of building my first water cooled rig, and my intentions are to build a high end rig to support 3D content (movies, gaming, pictures, etc). I'm also not the type to change up my hardware components frequently either. So whatever GPU I put in it, it'll be there for a good 5 plus years. I've heard a lot of mixed reviews on the EVGA GTX590 hydro copper, some good and some bad. The consensus seems to be that some hardcore gamers prefer using dual 580's over just one 590 - mostly in part to their overclocking abilities, whereas the 590 has some bugs to work out in the overclocking department. I don't plan on overclocking my 590, and I have no problem buying an addition 590 sometime down the line if it came down to upgrading my rig at some point, but the question of the day is whether or not I should go with a single 590 3MB (for now), or go with the fan-favorite of dual 3MB 580's and be done with it.

I'm buying one or the other this week, so any help would be greatly appreciated.


----------



## squishysquishy

Quote:


> Originally Posted by *Crackheadkid*
> 
> I am assuming that you are talking about your processor...otherwise those are ridiculously low GPU temps.


Quote:


> Originally Posted by *Masked*
> 
> On water, I don't think my 590 has ever even seen 40 on load...Max so far is 34c.


Wow, I am fairly excited to see how much lower my temps are then. because right now they are 85ish C in game with the standard air cooler. ohh Thursday you need to get here sooner ^__^


----------



## MKHunt

Quote:


> Originally Posted by *Crackheadkid*
> 
> Wow, I am fairly excited to see how much lower my temps are then. because right now they are 85ish C in game with the standard air cooler. ohh Thursday you need to get here sooner ^__^


In a GPU-only loop you can get ridiculous temps. My loop is split between 2600k and 590. Max load (folding cpu/gpu both OCed) with 2x 240mm and 1x 120mm I see 45C w/ side panel on and 43C side panel off.


----------



## Wogga

Quote:


> Originally Posted by *Crackheadkid*
> 
> I am assuming that you are talking about your processor...otherwise those are ridiculously low GPU temps.


37/38/37/37 as i see now. gpu temp, really. single loop with cpu, mobo mosfets and 4 gpus. its just a raw power of MoRA3 9x120


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> 37/38/37/37 as i see now. gpu temp, really. single loop with cpu, mobo mosfets and 4 gpus. its just a raw power of MoRA3 *9x120*


rofl and suddenly everything makes sense.


----------



## iARDAs

New beta drivers are out

http://www.guru3d.com/news/geforce-forceware-29551-download-available/

*Key Bug Fixes and Enhancements

Fixes instances where the GeForce GTX 590 fan unnecessarily increased to 100%.*

That was a problem where people including me (rarely) was suffering with


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> New beta drivers are out
> http://www.guru3d.com/news/geforce-forceware-29551-download-available/
> *Key Bug Fixes and Enhancements
> Fixes instances where the GeForce GTX 590 fan unnecessarily increased to 100%.*
> That was a problem where people including me (rarely) was suffering with


yep, after almost a year, they finally fixed it.
good news.


----------



## iARDAs

Quote:


> Originally Posted by *toX0rz*
> 
> yep, after almost a year, they finally fixed it.
> good news.


Yeah i wonder why something like that took almost a year.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah i wonder why something like that took almost a year.


Look at Battlefield 3 optimization and look at SWTOR's...Same will be true of GW2.

Apparently they've been so busy they no longer have time to roll out drivers...I seem to think differently.

Without optimization the function of your card basically halves and that's what a tremendous amount of us are seeing.

Personally, I feel it's apathy and the fact they want the majority of users to swap to Kepler...Which, they'll now have to, to get the best performance.


----------



## YP5 Toronto

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys do you think te Corsair HX1000 is enough to run 2 gtx 590 in SLI?


depends on the specs of the other equipment running on the PSU.

based on what I can see in your sig...its too boarderline for my comfort. This is assuming you are OCing the 2700k

http://extreme.outervision.com/psucalculatorlite.jsp


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Look at Battlefield 3 optimization and look at SWTOR's...Same will be true of GW2.
> Apparently they've been so busy they no longer have time to roll out drivers...I seem to think differently.
> Without optimization the function of your card basically halves and that's what a tremendous amount of us are seeing.
> Personally, I feel it's apathy and the fact they want the majority of users to swap to Kepler...Which, they'll now have to, to get the best performance.


Perhaps you are right.

if a kepler exceeds the performance of a 590 i might go for it

but normally i will wait for 7 series

I just hope Nvidia releases better drivers.


----------



## Image132

Quote:


> Originally Posted by *iARDAs*
> 
> Perhaps you are right.
> if a kepler exceeds the performance of a 590 i might go for it
> but normally i will wait for 7 series
> I just hope Nvidia releases better drivers.


Yep, I'm in the same boat as you. Unless kepler is surprisingly good I'm going to wait for maxwell.

Just sucks nvidia haven't release any new non-beta drivers. Since I use displayport and all their beta drivers that I've tried don't support displayport I'm stuck with these old drivers


----------



## ProfeZZor X

Looks like my 590 hydro copper is finally ordered and being shipped as of today. I don't plan on overclocking any of my components, so I'm not as concerned about that aspect of this card as some experienced users are. My main interest is that it has the 3D surround feature, that I'll be sure I take full advantage of when I get my displays later this summer. I feel more confident with my purchase of just the one 590, over buying two 3MB 580 hydro coppers that I had the oppotunity to buy. At least by the time I get my 3D displays, the price will probably have gone down on the 590's, should I decide to get another one.


----------



## kzinti1

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Looks like my 590 hydro copper is finally ordered and being shipped as of today. I don't plan on overclocking any of my components, so I'm not as concerned about that aspect of this card as some experienced users are. My main interest is that it has the 3D surround feature, that I'll be sure I take full advantage of when I get my displays later this summer. I feel more confident with my purchase of just the one 590, over buying two 3MB 580 hydro coppers that I had the oppotunity to buy. At least by the time I get my 3D displays, the price will probably have gone down on the 590's, should I decide to get another one.


Where did you find any EVGA GTX590 Hydro-Copper's? There not even listed at EVGA anymore since GTX590's are no longer being made, by anyone.

Since you found even one, you're quite lucky. It took me a couple of months of searching before I could buy my 2nd EVGA GTX590 Classified.

I never even saw the Hydro-Coppers in stock, anywhere, when I went shopping for my 1st GTX590.

At this late date, I would've tried to wait a few months for Fermi, but that's your decision and none of my business. I don't like waiting, either, but I'd have certainly tried to.


----------



## Masked

Quote:


> Originally Posted by *kzinti1*
> 
> Where did you find any EVGA GTX590 Hydro-Copper's? There not even listed at EVGA anymore since GTX590's are no longer being made, by anyone.
> Since you found even one, you're quite lucky. It took me a couple of months of searching before I could buy my 2nd EVGA GTX590 Classified.
> I never even saw the Hydro-Coppers in stock, anywhere, when I went shopping for my 1st GTX590.
> At this late date, I would've tried to wait a few months for Fermi, but that's your decision and none of my business. I don't like waiting, either, but I'd have certainly tried to.


I'm more than likely selling my last 590 HC which, actually has the overnight ARMA...If anyone is interested, PM me.

That being said, many people have asked me why there is a love/hate relationship with this card and I'll answer that publicly.

I'm very, very, intimate with the 590 because we had an exclusive for the Auroras so, quite frankly, I know much more than I should and/or would like to.

The love/hate stems mostly from Nvidia's stance on the card, itself.

When the card was initially released, it did surpass the 6990 in practically every aspect, un-optimized, because even capped, that's what it's capable of doing.

Some say it's 2x 570's...Which NOW that the 570 has been optimized as much as it has (Over 60% performance gain since launch) is true but, it wasn't on release...Far from the truth at release. In fact, on the day of release (I remember this because our UPS guy showed up at 6:15 that morning with our stock and I almost peed myself) it crushed the 570's in SLI and even rivaled the 580's in SLI (Was JUST tied)...Surpassed the 6990 in nearly every game/real-world aspect and at the end of the day, was a good buy for 700$.

The 590, as Nvidia would have you believe, is the perfect card...There are no issues, no voltages locks, no overclocking problems...The card is absolutely perfect...THIS IS THE PROBLEM.

You as the end user CAN optimize the card (In which it actually becomes 2x580's slightly OC'd) but, the amount of work to do so just, isn't worth the end result because you still lack a driver profile. Still, it can be done and this is of note, the card CAN be taken far beyond 2 OC'd 570's.

From a tech standpoint, the card is a dream for what it is...It's easily streamlined, the heat is less than 2 580's; it disspates easily and has relatively no driver issues, far be it from the performance drivers...Still no REAL issues.

I, as a tech/admin/whatever, love this card because it's simple, it runs cool and it gets by...Others, have issue because it doesn't do what it promised and I absolutely agree.

So the love is from the card itself and what some of us see it as...The hate stems from Nvidia, themselves and their overall apathy towards a good design.


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> I'm more than likely selling my last 590 HC which, actually has the overnight ARMA...If anyone is interested, PM me.
> From a tech standpoint, the card is a dream for what it is...It's easily streamlined, the heat is less than 2 580's; it disspates easily and has relatively no driver issues, far be it from the performance drivers...Still no REAL issues.
> I, as a tech/admin/whatever, love this card because it's simple, it runs cool and it gets by...Others, have issue because it doesn't do what it promised and I absolutely agree.
> So the love is from the card itself and what some of us see it as...The hate stems from Nvidia, themselves and their overall apathy towards a good design.


This is pretty much how I feel. Nvidia themselves is so apathetic towards the card and owners of the card. As far as nvidia is concerned, so long as it _can_ surpass the 6990 they don't need to do anything else.


----------



## ProfeZZor X

Quote:


> Originally Posted by *kzinti1*
> 
> Where did you find any EVGA GTX590 Hydro-Copper's? There not even listed at EVGA anymore since GTX590's are no longer being made, by anyone.
> Since you found even one, you're quite lucky. It took me a couple of months of searching before I could buy my 2nd EVGA GTX590 Classified.
> I never even saw the Hydro-Coppers in stock, anywhere, when I went shopping for my 1st GTX590.
> At this late date, I would've tried to wait a few months for Fermi, but that's your decision and none of my business. I don't like waiting, either, but I'd have certainly tried to.


I tried Newegg and Amazon, but not such luck... There are also a couple of the HC's floating around on ebay right now. Some new, others used. The one I bought is brand new.

No offense, but I don't share everyone's jaded feelings on this card. This being my first time water cooling rig, I want this build to have as few cooling or overclocking issues as possible - which means that even though I may not be buying the absolute end-all-be-all of cards, it has a simplicity and reputation about it that makes me feel confident in buying it. And with that, rather than upgrade again in another 3 to 5 years chasing the latest and greatest GPU, I'd rather have a card that won't be too grossly out of touch with today's and distant future games, 3D and HD media... If push comes to shove, I'll just double up on cards in a few years. And by then, they'll be significantly cheaper.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I'm more than likely selling my last 590 HC which, actually has the overnight ARMA...If anyone is interested, PM me.
> That being said, many people have asked me why there is a love/hate relationship with this card and I'll answer that publicly.
> I'm very, very, intimate with the 590 because we had an exclusive for the Auroras so, quite frankly, I know much more than I should and/or would like to.
> The love/hate stems mostly from Nvidia's stance on the card, itself.
> When the card was initially released, it did surpass the 6990 in practically every aspect, un-optimized, because even capped, that's what it's capable of doing.
> Some say it's 2x 570's...Which NOW that the 570 has been optimized as much as it has (Over 60% performance gain since launch) is true but, it wasn't on release...Far from the truth at release. In fact, on the day of release (I remember this because our UPS guy showed up at 6:15 that morning with our stock and I almost peed myself) it crushed the 570's in SLI and even rivaled the 580's in SLI (Was JUST tied)...Surpassed the 6990 in nearly every game/real-world aspect and at the end of the day, was a good buy for 700$.
> The 590, as Nvidia would have you believe, is the perfect card...There are no issues, no voltages locks, no overclocking problems...The card is absolutely perfect...THIS IS THE PROBLEM.
> You as the end user CAN optimize the card (In which it actually becomes 2x580's slightly OC'd) but, the amount of work to do so just, isn't worth the end result because you still lack a driver profile. Still, it can be done and this is of note, the card CAN be taken far beyond 2 OC'd 570's.
> From a tech standpoint, the card is a dream for what it is...It's easily streamlined, the heat is less than 2 580's; it disspates easily and has relatively no driver issues, far be it from the performance drivers...Still no REAL issues.
> I, as a tech/admin/whatever, love this card because it's simple, it runs cool and it gets by...Others, have issue because it doesn't do what it promised and I absolutely agree.
> So the love is from the card itself and what some of us see it as...The hate stems from Nvidia, themselves and their overall apathy towards a good design.


That's really interesting Masked, and makes a lot of sense.

I also wonder if the misbegotten press the card got in the beginning due to the reviewers ineptitude possibly scared Nvidia off from developing/optimizing the card the way they do with others?

The unfortunate thing is, while the usual move would be to switch over to AMD, there seems to be just as many prevalent issues on their end as well.

It feels a little like the U.S. political system. It doesn't matter if you go Democrat or Republican, you're fracked either way.

Kinda makes one wish for a 3rd GPU manufacturer to add some competition and get these two off their asses.

Nvidia does seem, going by the media, to be a little too focused on Tegra and leaving their high end customers hanging.

Masked, If you move your 590 Hydro before Kepler comes out, what are you going to use in the meantime?

Other than driver installation hassles, I have no problem with AMD. If their cards merit exploration over Nvidia's offerings, I'm open to them.

Here's to hoping the new beta driver somehow fixes performance in SWTOR. I can't play the game in SLI without bad crashes that need a whole restart, or, it always crashes to desktop when playing through a single GPU.


----------



## MKHunt

I got jelly of wogga's 2133 + 590 combo... so I got some of my own.

Of course with a nice 590 in the picture too.



Can the thread really have too many 590 pics? I think no.

Oh and sweet sweet low volts


----------



## Wogga

nice!
btw mine are stable only at [email protected]~1.69v
so i have to be jealous now


----------



## toX0rz

Quote:


> Originally Posted by *Masked*
> 
> In fact, on the day of release (I remember this because our UPS guy showed up at 6:15 that morning with our stock and I almost peed myself) it crushed the 570's in SLI and even rivaled the 580's in SLI (Was JUST tied)...Surpassed the 6990 in nearly every game/real-world aspect and at the end of the day, was a good buy for 700$.


Hows that?
It's essentially two heavily underclocked 580's, how would it tie two regular 580s then? That just doesnt make sense.
In fact, in every launch review I have seen the car performed even slightly below two 570's (~5%) if it wasnt for some insane triple monitor 5760x1080 resolutions where the 570's would hit VRAM limit earlier and fall behind by a small margin.

Also, the 6990 has always been faster by a small bit in the majority of games ever since the release of the 590...

Quote:


> From a tech standpoint, the card is a dream for what it is...It's easily streamlined, *the heat is less than 2 580's; it disspates easily* and has relatively no driver issues, far be it from the performance drivers...Still no REAL issues.
> I, as a tech/admin/whatever, love this card because it's simple, *it runs cool* and it gets by...Others, have issue because it doesn't do what it promised and I absolutely agree.
> So the love is from the card itself and what some of us see it as...The hate stems from Nvidia, themselves and their overall apathy towards a good design.


Thats true.
It produces way less heat than a respectively comparable SLI system and has an outstanding cooling solution on top of that.
In the guru3d review it pulled 70Watts less than a GTX 570 SLI setup which is amazing.
I myself reach temperatures of 72°C - 76°C at load with the stock fan using a custom fan profile @ 70%, it's outragous. Even stock 560 Ti's will have a hard time matching these values. (yeah I know it will seem very high for you watercooling gurus but from a stock cooler perspective


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> Hows that?
> It's essentially two heavily underclocked 580's, how would it rival two regular 580s then?
> In fact, in every launch review I have seen the car performed even slightly below two 570's (~5%) if it wasnt for some insane triple monitor 5760x1080 resolutions where the 570's would hit VRAM limit earlier.
> Is there any evidence of the 590 beating two 570s? I haven seen it. (Im talking stock clocks)
> Thats true.
> It produces way less heat than a respectively comparable SLI system and has an outstanding cooling solution on top of that.
> In the guru3d review it pulled 70Watts less than a GTX 570 SLI setup which is amazing.
> I myself reach temperatures of 72°C - 76°C at load with the stock fan using a custom fan profile @ 70%, it's outragous. Even stock 560 Ti's will have a hard time matching these values.


You forget that for the first 2 weeks post release the card was not only over-clockable but, performed well at 850mhz.

In fact, all mine were clocked at 850-900mhz and HANDILY beat the 6990's and all 570's in the office.

I believe in the first @ 15 pages you'll see my original stats and that of others.

Reviewers within the first 2 weeks, took the cards overboard and thus, not only the discrepancy but, the "false positives".

In fact, post-driver-locking, you'll see Hard's review actually surpasses the 570's in SLI.

As I stated before, ON RELEASE, which means pre-optimization, the 590 STRONGLY beat the 570's...

I don't know what's so complicated to understand here...

The 590 has seen 0 optimization since release, in fact the first update, arguably since the voltage lock, occurred yesterday...

The 570 has seen 6 optimized driver releases increasing it's overall performance by about 60%.

Am I speaking Greek or something?

It's because the 590 is apparently "perfect" that there has not been ANY optimization for the card itself...It's arguably the same it has been since 2 weeks after release...Not 1 SINGLE performance update.

So of course 570's in SLI NOW handily beat the card...However, on RELEASE when the 570 had no optimization, the 590 was the king.

When unlocking this card and taking it to 900mhz, it actually even beat the 580's in SLI ~ I believe Guru originally did that review but, it was archived a while back -- Anyone who is familiar with this review would agree.

There is a total LACK of support with this card and personally, I feel that's disgusting and the primary problem.

Had we still been able to take this card to 900mhz, we honestly wouldn't be having this conversation because the 590's dominance over even OC'd 570's would be beyond reproach...It still is actually.

As I stated before, if you go above and beyond, unlock the card and take it to 900mhz; write your own driver profile etc...It handily beats 570's OC'd ~ Evidence of this is RaginCain's thread/OC results...

I find it a little ridiculous that whenever this subject comes up, you come at me guns blazing when the facts have been out for over a year...Hell, I bet my 10 year old cousin could point out how optimized the 570 has become since launch compared to the 590.

I genuinely don't understand where the lack of comprehension here is...Shall I publish a paper on the 590 to formally get the above recognized or are we good with simple facts you can google?

I'm not attacking you, I genuinely don't understand or comprehend where the issue is...


----------



## toX0rz

Quote:


> Originally Posted by *Masked*
> 
> You forget that for the first 2 weeks post release the card was not only over-clockable but, performed well at 850mhz.
> In fact, all mine were clocked at 850-900mhz and HANDILY beat the 6990's and all 570's in the office.
> I believe in the first @ 15 pages you'll see my original stats and that of others.
> Reviewers within the first 2 weeks, took the cards overboard and thus, not only the discrepancy but, the "false positives".
> In fact, post-driver-locking, you'll see Hard's review actually surpasses the 570's in SLI.
> As I stated before, ON RELEASE, which means pre-optimization, the 590 STRONGLY beat the 570's...
> I don't know what's so complicated to understand here...
> The 590 has seen 0 optimization since release, in fact the first update, arguably since the voltage lock, occurred yesterday...
> The 570 has seen 6 optimized driver releases increasing it's overall performance by about 60%.
> Am I speaking Greek or something?
> It's because the 590 is apparently "perfect" that there has not been ANY optimization for the card itself...It's arguably the same it has been since 2 weeks after release...Not 1 SINGLE performance update.
> So of course 570's in SLI NOW handily beat the card...However, on RELEASE when the 570 had no optimization, the 590 was the king.
> When unlocking this card and taking it to 900mhz, it actually even beat the 580's in SLI ~ I believe Guru originally did that review but, it was archived a while back -- Anyone who is familiar with this review would agree.
> There is a total LACK of support with this card and personally, I feel that's disgusting and the primary problem.
> Had we still been able to take this card to 900mhz, we honestly wouldn't be having this conversation because the 590's dominance over even OC'd 570's would be beyond reproach...It still is actually.
> As I stated before, if you go above and beyond, unlock the card and take it to 900mhz; write your own driver profile etc...It handily beats 570's OC'd ~ Evidence of this is RaginCain's thread/OC results...
> I find it a little ridiculous that whenever this subject comes up, you come at me guns blazing when the facts have been out for over a year...Hell, I bet my 10 year old cousin could point out how optimized the 570 has become since launch compared to the 590.
> I genuinely don't understand where the lack of comprehension here is...Shall I publish a paper on the 590 to formally get the above recognized or are we good with simple facts you can google?


I did get you hence why i specifically said *launch review* as in review on release date.
I also mentioned that im talking stock clocks and not about OC comparisons.

And yes, even in the HardOCP review it did perform just as good and even worse than GTX 570 SLI.






Obviously OC'd it would look different, then again i didnt expect it could run 900MHz with the weak power circuit on this card. Was it a stable overclock that you used 24/7 ?


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> I did get you hence why i specifically said *launch review* as in review on release date.
> I also mentioned that im talking stock clocks and not about OC comparisons.
> And yes, even in the HardOCP review it did perform just as good and even worse than GTX 570 SLI.
> 
> 
> 
> 
> 
> Obviously OC'd it would look different, then again i didnt expect it could run 900MHz with the weak power circuit on this card. Was it a stable overclock that you used 24/7 ?


At 1.0v the card, itself; is actually very stable.

Again, I didn't mean to insult you in my previous post, I just don't understand where the disconnect is/was.

When hard did their review, the card had already been driver locked from a manufacturer standpoint, thus why I cited Guru...Guru OC'd their's and I believe it even won editor's choice? ~ I have so much going on right now, that fact is very fuzzy...But, I do remember the release reviews being better than hard's.

The issue is that there are a plethora of just bad reviews...Reviews that didn't take the original specs into consideration and/or they blew up the card...Most of them you'll find, blew up the card.

The 590, itself, performs WELL at 1.0v; very well, in fact...And I know Cain had his above 800mhz, mine ran spectacularly at 900mhz...

Hell, they actually RELEASED at stock 700mhz...It was only after the "disaster" of Sweclockers and others was an IMMEDIATE (I mean overnight) driver put out that not only locked the card, but they actually changed the boxes on the following runs to reflect the new clock speeds.

This is why EVGA's boxes feature no actual literature.

The problem, as I stated, in Greek -- Is that the 570, even by that time, had 2 MAJOR optimizations...Those reviews are done just after the 570 received it's 2nd update.

You have to remember that the 570 was released in December and the 590, March. (Almost April)

That discrepancy is what gives the 570 the edge -- If you kick back the 570 to release, the 590 does easily defeat it...Again, just semantics.

This is bottom line semantics...Clock for clock, CURRENTLY, they're tied. That I will absolutely agree with...Right now, they're tied.

If the 590 had seen the same attention the 570 did, the 590, just based on historic Nvidia optimization, would be about 40-50% of what it currently is...Which, you, yourself can actually accomplish...Only, it's an incredible amount of work.

I write our mobile drivers on occasion (You'll find several custom AW drivers out there) and it's an absolute nightmare to address the BSOD and base issues but, again, it is possible.

I believe the issue lies with the card being a limited edition...It's perfect and there just aren't enough of us to justify the manpower to actually address our issues.

Like I said, love/hate ~~ We love it because of how simple integration is and how little problems there are...We hate it because we know what it can be and Nvidia will never take it there.

Edit:

The VRM's aren't actually weak at all...They're the highest quality VRM's that a board company has ever used...The issue lies in the delivery and only the delivery.

The caps/lanes could not handle the load and thus they "blow out" at nearly the same location, every time...The lowest-2nd core based cap and/or the physical casing itself, because the load is in such excess, it actually back-feeds.


----------



## iARDAs

Could someone recommend me a cooler for my 590?

not water cooling as its very tough to find around here. and very expensive.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Could someone recommend me a cooler for my 590?
> not water cooling as its very tough to find around here. and very expensive.


There are no real aftermarket air coolers for the 590.

I'm sure there are 1 or 2 but, the only real coolers for the card were water-based.


----------



## toX0rz

Quote:


> Originally Posted by *Masked*
> 
> If the 590 had seen the same attention the 570 did, the 590, just based on historic Nvidia optimization, would be about 40-50% of what it currently is...*Which, you, yourself can actually accomplish...Only, it's an incredible amount of work*.
> I write our mobile drivers on occasion (You'll find several custom AW drivers out there) and it's an absolute nightmare to address the BSOD and base issues but, again, it is possible.
> I believe the issue lies with the card being a limited edition...It's perfect and there just aren't enough of us to justify the manpower to actually address our issues.
> Like I said, love/hate ~~ We love it because of how simple integration is and how little problems there are...We hate it because we know what it can be and Nvidia will never take it there.
> Edit:
> The VRM's aren't actually weak at all...They're the highest quality VRM's that a board company has ever used...The issue lies in the delivery and only the delivery.
> The caps/lanes could not handle the load and thus they "blow out" at nearly the same location, every time...The lowest-2nd core based cap and/or the physical casing itself, because the load is in such excess, it actually back-feeds.


How exactly?
By re-writing own drivers or did I get you wrong?


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> There are no real aftermarket air coolers for the 590.
> I'm sure there are 1 or 2 but, the only real coolers for the card were water-based.


Thanx for the info...

I just thought maybe i would overclock my GPU from stock but i dont want to do it with the stock fan. I am scared.

I will probably wait for water cooling to be more popular and cheaper here.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thanx for the info...
> I just thought maybe i would overclock my GPU from stock but i dont want to do it with the stock fan. I am scared.
> I will probably wait for water cooling to be more popular and cheaper here.


If you want...As long as you cover cost, I wouldn't mind shipping to you -- Apparently I keep the local UPS shop in business so, by all means, if you'd like to go on water, would be more than happy to send it your way.


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> How exactly?
> By re-writing own drivers or did I get you wrong?


Yes.

You have to unlock the BIOS on the card, which is done through a re-flash.

So now you have an unlocked base, almost like jailbreaking your phone.

So, now it requires the ability to go above and beyond what it was.

The cards are actually DRIVER and BIOS locked.

So you have to actually edit both, in order to see full functionality.

You can unlock your bios and still see 900mhz only, the card itself is capped because the "stock" drivers won't allow you to take it anywhere.

If you edit BOTH, you can actually have a fully functional, release GTX 590 operating at 900mhz with the optimization it "should" have.

However, now you have a major issue because your custom driver doesn't sync with any graphics settings of any games so, essentially all you can actually do is bench.

I believe K1ngp1n unlocked his fully for an nitro-run but, again, I've got so much whirling around right now, I can't give dates.

I know of 2 clients that have done the above and as an extra, I actually test/fix their drivers a little but, that's not something I can release...However, their results are...Where the card should/could be if Nvidia would put the time in.

Personally, I'm just looking to move on -- The AMD drivers are actually craptastic, much more than Nvidia's but, I think I'm going to go with Kepler, ultimately.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> If you want...As long as you cover cost, I wouldn't mind shipping to you -- Apparently I keep the local UPS shop in business so, by all means, if you'd like to go on water, would be more than happy to send it your way.


That would actually be awesome.

However the problems are

1-) Customs. You never know if an item is stuck

2-) No idea how I can ever convert my system into a water cooling one.

3-) If somethings go wrong I dont know anyone near me know much about watercooling.

though i will look into water cooling more. Perhaps its something i can do myself.

thank you for the offer though. If i change my mind I could ask your help in shipping. I would pay via paypal instantly. but again, i need to know that its something that i can deal with technically


----------



## MKHunt

What drivers are everyone using? I'm still on 285.62 because it lets me go to .938V rather than .925V.

Any advantages to the newer drivers?


----------



## Wogga

290.53 w/ custom bios (0.963v)


----------



## rush2049

I am on the latest beta drivers (came out like 2 days ago?) and am on an unlocked bios.

Running 700mhz @ .963mV with the stock heatsink.
Idle temp is 35C and load temp is 75C (though my case has great airflow)


----------



## Recipe7

Hmmm... where can I get me this custom bios


----------



## rush2049

The "gtx 590 flashing and overclocking' thread

do a search in the nvidia section....

Ragincain was the main person that put it together (way way back, that is if I can say that in relation to 590 history)


----------



## Recipe7

Will look into it rush, thanks.

What driver are you running? The new 295 i presume? (I'm not sure if there are already two 295s around).


----------



## rush2049

295.51

Only glitch I have seen so far: it has texture flickering issues in Civ 5.
(probably going to be fixed in the next version)


----------



## Recipe7

Ok, great. Thanks rush.


----------



## MKHunt

Quote:


> Originally Posted by *rush2049*
> 
> The "gtx 590 flashing and overclocking' thread
> do a search in the nvidia section....
> Ragincain was the main person that put it together (way way back, that is if I can say that in relation to 590 history)


Do the mod bios work with the newer revision cards? I'm hesitant because they swapped out parts for ones with better tolerances but included a different bios than before.


----------



## rush2049

If the bios that shipped with your 590 has a 40 in it somewhere (sorry I don't have the data in front of me), might be a 41 or 42..... that is the newer bios/ revision....

The old one is 7.10.37.00.0X and fluctuations around there depending on your vendor specific modifcations.

Old bios cards can flash to the higher bios, but the ones that shipped higher I haven't had any success flashing down.... (tried with the one card i got my hands on that had it).

So if you have the newer revision dig through that flashing thread, there was someone making newer modded bioses in the more recent pages.

If you have the original revision feel free to use whichever bios you want (try to stay within your vendor).

Also a small tip that I don't think is covered in the extensive write up that ragincain did.
You might have to run the --protectoff command for each core to disable protection on the bios chips before flashing (if the vendor protected them). Then run the --protecton command for each core after flashing...... For help with these and other commands look inside the nvflash readme.

Wow I feel like a hardware manual writer....


----------



## MKHunt

Larger inductors. It's definitely the newest revision as nobody seemed to know about it before I had my card. I made a thread about the changes after researching/interpreting the datasheets and circuitry diagrams.

I'll probably have to mod the bios, but hopefully not.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> Larger inductors. It's definitely the newest revision as nobody seemed to know about it before I had my card. I made a thread about the changes after researching/interpreting the datasheets and circuitry diagrams.
> I'll probably have to mod the bios, but hopefully not.


Well, come to find out it's about 95% of revision 2's which, are the majority of the market so, ironically, the vast majority (93%) of total users actually have that revision.

What I find interesting is that EVGA got an entire stock of the Rev 2's and everyone else based their CNC on revision 1...

Makes you think.


----------



## MKHunt

Quote:


> Originally Posted by *Masked*
> 
> Well, come to find out it's about 95% of revision 2's which, are the majority of the market so, ironically, the vast majority (93%) of total users actually have that revision.
> What I find interesting is that EVGA got an entire stock of the Rev 2's and everyone else based their CNC on revision 1...
> Makes you think.


Interesting. Compared to my original card coil whine is significantly reduced (almost completely) but it is indeed interesting that EVGA got so many. I always suspected they were closer to Nvidia than the other companies since they seem to sell more reference cards.

ETA: Also does that mean there is already a mod bios for the card? It folds 730 at .925V so at .963... PPD.


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> Interesting. Compared to my original card coil whine is significantly reduced (almost completely) but it is indeed interesting that EVGA got so many. I always suspected they were closer to Nvidia than the other companies since they seem to sell more reference cards.
> ETA: Also does that mean there is already a mod bios for the card? It folds 730 at .925V so at .963... PPD.


All of our stock is post EVGA stock to my understanding so, we have all Rev 2 stock as well.

As to if they just ran out and used secondary stock or if they slammed on the breaks and said "WOAH"...Who knows but, it's quite evident that JUST at the start, they made the swap.


----------



## ProfeZZor X

The anticipation is killing me to actually have mine in my hands and feel the weight of that thing - which I hear is unreal. So I totally look forward to unboxing all that hydro copper goodness when it arrives next week.


----------



## rush2049

you know how motherboards have that certain 'flex' feel. Especially when holding it up by the large aftermarket heatsink you installed....

and you know when you hold a new smartphone how solid they feel....

well this thing feels more like holding a clay brick as opposed to that really sturdy feeling described above....


----------



## MKHunt

Quote:


> Originally Posted by *rush2049*
> 
> you know how motherboards have that certain 'flex' feel. Especially when holding it up by the large aftermarket heatsink you installed....
> and you know when you hold a new smartphone how solid they feel....
> well this thing feels more like holding a clay brick as opposed to that really sturdy feeling described above....


actually I don't know about that flex feel. Got Z68XP-UD4, feels like a smartphone. XD

I do know the brick feel though. We used to have the old gray brick cell phones.


----------



## rush2049

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rush2049*
> 
> you know how motherboards have that certain 'flex' feel. Especially when holding it up by the large aftermarket heatsink you installed....
> and you know when you hold a new smartphone how solid they feel....
> well this thing feels more like holding a clay brick as opposed to that really sturdy feeling described above....
> 
> 
> 
> actually I don't know about that flex feel. Got Z68XP-UD4, feels like a smartphone. XD
> 
> I do know the brick feel though. We used to have the old gray brick cell phones.
Click to expand...

I am talking like Scyth Mugen 2 heavy.... the board flexes supporting all that weight...


----------



## Swolern

Quote:


> Originally Posted by *heyskip*
> 
> I ran some benchmarks and with the game in its current state a dedicated PhysX card doesn't seem worth it (on my system anyway).
> 560ti dedicated physx - MIN 18 MAX 100 AVG 53
> 590 one core dedicated physx - MIN 23 MAX 71 AVG 52
> 590 physx active but no cores dedicated - MIN 19 MAX 107 AVG 52
> CPU dedicated physx - MIN 20 MAX 114 AVG 49
> If I get curious I may try my 560ti in the x8 slot instead of x4 to see if theres any difference. Also once the game is patched I will retry because a I expected a bigger difference with a 560ti dedicated. If it doesn't make a difference in a physx heavy game like AC then its probably not worth the extra power usage and increased heat.


Ok I just got around to getting a dedicated PhysX card and wow what a difference in Batman AC. First of all my rig- Asus z68 Pro/ EVGA 590 Classified/ i5 2500k/ 8gb 1600 ram/ 1050w Corsair.

With just 590 (all settings maxed and tessellation on high @ 1920x1080)
-PhysX off: 40-80 FPS. I'm not counting the small hiccups that bring FPS in single didgits every so often
- PhysX on high- 24-45FPS

With 590+ 460 as dedicated PhysX (all settings same as before)
-PhysX on high: 40-80 FPS !!!

As you can see high PhysX on Batman AC puts a huge strain on the 590 by itself bringing. While playing with same setting with no PhysX marks a big difference. And I'm getting the same high FPS with high PhysX and dedicated card as no PhysX and no dedicated card.

So I can finally play Batman AC yay! I picked up the PNY GTX 460 on clearance at BestBuy (B&M only) for $124.

Now I have to try it on other Physx games!


----------



## toX0rz

Quote:


> Originally Posted by *Swolern*
> 
> Ok I just got around to getting a dedicated PhysX card and wow what a difference in Batman AC. First of all my rig- Asus z68 Pro/ EVGA 590 Classified/ i5 2500k/ 8gb 1600 ram/ 1050w Corsair.
> With just 590 (all settings maxed and tessellation on high @ 1920x1080)
> -PhysX off: 40-80 FPS. I'm not counting the small hiccups that bring FPS in single didgits every so often
> - PhysX on high- 24-45FPS
> With 590+ 460 as dedicated PhysX (all settings same as before)
> -PhysX on high: 40-80 FPS !!!
> As you can see high PhysX on Batman AC puts a huge strain on the 590 by itself bringing. While playing with same setting with no PhysX marks a big difference. And I'm getting the same high FPS with high PhysX and dedicated card as no PhysX and no dedicated card.
> So I can finally play Batman AC yay! I picked up the PNY GTX 460 on clearance at BestBuy (B&M only) for $124.
> Now I have to try it on other Physx games!


It wont make much difference in other games.
Batman AC is known to be by far the most demanding game regarding PhysX and thus the most extreme case.
Mafia 2 might be worth trying though.


----------



## Swolern

Quote:


> Originally Posted by *toX0rz*
> 
> It wont make much difference in other games.
> Batman AC is known to be by far the most demanding game regarding PhysX and thus the most extreme case.
> Mafia 2 might be worth trying though.


Ya I'm loving my cheap 460 so far. I know it won't make a big difference in alot of games, but Batman AC is one of my favorite games and I've been waiting to play it at max settings and a high FPS.

Looks like it pays off in Mafia 2 also! Ran benchmark @ max settings and PhysX on high 1920x1080.
590 +CPU PhysX = 16.2fps avg
590 only (no dedication)= 46.4fps avg
590+460(dedicated)= 71.2fps avg

I guess the 590 spoils me but I want my games at a solid 60fps. The 460 helps me achieve this.


----------



## iARDAs

Why do games have to use 1 of the cores of 590 fully in physx?

Why cant it use half of a core in 590?


----------



## MasterVampire

So my GTX590 died the other day









When gaming in Skyrim or GTA after maybe like 15 minutes the game would crash.
Then a few days later whenever I would goto launch a game the screen would go back and I would get a no signal message and would have to do a hard reset.
Then soon after the same black screen thing would happen if I tried to watch a movie or tv show.
And finally after booting up the pc and getting into windows it would black screen crash.

I tried formatting the hdd twice prior to the final death it gets after booting into windows with nvidia different drivers.
Also replaced the PC's motherboard and cpu (had been meaning to upgrade for a while anyway) but that didnt make a difference.

The card itself never overheated because I took the fan off the top and hooked it up to a watercooling setup and my PSU is a 700w one.

So can it be conculded that the card needs to be RMA'ed so I get it fixed or a replacement? If so does this happen alot with these cards?


----------



## Swolern

Quote:


> Originally Posted by *MasterVampire*
> 
> So my GTX590 died the other day
> 
> 
> 
> 
> 
> 
> 
> 
> When gaming in Skyrim or GTA after maybe like 15 minutes the game would crash.
> Then a few days later whenever I would goto launch a game the screen would go back and I would get a no signal message and would have to do a hard reset.
> Then soon after the same black screen thing would happen if I tried to watch a movie or tv show.
> And finally after booting up the pc and getting into windows it would black screen crash.
> I tried formatting the hdd twice prior to the final death it gets after booting into windows with nvidia different drivers.
> Also replaced the PC's motherboard and cpu (had been meaning to upgrade for a while anyway) but that didnt make a difference.
> The card itself never overheated because I took the fan off the top and hooked it up to a watercooling setup and my PSU is a 700w one.
> So can it be conculded that the card needs to be RMA'ed so I get it fixed or a replacement? If so does this happen alot with these cards?


It could be GPU or PSU. 700w psu is bare minimum for the 590. If its a lower end brand name psu it could be even less efficient for the system. Best way to test is swap out the components and test the system.

GPU death doesn't happen too much on the 590, but it's possible just like any other GPU. If unlocked for high overclocks or even worse Overvolting then your risks are much higher.


----------



## squishysquishy

Hey guys, I installed koolance water blocks on my 590's last night. I have everything bought and happy, I just need to mount everything up. Hopefully I will have time for that this weekend.


----------



## Swolern

Quote:


> Originally Posted by *iARDAs*
> 
> Why do games have to use 1 of the cores of 590 fully in physx?
> Why cant it use half of a core in 590?


This only happens if you dedicate a core to PhysX. If no cores are dedicated then PhysX will share your primary core. When SLI is enabled and PhysX is enabled. The first GPU will do all the PhysX workload + standard rendering and every GPU thereafter will do basic SLI rendering operations. This means your primary GPU will be the performance bottleneck of your PhysX performance.


----------



## MasterVampire

Quote:


> Originally Posted by *Swolern*
> 
> It could be GPU or PSU. 700w psu is bare minimum for the 590. If its a lower end brand name psu it could be even less efficient for the system. Best way to test is swap out the components and test the system.
> GPU death doesn't happen too much on the 590, but it's possible just like any other GPU. If unlocked for high overclocks or even worse Overvolting then your risks are much higher.


Yeah but the thing is I have been using the 590 for the last 5 months with my current 700W PSU fine before it died with these black screen crashes.

I also forgot to mention that I also tested my ram with memtest for a few hours with no errors.


----------



## Recipe7

I've been using a corsair hx650watt with my 590. Your 700watt should be enough, granted that you are using a reputable PSU.


----------



## RODBORA

Quote:


> Originally Posted by *MKHunt*
> 
> What drivers are everyone using? I'm still on 285.62 because it lets me go to .938V rather than .925V.
> Any advantages to the newer drivers?


MK,

I'm using the latest beta at 630 stock Mhz, even with just .925v. But the objective is follow the GPU behaviour, trying to see if it will be 100% stable when using new beta.

Another point Guys: How much can I overclock my 590 for 24/7 using the stock air cooler at .938v (with 285.62 driver) and at .925v (with new beta driver)? And this overclock will be 100% secure to keep the GPU working fine?

Thanks!


----------



## Wogga

i believe voltage on 285.62 is 0.938v only because of a bug in monitoring software. a few number of drivers between 270.xx and 290.xx series showed voltage rised by 1 step but real voltage was 1 step lower. so basically 285.62 or latest drivers - real voltage will be the same (0.925v)


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> i believe voltage on 285.62 is 0.938v only because of a bug in monitoring software. a few number of drivers between 270.xx and 290.xx series showed voltage rised by 1 step but real voltage was 1 step lower. so basically 285.62 or latest drivers - real voltage will be the same (0.925v)


ya sure? because at 285.62 i can run 745/1758 but on anything newer I can only go 730/1728

GPU-Z and afterburner both show .938V. Haven't tried precision.



BIOS version 70.10.42.00.90


----------



## Dennybrig

Hey crackhead guy congrats on your waterblocks! Dont forget to post pictures and also could you tell me what will your watercooling setup be? I mean the equipment you will use to water cool them...
Please let me know!


----------



## RODBORA

Quote:


> Originally Posted by *MKHunt*
> 
> ya sure? because at 285.62 i can run 745/1758 but on anything newer I can only go 730/1728
> GPU-Z and afterburner both show .938V. Haven't tried precision.
> 
> BIOS version 70.10.42.00.90


Mk,

I'm feel sure you are correct: the voltage are different between 285.62 and new beta drivers. I will try to keep the new beta (290 series) just to follow the GPU behaviour.

Talking about overclock: Your 590 are air cooled? If yes, is it 100% safe to keep the 590 overclocked at 730Mz? How long time are you using this frequencies?

thanks MKHunt.


----------



## MKHunt

Quote:


> Originally Posted by *RODBORA*
> 
> Mk,
> I'm feel sure you are correct: the voltage are different between 285.62 and new beta drivers. I will try to keep the new beta (290 series) just to follow the GPU behaviour.
> Talking about overclock: Your 590 are air cooled? If yes, is it 100% safe to keep the 590 overclocked at 730Mz? How long time are you using this frequencies?
> thanks MKHunt.


my 590 is under water with an EK EN+Acetal block. I use 730mhz to fold and I game at stock since I have a 60Hz monitor.


----------



## kzinti1

Did any of you happen to write down the number on the top of your cards when you stripped them down for a waterblock?

I need to be absolutely certain if the pair of EVGA's I'm running are actually reference cards, before I waste a lot of money on a pair of blocks I can't use.

I've said several times that the original Nvidia pcb's that were shipped to the end-producers came in 2 revisions. http://vr-zone.com/articles/revised-geforce-gtx-590-cards-in-june/12222.html

The 2nd revision came out in June, 2011. Right around the time I bought my 1st GTX590.

I read somewhere, maybe this Forum, maybe another, that somebody had a pair of XSPC waterblocks foe their GTX590's that weren't exactly the same and one came within about 1/8th inch of fitting properly? I imagine he had both revisions of these cards. I can't find any waterblock manufacturer that will answer an e-mail asking which revision of GTX590's their blocks are for.

I'm not really concerned by the brand of blocks, as long as they have no exposed copper. I used a non-plated Danger Den block on an XFX 5850 and couldn't keep it from tarnishing. I should've known better.

Thanks,

k1


----------



## MKHunt

Koolance and EK I can confirm without any doubt that they will fit ANY 590.

I actually have a Koolance for sale but not posted yet; BNIB since I RMAed my other one and couldn't wait. It does get better temps than my EK block but the difference is like 3-4C so I have a hard time caring lol.


----------



## kzinti1

Quote:


> Originally Posted by *MKHunt*
> 
> Koolance and EK I can confirm without any doubt that they will fit ANY 590.
> I actually have a Koolance for sale but not posted yet; BNIB since I RMAed my other one and couldn't wait. It does get better temps than my EK block but the difference is like 3-4C so I have a hard time caring lol.


Thanks MK! I was thinking about a Koolance but there are no backplates listed for their blocks.

Now, even though it has exposed copper, I'm leaning more towards a pair of XSPC blocks. According to their install instructions, http://www.xs-pc.com/manual/razorgtx590.pdf, I can use the stock backplate. In fact, it says the block is mounted with the screws going through the stock backplates. They also have the wide-spaced ports I like so much.

But, I'm not ordering these until I can find some confirmation that they'll fit my specific cards.

They certainly aren't as flashy as other brands but who would know it besides me? The backplate is visible in most of our cases, not the waterblock itself. And the EVGA Classified backplate is just as classy looking as any other.

I'll sure miss the LED lit green Nvidia symbol and the matching GeForce logo, but I'd miss it even more if my cards lives were shortened because I failed to cool them the best I possibly could. I also saw a really fake-looking spec and price sheet for the upcoming GTX690's, if there actually will be such a thing, of course, and it listed them as $1000.00 each! I'll be keeping these 590's as long as possible and watercooling them is the only way to go. If I'm still alive I may try a pair of GTX790's, but that's a long ways down the road, so I kinda doubt it.

I'm still hoping EVGA will start selling the waterblocks made for them by SwifTech. They're quite ugly but it doesn't matter. But, I think, (not the least bit sure), that EVGA quit making their HydroCopper GTX590's when Nvidia came out with their new pcb's. I've even heard a couple of rumors that EVGA may have made other changes to the stock 590 pcb's that nobody else did. If so, then I think that they'd also have made a different BIOS to allow higher voltages. But, mine still idle at 0.875V and top out at 0.938V, or whatever it actually is. I really wish that EVGA had put mount points for a volt-meter like MSI did with my GTX580 Lightning XE's. I can't find a pcb layout for these EVGA's and I'd really rather not go fumbling around with a multi-meter unless I know exactly where to place the probes. There should be somebody here at OCN that knows how to do this.

Thanks again.


----------



## Masked

Quote:


> Originally Posted by *kzinti1*
> 
> I'm still hoping EVGA will start selling the waterblocks made for them by SwifTech. They're quite ugly but it doesn't matter. But, I think, (not the least bit sure), that EVGA quit making their HydroCopper GTX590's when Nvidia came out with their new pcb's. I've even heard a couple of rumors that EVGA may have made other changes to the stock 590 pcb's that nobody else did. If so, then I think that they'd also have made a different BIOS to allow higher voltages. But, mine still idle at 0.875V and top out at 0.938V, or whatever it actually is. I really wish that EVGA had put mount points for a volt-meter like MSI did with my GTX580 Lightning XE's. I can't find a pcb layout for these EVGA's and I'd really rather not go fumbling around with a multi-meter unless I know exactly where to place the probes. There should be _somebody_ here at OCN that knows how to do this.
> Thanks again.


That won't be happening.

The 590 HC's were a limited one run only with Swiftech and they already started CNCing the next release so, no more 590 blocks will be released to the public beyond what's actually on the cards.

The 580 was different because that wasn't a limited card release and to cover the 1.5 // 3gb cards, all they had to do was shorted 2/3 protruding contact points -- Bam, done.

So, unfortunately, this will not be a reality.

The rumors surrounding this card are incorrect. The PCB has never changed, not a once.

The contact points ON the PCB have changed as have some components on the PCB but, the PCB was never changed.

As for the voltage issues, the only issue currently withstanding is the Bios/Driver lock on behalf of Nvidia and it HAS FLUCTUATED over several releases. ~ Whomever said this, you're absolutely correct.

Nvidia is using driver releases almost as tests to see what's guaranteed safe for the public and this is far from the first time they've done this, no surprise.

I have multi-metered our cards which is what my blurb was about on the previous page.

Your card has no physical blockade UNTIL 1.05v which is the point at which your doppler connector FAILS and actually rebounds the voltage BACK to the VRM's which blows the closest VRM to the 2ndary caps.

1.0v is absolutely safe and as I've said many times...2 Weeks after release, MANY of us had these at 0.95v @ 850mhz or better.


----------



## squishysquishy

Quote:


> Originally Posted by *Dennybrig*
> 
> Hey crackhead guy congrats on your waterblocks! Dont forget to post pictures and also could you tell me what will your watercooling setup be? I mean the equipment you will use to water cool them...
> Please let me know!


I have a koolance block, 35x pump, 180mm rad, 2 x 180mm phobya fans, xspc single bay res, compression fittings. And i might be putting in a koollance inline drain port...but it would look soo ugly...I might leave it out.

I coulddent mod my case to fit another rad for the other 590, so I am converting the other one back to air and putting it in my brothers computer untill the games stop playing maxed out on a single 590. Then I will need to upgrade to a new case, prob a silverstone raven (still uses the 180mm fans).

I will post pics when I get everything finished, I still need to flush my rad before I can add coolant.


----------



## toX0rz

Quote:


> Originally Posted by *Masked*
> 
> That won't be happening.
> The 590 HC's were a limited one run only with Swiftech and they already started CNCing the next release so, no more 590 blocks will be released to the public beyond what's actually on the cards.
> The 580 was different because that wasn't a limited card release and to cover the 1.5 // 3gb cards, all they had to do was shorted 2/3 protruding contact points -- Bam, done.
> So, unfortunately, this will not be a reality.
> The rumors surrounding this card are incorrect. The PCB has never changed, not a once.
> The contact points ON the PCB have changed as have some components on the PCB but, the PCB was never changed.
> As for the voltage issues, the only issue currently withstanding is the Bios/Driver lock on behalf of Nvidia and it HAS FLUCTUATED over several releases. ~ Whomever said this, you're absolutely correct.
> Nvidia is using driver releases almost as tests to see what's guaranteed safe for the public and this is far from the first time they've done this, no surprise.
> I have multi-metered our cards which is what my blurb was about on the previous page.
> Your card has no physical blockade UNTIL 1.05v which is the point at which your doppler connector FAILS and actually rebounds the voltage BACK to the VRM's which blows the closest VRM to the 2ndary caps.
> *1.0v is absolutely safe and as I've said many times...2 Weeks after release, MANY of us had these at 0.95v @ 850mhz or better*.


Isnt the max possible voltage on current drivers like 0,96V - 0,97V?
So 850mhz would still be possible ?


----------



## squishysquishy

On a more serious note. My 590 is in my TJ08-E. I dont have the room for my pump to be underneith my 590 (due to harddrives and my H80 for my processor), would it be bad if I mounted my pump ontop of the 590block? I could mount it up against my PSU but I would rather not.

Thanks for the input.


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> Isnt the max possible voltage on current drivers like 0,96V - 0,97V?
> So 850mhz would still be possible ?


No, if you haven't modified your bios max voltage is lifetime set under 0.94v ~

I believe it's 0.936


----------



## MKHunt

Quote:


> Originally Posted by *kzinti1*
> 
> Thanks MK! I was thinking about a Koolance but there are no backplates listed for their blocks.
> Now, even though it has exposed copper, I'm leaning more towards a pair of XSPC blocks. According to their install instructions, http://www.xs-pc.com/manual/razorgtx590.pdf, I can use the stock backplate. In fact, it says the block is mounted with the screws going through the stock backplates. They also have the wide-spaced ports I like so much.
> But, I'm not ordering these until I can find some confirmation that they'll fit my specific cards.
> They certainly aren't as flashy as other brands but who would know it besides me? The backplate is visible in most of our cases, not the waterblock itself. And the EVGA Classified backplate is just as classy looking as any other.


the Koolance blocks work with the stock backplate. As far as I know, only EK blocks don't work with the stock backplate,

Also, memory OCs werent doing anything with my PPD so I dropped my mem to stock and I'm eeking out what I can in moar core.


----------



## heyskip

Quote:


> Originally Posted by *toX0rz*
> 
> Isnt the max possible voltage on current drivers like 0,96V - 0,97V?
> So 850mhz would still be possible ?


My stable clocks are [email protected] and [email protected] and I would say these are on the lower end of average. [email protected] sound a little far fetched to me. Maybe for surfing the net, but completeing a 3DMark11 run or playing a GPU intensive game like BF3, I doubt it.

This is on the earlier cards, the later ones seem to be more capable.


----------



## Masked

Quote:


> Originally Posted by *heyskip*
> 
> My stable clocks are [email protected] and [email protected] and I would say these are on the lower end of average. [email protected] sound a little far fetched to me. Maybe for surfing the net, but completeing a 3DMark11 run or playing a GPU intensive game like BF3, I doubt it.
> This is on the earlier cards, the later ones seem to be more capable.


I then challenge you to read the first 20 pages of this thread.

850mhz was achieved easily pre-voltage lock on anywhere from .955v to 1.00v ~

There are no earlier cards...Revision 1 was a VERY limited run, then they hopped to revision 2.

I'd argue that everyone in the market has had a Rev 2.0 since release only, they didn't know it.

So, 850mhz is easily achieved but, you actually have to unlock the card to get it there.

Your issue is that your cards are DRIVER LOCKED...In fact, for everyone in here (because none of you have written your own drivers) we're all driver locked.

Once that dampener is removed, I'd almost bet you my left nut your card could do 850mhz pre-1.0v.


----------



## heyskip

Quote:


> Originally Posted by *Masked*
> 
> I then challenge you to read the first 20 pages of this thread.
> 850mhz was achieved easily pre-voltage lock on anywhere from .955v to 1.00v ~
> There are no earlier cards...Revision 1 was a VERY limited run, then they hopped to revision 2.
> I'd argue that everyone in the market has had a Rev 2.0 since release only, they didn't know it.
> So, 850mhz is easily achieved but, you actually have to unlock the card to get it there.
> Your issue is that your cards are DRIVER LOCKED...In fact, for everyone in here (because none of you have written your own drivers) we're all driver locked.
> Once that dampener is removed, I'd almost bet you my left nut your card could do 850mhz pre-1.0v.


Don't worry I've pretty much read this entire thread and the overclocking/flashing thread, don't tell my boss because most was done while at work









I was actually responding to the bold text below.
Quote:


> Originally Posted by *toX0rz*
> 
> Isnt the max possible voltage on current drivers like 0,96V - 0,97V?
> *So 850mhz would still be possible ?*


With up to date drivers which 99% of us have access to and need to run due to game optimizations, I don't think 850mhz is realistically achievable.


----------



## toX0rz

Quote:


> Originally Posted by *heyskip*
> 
> My stable clocks are [email protected] and [email protected] and I would say these are on the lower end of average. [email protected] sound a little far fetched to me. Maybe for surfing the net, but completeing a 3DMark11 run or playing a GPU intensive game like BF3, I doubt it.
> This is on the earlier cards, the later ones seem to be more capable.


Is it on air?
If so, what temps are you getting and most importantly how did the OC affect your temps in comparison to running it at stock?

After a year of running the card stock I pretty much decided to give the OC a shot aswell.
I'm aiming for a ~20% overclock so around -/+ 730 MHz, will check the 590 flashing thread later and try some BIOSes out.

Im on stock air cooler, but I reckon the cooling will be enough seeing how my GPUs only run between 72°C and 80°C at around 70% fan speed while gaming.

On another note @Masked:
How can I check what Revision my card is? I realize that you said the vast majority is on Rev2 cards but I got mine pretty much at launch so would be cool if there'd be a way to check.


----------



## Wogga

if rev. 2.0 is that with different VRMs, then i have both cards rev. 1.0. and highest benched clock was [email protected] (it can be checked in green vs. red thread) on 280.19 drivers, that as i believe are the latest voltage unlocked drivers.
even on "burning" drivers, ones that didnt trottled the card in 3DMark11, i could succesfully bench only at [email protected]


----------



## H4rd5tyl3

Forgot to post my pic way back when lol.



Also, my card is from around launch day. I'm assuming I can flash the bios w/o voiding warranty?


----------



## Masked

Quote:


> Originally Posted by *heyskip*
> 
> Don't worry I've pretty much read this entire thread and the overclocking/flashing thread, don't tell my boss because most was done while at work
> 
> 
> 
> 
> 
> 
> 
> 
> I was actually responding to the bold text below.
> With up to date drivers which 99% of us have access to and need to run due to game optimizations, I don't think 850mhz is realistically achievable.


Not currently, no.

The driver lock alone won't let you get above 800mhz, anyway...At least that was true the last time we tried.

So, as of today, 850mhz is not achievable...unless you unlock the card and re-write the drivers.
Quote:


> Originally Posted by *toX0rz*
> 
> On another note @Masked:
> How can I check what Revision my card is? I realize that you said the vast majority is on Rev2 cards but I got mine pretty much at launch so would be cool if there'd be a way to check.


It is on the back of the card located just near the middle screw...
Quote:


> Originally Posted by *Wogga*
> 
> if rev. 2.0 is that with different VRMs, then i have both cards rev. 1.0. and highest benched clock was [email protected] (it can be checked in green vs. red thread) on 280.19 drivers, that as i believe are the latest voltage unlocked drivers.
> even on "burning" drivers, ones that didnt trottled the card in 3DMark11, i could succesfully bench only at [email protected]


You wouldn't know which revision you have/had unless the serial number says B or Revision 2.

And having the card at 1.05v was bad ~ It means your secondary core was at 1.07 ~ So you could have actually hard-capped the card, yourself.


----------



## toX0rz

Quote:


> Originally Posted by *Masked*
> 
> It is on the back of the card located just near the middle screw...


699-11020-0005-500

Which revision is it


----------



## iARDAs

When I set my settings to LOW and play the game in 3D, the performance is very very awful. I am usually at 60 fps but it drops to 40 fps.

I truly believe that Bf3 needs a better SLI (590) profile and updated drivers.

I can play the game in 2d with ULTRA settings averaging 60 or 70 fps

I turn everything off and set the game at LOW and in 3D i get 40-50 average


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> 699-11020-0005-500
> S/N: 0****************
> Which revision is it


Posting your serial # is a very bad idea -- I'd edit that were I you.

And you actually have a vanilla...Like I said, there are very few of these out there...Especially those that haven't been RMA'd yet.

500 = Rev 1
501 = Rev 1(This is when they changed "lanes")
502 = Rev 2. (Started the 2nd run - Made a cap swap)
502b = Rev 2. (2nd run, made the VRM change and pathways)

Actually -- I think it went from 500 to 502...Because 500 was pre-order and 501 was sample stock...502 was standard and 502b happened very shortly after standard.

TLDR; I actually don't have an answer for the order they were released in...It's entirely possible they went 500, 501, 502b...

I do know they tend to label sample stock and pre-order stock separately from retail...


----------



## toX0rz

I see, so I guess I'm pretty much bound to higher risks when OCing my card since I have the first revision..


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> I see, so I guess I'm pretty much bound to higher risks when OCing my card since I have the first revision..


No, the limits never changed.

If you go beyond 1.05v on any 590, you're going to blow the VRM's out...There's no stopping that.

If you're better off or not in terms of performance, as always, is luck of the draw.


----------



## rush2049

I personally like my 590, its the first version (500)..... overclocks well, temps are low, accepted the modded bios without a problem.
(Just make sure your case supplies cool air and vacates both ends of the card quickly)


----------



## MKHunt

Hmm further testing has shown that the EK block runs about 5C hotter than the Koolance, which used to get to the exact same temp as the CPU when both were fully loaded.


----------



## toX0rz

Quote:


> Originally Posted by *rush2049*
> 
> I personally like my 590, its the first version (500)..... overclocks well, temps are low, accepted the modded bios without a problem.
> (Just make sure your case supplies cool air and vacates both ends of the card quickly)


What BIOS are you running? What clocks and voltages?


----------



## heyskip

Quote:


> Originally Posted by *toX0rz*
> 
> Is it on air?
> If so, what temps are you getting and most importantly how did the OC affect your temps in comparison to running it at stock?
> After a year of running the card stock I pretty much decided to give the OC a shot aswell.
> I'm aiming for a ~20% overclock so around -/+ 730 MHz, will check the 590 flashing thread later and try some BIOSes out.
> Im on stock air cooler, but I reckon the cooling will be enough seeing how my GPUs only run between 72°C and 80°C at around 70% fan speed while gaming.
> On another note @Masked:
> How can I check what Revision my card is? I realize that you said the vast majority is on Rev2 cards but I got mine pretty much at launch so would be cool if there'd be a way to check.


It's watercooled now. Didn't overclock while on air due to the summer ambient temps here of between 35-40c. The card was hitting +90c at stock.


----------



## kzinti1

I decided to give the ForceWare 295.51 Beta drivers a shot today and found that GPU-Z 0.5.8 now reports my max voltage as 0.9250V. instead of the 0.9380V it reported earlier.

This is now the correct voltage isn't it?

Is there a simple set of instructions, somehere at all, that explains how to mod the vBIOS so I can achieve a maximum 1.000V for my cards? That's an OV of 0.0750V. I've read through the thread about doing this but I still am not the least bit comfortable with the instructions I've pieced together.

This comes close, 



, but I'd still prefer a written copy I can print out and follow, step by step.

It's still hard to believe such a miniscule voltage increase could have such a profound effect on gpu performance, even though it's been proven beyond a shadow of a doubt.


----------



## Shinobi Jedi

Is anyone else playing SWTOR and getting constant crashes to desktop or system lockups needing a hard reset?

It doesn't matter what driver I use between the WHQL and the beta's, whether I disable SLI and run it through one GPU - this game is constantly crashing to the desktop or worse.

And it's the only game I'm having issues with, every other game runs full blast flawlessly.

It happens on my two notebooks too, which is why I think its the coding, but the haters on the SWTOR forums are trying to say it's all me and my system.

So, being one who tries to keep an open mind, I'd thought I'd ask here if anyone else is having stability issues with SWTOR and their 590?

It almost doesn't matter anyways; as that I never play MMO's, my lvl. 41 Jedi Consular can't pass the lvl. 37 Class Quest/Mission. So, I guess I suck at them.

Good Times...


----------



## Wogga

checked both cards: 500, so both are "vanilla" =D
considering short period of full load under 1.05v, torned off capacitor and good condition in terms of perfomance, i think the card itself wasnt too damaged by high voltage

Shinobi Jedi, i'm playing too and having some troubles. same crushing, but it happens only when changing instances, when loading screen is just one big ugly random skill icon. hitting "win" key quickly help to avoid driver crushing.
also after aprox. 2 hours i have some kind of texture glitch. alt+tabing helps (video memory leaks i think).


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Is anyone else playing SWTOR and getting constant crashes to desktop or system lockups needing a hard reset?
> It doesn't matter what driver I use between the WHQL and the beta's, whether I disable SLI and run it through one GPU - this game is constantly crashing to the desktop or worse.
> And it's the only game I'm having issues with, every other game runs full blast flawlessly.
> It happens on my two notebooks too, which is why I think its the coding, but the haters on the SWTOR forums are trying to say it's all me and my system.
> So, being one who tries to keep an open mind, I'd thought I'd ask here if anyone else is having stability issues with SWTOR and their 590?
> It almost doesn't matter anyways; as that I never play MMO's, my lvl. 41 Jedi Consular can't pass the lvl. 37 Class Quest/Mission. So, I guess I suck at them.
> Good Times...


Well, in terms of the system -- It is you.

The game, itself, cannot logically cause the above.

What SWTOR DOES do, is it finds errors where even memtest, fails...Now, I'm not saying you have an error that would cause detriment to anything else, I'm saying that it's highly likely you have a real error, it's just at an extreme you don't use your system at.

The above is actually pretty normal...Examples of this would be Photoshop (anyone remember those errors?), Doom 3 (I remember those crashes...Wonderful HD corruption tool, still to this day)...

Again, I'm not saying the above is your issue but, it's funny how some programs, root out errors better than our stress tests.

What I can see also causing this issue is drivers -- Certain drivers just don't like the game...I have no idea what drivers I'm on at home, actually...But, I have a HUGE memory leak on my ship the moment I touch the space terminal...Causes a double instance.

Personally, I'd write them an email with your DXdiags and explain the issue in depth...

As I said yesterday in the SWTOR forum, the diligent more are we, a better game, expect can we.
Quote:


> Originally Posted by *Wogga*
> 
> checked both cards: 500, so both are "vanilla" =D
> considering short period of full load under 1.05v, torned off capacitor and good condition in terms of perfomance, i think the card itself wasnt too damaged by high voltage
> Shinobi Jedi, i'm playing too and having some troubles. same crushing, but it happens only when changing instances, when loading screen is just one big ugly random skill icon. hitting "win" key quickly help to avoid driver crushing.
> also after aprox. 2 hours i have some kind of texture glitch. alt+tabing helps (video memory leaks i think).


It doesn't matter how short of a load it is...1.05v on 1 core, pushes the over above 1.05 -- by at least +0.03v ~ I'm honestly not kidding.

Past 1.05v is actually the stress point of the vast majority of those pathways...I wouldn't be surprised if there was damage and you just won't see it until down the line.

This is why earlier in the thread it was expressed that one should NEVER go beyond 1.02v ~ It's just bad for the card, itself.


----------



## emett

Masked I am getting a "battlefield has stopped working error" when I turn the grfx up to ultra etc..
Could this be a similar cause? I ran memtest and there are no errors. What do you suggest I do?
I will be testing a friends 590 in my pc on the weekend, but what else can you suggest?
Funny thing is when bf3 was released I had no issues for a couple months which makes my think it's just a coding error.


----------



## Masked

Quote:


> Originally Posted by *emett*
> 
> Masked I am getting a "battlefield has stopped working error" when I turn the grfx up to ultra etc..
> Could this be a similar cause? I ran memtest and there are no errors. What do you suggest I do?
> I will be testing a friends 590 in my pc on the weekend, but what else can you suggest?
> Funny thing is when bf3 was released I had no issues for a couple months which makes my think it's just a coding error.


That's more/less a driver issue unless your card is damaged.

When a pathway on your card is blocked...or a solder "goes bad", the relay doesn't actually work and it just stops the card.

Ironically, the same thing happens if your driver is corrupted or, just won't play nice with the game.

I'd remove and re-download your drivers -- Go from there.


----------



## emett

Ok, I'll give that ago. I normally just install the drivers over the top of each other.
I have only really over clocked a couple times when I ran the palit custom bios, max overclock I did was 700mhz. So I think the card is fine but like I said will tese this with my mates card..
Should I use a driver sweep program to remove the drivers? Or is custom instal all that you'd recommend?


----------



## Masked

Quote:


> Originally Posted by *emett*
> 
> Ok, I'll give that ago. I normally just install the drivers over the top of each other.
> I have only really over clocked a couple times when I ran the palit custom bios, max overclock I did was 700mhz. So I think the card is fine but like I said will tese this with my mates card..
> Should I use a driver sweep program to remove the drivers? Or is custom instal all that you'd recommend?


Yep, driver sweep and re-install.


----------



## iARDAs

I always wonder if installing a new driver via driver sweeper actually benefited.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I always wonder if installing a new driver via driver sweeper actually benefited.


There's so much residual left in just installing over your previous driver...God only knows what sticks around.

I remember back in the day, with the 280's ~ If you didn't uninstall the ENTIRE driver, the sound profile actually devoured the ascending sound files...Meaning you could be on driver 280-whatever but, still have the sound profiles from driver 250-whatever.

An issue they still haven't fixed, btw









There are also registry tweaks the drivers make that are auto-removed by Sweeper...

It does a good job and it's better to be safe rather than have an eon-old sound file cannibalizing half your audio profile.


----------



## iARDAs

Haha lol

I already had the latest beta but now uninstalled in used driver sweeper and installing it again.

From now on I will do like this to be on the safe side









Its just weird that before my computer was booting ihit f8 and i was not taken to the Windows Safe Mode option

I am using an UEFI bios and when i hit f8 i had 3 options

1-) the name of my DVD drive
2-) the name of my HDD
3-) enter setup

I started in safe mode via msconfig.


----------



## rush2049

Quote:


> Originally Posted by *toX0rz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rush2049*
> 
> I personally like my 590, its the first version (500)..... overclocks well, temps are low, accepted the modded bios without a problem.
> (Just make sure your case supplies cool air and vacates both ends of the card quickly)
> 
> 
> 
> What BIOS are you running? What clocks and voltages?
Click to expand...

700 mhz clocks, 963 mV on both cores

The bios is the Asus volt mod one, custom made for me by ragincain. But I believe it is the first modded one listed under ASUS in the 'flashing and overclocking' thread.


----------



## Dennybrig

Guys, please help u have tried to solve the following problem to no avail.

Look, i have my GTX 590 video card connected to my 50' Plasma TV through a DVI to HDMI adapter. So, when i turn the computer the BIOS Screen comes up fine (i have a Maximus IV Gene Z MOBO), then the windows spheres start to move and get near each other in the boot up animation for windows 7 and everything works fine to that point, then, when it the time comes to get into windows itself, the screen turns black and my TV says No Signal, so i disconnect the HDMI cable from my video card and connect it to the MOBO HDMI out port and then the image comes up (the famous 'warning, this may not be a Windows 7 original copy') then i disconnect the cable from the HDMI port of my MOBO and connect it back to the video card and then the image appears on the TV.

What do you think is causing this issue? May it be the drivers? i have installed the drivers several times with the same results (sometimes, even when i do the above procedure the image does not appear in the screen once i go from the HDMI port of my MOBO to the DVI port of the video card)

Please let me know is a Pain in the *ss to be swapping cables every time i turn on my computer

Thanks guys


----------



## iARDAs

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys, please help u have tried to solve the following problem to no avail.
> Look, i have my GTX 590 video card connected to my 50' Plasma TV through a DVI to HDMI adapter. So, when i turn the computer the BIOS Screen comes up fine (i have a Maximus IV Gene Z MOBO), then the windows spheres start to move and get near each other in the boot up animation for windows 7 and everything works fine to that point, then, when it the time comes to get into windows itself, the screen turns black and my TV says No Signal, so i disconnect the HDMI cable from my video card and connect it to the MOBO HDMI out port and then the image comes up (the famous 'warning, this may not be a Windows 7 original copy') then i disconnect the cable from the HDMI port of my MOBO and connect it back to the video card and then the image appears on the TV.
> What do you think is causing this issue? May it be the drivers? i have installed the drivers several times with the same results (sometimes, even when i do the above procedure the image does not appear in the screen once i go from the HDMI port of my MOBO to the DVI port of the video card)
> Please let me know is a Pain in the *ss to be swapping cables every time i turn on my computer
> Thanks guys


Interesting.

Which driver are you using?

I am using the latest beta drivers. Today i uninstalled it, used driver sweeper and reinstalled it just to have my drivers clean as suggested in this thread.

However after this process when I had to reboot my PC, so that the beta install would complete, i had the exact same problem with you.

the computer would turn on, i saw the windows starting screen, but later i lost image. THE pc was on but my monitor was saying NO SIGNAL.

I than switched the place of my DVI cable to the other DVI port on the card and it worked.

Perhaps we just discovered an issue.


----------



## Dennybrig

Yep, it seems we did discovered an issue. Anybody else can share some light on the problem?


----------



## iARDAs

Quote:


> Originally Posted by *Dennybrig*
> 
> Yep, it seems we did discovered an issue. Anybody else can share some light on the problem?


http://forums.nvidia.com/index.php?showtopic=222491

Started this thread on Nvidia`s forum as well

no response so far.


----------



## emett

I have never used driver sweep before, should I uninstall, nVidia chipset, nVidia display and nvidia phyX drivers? or just the display?


----------



## iARDAs

Quote:


> Originally Posted by *emett*
> 
> I have never used driver sweep before, should I uninstall, nVidia chipset, nVidia display and nvidia phyX drivers? or just the display?


Today i came across to a guide and it said display and physx


----------



## Masked

...Did either of you bother to pay attention to the fact that the only correlative denominator between you both was the single most important factor?

Your issue is the word BETA.

BETA drivers will not work 100% with any monitor application especially for one in which the monitor is not hard-written.

Try downloading an official whql.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> ...Did either of you bother to pay attention to the fact that the only correlative denominator between you both was the single most important factor?
> Your issue is the word BETA.
> BETA drivers will not work 100% with any monitor application especially for one in which the monitor is not hard-written.
> Try downloading an official whql.


I reverted back to the previous beta and now the problem seems solved.

I am just afraid that the latest WHQL is not so great for BF3.

but from now on I am sticking to WHQL.

Its been so long since we had one.


----------



## iARDAs

Quick question guys.

Are you comfortable having a 590 around 80-85 degrees?

When i let my fan in AUTO it is around 85 degrees under full load and the fan speed is about 60%

if i use a custome profile i can take it down to high 70s or 80 but than the fan speed gets more around 75% hence the GPU gets a bit louder.

Would i be perfectly safe with around 85 degrees if i use this card forever?


----------



## tonyjones

man quad sli gtx 590 is crazy


----------



## wardoc22

Quote:


> Originally Posted by *tonyjones*
> 
> man quad sli gtx 590 is crazy


now where would you get the money for 4 gtx 590's and a big enough case?

EDIT- wait i forgot that there are 2 cores in a 590. Lol sorry


----------



## Dennybrig

Masked, iARDAs i already checked the driver version i have installed and it is the 285.62, that is NOT a Beta version , am i right?
I still faced the same issue today only that this time my cable swapping did not work and i had to connect my computer to my TV trough the MOBO HDMI out since no my TV said No Signal.

What do you guys think i should do?

Thanks in advance for your support


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> Quick question guys.
> Are you comfortable having a 590 around 80-85 degrees?
> When i let my fan in AUTO it is around 85 degrees under full load and the fan speed is about 60%
> if i use a custome profile i can take it down to high 70s or 80 but than the fan speed gets more around 75% hence the GPU gets a bit louder.
> Would i be perfectly safe with around 85 degrees if i use this card forever?


I felt uncomfortable with it, thats why I made a custom fan profile via Afterburner.
In games im reaching temps between 75°C and 80°C with a 70% fan speed.
The axial fan of the 590 is relatively silent compared to the radial fans that are used on most reference cards, so 70% is still ok.. I can't hear it when playing with headset / headphones which I always do anyway


----------



## toX0rz

http://www.techpowerup.com/forums/showthread.php?t=158907

Has anyone seen this?

Some dude has modded the 290.53 driver in order to be able to SLI different cards together. Following combinations have already been tested and approved by some users:

GTX 460 + GTX 560 Ti
GTX 470 + GTX 560 Ti 448
GTX 580 + GTX 570

Anyone here that has a spare 580 left to test it out?
I'm pretty sure this will actually enable the 590 to run along with a 580 in tri-sli, would be amazing.

Only downside would be that you'd be bound to this driver unless he will keep updating the newer drivers aswell.


----------



## Masked

Quote:


> Originally Posted by *toX0rz*
> 
> http://www.techpowerup.com/forums/showthread.php?t=158907
> Has anyone seen this?
> Some dude has modded the 290.53 driver in order to be able to SLI different cards together. Following combinations have already been tested and approved by some users:
> GTX 460 + GTX 560 Ti
> GTX 470 + GTX 560 Ti 448
> GTX 580 + GTX 570
> Anyone here that has a spare 580 left to test it out?
> I'm pretty sure this will actually enable the 590 to run along with a 580 in tri-sli, would be amazing.
> Only downside would be that you'd be bound to this driver unless he will keep updating the newer drivers aswell.


This isn't true SLI and won't actually be compatible in real world apps.

The issue with the 590 is that each core is not an individual card so, even if you SLI, your limited by the max bandwidth of the lesser card.

So in english, you're bottlenecking whatever the better card is.

It's also not a worthy risk because when your better card ramps up, the lower tries to catch up...However it can.

I know what/how they're doing it and you will fry out a card, eventually.

This is why NVIDIA doesn't authorize it -- You blow out cards faster than the Patriots blew the super bowl...


----------



## Recipe7

Quote:


> Originally Posted by *Masked*
> 
> You blow out cards faster than the Patriots blew the super bowl...


----------



## ProfeZZor X

My new hydro copper 590 finally arrived today (along with my NZXT fan controller)!!! ...Damn that sucker is heavier than I had expected. I just hope that it lives up to its reputation and can handle the 3D content I'm hoping to get out of it.


----------



## emett

Hey Masked, the driver sweep and clean install worked! No more BF3 crashed. W00t Thx!


----------



## kzinti1

I've completely wiped and then redownloaded the drivers, sometimes 4 to 5 times a day, and I still keep losing the Nvidia Control Panel.

I click on it, it doesn't appear. I open the Task Manager and it's supposedly running, but nowhere to be seen.

This began happening with every driver set since, I think, ForceWare 275.33.

Sometimes I'm playing a game, usually RAGE or Hard Reset, glance up at the Afterburner display, and suddenly SLi isn't enabled. I close to the main screen and GPU-Z says SLi is enabled. A few of the times when the NCP actually appears, it sometimes says SLi is enabled, even though the SLi indicator says otherwise.

Yes, the bridge connectors are tight, the contacts clean and I've tried different bridge cables with exactly the same intermittant results.

I've also tried a brand new ssd and full windows install. I have Malwarebytes PRO and Microsoft Security Essentials set to ignore anything Nvidia. When I install the drivers I perform a Custom Install, checking only the video drivers, PhysX and Clean Install. I have no use for 3D drivers or Nvidia sound drivers and prefer to update the drivers myself, so I don't install Nvidia Update.

Have any of you experienced the "Disappearing Nvidia Control Panel" before? If so, did you fix it? Permanently?

BTW, this also happens when I'm using a pair of MSI GTX580 Lightning XE's. And when I run just one card instead of SLi.

Personally, I think the Nvidia driver team has been infiltrated by some asswipe from AMD who is purposely sabotaging these latest drivers. Nvidia used to be most famous for their rock solid drivers. Now, people everywhere are complaining about their on-again, off-again drivers.


----------



## matherror

Hello guys .Somebody can gıve me zotac 590 edit bios ? thanks..


----------



## kzinti1

I spent a good deal of time yesterday (Sunday) erasing, downloading, and reinstalling the ForceWare 285.62 divers.

In fact, I did it at least 6 times before the Nvidia Control Panel would appear so I could setup SLi and then set PhysX to run off on my 3rd Core, instead of the CPU.

According to the NCP, PhysX was set to run on Core 3, but when I enabled both the SLi and PhysX indicators during benchmarks, PhysX was shown as being assigned to the CPU. Even though the NCP showed it on Core 3. Or, as they show, 2A. The 1st Core of the 2nd card.

Also, going by the MSI AfterBurner OSD, Cores 1,,2 and 4 were running at 0.938 mV while Core 3 was at 0.8750 mV.

The ASIC Quality Test of GPU-Z also varies between each Core as 1=76.9%_2=72.0%_3=86.9%_4=86.6%.

All (I think) I know about the ASIC Quality reading is that it means the lower the score you get, the more leakage your chip has. I'm wondering if my 1st card, with the most voltage leakage, should be RMA'd back to EVGA?

I'm also wondering if the disparity between the ASIC Quality has anything to do with the problems I'm having with the NCP, keeping SLi running and PhysX being shunted away from Core 3 and then to the CPU.

I don't remember the numbers, but there was a rather large discrepency between the ASIS Quality of my pair of MSI GTX580 Lightning XE's.

Would someone please tell me what this all about and what I can do to keep Quad-SLi running on these GTX590's? And the problems I'm having with PhysX?

I can't even run FutureMarks 3D Vantage any more since it crashes to a black screen after just a few seconds into the 1st test.

Where the woman in the test should really start wearing a sports bra before those breasts start hanging down past her navel! (I've overheard women talking about how much it hurt when they 1st went jogging and didn't wear the proper support bra. It even hurts me just to look at that part of the program. I think they used real people to create that animation.)

Does FutureMark somehow make your cards run PhysX through the CPU instead of a GPU?

I know most of this thread should be posted at the FutureMark Forums, but the people there seem to not fully understand the English language and have yet to give me any usable advice on anything I've asked there before. They're also completely dedicated Fanboys of the FutureMark programs and know for a fact that all of their programs are never, ever at fault. They're just as bad as the Fanboys on Nvidia, MSI, EVGA and OCZ Forums. Their know for a fact that their products are also never at fault, so they know for a fact that anybody with a problem with their products are just plain stupid and sometimes they won't even answer a post. ASUS is also getting close to being the same way.

Sorry for such a long, rambling post. I have a doctors appointment today and can't sleep. This helps take my mind off of things. Probably the Valium, too.


----------



## Dennybrig

Guys do you know if there is a backplate that is compatible with the Heatkiller X3 Hole edition for the GTX 590? The bare back of the GTX 590 is fugly as hell


----------



## iARDAs

Guys quick question

As i know the cores of our cards have 1.5 Vram each.

When SLI is enabled and the game supports SLI, does it use 3 gb vrams? or are we still stock at 1.5 vram

I am asking this because I read somewhere that a person had 2 570s with 1.5 vram i believe. And even with SLI he always had 1.5 vram. never 3.

is he wrong? or is our card different?


----------



## RushMore1205

please add me, i have 3 of the gtx590 EVGA, 1classy, 2 regular

i love these cards, too bad they are no longer being made


----------



## impalist

Apologies if this sounds relatively elementary, are the three dual-DVI ports in the 590 identical or are they used for certain connections (heard things like SL-DVI and DL-DVIs and was getting confused)?

I probably sound rather pedantic about this, as I'm not really in the know when it comes to these things - guess the point of the question is whether I'm using my monitor's connection to the GPU is being fully optimised. Been trying to look for 590 schematics which hopefully could've detailed an answer for me, but to no avail


----------



## derickwm

Quote:


> Originally Posted by *RushMore1205*
> 
> please add me, i have 3 of the gtx590 EVGA, 1classy, 2 regular
> i love these cards, too bad they are no longer being made


Ohhhh Rushmore








Quote:


> Originally Posted by *impalist*
> 
> Apologies if this sounds relatively elementary, are the three dual-DVI ports in the 590 identical or are they used for certain connections (heard things like SL-DVI and DL-DVIs and was getting confused)?
> I probably sound rather pedantic about this, as I'm not really in the know when it comes to these things - guess the point of the question is whether I'm using my monitor's connection to the GPU is being fully optimised. Been trying to look for 590 schematics which hopefully could've detailed an answer for me, but to no avail


Actually surprisingly even to myself, all 3 dvi ports are indeed DVI-D.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> Guys quick question
> As i know the cores of our cards have 1.5 Vram each.
> When SLI is enabled and the game supports SLI, does it use 3 gb vrams? or are we still stock at 1.5 vram
> I am asking this because I read somewhere that a person had 2 570s with 1.5 vram i believe. And even with SLI he always had 1.5 vram. never 3.
> is he wrong? or is our card different?


Hi iARDAs, enjoying your 590?

To answer your question, just think of the 590 as a 570 SLI. We know that the 570 has 1.5 vram, and the 590 will operate as such. The 590 has two sets of 1.5vram, which do not add up to 3.0vram.

Basically, the 590 is two cores, each with 1.5vram; not 3.0vram total.


----------



## ProfeZZor X

I am so looking forward to finishing this project. I'll be ordering more parts tomorrow.

Here's my proof of purchase:


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> Hi iARDAs, enjoying your 590?
> To answer your question, just think of the 590 as a 570 SLI. We know that the 570 has 1.5 vram, and the 590 will operate as such. The 590 has two sets of 1.5vram, which do not add up to 3.0vram.
> Basically, the 590 is two cores, each with 1.5vram; not 3.0vram total.


Thank you so much for the response my friend.

I am enjoying my 590 so much but i dont think games really take advantage of it. I am talking about Bf3 of course. Especially in 3D the games performance is awful. Other than that the card is a beast.

95% i am not going to be buying a kepler card that is UNLESS a single GPU card can be faster than 590 which is a myth for now. Also I doubt that there will be a dual GPU kepler. If yes not soon.


----------



## Recipe7

Looks like we are both holding true to our word.

I don't play in 3D, so I wouldn't be dissapointed in that department. However, I am able to play at a constant 60fps (vsync on) with everything maxed (including the FXAA injector mod) on BF3. Couldn't be more happier.

I recently started playing the Diablo 3 Beta. My GPU usage is around 50%... too easy for the 590, hehe.


----------



## Masked

Quote:


> Originally Posted by *Recipe7*
> 
> Looks like we are both holding true to our word.
> I don't play in 3D, so I wouldn't be dissapointed in that department. However, I am able to play at a constant 60fps (vsync on) with everything maxed (including the FXAA injector mod) on BF3. Couldn't be more happier.
> I recently started playing the Diablo 3 Beta. My GPU usage is around 50%... too easy for the 590, hehe.


I've been in D3 for a very long while now, Alpha/Beta etc...590's don't even bat an eye...

I don't see the need to get a new card with Kepler...Maybe after...But, there's no need with the current market, truth be told.


----------



## Recipe7

Quote:


> Originally Posted by *Masked*
> 
> I've been in D3 for a very long while now, Alpha/Beta etc...590's don't even bat an eye...
> I don't see the need to get a new card with Kepler...Maybe after...But, there's no need with the current market, truth be told.


True.

The only reason to upgrade, as stated by iARDAs, is for a single card solution that would match the 590 with stock clocks.

Personally, I am running my rig with a 650 watt, so to get any performance over my 590, I would have to buy another PSU to compensate running xfire or sli.

With my x1080 monitor, absolutely no logical reason to upgrade from a 590 within the next 2 years. Unless a single card solution with power above the 7970 comes out which I can SLI or xfire on my 650watt PSU... I know it won't happen anytime soon.


----------



## iARDAs

@ recipe7

I just did some search in 3Dvision forums of Nvidia, and it seems that Battlefield 3 has an issue in 3D itself. Although it is a 3Dvision ready title it seems that engine is having issue.

I truly love the 2d performance in Battlefield 3 but even for that there should be better drivers.

However it is very very good to know that a 590 will pretty much max out any game in the next 2 years no matter what is thrown at it unless it does not support directx 11.1 features of course

@ masked

I agree with your assesment on 3D. Best is probably skip kepler for that purpose but again if there is a single GPU card that can outperform 590 i might consider it. Yet again i might not.







3D market still needs a kick i believe as very very few titles give you the enjoyment of 3D as others have few glitches that really put you off.

Sometimes i say to myself ok no more 3d as it has annoying bugs in most of the games, but than i say oh hell i have a 3dvision monitor let me play games with it. I am having this dillemma for the last 1 month.

Though i truly enjoy the 120hz capabilities of my screen and the power of my 590 which can give me 120 fps in most of the games. Such as skyrim.

Edit : *Ah shoot masked you said D3 as in Diablo 3*. I though you were in 3D from the beginning lol my mistake







Anyway at least the forum took my take on 3D with a 590


----------



## ProfeZZor X

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you so much for the response my friend.
> I am enjoying my 590 so much but i dont think games really take advantage of it. I am talking about Bf3 of course. *Especially in 3D the games performance is awful.* Other than that the card is a beast.
> 95% i am not going to be buying a kepler card that is UNLESS a single GPU card can be faster than 590 which is a myth for now. Also I doubt that there will be a dual GPU kepler. If yes not soon.


This is not very encouraging to hear...







...seeing as I haven't even got my rig up and running yet. But like everyone's opinions here, I'll take it with a grain of salt upon experiencing the 590's 3DVision first hand.


----------



## RushMore1205

im going to let you guys into a little secret, i wittnesed a kepler card that will cost 349, get higher scores in heaven benchmark then the 7970


----------



## iARDAs

Quote:


> Originally Posted by *ProfeZZor X*
> 
> This is not very encouraging to hear...
> 
> 
> 
> 
> 
> 
> 
> ...seeing as I haven't even got my rig up and running yet. But like everyone's opinions here, I'll take it with a grain of salt upon experiencing the 590's 3DVision first hand.


Dont get me wrond 590 is a beast in 3D

but battlefield 3 unfortunately has issues in 3Dvision.

Outside scenes are capped at 40 FPS

You can play it in ULTRA or LOW settings but outside scenes are always at 40 fps.

No issues in inside as well.

In nvidia forums it has been discussed that this is an issue with the engine.

People running 2 580s even have this issue.


----------



## iARDAs

Quote:


> Originally Posted by *RushMore1205*
> 
> im going to let you guys into a little secret, i wittnesed a kepler card that will cost 349, get higher scores in heaven benchmark then the 7970


Is that going to be a high end kepler?

Also how is 7970 compared to 590?


----------



## Masked

Quote:


> Originally Posted by *RushMore1205*
> 
> im going to let you guys into a little secret, i wittnesed a kepler card that will cost 349, get higher scores in heaven benchmark then the 7970


I've witnessed a more expensive card crush the 7970


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> @ recipe7
> I just did some search in 3Dvision forums of Nvidia, and it seems that Battlefield 3 has an issue in 3D itself. Although it is a 3Dvision ready title it seems that engine is having issue.
> I truly love the 2d performance in Battlefield 3 but even for that there should be better drivers.
> However it is very very good to know that a 590 will pretty much max out any game in the next 2 years no matter what is thrown at it unless it does not support directx 11.1 features of course


Well then, no problem in our department, hehe.
Quote:


> Originally Posted by *Masked*
> 
> I've witnessed a more expensive card crush the 7970


Do tell!


----------



## emett

I must say I play ALOT of games in 3d, never have an issue. BF3 included, i turn some settings down and it's smooth as.
*edit* i just checked and was on caspian border in 3d on medium setings and it was running @ 60fps. There were a lot of time its went into the 40's but it's definatly not capped at 40.


----------



## iARDAs

Quote:


> Originally Posted by *emett*
> 
> I must say I play ALOT of games in 3d, never have an issue. BF3 included, i turn some settings down and it's smooth as.
> *edit* i just checked and was on caspian border in 3d on medium setings and it was running @ 60fps. There were a lot of time its went into the 40's but it's definatly not capped at 40.


The 40 fps thing i believe happens only outside when there is a lot of sun.

I can play bf3 perfectly in METRO levels inside for example but outside it does take a fps hit and lingers around 40

Which drivers are you on?


----------



## iARDAs

Question

What type of AA do you guys prefer with your 590?

MSAA?

CSAA?

QCSAA ?

and i believe there is a new AA method FXAA

i believe not every game supports every type of AA method, if not can we enable it via Nvidia Control Panel?


----------



## ProfeZZor X

Quote:


> Originally Posted by *iARDAs*
> 
> Dont get me wrond 590 is a beast in 3D
> but battlefield 3 unfortunately has issues in 3Dvision.
> Outside scenes are capped at 40 FPS
> You can play it in ULTRA or LOW settings but outside scenes are always at 40 fps.
> No issues in inside as well.
> In nvidia forums it has been discussed that this is an issue with the engine.
> People running 2 580s even have this issue.


Playing a lot of 3D content is specifically why I bought this card in the first place, versus settling for two 580's. But if push comes to shove, I may end up getting another one sometime down the line in hopes that that'll help fix the problem, just to be on the safe side. But, I hope that won't be the case. Playing video games in 3D is one thing, but playing feature length 3D movies is another. And I certainly hope the 590 can hold up to that...


----------



## Arizonian

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Playing a lot of 3D content is specifically why I bought this card in the first place, versus settling for two 580's. But if push comes to shove, I may end up getting another one sometime down the line in hopes that that'll help fix the problem, just to be on the safe side. But, I hope that won't be the case. Playing video games in 3D is one thing, but playing feature length 3D movies is another. And I certainly hope the 590 can hold up to that...


I may have misunderstood you but watching 3D blu-ray movies can be done with any Nvidia GPU and dosent effect your watching a 3D movie.

It's 3D Vision gaming where the GPU performance comes into play.

If you haven't purchased yet the best 3D movie out right now is - Transformers Dark Side of the Moon. It's down right spectacular. Movies shot in 3D are much better than movies converted from 2D to 3D.


----------



## Dennybrig

Masked/Rushmore:

I know that you may not be allowed to tell anything about the Kepler cards you witnessed but let me ask you the straight question:
Do you think it is convenient to sell the GTX 590s now to get the new ones? I have two GTX 590s watercooled
Please let me know and thanks for everything


----------



## emett

This is a quote from Masked on the last page.

"I don't see the need to get a new card with Kepler...Maybe after...But, there's no need with the current market, truth be told."


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> Is that going to be a high end kepler?
> Also how is 7970 compared to 590?


Judging by current rumours floating around - no.
The card he is talking about is the mid-range (not high-end) GK104 Kepler which is supposedly faster than the HD 7970.

As for how the 7970 compares to the GTX 590 - stock for stock, the GTX590 is around 20-25% faster.
If you OC the 7970 to around ~1200 MHz they will perform equally. Most 7970's can do 1200-1300MHz.
I dont know how it compares to a overclocked 590 though.

Though, Im almost inclined to say that from todays standpoint, a HD 7970 would be a smarter choice when looking at all the driver limitations that the GTX 590 has gone through. The 7970 also has twice the amount of usable VRAM.

Quote:


> Originally Posted by *Dennybrig*
> 
> Masked/Rushmore:
> I know that you may not be allowed to tell anything about the Kepler cards you witnessed but let me ask you the straight question:
> Do you think it is convenient to sell the GTX 590s now to get the new ones? I have two GTX 590s watercooled
> Please let me know and thanks for everything


I'd like to know that too.
If it truly turns out that GK104 is going to be faster than the 7970 (which would put it around 590 level) for a 300-350$ price point, it will make the GTX 590 obsolete and literally CRIPPLE its resell value.
It may be wise to get rid of the 590 as fast as possible IF the rumours turn out to be true, but theres too little evidence yet.
Some insight from insiders that know more would be nice







even if its just a small hint


----------



## iARDAs

Honestly if there is a new kepler that costs around 350 US dollars that will outperform 590 I would be so angry.

I doubt that something like that would happen but hey, this is business world.

Ati HD 7970 seems a smart buy as well since it is a single GPU but i still prefer the 25% performance in stock speeds.


----------



## Masked

I had this whole thing typed out and then...I said screw it.

You know what I'm going to say, all of you do.

I'll do a list of the facts of the 590 pros and cons...Which, you all know what's on there.

I'm not "blowing" another NDA this week after 5entinel already started a fun little brood-war between my office and Blizzard...Gotta love some of the mods here.

Kepler is coming out soon, make up your own minds.

I will say, if nothing currently challenges your card on this market...Why pay more to switch to something in which the gap is even greater?


----------



## ProfeZZor X

Quote:


> Originally Posted by *Arizonian*
> 
> I may have misunderstood you but watching 3D blu-ray movies can be done with any Nvidia GPU and dosent effect your watching a 3D movie.
> It's 3D Vision gaming where the GPU performance comes into play.
> If you haven't purchased yet the best 3D movie out right now is - Transformers Dark Side of the Moon. It's down right spectacular. Movies shot in 3D are much better than movies converted from 2D to 3D.


I have an arsenal of 3D movies in my collection. Both on blu-ray, and in digital form for my EVO 3D. And yes, Transformers: Dark of the Moon is one of them.

I would eventually like to get into the 3D gaming world in the next couple of months, so I'll have to research what's out there that appeals to me. I'll take your word that the only issue with the 3DVision will be with the fps on games. But honestly, since I don't have any 3D games as of yet, I suppose I shouldn't concern myself until I cross that bridge.


----------



## Dennybrig

Well said Masked, i will rather wait another two years to change them, after all i just spent $246 dollars to water cool my cards so that would frustrate me a little bit....

But anyhow thanks for the insight Masked, just here with you, fellow overclockers, can one really find answers to tough questions about the 590

Kudos


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> Well said Masked, i will rather wait another two years to change them, after all i just spent $246 dollars to water cool my cards so that would frustrate me a little bit....
> But anyhow thanks for the insight Masked, just here with you, fellow overclockers, can one really find answers to tough questions about the 590
> Kudos


The 590 will ALWAYS have issues unless Nvidia has a epiphany...I don't see them having an epiphany...ANYTIME soon.

So, essentially we're left with that you've got, right now.

Kepler will come close, I can't say anything more...If you'd like to ditch the issues of SLI etc, power to you...If not, don't.

Your resale value will drop, regardless as it always does -- These are things you guys know.

It's up to you how you want to tackle those issues, not for me to tell you how...

Your card is more than capable of handling anything currently on the market and will be capable of handling anything in the future.

When/if it doesn't, use it as your excuse then to bump up but -- It's going to be a while


----------



## iARDAs

Masked thank you for the tips. you are spot on thats for sure.

I want to ask you one question though since it seems you know your stuff.

Did you experience Directx 11.1?

Some people says its just software and others say you need new hardware/

Do we really need new hardware for Dx11.1? if yes is it something great that we should think about?


----------



## ProfeZZor X

Quote:


> Originally Posted by *Masked*
> 
> The 590 will ALWAYS have issues unless Nvidia has a epiphany...I don't see them having an epiphany...ANYTIME soon.
> So, essentially we're left with that you've got, right now.
> Kepler will come close, I can't say anything more...If you'd like to ditch the issues of SLI etc, power to you...If not, don't.
> Your resale value will drop, regardless as it always does -- These are things you guys know.
> It's up to you how you want to tackle those issues, not for me to tell you how...
> *Your card is more than capable of handling anything currently on the market and will be capable of handling anything in the future.*
> When/if it doesn't, use it as your excuse then to bump up but -- It's going to be a while


This is all the more reason why I'd be content with the 590 I have now, rather than chase waterfalls. After all, if what you say is true about the resale value of this GPU, then it's just an incentive for me to pick up one or two more 590's to add to my collection. And chances are, those cards combined will/might out perform the latest and greatest card that everyone is chasing after that potentially might not even perform as good or better, or have any significant substance.

...That's just my worthless two cents.


----------



## Masked

Quote:


> Originally Posted by *ProfeZZor X*
> 
> This is all the more reason why I'd be content with the 590 I have now, rather than chase waterfalls. After all, if what you say is true about the resale value of this GPU, then it's just an incentive for me to pick up one or two more 590's to add to my collection. And chances are, those cards combined will/might out perform the latest and greatest card that everyone is chasing after that potentially might not even perform as good or better, or have any significant substance.
> ...That's just my worthless two cents.


You cannot tri-sli 590s -- You can only SLI two 590's for a total of 4 GPU's.


----------



## toX0rz

Interesting stuff, kudos to you Masked









Yeah, I am just generally skeptical seeing how I almost manage to max out the 590's VRAM at 1080p already in some games.
I have the feeling 1,5GB will not be enough to keep me above water the next few months and I could imagine Kepler will OC way better too seeing how nVidia crippled OC potential with driver limitations for this card.

Not sure what to do


----------



## ProfeZZor X

Quote:


> Originally Posted by *Masked*
> 
> You cannot tri-sli 590s -- You can only SLI two 590's for a total of 4 GPU's.


Oops... Did I say "Tri"?

I'll give the market about six months from now and see where the 590 stands. If prices go south, I'll most likely have another one set up on my rig before the end of the year... And that should be more than enough to keep me in the black for a good long time.


----------



## Calado90

Oh my god, my asus gtx 590 just sucks. I just flashed the card to try some oc. but even with 0.968v the card doesnt andles 670mhz on core.
It just sucks


----------



## rush2049

Well I just about saw 700$ burn up in smoke....

I was fiddling with my hard drives, preparing for a new ssd that is on the way.
So I was moving sata cables around. I finished what I was doing and booted back up. I waited for all the hard drives to be detected (6 sata devices) and re-booted as windows told me after it installed drivers for them.

Then I putting around my desktop making sure everything was ok when I started to notice wierd graphical glitches. Like windows not refreshing when I moved them and such. I happened to glance at my GPU meter and it said one core was at 75C and the other at 80C, and the fan was running at 100%. That was wierd though, cause I normally idle at 35 or 36C.

Opened up afterburner, fan said it was going at 100%, cores were not cooling off, actually they were getting hotter.....

Opened up GPU-Z it said the gpu temps were in fact that high, which was scary cause at this point they were nearing 90C. But I did notice something GPU-Z said my fan was in fact not spinning at all and was at 0rpm.

At this point I panicked, and shut down!

Opened the case back up, it seems one of my sata cables going to an e-sata port on the back must have bunched up a bit and was caught on one of the fan blades of the 590. So my fan wasn't spinning at all.....

Did a bit of re-routing, now I am back to 35C idle temps... I thought it was a goner for a second....

edit: posted this here as well because its a 590 story


----------



## iARDAs

Thats some scary thing bro.

Glad you sorted it out.

I honestly dont want to be in your shoes at that moment.

If my 590 dies i would be mad.


----------



## kzinti1

It seems the manufacturers could add a grill over these fans, without going bankrupt because of the 2 or 3 pennies (or even less) it would cost per card. They could mold them into the covers.


----------



## Dennybrig

Damn Rush i dont want to be in your shoes either....
i am a little panicked too since yesterday i started to put the watercooling blocks on my video cards and those are some heavy M*therf*ckers...

I dont know if my Asus Maximus IV Gene Z will support the weight of two cards with the waterblocks but lets see what happens!!

I am just waiting for:

* Compression Fittings from EK (the cheapest available)
* Primoflex UV Reactive tubing
* Heatkiller Dual Link SLI connector (to connect the waterblocks together)
* 280 mm radiator from Koolance (i will use that to cool one of the video cards, the other one and the processor will be handled by the XSPC RS360 radiator i have on top of the case, HAF X)
* NZXT 8 pin PSU cable extender (the cable from my OCZ ZX 1000W were not long enough to get where the processor is)
* Logysis UV Cold Cathodes

Guys, i have been following this thread since the early days and finally im about to finish my machine!!!, i cant believe it..

Can you believe i started with a MicroATX case with just one GTX 590 video card?? This kind of threads can take you to very weird paths really....

But anyhow thanks for your support guys!

I will post pictures as soon as i finish it (i hope to do it next week)


----------



## Mongol

^^^they are heavy.

There is some serious sag that I'm in the process of remedying. I know you can use solid tubes to prop the top card up. I actually came up with this:



Eliminates the sag of the top card, and I have another piece of carbon fiber that I'm going to shape in order to prop up the bottom card.


----------



## Dennybrig

Wow, nice mod bro, looks really nice, you know, i think that the Dual Link will do the trick for me just for the upper card, it looks like this:

http://www.frozencpu.com/products/14821/ex-blc-1035/Heatkiller_GPU-X_GPU-X_X_Dual_Link_Bridge_Block_-_1_Sot_10191.html

Check it out, is for Heatkiller cards


----------



## Dennybrig

By the way *********, one question, how many radiators and what type do you use to cool the cards?

Please let me know! I was looking for someone that had two GTX 590s watercooled!


----------



## Tuthsok

Quote:


> Originally Posted by *Dennybrig*
> 
> By the way *********, one question, how many radiators and what type do you use to cool the cards?
> Please let me know! I was looking for someone that had two GTX 590s watercooled!


Hi,

I use two of these to cool a pair of EVGA 590 Hydro Coppers and an i7 990x

http://www.koolance.com/water-cooling/product_info.php?product_id=2034

These keep all my GPU and CPU cores below 50C when running everything at a 100% full load ([email protected] on all CPU and GPU cores).


----------



## Dennybrig

Hey Tuthsok thanks for the advice!! I will try to cool them and my procesor with one XSPC RS360 and a 280mm rad from koolance, do you think it will be enough??


----------



## Tuthsok

Quote:


> Originally Posted by *Dennybrig*
> 
> Hey Tuthsok thanks for the advice!! I will try to cool them and my procesor with one XSPC RS360 and a 280mm rad from koolance, do you think it will be enough??


Depending on what you are doing that should probably do it fine or at least to start and get an idea of your needs. You can always adjust later if needed if things run hotter than you want.

When I first put in my second 590 I was running the CPU and both cards on just one of the 3x120 radiators I noted. Surprisingly it held it's own under moderate loads. This was during fall when ambient temperatures were cooler and I did not do any folding or run games such as Metro 2033 at Max settings until I added the second radiator.


----------



## Forty-two

So I just updated the driver for my ASUS ENGTX590 from 267.91 that came stock to 285.62. I ran 3DMark11 right before I made the change and then tried to run it right after. I got this error message:

3DMark 11 has found DirectX 11 system files, but your graphics adapter reports that it supports only Direct X 9_3.

Anyone know what went wrong and how to fix it? Rolling the driver back didn't fix it.


----------



## MKHunt

Quote:


> Originally Posted by *Forty-two*
> 
> So I just updated the driver for my ASUS ENGTX590 from 267.91 that came stock to 285.62. I ran 3DMark11 right before I made the change and then tried to run it right after. I got this error message:
> 3DMark 11 has found DirectX 11 system files, but your graphics adapter reports that it supports only Direct X 9_3.
> Anyone know what went wrong and how to fix it? Rolling the driver back didn't fix it.


did you do a clean install? Nvidia drivers are notoriously bad when it comes to writing over themselves.


----------



## Dennybrig

Ok Tuthsok!!! Wow you were really brave putting one CPU and four GPUs on a single 360 Rad!!! But its good news knowing that it did not exploded wiith that load!!

I hopefully will assemble the entiere Rig on Monday


----------



## RushMore1205

Quote:


> Originally Posted by *Dennybrig*
> 
> Ok Tuthsok!!! Wow you were really brave putting one CPU and four GPUs on a single 360 Rad!!! But its good news knowing that it did not exploded wiith that load!!
> I hopefully will assemble the entiere Rig on Monday


one 360 rad will handle anything what we do is such overkill man


----------



## piskooooo

How are the drivers for this? I can get a Hydro Copper version for $700 which seems amazing but I dunno.


----------



## mironccr345

Hey guys, I was looking into upgrading my 460's 2GB for one GTX 590. Will one GTX 590 be worth the upgrade? I run Nvidia Surround and I think one 590 will do the job. I figured it wouldn't hurt to ask before I made a purchase. Thanks for the help in advance!


----------



## squishysquishy

Hello Gents; I am done.

Front View


Cable Management Side


The Belly of The Beast


Closeup of Water Cooling


GPU Temps


Bunny


BTW, does this count as my proof?


----------



## iARDAs

Quote:


> Originally Posted by *piskooooo*
> 
> How are the drivers for this? I can get a Hydro Copper version for $700 which seems amazing but I dunno.


You mean the nvidia drivers? well they are fine for me. Its just we are waiting for Nvidia to release a WHQL drivers for 4 months in general but no luck so far. Sinc ethis is a SLI card there might be issues on newer games but eventually they are sorted out. With my 590 I dont ever run into a trouble that can not be fixed. They say Alan Wake has issues with Nvidia in general and there is no SLI profile. but I am guessing that this will be sorted out in the next patch.

Quote:


> Originally Posted by *mironccr345*
> 
> Hey guys, I was looking into upgrading my 460's 2GB for one GTX 590. Will one GTX 590 be worth the upgrade? I run Nvidia Surround and I think one 590 will do the job. I figured it wouldn't hurt to ask before I made a purchase. Thanks for the help in advance!


590 Is a great card for Nvidia Surround. Or perhaps a 580 or 570 SLI. It will have 1.5 GB vram though for each core. Any how. A 590 will beat 460 SLI easily so go for it, or if you really want wait for Kepler. However if you are a person like me and just want the gpu now go for the 590 sure. I love it personally. Though I am not running surround. I wish i did. I just dont have the space for it


----------



## iARDAs

Quote:


> Originally Posted by *Crackheadkid*
> 
> Hello Gents; I am done.
> Front View
> 
> Cable Management Side
> 
> The Belly of The Beast
> 
> Closeup of Water Cooling
> 
> GPU Temps
> 
> Bunny
> 
> BTW, does this count as my proof?


Your water cooling cables look like a Amazonian green poisonous snake crawled up in your case waiting for its prey.

Other than that good setup. What is your temp in load? like playing BF3?


----------



## squishysquishy

Quote:


> Originally Posted by *iARDAs*
> 
> Your water cooling cables look like a Amazonian green poisonous snake crawled up in your case waiting for its prey.
> Other than that good setup. What is your temp in load? like playing BF3?


Dont mock my green tubing, it looks fantastic.

At 100% load using 3dmark11 for 25min temps leveled off at GPU1 59C GPU 2 69C degrees according to evga precision. I might have the patience to redo the thermal paste, but not for a while.

Overall, I am happy. Beats the 85C I was getting a half power previous to liquid cooling.


----------



## Masked

Quote:


> Originally Posted by *Crackheadkid*
> 
> Dont mock my green tubing, it looks fantastic.
> At 100% load using 3dmark11 for 25min temps leveled off at GPU1 59C GPU 2 69C degrees according to evga precision. I might have the patience to redo the thermal paste, but not for a while.
> Overall, I am happy. Beats the 85C I was getting a half power previous to liquid cooling.


I have a 480+ a 360 sandwiched together -- Never goes above 35c -- Verrrrry happy.

Re-Doing my double wide soon enough -- Will have pics









Expecting even lower temps -- 30c at load is my prediction.

ALSO

I regret selling my other personal 590 -- If someone knows someone with a 590 EVGA HC that's willing to part with it -- Please contact me. -- Thanks.

I have a 580 1.5gb HC that I'm willing to let go + some serious cash


----------



## iARDAs

Quote:


> Originally Posted by *Crackheadkid*
> 
> Dont mock my green tubing, it looks fantastic.
> At 100% load using 3dmark11 for 25min temps leveled off at GPU1 59C GPU 2 69C degrees according to evga precision. I might have the patience to redo the thermal paste, but not for a while.
> Overall, I am happy. Beats the 85C I was getting a half power previous to liquid cooling.


Nice temps

Stock speeds give me 85 degrees under full load and the fan spins at 75%

I dont need to overclock so i am happy


----------



## Dennybrig

Well, Rushmore, maybe you are right, but as one fellow overclocker once said, if you use only one 360 rad you will not want to touch the hose "with a 10 feet tad pole"

I think that it will run too hot


----------



## Dennybrig

Crackheadkid congratulations on your Rig!!!

I think it was a pain in the a*s to put it all together since it seems REAAAAAALLY tight in there.

badass nonetheless


----------



## Dennybrig

I know of a guy who is selling his GTX 590 without a watercooling block, it is the EVGA GTX 590 and it is in perfect condition (i have seen it working)

Would you be interested in his card?

Please let me know


----------



## piskooooo

Quote:


> Originally Posted by *iARDAs*
> 
> You mean the nvidia drivers? well they are fine for me. Its just we are waiting for Nvidia to release a WHQL drivers for 4 months in general but no luck so far. Sinc ethis is a SLI card there might be issues on newer games but eventually they are sorted out. With my 590 I dont ever run into a trouble that can not be fixed. They say Alan Wake has issues with Nvidia in general and there is no SLI profile. but I am guessing that this will be sorted out in the next patch.


Thanks, I'll probably get it then.


----------



## Tuthsok

Quote:


> Originally Posted by *Dennybrig*
> 
> Well, Rushmore, maybe you are right, but as one fellow overclocker once said, if you use only one 360 rad you will not want to touch the hose "with a 10 feet tad pole"
> I think that it will run too hot


That was the reason I kept loads down on my system while it only had the one rad in the loop after adding the second 590.

While the single 3x120 radiator would still keep the CPU and GPUs below air cooled temperatures when I tested quickly with some benchmark tools the coolant temperature went way higher than my comfort levels would allow


----------



## PCModderMike

Quote:


> Originally Posted by *mironccr345*
> 
> Hey guys, I was looking into upgrading my 460's 2GB for one GTX 590. Will one GTX 590 be worth the upgrade? I run Nvidia Surround and I think one 590 will do the job. I figured it wouldn't hurt to ask before I made a purchase. Thanks for the help in advance!


Uh oh, stepping up to a 590!? Thats a nice upgrade man







You thinking of buying the EVGA version with the block already installed? Or gonna go with a reference design and put your own block on? Oh and you think it will it fit in your RV02 too with the RX360 down there?


----------



## mironccr345

Quote:


> Originally Posted by *PCModderMike*
> 
> Uh oh, stepping up to a 590!? Thats a nice upgrade man
> 
> 
> 
> 
> 
> 
> 
> You thinking of buying the EVGA version with the block already installed? Or gonna go with a reference design and put your own block on? Oh and you think it will it fit in your RV02 too with the RX360 down there?


haha, Thanks bro, but im shopping around. Definitely a EVGA version, but will take the Asus version. I'll more than likely get the one with a reference cooler then get a water-block later down the road. I don't think it'll fit with my RX, so i'll have to switch it out with a EX 360 rad....or a 540 rad!


----------



## iARDAs

Guys I have a Zotac version of this card

What does installing a block do exactly?

It is to do with only water cooling i suppose?


----------



## Shinobi Jedi

590 Pimps -

Forgive me if this has been covered, I tried searching RagingCain's Bios thread and the last dozen pages but can't find the info I thought I may have read on here from Masked or Rush,

But in trying to pin down why my rig is still getting crashes to desktops with SWTOR seemingly with no matter what driver I try, I noticed that my voltage is 0.875 on GPU 0 and 0.913 on GPU 1.

My bios are registering as the same in Cain's bios thread under the following:

EVGA - Stock HydroCopper BIOS (Patched):
Version: 70.10.37.00.92 (GPU0) / 70.10.37.00.93 (GPU1)
630 Core / 1260 Shader / 864 Mem
GPU0 - 0.925v / GPU1 - 0.925v
Max Voltage = 1.063v

But I'm running those bios on the AC 590 Classified LE's...

TBH, I don't remembering flashing Cain's modified bios (but I am drawing a blank) and if not, then these are the bios from EVGA to unlock the fan profile I assume because that's the only flashing I did that I remember.

But if that's the case, shouldn't my voltage at least be running at 0.925? Or did I read on here somewhere that the newer drivers locked the voltage back down to 0.913? That's what I'm unsure of, and confused about.

And is this my problem with SWTOR crashes to desktop? Because that seems to be the only issue I'm having, every other game runs full blast flawless.

The only other tricks that seem to help SWTOR is running it in WinXp compatability mode, or disabling SLI. Employing both did reduce the crashes, but it still occurs.

Another interesting thing the game does that I don't understand is, when playing SLI and 4 GPU's going, the space missions fps don't go above 70fps. If I disable SLI I get 111-120fps constant.

The only other potential issue with my system is my X58 mobo has a dead dimm slot, so I've only been running 8gb's of ram and just haven't felt like tearing my rig apart to RMA it, until ideally, Ivy comes out and I do an upgrade and then slap my X58 into a backup build once it got fixed.

But maybe that's causing my issues? I really have no idea as most of my knowledge comes from enthusiast hardware and rig assembling and I really don't know jack about programming.

Still love my cards, though. The only thing that is making me consider Kepler is if they come out with 3D 120hz IPS displays that can do 2560x1600, and the 590 1.5gb vram limit having an issue running games full blast at those resolutions. Otherwise, I'm quite happy with what I have and playing 120hz 1080p

(Though I still haven't forgotten what Cain said months ago about how good movies can look at that res. That alone, has me considering picking up a standard 2560x1600 display, but just not really expect to use it as a gaming display and keep my current BenQ for that.)

Thanks to all for taking the time to read this and any help offered.

ETA: Forgot to say, I've also been using the latest version of driver sweeper while in safe mode in between driver switching, while also using Phyxion's reg cleaning program, so I think (hope) my registry should be cool.

Cheers!


----------



## kzinti1

Same here. On my 1st card, core 2 runs at a lower voltage than core 1, and both cores of my 2nd card. I'm getting random crashes and hard freezes while playing RAGE, and no problems at all with Hard Reset or Quake 4, except for the same voltage difference in core 2 of the 1st card. Using stock gpu BIOS and ForceWare 285.62.

Core 1 = 0.9380V.

Core 2 = 0.8750V.

Core 3 = 0.9380V. (PhysX enabled)

Core 4 = 0.9380V.


----------



## iARDAs

The new WHQL drivers are out guys,. Give them a try


----------



## kzinti1

Quote:


> Originally Posted by *iARDAs*
> 
> The new WHQL drivers are out guys,. Give them a try


Where? Not at Nvidia or GeForce. Yet.


----------



## iARDAs

Quote:


> Originally Posted by *kzinti1*
> 
> Where? Not at Nvidia or GeForce. Yet.


http://www.nvidia.co.uk/object/win7-winvista-64bit-295.73-whql-driver-uk.html

out in UK

They did not update all the regions it seems.

Also they just poped out in Guru3d.com as well but they named the file wrong. It has to be 295.73 and they ahve it as 295.53 but the file is actually 295.73

http://www.guru3d.com/news/geforce-forceware-29553-64bit-whql-download/

this is the link, though I am sure this will be edited once they realize their mistakes.


----------



## kzinti1

Thanks Iardas. I'm looking forward to your testing the new drivers for us.


----------



## iARDAs

Quote:


> Originally Posted by *kzinti1*
> 
> Thanks Iardas. I'm looking forward to your testing the new drivers for us.


Haha i will for sure

Once the international version is up


----------



## Shinobi Jedi

You can get the download link through the U.S.A site if you let them automatically detected your system. But the manual search/link has not been updated yet.

I'm going to try them.

Good Times...


----------



## iARDAs

These drivers AWESOME in BF3 for me.

Ultra settings. I get min 60 but no micro stuttering or anything like that.

Smooth as butter.

I strongly advice for this drivers.


----------



## Wogga

why no swtor sli profile? =(
Shinobi Jedi, i'm getting crushes too. and right after crash only one of four GPUs continue to "work", other 3 sit at 0% load, 0.875v and nothing can make them work again except reboot. it happens roughly after 2-2.5 hours of playing and first symptoms are dissapearing textures


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Wogga*
> 
> why no swtor sli profile? =(
> Shinobi Jedi, i'm getting crushes too. and right after crash only one of four GPUs continue to "work", other 3 sit at 0% load, 0.875v and nothing can make them work again except reboot. it happens roughly after 2-2.5 hours of playing and first symptoms are dissapearing textures


Wow. That is exactly how it goes down for me. Exactly...

Sorry you're getting it too, but I feel a lot better knowing it's not just me. Especially with how exacting the crashing occurs. I didn't realize that was the pattern until you pointed it out. And when you did I felt like I was staring into a mirror.

Well, maybe we can get them to fix it if we raise enough hell?

Or if not, maybe either bug EVGA to write an SLI profile like they used to or hope one of the Software Jedi's around here can write us one up.

But I agree, it's really disappointing that there's no SWTOR SLI profile...

ETA: I just put a post on the EVGA boards under the sticky for this new driver asking Jacob to see if the EVGA SLI Enhancement team could write up a proper SLI profile for SWTOR.. Hopefully we'll get a positive response!


----------



## Image132

ARGH! I hate nvidia. Why do they put displayport on their cards if their drivers don't support it? I mean what the hell. Worst part is their last drivers (285.62) supported it. They are going backwards!

Now I have to use DVI


----------



## Masked

Quote:


> Originally Posted by *Image132*
> 
> ARGH! I hate nvidia. Why do they put displayport on their cards if their drivers don't support it? I mean what the hell. Worst part is their last drivers (285.62) supported it. They are going backwards!
> Now I have to use DVI


I mentioned the connection thing at a meeting we had a few weeks back -- It's basically a driver issue involving over-ride on the displayports -- I don't know what's up with that but, hopefully it'll get fixed, soon.


----------



## Shinobi Jedi

Everyone on the new driver thread on the EVGA boards are blaming the game for the issues we're having in SWTOR, and advising to just unsub..

My gaming notebooks have been getting some neglect, I'm going to try to play it a little more extensively on those and see if I get any similar issues.

I was getting game crashes on my notebooks, but I wasn't sure if it might due to temps. The Asus G73 5870m GPU's run hot. And SWTOR gave me my first overheat since I had my notebook GPU RMA'd and repasted with IC Diamond.

Mass Effect 3 comes on 3/6. That will definitely pull me away from SWTOR and hopefully let my 590's stretch their legs a little or a lot more than running SWTOR through one GPU does.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Everyone on the new driver thread on the EVGA boards are blaming the game for the issues we're having in SWTOR, and advising to just unsub..
> My gaming notebooks have been getting some neglect, I'm going to try to play it a little more extensively on those and see if I get any similar issues.
> I was getting game crashes on my notebooks, but I wasn't sure if it might due to temps. The Asus G73 5870m GPU's run hot. And SWTOR gave me my first overheat since I had my notebook GPU RMA'd and repasted with IC Diamond.
> Mass Effect 3 comes on 3/6. That will definitely pull me away from SWTOR and hopefully let my 590's stretch their legs a little or a lot more than running SWTOR through one GPU does.


People on the EVGA boards are generally morons -- Why I don't go there anymore.

In fact, I don't even visit their website anymore.

In the case of the 590, it's an SLI issue and the same goes for those that have SLI (Hint -- About 80% of that forum)...A fix is coming but, it's in NVIDIA's hands.


----------



## Recipe7

Quote:


> Originally Posted by *Masked*
> 
> I mentioned the connection thing at a meeting we had a few weeks back -- It's basically a driver issue involving over-ride on the displayports -- I don't know what's up with that but, hopefully it'll get fixed, soon.


Has the recent WHQL driver fixed the displayport problem?


----------



## iARDAs

Hey folks question

I had this before with a beta driver and had it once with the last WHQL

while gaming my cores fall from 100% to 30 or 40%

why would that happen?

MY FPS falls from 70 to 30s and i need to quit the game.

Any ideas?


----------



## kzinti1

Here's a fairly interesting thread at Nvidia's Forums: http://forums.nvidia.com/index.php?showtopic=196244&st=0&gopid=1372178&#entry1372178

Well worth a read IMO. Th OP is dated 24 March 2011 and my post, the last one, is #30.

I guess that indicates how much Nvidia actually cares about our problems with these cards.

They have our money so I guess we can all just go and piss off.


----------



## kzinti1

I just noticed that with the new drivers, 295.73, the voltage has dropped down to 0.9250 max., from 0.9380.

I installed a new copy of PCMark 7 and the scores dropped by about 6 points with the new drivers at the same settings.

If you haven't tried the new drivers yet, please run some benches before *and* after installing them.

Also try your regular OC settings and whatever your stock settings are/were.

As usual, there isn't one single optimization for any game I own and none for any game I even want to own.

BTW, the Nvidia Control Panel still disappears every single time I restart my computer.


----------



## Wogga

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Wow. That is exactly how it goes down for me. Exactly...
> Sorry you're getting it too, but I feel a lot better knowing it's not just me. Especially with how exacting the crashing occurs. I didn't realize that was the pattern until you pointed it out. And when you did I felt like I was staring into a mirror.
> Well, maybe we can get them to fix it if we raise enough hell?
> Or if not, maybe either bug EVGA to write an SLI profile like they used to or hope one of the Software Jedi's around here can write us one up.
> But I agree, it's really disappointing that there's no SWTOR SLI profile...
> ETA: I just put a post on the EVGA boards under the sticky for this new driver asking Jacob to see if the EVGA SLI Enhancement team could write up a proper SLI profile for SWTOR.. Hopefully we'll get a positive response!


i've found only one way to make it run longer than 2-2.5 hours. i'm playing in fullscreen, every time i'm noticing some weird things with textures i just alt+tab and continue to play untill next texture glitch. another symptom is when loading screen looks like a random skill icon stretched from 100x100 pixels to 1920x1200. this can lead to driver crash if you wont be quick enough to hit "windows" key or alt+tab
i should try to check video memory usage but always forgetting about this because of EV and HH HMs








also i havent tried to play with SLi turned off


----------



## Masked

The new drivers are such garbage.

I crash in BF3 after 20 mins and the voltage is so low it's ridiculous.

So much for "we're addressing those issues"...

What *******.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> The new drivers are such garbage.
> I crash in BF3 after 20 mins and the voltage is so low it's ridiculous.
> So much for "we're addressing those issues"...
> What *******.


I did not crash in BF3 so far. Did you crash with stock speeds or OC

maybe these new drivers are not OC friendly as they lowered the voltage.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I did not crash in BF3 so far. Did you crash with stock speeds or OC
> maybe these new drivers are not OC friendly as they lowered the voltage.


Stock -- I gave up on the OC the other night.

Still crash every 20 minutes...Driver recovers -- Forced to restart.


----------



## toX0rz

so they lowered the max voltage again?


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Stock -- I gave up on the OC the other night.
> Still crash every 20 minutes...Driver recovers -- Forced to restart.


Ouch that sucks bro.

I am sorry you are having a bad experience with these drivers


----------



## Dennybrig

Guys, i need you help. Look yesterday i did the first leak test on my liquid cooling setup and found out no leaks but the problem is that there are some parts of the tubing are completely devoided of water (imagine some megabubbles on the tubing). My setup is:

Pump and Reservoir (XSPC 750) > CPU block (XSPC) > 2 GTX 590 in quadsli > 280mm radiator > 360 mm radiator

Please let me know what to do to fill completelly the entire loop. The reservoir is full and the pump is working but for some reason the tubing has those megabubbles

Thanks in advance


----------



## rush2049

With the custom firmware, the latest drivers didn't change a thing voltage wise..... just saying....


----------



## YP5 Toronto

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys, i need you help. Look yesterday i did the first leak test on my liquid cooling setup and found out no leaks but the problem is that there are some parts of the tubing are completely devoided of water (imagine some megabubbles on the tubing). My setup is:
> Pump and Reservoir (XSPC 750) > CPU block (XSPC) > 2 GTX 590 in quadsli > 280mm radiator > 360 mm radiator
> Please let me know what to do to fill completelly the entire loop. The reservoir is full and the pump is working but for some reason the tubing has those megabubbles
> Thanks in advance


You will need to shake, rattle and roll the whole setup to get those air bubbles to move. Not much help can be provided as we can't see or touch your setup.

Edit: Create a separate thread on your problem.


----------



## Dennybrig

Thanks i will post pictures so that you can see wat i mean


----------



## Dennybrig

June thing, shouldnt the water replace the air from the loops by itself? Why do i nee to shake the setup?


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> June thing, shouldnt the water replace the air from the loops by itself? Why do i nee to shake the setup?


Your water pressure isn't high enough to push through the air pocket -- you shake every system or rock it because of this...always have, always will.


----------



## Dennybrig

Guys, just to let you know that i had given a 360 degree turn to my entire case and the air pockets are not in the tubing anymore! Thanks for the advice but im afraid the air pockets might be inside one of my waterblocks or radiators , is there any way to know this?


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys, just to let you know that i had given a 360 degree turn to my entire case and the air pockets are not in the tubing anymore! Thanks for the advice but im afraid the air pockets might be inside one of my waterblocks or radiators , is there any way to know this?


If you start "losing" water and have no leaks, then you have a pocket in the top of your rad...


----------



## YP5 Toronto

you will need to continue to fill and shake rattle and roll. Do you have reservoir?


----------



## Dennybrig

I have the XSPC 750 pump and reservoir comboy, is a bay unit...
Ok i think i will have to repeat the rolling and shaking... Im leak testing with the components inside the case now, by 10pm i will know how it looks like


----------



## Dennybrig

How can one know when is ones sustem completelly bubble free?Is it just visual?


----------



## ProfeZZor X

Quote:


> Originally Posted by *Dennybrig*
> 
> How can one know when is ones sustem completelly bubble free?Is it just visual?


I'd also like to know this too...

That, and how does one properly flush the water block of impurities on these types of GPUs?


----------



## Dennybrig

Guys last dumb question! Ehen i do the famous rattling and shaking, do i have to have the pump running or shutted off?


----------



## YP5 Toronto

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys last dumb question! Ehen i do the famous rattling and shaking, do i have to have the pump running or shutted off?


both.


----------



## Dennybrig

What? I dont understand


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> What? I dont understand


You shake it while running and then do it while it's not -- It's a fairly easy concept - Both.


----------



## Dennybrig

Cmon man why the rude attitude


----------



## YP5 Toronto

Quote:


> Originally Posted by *Dennybrig*
> 
> Cmon man why the rude attitude


not rude..its a straight answer. you asked on or off...I said both. very simple. As noted before you should start a separate thread.


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> Cmon man why the rude attitude


Since when is sarcasm an attitude?

Lol...


----------



## Dennybrig

Ok bye


----------



## Arizonian

Though I'm not in the club, I've followed it since it's release and any discussion about the 590 is the right place to discuss anything any 590 owner might have.

Denny your in the right place and just ignore comments that don't completely iterate the answer to your question because one person doesn't speak for the club. I can attest to most of the 590 club members are very good people and always have been willing to help where their knowledge could assist another 590 member.








Can't we all just get along?


----------



## Recipe7

This isn't a loop thread, yet the 590 club delivers,


----------



## rush2049

590 club is fairly exclusive.... we can handle questions of any type... lol


----------



## Recipe7

Haha. right on... i feel special now


----------



## Dennybrig

Guys i regret to inform you very very sad news indeed. As i finished my leak testing yesterday i was extraordinarily happy when the time got to start testing with my computer turned on.

Everything went fine but my computer did not recognized the second Video Card and i said "Well it has to be that it does not have the drivers for it yet" So i started downloading the drivers and the most bizarre thing happened. The computer shutted down itself and then turned on again but when it did it fried something out of my Slot 1 video card since i saw smoke coming out from it.... i immediatelly turned the computer off but i think it was too late

You dont know how bad i feel about this, i have been you know trying to set up this whole mess of water cooling and purchased a bunch of stuff so that i could be ready to watercool my system and this happens....

Hard facts:

* I was using a OZC ZX 1000W PSU (suggested to my by a fellow overclocker that has the same configuration as mine and has worked fine for him for loke a year now) Do you think the PSU was responsible for this?
* NO LEAKS were present when i the testing and my loop was completely well contained
* When i took apart the card that fried (i removed the waterblock) i found out that there was close to no remains of Thermal Paste in the GPU area of the waterblock. I feel like the waterblock was not even touching the GPU right
* When i reconnected the Video Card to my computer with the Air Cooler the LED Logo of the video card starts blinking (this did not happened before)

Guys, what do you think i did wrong?

Can the Video card get repaired by someone?

thanks for your support im a little down for this


----------



## Smo

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys i regret to inform you very very sad news indeed. As i finished my leak testing yesterday i was extraordinarily happy when the time got to start testing with my computer turned on.
> Everything went fine but my computer did not recognized the second Video Card and i said "Well it has to be that it does not have the drivers for it yet" So i started downloading the drivers and the most bizarre thing happened. The computer shutted down itself and then turned on again but when it did it fried something out of my Slot 1 video card since i saw smoke coming out from it.... i immediatelly turned the computer off but i think it was too late
> You dont know how bad i feel about this, i have been you know trying to set up this whole mess of water cooling and purchased a bunch of stuff so that i could be ready to watercool my system and this happens....
> Hard facts:
> * I was using a OZC ZX 1000W PSU (suggested to my by a fellow overclocker that has the same configuration as mine and has worked fine for him for loke a year now) Do you think the PSU was responsible for this?
> * NO LEAKS were present when i the testing and my loop was completely well contained
> * When i took apart the card that fried (i removed the waterblock) i found out that there was close to no remains of Thermal Paste in the GPU area of the waterblock. I feel like the waterblock was not even touching the GPU right
> * When i reconnected the Video Card to my computer with the Air Cooler the LED Logo of the video card starts blinking (this did not happened before)
> Guys, what do you think i did wrong?
> Can the Video card get repaired by someone?
> thanks for your support im a little down for this


Can I just get this straight. Was this the first time you powered on the machine after building it or something? I ask because you say you were leak testing it (without the machine on?) then were happy to turn it on, and when you did, the machine didn't recognise the second 590.

I can only assume that the 'shut down and turning itself back on' was a driver crash, probably due to the primary card overheating.

I would assume that there was possibly no flow, or a large enough air pocket that it's fried your top card (or like you say, no TIM on the waterblock).

Were you monitoring the temperatures at all?


----------



## Masked

Quote:


> Originally Posted by *Arizonian*
> 
> Though I'm not in the club, I've followed it since it's release and any discussion about the 590 is the right place to discuss anything any 590 owner might have.
> Denny your in the right place and just ignore comments that don't completely iterate the answer to your question because one person doesn't speak for the club. I can attest to most of the 590 club members are very good people and always have been willing to help where their knowledge could assist another 590 member.
> 
> 
> 
> 
> 
> 
> 
> Can't we all just get along?


Why is everyone pointing at me -- It's a very simple concept of water pressure...If an air pocket big enough develops, the continual pressure on the back end cannot push through the pocket because of it's residual difference with the location -- So you shake it and the bubble isn't "stuck" anymore.

That's literally like 6th grade science -- I wasn't being mean, I was simply pointing out exactly what to do...Besides the fact that the above should be common sense, I haven't been mean to anyone in a very long time...Not about to start with random person 01.
Quote:


> Originally Posted by *Dennybrig*
> 
> Guys i regret to inform you very very sad news indeed. As i finished my leak testing yesterday i was extraordinarily happy when the time got to start testing with my computer turned on.
> Everything went fine but my computer did not recognized the second Video Card and i said "Well it has to be that it does not have the drivers for it yet" So i started downloading the drivers and the most bizarre thing happened. The computer shutted down itself and then turned on again but when it did it fried something out of my Slot 1 video card since i saw smoke coming out from it.... i immediatelly turned the computer off but i think it was too late
> You dont know how bad i feel about this, i have been you know trying to set up this whole mess of water cooling and purchased a bunch of stuff so that i could be ready to watercool my system and this happens....
> Hard facts:
> * I was using a OZC ZX 1000W PSU (suggested to my by a fellow overclocker that has the same configuration as mine and has worked fine for him for loke a year now) Do you think the PSU was responsible for this?
> * NO LEAKS were present when i the testing and my loop was completely well contained
> * When i took apart the card that fried (i removed the waterblock) i found out that there was close to no remains of Thermal Paste in the GPU area of the waterblock. I feel like the waterblock was not even touching the GPU right
> * When i reconnected the Video Card to my computer with the Air Cooler the LED Logo of the video card starts blinking (this did not happened before)
> Guys, what do you think i did wrong?
> Can the Video card get repaired by someone?
> thanks for your support im a little down for this


There is only 1 possibility...You somehow grounded your card.

How do you ground your card? Water was on the card OR you didn't put the block on properly.

It can't be anything else -- I happen to be the king of blowing cards up and been there, done that -- Nvidia gave me a T-Shirt somewhere (not joking).

I've found that regardless if the water is non-conductive or conductive, if there's enough spread out over a distance, it still allows a ground. -- I've tested this at length and...In my book, it's a fact.

Bottom line, it's either or...If there's no water on the card, you didn't put the block on right.

That's the only way what you explained could have happened because overheat cap is at 100c. -- You say it smoked, which means it sparked out which means, it's dead.

It was obviously an accident but, it still happened.

I'm leaning on, you didn't put the block on properly -- This is what most people do wrong the first few times -- Maybe you didn't use thermal paste where you should have...Who knows, I can't see it but, the card for all intents and purposes, is dead.

If you'd like to call me and I can walk you through how to get an RMA, I'll more than happily do that but, if you have anything but an EVGA and say what happened above, your warranty will be denied.


----------



## Dennybrig

OK Masked thanks for the insight, you know i was also thinking that i did something wrong with the waterblock but, you know the computer started and everything ran fine for the first minutes.
I was monitoring the temps of course and they were in the 60s when this happened
I think that something worked out wrong, its kind of sad, maybe my little watercooling experiment was done over too much dough (i mean each card costs frakking $600 dlls)

But well, i think that life goes on and maybe watercooling is not for me, i dont know, maybe im just to stupid as to put the water block even thou i followed the instructions by the letter.

By what im understanding on your reponse Masked there is no way this card can come back to life again.


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> OK Masked thanks for the insight, you know i was also thinking that i did something wrong with the waterblock but, you know the computer started and everything ran fine for the first minutes.
> I was monitoring the temps of course and they were in the 60s when this happened
> I think that something worked out wrong, its kind of sad, maybe my little watercooling experiment was done over too much dough (i mean each card costs frakking $600 dlls)
> But well, i think that life goes on and maybe watercooling is not for me, i dont know, maybe im just to stupid as to put the water block even thou i followed the instructions by the letter.
> By what im understanding on your reponse Masked there is no way this card can come back to life again.


No, you fried out the secondary caps by the VRM's -- It's as dead as a doorknob.

I strictly water-cool now and my first time, I friend out a 5k$ PC ~ It happens to everyone.

I don't mean to be a...blank...about this but, if you had put the block on right, it wouldn't be dead atm.

Thermal contact would not have caused this card to die at 60c -- It's just not possible --

Personally, I think it's more likely that you got water on the card, didn't know it and just fried it out that way -- that's much more likely than putting it on wrong...BUT, if you mounted it at an angle and something was making contact -- Same end.

ALSO If you put a back-plate on the card, aftermarket, and didn't use the spacers -- The ground would only occur when you stressed the card because for the voltage to carry over, it would have to be "stressed"...


----------



## YP5 Toronto

Did you leak test the system prior to powering up the entire system? (just run the pump and watch for leaks over an extended period of time)

Did you eventually get all the air out?


----------



## Dennybrig

Yes i did the leak testing before putting power on the system
And no, there is no 100 percent way to be sure you dont have air in the system


----------



## YP5 Toronto

Quote:


> Originally Posted by *Dennybrig*
> 
> Yes i did the leak testing before putting power on the system
> And no, there is no 100 percent way to be sure you dont have air in the system


yes there is...but not going to argue. (we are talking about air pockets and not tiny bubbles)


----------



## Arizonian

Quote:


> Originally Posted by *Masked*
> 
> Why is everyone pointing at me --


I didn't mean you at all Masked. You've always been very helpful not just on this club thread but many others. Sorry for any confusion.







All good?


----------



## Dennybrig

Damn guys, i cant get over it!
I keep thinking what i did wrong!!!!!!

Damn, im afraid of putting my other GTX 590 on its respective watercooler again, i think i will stickwith air cooling...

You know, i can take risks with the CPU (since is not more than 200 bucks) but the GTX 590....
i dont know man...

DAMN IT!


----------



## MKHunt

Do you have pics of the water block? It might be a pre-revision water block which would cause poor contact on everything but the inductors. If you tightened it enough for contact, the card would flex a tremendous amount which could also cause problems.


----------



## Dennybrig

The waterblock are heatkiller x3
By the way, in the EVGA GTX 590 classified edition video card where can i find the serial number besides the sticker in the backplate?
Is it marked in the PCB somewhere?


----------



## squishysquishy

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys, i need you help. Look yesterday i did the first leak test on my liquid cooling setup and found out no leaks but the problem is that there are some parts of the tubing are completely devoided of water (imagine some megabubbles on the tubing). My setup is:
> Pump and Reservoir (XSPC 750) > CPU block (XSPC) > 2 GTX 590 in quadsli > 280mm radiator > 360 mm radiator
> Please let me know what to do to fill completelly the entire loop. The reservoir is full and the pump is working but for some reason the tubing has those megabubbles
> Thanks in advance


I had a similar problem, you need to 'air out 'your system. I opened the (extra) ports on my radiator and let the water backfill untill I had to cap it. ran the pump, then repeated. after awhile the pump sucks all the excess air out just with the flow of water.

Otherwise I have no clue how you are going to fix that. ^__^


----------



## kzinti1

Quote:


> Originally Posted by *MKHunt*
> 
> Do you have pics of the water block? It might be a pre-revision water block which would cause poor contact on everything but the inductors. If you tightened it enough for contact, the card would flex a tremendous amount which could also cause problems.


This is *exactly* what I've been asking about my own pair of cards. Bought 3 months apart. How can I tell what waterblocks are the proper ones for my cards? All of the Product Descriptions only say that they're for "Reference Cards", whatever the Hell that means!


----------



## MKHunt

Quote:


> Originally Posted by *kzinti1*
> 
> This is *exactly* what I've been asking about my own pair of cards. Bought 3 months apart. How can I tell what waterblocks are the proper ones for my cards? All of the Product Descriptions only say that they're for "Reference Cards", whatever the Hell that means!


You can tell by looking down through the fan. If you can see grey cubes with a large 'C' logo on them, you have the newer hardware that requires more clearance. I know for a fact that EK and Koolance (Rev 1.1) waterblocks fit those cards. Others, I have no idea.


----------



## Dennybrig

Wwhat do you mean with "the new revision needs more clearance"? do you mean that it is "farther" from making contact between the waterblock and the video card components?

Dude, you know this is an addiction when you are checking the GTX 590 owners club while attending a wedding as i am right now!

Best regards to all by the way and thanks for all the support shown so far!


----------



## kzinti1

Quote:


> Originally Posted by *MKHunt*
> 
> You can tell by looking down through the fan. If you can see grey cubes with a large 'C' logo on them, you have the newer hardware that requires more clearance. I know for a fact that EK and Koolance (Rev 1.1) waterblocks fit those cards. Others, I have no idea.


Thank you. That's the answer I've been looking for all of these months.


----------



## Shinobi Jedi

There's a guy on the EVGA 500 Boards in one of the 590 threads claiming his frustrations with the voltage lockdown on the cards.

He is ratifying Masked statements and saying that only he and those few like him that know how to write drivers and writing driver's themselves are really tapping into this card's potential. And, I believe (don't quote me here, I'm going off of memory) that with a voltage of 0.975 they are able to hit base 580 overclocks..

Really makes me wish I knew how to write drivers...

Unfortunately, I'm just a gamer who knows how to put objects in slots, connect the write cables and tighten screws.

Poor Dennybrig pretty much scared me off of self building a watercooled rig, for, hmmm.... like EVER! I freely admit if I ever get a watercooled desktop I'll most likely get a pre-built one and pay the PC Gaming version of the Apple Tax for the convenience of not having to go through the headache's or time lost to break my watercool rig building cherry, but the warranty that goes with it.

I don't know, maybe if I watched one being built in front of me where I could ask questions, etc. I'd attempt it, or maybe hire some legit dude here to come build it with me and walk me through it, otherwise.. I'll just deal with the fan noise instead and slap on some gaming headphones when it's too intrusive. I had a biznatch of a time just cleaning the dust out of my 590 fans.

What's everybody's verdict on the new drivers?


----------



## Recipe7

They are fine with me.

I'm only playing BF3, D3, SC2, and Dota2. Haven't crashed since I installed them running with a 650 core, yey.


----------



## Wogga

295.73 drivers are fine for me. either bioware really improved swtor perfomance or drivers did it.


----------



## iARDAs

I am loving the latest WHQL with all of my heart as well.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> There's a guy on the EVGA 500 Boards in one of the 590 threads claiming his frustrations with the voltage lockdown on the cards.
> He is ratifying Masked statements and saying that only he and those few like him that know how to write drivers and writing driver's themselves are really tapping into this card's potential. And, I believe (don't quote me here, I'm going off of memory) that with a voltage of 0.975 they are able to hit base 580 overclocks..
> Really makes me wish I knew how to write drivers...
> Unfortunately, I'm just a gamer who knows how to put objects in slots, connect the write cables and tighten screws.
> Poor Dennybrig pretty much scared me off of self building a watercooled rig, for, hmmm.... like EVER! I freely admit if I ever get a watercooled desktop I'll most likely get a pre-built one and pay the PC Gaming version of the Apple Tax for the convenience of not having to go through the headache's or time lost to break my watercool rig building cherry, but the warranty that goes with it.
> I don't know, maybe if I watched one being built in front of me where I could ask questions, etc. I'd attempt it, or maybe hire some legit dude here to come build it with me and walk me through it, otherwise.. I'll just deal with the fan noise instead and slap on some gaming headphones when it's too intrusive. I had a biznatch of a time just cleaning the dust out of my 590 fans.
> What's everybody's verdict on the new drivers?


Wait till you see The Return of The Sith custom we're building









Gonna knock your socks off.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> Wait till you see The Return of The Sith custom we're building
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna knock your socks off.


Masked...

Wow!!

That is Super-Intergalactic-Pimp-Jedi...

Whether you end up making that an available product, or not, you can put me down as a customer. I was thinking of going AW just because I figured they had Pimps like you behind the curtains and I love my M11x.

But now that I see you've got your own operation going, that's out the window.

Congrats on the new venture! I've got a vibe it's going to be very successful


----------



## MKHunt

Quote:


> Originally Posted by *Dennybrig*
> 
> Wwhat do you mean with "the new revision needs more clearance"? do you mean that it is "farther" from making contact between the waterblock and the video card components?
> Dude, you know this is an addiction when you are checking the GTX 590 owners club while attending a wedding as i am right now!
> Best regards to all by the way and thanks for all the support shown so far!


The later revision cards have new inductors that are taller. If you have an old revision water block, the inductors will prevent the block from making good contact on the cores, VRAM chips, VRM MOSFETS, and NF200 chip. Basically everything but the inductors.


----------



## mironccr345

Hello everyone, I just bought me a 590 and after running some test, it gets a little toasty. I was wondering if the back plat will still be compatible if I were to add an EK water block?


----------



## MKHunt

Quote:


> Originally Posted by *mironccr345*
> 
> Hello everyone, I just bought me a 590 and after running some test, it gets a little toasty. I was wondering if the back plat will still be compatible if I were to add an EK water block?


EKWB screws are too large to fit the stock back plate. Koolance works 100% with the stock plate.

Of course, the EK plate is pretty good looking, though an extra $40.


----------



## NismoTyler

Quote:


> Originally Posted by *MKHunt*
> 
> EKWB screws are too large to fit the stock back plate. Koolance works 100% with the stock plate.
> Of course, the EK plate is pretty good looking, though an extra $40.


Do GTX 590 owners worry about 40 bucks?


----------



## mironccr345

Quote:


> Originally Posted by *MKHunt*
> 
> EKWB screws are too large to fit the stock back plate. Koolance works 100% with the stock plate.
> Of course, the EK plate is pretty good looking, though an extra $40.


wow







, that looks awesome! Guess I'll be getting one of those too! Thanks for the reply!


----------



## Tuthsok

Just an FYI for anyone considering the Koolance block for these cards as I notice it still being mentioned/suggested as an option. It seems to have been archived now as a discontinued product.

http://www.koolance.com/product_archive/product_info.php?product_id=1146


----------



## MKHunt

Quote:


> Originally Posted by *Tuthsok*
> 
> Just an FYI for anyone considering the Koolance block for these cards as I notice it still being mentioned/suggested as an option. It seems to have been archived now as a discontinued product.
> http://www.koolance.com/product_archive/product_info.php?product_id=1146


I'll continue to mention it because I have one NIB sitting in the Koolance box with Koolance tape just waiting for a new home


----------



## PCModderMike

Quote:


> Originally Posted by *MKHunt*
> 
> EKWB screws are too large to fit the stock back plate. Koolance works 100% with the stock plate.
> Of course, the EK plate is pretty good looking, though an extra $40.


Yes, it certainly does look good, wow


----------



## kzinti1

Quote:


> Originally Posted by *NismoTyler*
> 
> Do GTX 590 owners worry about 40 bucks?


Only when they think the original backplate looks a lot better.


----------



## YP5 Toronto

speaking of backplates...its been a few generations since I last did GPU cooling, but the backplates on my 2 590s are HOT. I am sure this is normal, just wanted to double check with you guys.

I have even thought of putting some heatsinks on the backplate to help dissipate heat.


----------



## Masked

They shouldn't be THAT hot but, if you only have 1 rad and you're pushing the cards above 40c ~ They will be.


----------



## YP5 Toronto

Quote:


> Originally Posted by *Masked*
> 
> They shouldn't be THAT hot but, if you only have 1 rad and you're pushing the cards above 40c ~ They will be.


I have 3 x XSPC EX 480s with Triebwerk TF-121 fans

but the back plates gets quite hot during gaming.


----------



## Masked

Quote:


> Originally Posted by *YP5 Toronto*
> 
> I have 3 x XSPC EX 480s with Triebwerk TF-121 fans
> but the back plates gets quite hot during gaming.


Are we talking 30c? 40c? ~ Quite hot isn't really giving me an idea considering I more than likely see quite hot as being something significantly hotter than what you're describing...


----------



## iARDAs

Hey folks

If i lower my CPU overclock of 2500k from 4.5 to 4.0 would i bottleneck my 590?


----------



## ProfeZZor X

Quote:


> Originally Posted by *YP5 Toronto*


What is that white gelatenous looking stuff settled inside the bottom of your reservoir?


----------



## YP5 Toronto

Quote:


> Originally Posted by *ProfeZZor X*
> 
> What is that white gelatenous looking stuff settled inside the bottom of your reservoir?


temporary filter...synthetic mesh. Didn't quite turn out the way I wanted to. It will be coming out when I add a couple of items to the build.


----------



## emett

iARDAs no it will be fine even at the stock 3.3ghz.


----------



## iARDAs

Quote:


> Originally Posted by *emett*
> 
> iARDAs no it will be fine even at the stock 3.3ghz.


Thank you bro.


----------



## Smo

The 590 is gone. Sorry fellas - I've moved to the Red team. Could well be back if Kepler is a blinder though!


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> The 590 is gone. Sorry fellas - I've moved to the Red team. Could well be back if Kepler is a blinder though!


Which GPU did u get?


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> The 590 is gone. Sorry fellas - I've moved to the Red team. Could well be back if Kepler is a blinder though!


You know...Considering the "670" is already rumored to be better than the 7970 ~ I find it interesting that people are swapping in droves.

Especially given the driver issues ATI fans have had lately ~ I would've thought you guys would wait...

Very interesting.


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> Which GPU did u get?


I decided on 2x MSI 7970s.

Quote:


> Originally Posted by *Masked*
> 
> You know...Considering the "670" is already rumored to be better than the 7970 ~ I find it interesting that people are swapping in droves.
> Especially given the driver issues ATI fans have had lately ~ I would've thought you guys would wait...
> Very interesting.


Only real reason is I want to give them a try mate, that's all. Like I say - if Kepler are better performers then I'll be back. I just want to dip my toe in the water so that I have some experience with GPUs. I also find driver issues interesting considering that not everybody is effected by the same issues with the same driver sets, it seems like a lottery!

After all I can't really say which company I prefer until I've actually been there and done it, you know? I won't be terribly bothered if I can't get on with the 7970s, I'll just move on to something else and learn from it.

The thing is rumours aren't really enough for me to base a decision on. I chose the 7970s because the performance that they've been proven to give is pretty astounding! However I respect you and your knowledge greatly so I'm always interested in hearing what you have to say.

Just trying new things


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> I decided on 2x MSI 7970s.
> Only real reason is I want to give them a try mate, that's all. Like I say - if Kepler are better performers then I'll be back. I just want to dip my toe in the water so that I have some experience with GPUs. I also find driver issues interesting considering that not everybody is effected by the same issues with the same driver sets, it seems like a lottery!
> After all I can't really say which company I prefer until I've actually been there and done it, you know? I won't be terribly bothered if I can't get on with the 7970s, I'll just move on to something else and learn from it.
> The thing is rumours aren't really enough for me to base a decision on. I chose the 7970s because the performance that they've been proven to give is pretty astounding! However I respect you and your knowledge greatly so I'm always interested in hearing what you have to say.
> Just trying new things


I'm not pointing fingers or moderately perturbed, I understand where you're coming from and I get it.

I just find it interesting that so many people are swapping on the cusp of a major release --

While Nvidia's currently having transistor issues, AMD had far more (They weren't as public) and still, many people are swapping to the 7970's ~ I just find that to be very interesting.


----------



## Smo

Quote:


> Originally Posted by *Masked*
> 
> I'm not pointing fingers or moderately perturbed, I understand where you're coming from and I get it.
> I just find it interesting that so many people are swapping on the cusp of a major release --
> While Nvidia's currently having transistor issues, AMD had far more (They weren't as public) and still, many people are swapping to the 7970's ~ I just find that to be very interesting.


Yeah I see your point. I think that it's because NVIDIA have been so hush hush about the whole thing and the release date while drawing ever near is still uncertain. You know what they say - people fear what they don't know. Maybe it's a comfort thing.

I'm going to keep a close eye on what NVIDIA bring to the table. As frustrated as I was at times with the 590, it's a fantastic piece of hardware. Even so, I was always slightly annoyed by the fact that it wasn't what it could be, and felt a little let down by it. Whereas on the flipside, the 7970 seems to be the card that keeps on giving at the moment. Even people who are disappointed with their overclock are still getting ~200MHz over stock without a voltage bump.


----------



## iARDAs

Have fun with your new toys bro. It is a good setup still...

So far a single 590 really makes me happy. I gave up on 3D gaming due to most games that i love are terrible or have minor issues that bug you so bad. So a 590 for 1080p 2d gaming is just fantastic for me.

I cant do 2d Surround as i dont have enough space. If i had space i would add another 590 but thats not going to happen.

Hey Masked, do you have any ideas if kepler will have 2 gpu single cards like a 690 ?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Have fun with your new toys bro. It is a good setup still...
> So far a single 590 really makes me happy. I gave up on 3D gaming due to most games that i love are terrible or have minor issues that bug you so bad. So a 590 for 1080p 2d gaming is just fantastic for me.
> I cant do 2d Surround as i dont have enough space. If i had space i would add another 590 but thats not going to happen.
> Hey Masked, do you have any ideas if kepler will have 2 gpu single cards like a 690?


You know I can't answer that









Maybe









Quite personally, I'm waiting to see the 7990 ~ A dual GPU card for me, is ideal...I MIGHT go with 3x Kepler but, I'm reserving my decision until I see what they push retail.

You have to remember that the samples everyone has, aren't always being pushed to the retail market so, it's a very solid possibility that what's being reviewed and "leaked" isn't actually going to make it.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> You know I can't answer that
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe
> 
> 
> 
> 
> 
> 
> 
> 
> Quite personally, I'm waiting to see the 7990 ~ A dual GPU card for me, is ideal...I MIGHT go with 3x Kepler but, I'm reserving my decision until I see what they push retail.
> You have to remember that the samples everyone has, aren't always being pushed to the retail market so, it's a very solid possibility that what's being reviewed and "leaked" isn't actually going to make it.


Haha lol great. I will be waiting for 690 than









Though like we discussed earlier a 590 will handle every game maxed for another 2 years so unless i see a 50% improvement i will definitely stick with my 590.

If only 3D gaming was much more compatible across all the games, and did not have any issues.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Haha lol great. I will be waiting for 690 than
> 
> 
> 
> 
> 
> 
> 
> 
> Though like we discussed earlier a 590 will handle every game maxed for another 2 years so unless i see a 50% improvement i will definitely stick with my 590.
> If only 3D gaming was much more compatible across all the games, and did not have any issues.


I've been giving a tremendous amount of thought to my gaming lately and I, quite honestly, feel as if I only need 2 monitors.

The 3 monitor thing is "nice" but, 2 is just so much easier.

Granted, gaming in 3 is difficult...I rarely, if ever, use my 3rd monitor.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I've been giving a tremendous amount of thought to my gaming lately and I, quite honestly, feel as if I only need 2 monitors.
> The 3 monitor thing is "nice" but, 2 is just so much easier.
> Granted, gaming in 3 is difficult...I rarely, if ever, use my 3rd monitor.


Hmm interesting

How do you game with 2 monitors? I thought you could either game with 1 or 3 monitors lol

Hey masked i have one final question. More like a curiousity actually.

How do you feel about Windows 8? Will we be better of gaming in Win7? or do you believe we should jump to Windows 8 for gaming?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hmm interesting
> How do you game with 2 monitors? I thought you could either game with 1 or 3 monitors lol
> Hey masked i have one final question. More like a curiousity actually.
> How do you feel about Windows 8? Will we be better of gaming in Win7? or do you believe we should jump to Windows 8 for gaming?


Actually, the beta for that was rejected until a later date...So, I can't answer that.

And, you don't game on 2 monitors, you game on 1.

Surround gaming is great in idea but, in reality there's no true arch yet so, while you see some-what peripherally it's not nearly "perfected" yet to actually qualify as vision and it really doesn't help in any game currently on the market...So, for me, it's fairly pointless.

I will game on 1 active monitor and have the other available for other things.


----------



## Sir_Gawain

Quote:


> Originally Posted by *Masked*
> 
> You know...Considering the "670" is already rumored to be better than the 7970 ~ I find it interesting that people are swapping in droves.
> Especially given the driver issues ATI fans have had lately ~ I would've thought you guys would wait...
> Very interesting.


I took the plunge as well and have been very happy with the performance over my 590. The 7970's are monsters indeed but, dont listen to the AMD fanboys that they say AMD driver issues are a thing of the past....They are not, such a complicated system! This reason alone will more than likely have me switching back to the green side for a pair of 680's if the price isnt more than the 7970.


----------



## Shinobi Jedi

I keep reading in different places that BF3 is well enough optimized that high end 580 users with only 1.5gb of vram are getting 120fps on 2560x1440 displays. This has me seriously considering getting a 2560x1600 display and running it dual style with my BenQ and switch back and forth for them between gaming depending on whether I want hi-rez or 3D/120hz

First I got to figure out which one to get in the 30" size. I hear conflicting reports about practically all of them.

I took a break from SWTOR to play some BF3 and Batman and Skyrim. The experience totally reminded me and reaffirmed why these cards rock so hard..


----------



## myrtleee34

Where is everyone getting their 590? New egg has none.


----------



## iARDAs

Quote:


> Originally Posted by *myrtleee34*
> 
> Where is everyone getting their 590? New egg has none.


In Turkey we have few 590s left








I am pretty much sure I can find a brand new one even a few months later.


----------



## Smo

They're very hard to get hold of in the UK - last I looked Scan.co.uk were out of stock completely (there is only one listed on the site, from ASUS). Novatech also only have one stocked which I think is the Gigabyte, (they only have one card in stock too) and it's priced at £730!


----------



## ProfeZZor X

What is the consensus from everyone on using LED televisions for computer usage? Before I even started up my build, I've been longing to use a high-end 3D LED Samsung television to give sight to the content on the gaming rig I was planning on building. Now that I'm half way done, I would imagine that with a 590 strapped to my rig, my viewing senses would be in sensory overload... So far I haven't committed to any television/monitor just yet, but I want to be sure that I'm not short changing myself with another bad decision when I actually do get around to buying the visual portion of my build... To be exact, I was thinking of the Samsung UE55D8000 model. It has a thin boarder - perfect for near seamless dual monitoring... Anyone care to answer?


----------



## iARDAs

Quote:


> Originally Posted by *ProfeZZor X*
> 
> What is the consensus from everyone on using LED televisions for computer usage? Before I even started up my build, I've been longing to use a high-end 3D LED Samsung television to give sight to the content on my gaming rig. I would imagine that with a 590 strapped to my rig, my viewing senses would be in sensory overload... Anyone care to answer?


I have a 3D Samsung LED tv

Samsung 8 series. I love this TV to death and it really gives good picture overall, i still prefer gaming on a monitor.

590 + Led samsung is a great setup though.

However if you will game in 3D, you will be stuck with 3Dtvplay or DDD. Cant use 3Dvision.

i love gaming in 3d, but i stay away gaming 3D on a TV. 120hz 3Dvision monitors are best for 3Dgaming.

For me Pros of gaming on Samsung Led TV are

+ Huge screen
+ Beautiful colors
+ Can have sound from the TV

CONS

- Not as comfortable as gaming on a desk in front of a monitor
- cant get focused much
- cant do 120hz gaming which is a huge let down
- input lag is a bit more than a very solid 120 hz monitor

Edit : Dont forget to change your input in a Samsung Led tv to PC.


----------



## BiN4RY

A bit late for me to join the party, but I own a rig with quad sli 590, and both are watercooled.
I understand that nVidia completely blocked voltage tweaking months back, are there any solutions to bypass this as of now, other than using a previous driver?
I was able to OC the cards to about 800mhz with voltage tweaking, but now I cannot even get the cards to perform 30mhz higher without it


----------



## iARDAs

Quote:


> Originally Posted by *BiN4RY*
> 
> A bit late for me to join the party, but I own a rig with quad sli 590, and both are watercooled.
> I understand that nVidia completely blocked voltage tuning months back, are there any solutions to bypass this as of now?


Wow bro you have 2 590s and you are still overclocking your rig?









If i had 2 590s i would probably downlock them to make sure they last longer


----------



## BiN4RY

Quote:


> Originally Posted by *iARDAs*
> 
> Wow bro you have 2 590s and you are still overclocking your rig?
> 
> 
> 
> 
> 
> 
> 
> 
> If i had 2 590s i would probably downlock them to make sure they last longer


Why not, I would like to get the most out of them


----------



## iARDAs

Quote:


> Originally Posted by *BiN4RY*
> 
> Why not, I would like to get the most out of them


True true









If only i could understand about water cooling, i would probably add it to my system and overclock a bit.

I am not feelign safe OCing the 590 with the stock cooler.


----------



## tehvampire

Hey what kinda performance do you guys get in Serious Sam 3 maxed out with your gtx590's?

Im usually getting 60fps all the time but it will sometimes dip down to about 50fps with heavy action.

2600k 4.5ghz
GTX590
16GB RAM


----------



## kzinti1

I was reading an old (Dec. 29, '11) (p)review of the new i7-3820, http://www.anandtech.com/show/5276/intel-core-i7-3820-review-285-quadcore-sandy-bridge-e/1, (now available without having to buy a combo BTW), by Mr. AnandTech himself, Anand Lal Shimpi, when I ran across the following quote from the Comments section of this article by chizow:

"RE: 16 vs 40 pcie lanes by chizow on Thursday, December 29, 2011

There's not much benefit when using 2xPCIE 2.0 single-GPU cards with PCIE 2.0 x8 slots. They just don't need that much bandwidth.

With multi-GPU cards however, that PCIE 2.0 x8 starts to choke them a bit and x16 starts showing its benefits.

Another thing to keep in mind too is that with X79 (and IB?) they support PCIE 3.0, so with PCIE 3.0 cards, PCIE 3.0 x8 is the equivalent of PCIE 2.0 x16 in terms of bandwidth. Should be beneficial for multi-card solutions especially with multi-GPU cards."

Is it true that we GTX590 owners and other multi-gpu cards are actually being limited by the x8 PCI-E lanes using P67 and Z68 motherboards?

If so, would it be a worthwhile investment in an i7-3820 or other Sandy Bridge E cpu and motherboard to keep our multi-gpu cards a little more future-proof?

Or, am I reading a little too much into this, it's too close to IvyBridge (which has been delayed once again to, I believe, July, 2012) or is this all just a load of crap?

Even at this seemingly late date with regards to IvyBridge, this is still quite a doable project at a fairly low cost.

The Egg is selling these 3820's for $319.99 with none of their "Combo Required for Purchase" nonsense. There are some quite good LGA 2011 motherboards at a fairly reasonable price, there are now adaptors for watercooling blocks so that would save some more money, and some quite good prices on XMP supported Quad memory modules.

What I'm trying to do is just skip Fermi altogether and wait for the generation after that to update my videocards. They're both still too new to be outdated before Gen.3 PCI-E is made available in another year or two, or whenever. PCI-E 3.0 ain't a done deal until the units are available for sale. Nothing but vaporware until then.

I'm really not looking for an excuse for a new build. I don't need any excuse. If/when I decide on a new build then it's already halfway done. Or more. All I need for a completely new build is a cpu, mobo and memory. I always keep a new build in this half-completed state for just such things as this. It's already built for IvyBridge and everything has been tested with another cpu, mobo and memory. Ready to fire up within hours of the arrival of the 3 major components.

All I'm looking for is some comments as to whether this is a good idea or not.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *kzinti1*
> 
> I was reading an old (Dec. 29, '11) (p)review of the new i7-3820, http://www.anandtech.com/show/5276/intel-core-i7-3820-review-285-quadcore-sandy-bridge-e/1, (now available without having to buy a combo BTW), by Mr. AnandTech himself, Anand Lal Shimpi, when I ran across the following quote from the Comments section of this article by chizow:
> 
> "RE: 16 vs 40 pcie lanes by chizow on Thursday, December 29, 2011
> There's not much benefit when using 2xPCIE 2.0 single-GPU cards with PCIE 2.0 x8 slots. They just don't need that much bandwidth.
> With multi-GPU cards however, that PCIE 2.0 x8 starts to choke them a bit and x16 starts showing its benefits.
> Another thing to keep in mind too is that with X79 (and IB?) they support PCIE 3.0, so with PCIE 3.0 cards, PCIE 3.0 x8 is the equivalent of PCIE 2.0 x16 in terms of bandwidth. Should be beneficial for multi-card solutions especially with multi-GPU cards."
> 
> Is it true that we GTX590 owners and other multi-gpu cards are actually being limited by the x8 PCI-E lanes using P67 and Z68 motherboards?
> If so, would it be a worthwhile investment in an i7-3820 or other Sandy Bridge E cpu and motherboard to keep our multi-gpu cards a little more future-proof?
> Or, am I reading a little too much into this, it's too close to IvyBridge (which has been delayed once again to, I believe, July, 2012) or is this all just a load of crap?
> Even at this seemingly late date with regards to IvyBridge, this is still quite a doable project at a fairly low cost.
> The Egg is selling these 3820's for $319.99 with none of their "Combo Required for Purchase" nonsense. There are some quite good LGA 2011 motherboards at a fairly reasonable price, there are now adaptors for watercooling blocks so that would save some more money, and some quite good prices on XMP supported Quad memory modules.
> What I'm trying to do is just skip Fermi altogether and wait for the generation after that to update my videocards. They're both still too new to be outdated before Gen.3 PCI-E is made available in another year or two, or whenever. PCI-E 3.0 ain't a done deal until the units are available for sale. Nothing but vaporware until then.
> I'm really not looking for an excuse for a new build. I don't need any excuse. If/when I decide on a new build then it's already halfway done. Or more. All I need for a completely new build is a cpu, mobo and memory. I always keep a new build in this half-completed state for just such things as this. It's already built for IvyBridge and everything has been tested with another cpu, mobo and memory. Ready to fire up within hours of the arrival of the 3 major components.
> All I'm looking for is some comments as to whether this is a good idea or not.


According to Masked, you're only losing about %3-%4 of bandwith on your card with the x8 lanes. Essentially a negligible amount. Negligible enough that it didn't stop him from putting a 2700k into his personal rig I believe back when he was running two 590's with that CPU.

That being said, I read the same review and have had one eye on EVGA's Z68, but I don't want to spend the money for an unlocked Sandy Bridge-e and I don't know how much of a gain that locked version of the CPU for $385 would net over my Bloomfield running @ 4.0ghz. Though I have been considering getting that and then replacing it with an Ivy Bridge chip as I do believe the Z68 will be compatible with them, but I'm still not sure. Plus, the PS Vita keeps calling my name, so I'm not sure if that may come first and I hold out until Ivy Bridge. July is not that far away. It'll creep up on us all before we know it.

However, I think my X58 mobo may be dying or dead before Ivy Bridge comes out, so I'm in a bit of a quandary and am still considering this Z68 option. But in your case, I don't think it sounds like you're missing much.


----------



## iARDAs

I am glad that i have only 1 590 GPU and it is in my 16x PCI slot









But yeah i heard somewhere that the difference was about 5% as well.

Difference between 4x and 16x was about 10% i believe.


----------



## Dennybrig

Guys, my dear Overclock fellows it is me again and i am bringing you a brand new fresh problem (as always).

Im writting you this out of exahustion in trying to get my MOBO (a Maximus IV Gene Z) to detect my GTX 590 video card and have not being able to do so.

I have a week trying to solve this problem to no avail.

Look, i have done everything one can possibly think of such as:

* Disabling the Multimonitor option and the Render option in the BIOS and selected PCI-E as the video priority device
* Connecting my video card to both PCI-E ports
* Changing the cables that feed the card from the PSU
* Tried every DVI port of the card
* Cleared CMOS, restarted the computer again
* Used Driver Sweeper to erase every single piece of NVIDIA software that may have still been in the computer

The most frustrating thing is that i know for sure that the card works since i tested it on another computer and it worked just fine but my MOBO just simply does not detect it (it does not appear in the device manager in Windows nor on the GPU DIMM.POST option in the BIOS) and worst thing of all is that this same video card WAS WORKING on this same MOBO a week ago but since i made a Windows re-install (had to do it since i was receivng a fatal error whenever i tried to start the computer) it just stopped detecting the video card alltogether.

Guys, please help me solve the problem, do you believe it might be software related?
Or what else do you think i should do to make this work? I am really confused at this point and from the place im from there is just NO guys who do gaming for a living so it not like i can take the computer to an specialist in order to get it running (believe me, at this point i would love to do it)

My setup is:

* Video Card: NVIDIA GTX 590 (obviously)
* MOBO: Maximus IV Gene Z
* Monitor: Panasonic Viera 50" Plasma TV and Samsung 37" LCD TV (have tried it in both TVs)
* Adapter: Im using a DVI to HDMI adapter and an HDMI cable to connect it to the TV

Please Help!!!! anyone with a Maximus IV MOBO has experienced a similar issue before? i dont know what else to do! Please let me know


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> According to Masked, you're only losing about %3-%4 of bandwith on your card with the x8 lanes. Essentially a negligible amount. Negligible enough that it didn't stop him from putting a 2700k into his personal rig I believe back when he was running two 590's with that CPU.
> That being said, I read the same review and have had one eye on EVGA's Z68, but I don't want to spend the money for an unlocked Sandy Bridge-e and I don't know how much of a gain that locked version of the CPU for $385 would net over my Bloomfield running @ 4.0ghz. Though I have been considering getting that and then replacing it with an Ivy Bridge chip as I do believe the Z68 will be compatible with them, but I'm still not sure. Plus, the PS Vita keeps calling my name, so I'm not sure if that may come first and I hold out until Ivy Bridge. July is not that far away. It'll creep up on us all before we know it.
> However, I think my X58 mobo may be dying or dead before Ivy Bridge comes out, so I'm in a bit of a quandary and am still considering this Z68 option. But in your case, I don't think it sounds like you're missing much.


I went for a 2700k because I felt it would be the easy button for a home solution -- Come to find out, it's actually been quite the opposite but, the math is correct.

Essentially you lose 10% total from a x16 to a x4 ~ So if you do the math, from there it's about 2.5% per card that you add to your setup ~ I know I'm off a little but, this is the easiest way to figure it out.

Now, you're not losing GPU power, you're losing processing speed because you're essentially bottlenecking that pipeline.

So if you have 3 cards, you'll see a 10% overall loss which, in the long run, is only accessed at 100/100 I.E. benching.

If you're an enthusiast, that's a mistake.

Now, personally, I'm going with Ivy -- I'm having flickering issues, random freezes etc -- All attributed to something other than my ram/hd/CPU so -- That's me.

Again, it's just simple math...And the loss isn't actually a loss until you're maxing out the slot...Which never happens anyway.


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> Guys, my dear Overclock fellows it is me again and i am bringing you a brand new fresh problem (as always).
> Im writting you this out of exahustion in trying to get my MOBO (a Maximus IV Gene Z) to detect my GTX 590 video card and have not being able to do so.
> I have a week trying to solve this problem to no avail.
> Look, i have done everything one can possibly think of such as:
> * Disabling the Multimonitor option and the Render option in the BIOS and selected PCI-E as the video priority device
> * Connecting my video card to both PCI-E ports
> * Changing the cables that feed the card from the PSU
> * Tried every DVI port of the card
> * Cleared CMOS, restarted the computer again
> * Used Driver Sweeper to erase every single piece of NVIDIA software that may have still been in the computer
> The most frustrating thing is that i know for sure that the card works since i tested it on another computer and it worked just fine but my MOBO just simply does not detect it (it does not appear in the device manager in Windows nor on the GPU DIMM.POST option in the BIOS) and worst thing of all is that this same video card WAS WORKING on this same MOBO a week ago but since i made a Windows re-install (had to do it since i was receivng a fatal error whenever i tried to start the computer) it just stopped detecting the video card alltogether.
> Guys, please help me solve the problem, do you believe it might be software related?
> Or what else do you think i should do to make this work? I am really confused at this point and from the place im from there is just NO guys who do gaming for a living so it not like i can take the computer to an specialist in order to get it running (believe me, at this point i would love to do it)
> My setup is:
> * Video Card: NVIDIA GTX 590 (obviously)
> * MOBO: Maximus IV Gene Z
> * Monitor: Panasonic Viera 50" Plasma TV and Samsung 37" LCD TV (have tried it in both TVs)
> * Adapter: Im using a DVI to HDMI adapter and an HDMI cable to connect it to the TV
> Please Help!!!! anyone with a Maximus IV MOBO has experienced a similar issue before? i dont know what else to do! Please let me know


Last week you claimed 3 things.

1) Your card "smoked up"
2) Your card was dead as a doorknob etc etc etc
3) That you had applied for RMA.

When a card smokes up like that -- You RMA it, period -- The fact that you didn't is more than likely why it's glitching.

Fast forward to today:
1) Card works on a friends PC (Did we try that PC with NO video cards?
2) Did we try a different slot on the motherboard?

There is no software involved in your card receiving a signal from the PCI slot, period.

Asus is actually a company I refuse to use again after denying us over 20k$ worth of RMA's (We don't cover natural disasters) so, in that regard -- Good luck with your motherboard RMA.


----------



## Dennybrig

Hahahahah i have always liked your humor Masked, ok, i failed to explain that yes, one card is being RMAd as we speak and the other one is the one im having troubles with.

I wished i did not had to do ANOTHER RMA for an ASUS board (the one i have was the replacement of a P8P67 M-PRO i RMAd for simply not working at all) but i think i will have to do it after all...

Jeez i hate this MOBO problems and the hassle it is to unmount everything again and wait for the new one to arrive.

Anyhow, yes, the PC of my other computer worked without the video card but what is really weird is, as i said in my post, my MOBO DID detected my card one week ago before making the windows reinstall, how weird is that?


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> Hahahahah i have always liked your humor Masked, ok, i failed to explain that yes, one card is being RMAd as we speak and the other one is the one im having troubles with.
> I wished i did not had to do ANOTHER RMA for an ASUS board (the one i have was the replacement of a P8P67 M-PRO i RMAd for simply not working at all) but i think i will have to do it after all...
> Jeez i hate this MOBO problems and the hassle it is to unmount everything again and wait for the new one to arrive.
> Anyhow, yes, the PC of my other computer worked without the video card but what is really weird is, as i said in my post, my MOBO DID detected my card one week ago before making the windows reinstall, how weird is that?


I'm very direct and extremely sarcastic so, I'm glad that someone else appreciates it.

I would RMA your board for the simple fact that when a card sparks out, it sends a direct feed into your PCI slot, sometimes it overloads, sometimes it doesn't.

In your case, it did.

Hardware dying is a typical process, for example, your video card isn't rated at a max temp of 105c because the silicone will burn out, melt or fry...It's because the solder they use is only rated for 107c so, when you hit 100c+ the danger is that your card will fry out because a solder will "melt off"...

Same thing with your motherboard -- Hardware just...dies...all the time.

My office has about...30-40 PC's all current tech...I do about 4 RMA's a week for something...Be it Ram, Motherboards, Cards, HD's...It's always very random but, the one single company I refuse to deal with anymore, is Asus...

I'm sorry but, there were 2 storms in a row and I didn't even hear a peep from HP when I had to replace over 500k worth of servers...Yet Asus claimed Acts of God for 20k ~ So I'm sorry, but, I got nothin.


----------



## Dennybrig

I guess you are right, and also, how is your EVGA FTW MOBO working so far? Seems very interesting, right mix of price and features


----------



## kzinti1

Has anybody clicked on the GTX590's under "Tags" at the bottom of this page?

I was rather surprised to see it rated as #10. In "Power Supplies"!

That really doesn't say very much as to OCN being a computer enthusiasts site, when they don't know the difference between a videocard and a power supply.

#10 in Power Usage does sound about right.


----------



## YP5 Toronto

Quote:


> Originally Posted by *Masked*
> 
> I would RMA your board for the simple fact that when a card sparks out, it sends a direct feed into your PCI slot, sometimes it overloads, sometimes it doesn't.
> In your case, it did.


+1


----------



## Dennybrig

Quote:


> Originally Posted by *YP5 Toronto*
> 
> +1


The thing here is that i tested the video card AFTER the other card sparked and it worked fine! Thats why im saying that im sure the MOBO works


----------



## Masked

Quote:


> Originally Posted by *Dennybrig*
> 
> The thing here is that i tested the video card AFTER the other card sparked and it worked fine! Thats why im saying that im sure the MOBO works


Again, I'll repeat -- It does not matter when the error occurred.

You more than likely split a connection that remained until last week.

RMA your motherboard -- The lane is gone.


----------



## Dennybrig

Yes, i think i will do so, thanks!


----------



## Dennybrig

Guys you are not going to believe this but i got another video card and proceeded to test it in my system and guess what? It worked .... So this is the most bizarre case ive ever seen, a video card that works ok in a motherboard but does not work in another one(mine). I tought the MOBO was the one defective but discovered that it is not after all...

I tried changing the PSU, clearing CMOS the hard way(removing the battery) to no avail... Im on a dead end now...


----------



## elfilipo

About a week ago, I just purchased a new toy.

2012-03-03_17.22.36.jpg 3515k .jpg file

My problem:

Well it seems theres a problem with my P2 port. When I first ran 2D surround I noticed that one of my monitors (the one plugged to P2) was distorted, flickering slightly and filled with red pixels polluting the screen. It's always the monitor I plug into that P2 DVI port that has the distortion, mind you I made sure to purchase 3 IDENTICAL LED 3D MONITORS, 3 DVI-D CABLES

In doing the 2D surround Nvidia changed the monitors to 120Hz so I changed it back to 60Hz and it works fine...(S23A700D are 120Hz Monitors and all 3 have worked at 120Hz unless they are plugged into P2). So basically my question is why can't I make the monitor plugged into P2 to work at 120Hz? Why is the monitor saying "NOT OPTIMUM MODE RECOMENDED MODE 1920X1080 DIGITAL" on occasion? Anyone have any ideas how to fix it? (I have newest drivers, switched the monitors around so that I'm sure it's not a cable problem, or a monitor problem. Leaves the GPU, the DVI port itself being faulty is my bet, or and I hope this is the case I'm doing something wrong with the settings somewhere.)

I've been looking at forums all day long no one seems to have had this problem.
Don't want to have to send my card back to Gigabyte... I just bought it!

I would much appreciate some council.
Thanks in advance to the club!


----------



## Image132

Quote:


> Originally Posted by *kzinti1*
> 
> I was reading an old (Dec. 29, '11) (p)review of the new i7-3820, http://www.anandtech.com/show/5276/intel-core-i7-3820-review-285-quadcore-sandy-bridge-e/1, (now available without having to buy a combo BTW), by Mr. AnandTech himself, Anand Lal Shimpi, when I ran across the following quote from the Comments section of this article by chizow:
> 
> "RE: 16 vs 40 pcie lanes by chizow on Thursday, December 29, 2011
> There's not much benefit when using 2xPCIE 2.0 single-GPU cards with PCIE 2.0 x8 slots. They just don't need that much bandwidth.
> With multi-GPU cards however, that PCIE 2.0 x8 starts to choke them a bit and x16 starts showing its benefits.
> Another thing to keep in mind too is that with X79 (and IB?) they support PCIE 3.0, so with PCIE 3.0 cards, PCIE 3.0 x8 is the equivalent of PCIE 2.0 x16 in terms of bandwidth. Should be beneficial for multi-card solutions especially with multi-GPU cards."


I might be late to this party but here is something to chew on:






and


----------



## iARDAs

Hey folks

Today while playing BF3 my GPU usage droped from 99% to 65% on many occasions.

What would your first diagnose be?

I am running the latest WHQL drivers and did a clean install.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hey folks
> Today while playing BF3 my GPU usage droped from 99% to 65% on many occasions.
> What would your first diagnose be?
> I am running the latest WHQL drivers and did a clean install.


 ~ I would think that, considering your FPS didn't change -- Obviously Nvidia's SLI profile really doesn't suck that badly.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> ~ I would think that, considering your FPS didn't change -- Obviously Nvidia's SLI profile really doesn't suck that badly.


I had clocked my CPU to 4.0 from 4.5 and now clocked it to 4.5 again.

I will check it again after i am done with the match on the TV

But I believe this is driver related as it started after downloading the latest WHQL one.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I had clocked my CPU to 4.0 from 4.5 and now clocked it to 4.5 again.
> I will check it again after i am done with the match on the TV
> But I believe this is driver related as it started after downloading the latest WHQL one.


There's nothing wrong.

Your FPS didn't drop which is a clear indication of the profile, actually working.

So when you stare at a wall it goes from 90% usage to 65% usage -- Your frames actually go UP -- That's 100% normal.

If you framerate had halved or quartered, okay but, it clearly didn't.

So...Less QQ and more pew pew imo.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> There's nothing wrong.
> Your FPS didn't drop which is a clear indication of the profile, actually working.
> So when you stare at a wall it goes from 90% usage to 65% usage -- Your frames actually go UP -- That's 100% normal.
> If you framerate had halved or quartered, okay but, it clearly didn't.
> So...Less QQ and more pew pew imo.


My FPS is always above 60 but the gameplay becomes choppy when the GPU usage falls from 99 to 65. although the FPS stays the same.

I dont know what QQ and pew pew means


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> My FPS is always above 60 but the gameplay becomes choppy when the GPU usage falls from 99 to 65. although the FPS stays the same.
> I dont know what QQ and pew pew means


I was making a joke -- It means less complaining and more shooting...Like the QQ are used for the eyes and the complaining face...The interns infect me with their garbage.

It's a driver issue...The SLI profiles are apparently working...

I never reach 99% usage in BF3 and mine's on ultra...So, got me.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I was making a joke -- It means less complaining and more shooting...Like the QQ are used for the eyes and the complaining face...The interns infect me with their garbage.
> It's a driver issue...The SLI profiles are apparently working...
> I never reach 99% usage in BF3 and mine's on ultra...So, got me.


Haha lol. I honestly had not heard that before. Now i know and can use it









Yeah it seems a driver issue to, i am not going to revert back to the latest beta driver as i dont play BF3 as much anymore, Probably will wait for the next beta driver.


----------



## jv916

I have my ASUS GTX 590 water cooled. At idle GPU1 is 26-27C and GPU2 is 30-31C. However, when playing BF3 GPU 1 is 59-60C and GPU 2 is 48-49C. Both GPUs load is at 97-99% in BF3. Water temp at load is 32-34C. Just wondering why GPU1 is +4C at idle and +10C higher at load than GPU2? Does anyone have the same issue?


----------



## Smo

Quote:


> Originally Posted by *jv916*
> 
> I have my ASUS GTX 590 water cooled. At idle GPU1 is 26-27C and GPU2 is 30-31C. However, when playing BF3 GPU 1 is 59-60C and GPU 2 is 48-49C. Both GPUs load is at 97-99% in BF3. Water temp at load is 32-34C. Just wondering why GPU1 is +4C at idle and +10C higher at load than GPU2? Does anyone have the same issue?


Are you running dual monitors?


----------



## jv916

No...just running single monitor.


----------



## Smo

Quote:


> Originally Posted by *jv916*
> 
> No...just running single monitor.


Possibly inconsistent flow through the waterblock - a bubble perhaps?


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> Possibly inconsistent flow through the waterblock - a bubble perhaps?


I concur with this good man.


----------



## iARDAs

@ Masked

My GPU fluctuation problem was solved OCing my CPU to 4.5 from 4.0

People mostly claim that even at stock 2500k would never bottleneck a 590. could that be wrong if the game is as intensive as Bf3?

Or perhaps restarting my PC for OCing resolved the issue for now but maybe it is still a driver problem and I can have the problem agian later?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> @ Masked
> My GPU fluctuation problem was solved OCing my CPU to 4.5 from 4.0
> People mostly claim that even at stock 2500k would never bottleneck a 590. could that be wrong if the game is as intensive as Bf3?
> Or perhaps restarting my PC for OCing resolved the issue for now but maybe it is still a driver problem and I can have the problem agian later?


I was actually going to PM you about that but, I got extremely busy with a special customer's PC ~ my apologies.

The 590 seems to bottleneck with anything under 4.3 ~ 4.3 is reachable by anyone, really -- Even on air.

For example, my OC at 4.5 is only there because I don't want any bottlenecking of any kind...

So, could that cause an issue? Absolutely...Especially in correlation with that driver...I'd leave it @ 4.5...


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I was actually going to PM you about that but, I got extremely busy with a special customer's PC ~ my apologies.
> The 590 seems to bottleneck with anything under 4.3 ~ 4.3 is reachable by anyone, really -- Even on air.
> For example, my OC at 4.5 is only there because I don't want any bottlenecking of any kind...
> So, could that cause an issue? Absolutely...Especially in correlation with that driver...I'd leave it @ 4.5...


Aw no worries my friend. No need for apologies.

Actually I had realized that when I first got my 590.

My 3Dmark11 Score with 4.0 OC was p8000 something and 4.5 OC was p9000 something. There were nearly 1000 points of difference between the two. So i had clocked my CPU at 4.5.

However I thought to myself that this was a benchmarking tool and perhaps it would not matter in games so i Clocked the CPU back to 4.0 because at 4.5 my CPU would hit 72-73 degrees.

None of my games had an issue my BF3 obviously did so OCing the CPU back to 4.5 seems to solve this problem.

I believe i will change my MOBO soon. When the next gen of MOBOs come out i will get a good one that will be IVY bridge compatible. The MOBO i am using now is the LE version which sux to be honest. I cant OC it over 4.5, it is always unstable.

I so regret this MOBO everyday. I had to buy a Asus P8Z68 deluxe or pro. This LE version is not even SLI ready can you believe that. Terrible purchase


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I so regret this MOBO everyday. I had to buy a Asus P8Z68 deluxe or pro. This LE version is not even SLI ready can you believe that. Terrible purchase


Asus is actually getting a very bad insider rep because of issues like the above.

Plus their entire RMA system is bogus -- Aside from my personal bias, it was a fight to get an RMA even BEFORE our issues with Irene + Whatever that blizzard was in October.

I outlawed ANY Asus part in this building, actually; since, no problems what-so-ever.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Asus is actually getting a very bad insider rep because of issues like the above.
> Plus their entire RMA system is bogus -- Aside from my personal bias, it was a fight to get an RMA even BEFORE our issues with Irene + Whatever that blizzard was in October.
> I outlawed ANY Asus part in this building, actually; since, no problems what-so-ever.


I will probably get an MSI or a GIGABYTE MoBo next time. I just want to buy one that will be IVY Bridge ready and can be OCable easily. This LE version is also OCable very easily but some of the UEFI features are not there and I could not achieve OC that is higher than 4.5 on air.

I cant believe i got this model because it was like 40 bucks cheaper.


----------



## MKHunt

Quote:


> Originally Posted by *iARDAs*
> 
> @ Masked
> My GPU fluctuation problem was solved OCing my CPU to 4.5 from 4.0
> People mostly claim that even at stock 2500k would never bottleneck a 590. could that be wrong if the game is as intensive as Bf3?
> Or perhaps restarting my PC for OCing resolved the issue for now but maybe it is still a driver problem and I can have the problem agian later?


I run at 4.4 and in BF3 (at least in SP where I actually alt+tab to monitor







) it runs mid-70's to low 80's on both cores and very spiky. :/


----------



## iARDAs

Quote:


> Originally Posted by *MKHunt*
> 
> I run at 4.4 and in BF3 (at least in SP where I actually alt+tab to monitor
> 
> 
> 
> 
> 
> 
> 
> ) it runs mid-70's to low 80's on both cores and very spiky. :/


are you on the latest WHQL drivers?


----------



## MKHunt

Quote:


> Originally Posted by *iARDAs*
> 
> are you on the latest WHQL drivers?


actually no. 285.62 because the latest drivers drop volts to .925V which is.... unpleasant after experiencing .938V


----------



## iARDAs

Quote:


> Originally Posted by *MKHunt*
> 
> actually no. 285.62 because the latest drivers drop volts to .925V which is.... unpleasant after experiencing .938V


Hmmm thats strange. Why do we have this fluctuation on our cards than since we are both using totally different drivers.

Maybe its the games fault?


----------



## MKHunt

Quote:


> Originally Posted by *iARDAs*
> 
> Hmmm thats strange. Why do we have this fluctuation on our cards than since we are both using totally different drivers.
> Maybe its the games fault?


It certainly _is_ poorly optimized. And as far as I know it still has memory leak issues.


----------



## iARDAs

Now I am getting mad at Withcer 1.

I want to give the series a go once more since I played Wicher 1 few months back but my save was corrupted. I played it on a laptop than (460M GPU)

Now i turned it on with my 590.

Everything at max I get 70-80 FPS, but at times it falls to 40 FPS.

I disabled SLI and still the same thing.

When SLI is disabled GPU usage is 40%

when SLI is enabled the GPU usage is 25% each.

i get stuttering and frame drops.

Why the hell isnt my GPU using its full potential. I might just revert drivers and see if it helps.


----------



## Masked

Witcher 1 is and always has been, very poorly optimized.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Witcher 1 is and always has been, very poorly optimized.


Yeah as I advance through the game I got that feeling. Why do developers do that.









Its still a gorgeous looking game, so beautiful. cant wait to play Witcher 2 after completing this one.

When I played the game on my 460M i thought the graphics were good ( i had to lower few settings) but at MAX things look even more prettier. I just love my 590


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I was actually going to PM you about that but, I got extremely busy with a special customer's PC ~ my apologies.
> The 590 seems to bottleneck with anything under 4.3 ~ 4.3 is reachable by anyone, really -- Even on air.
> For example, my OC at 4.5 is only there because I don't want any bottlenecking of any kind...
> So, could that cause an issue? Absolutely...Especially in correlation with that driver...I'd leave it @ 4.5...


Does this also apply to older Bloomfield CPU's? The highest I can get mine stable is 4.189ghz...

I assume it does going by your info here.. Bummer if so. That makes the Ivy Bridge wait that much harder


----------



## iARDAs

I am having a major issue with my 590




bye bye card?

I will try to reinstall the drivers now with a clean install.

This only happens when a game is launched.

Never on desktop, never when watching videos etc...

EDIT 1 : I turned of the PC, unplugged it, took the card out, blew all over for dust, put it on and now I dont have this issue.

However i do not feel so confident. If there was a problem with the GPU and its socket (like dust or whatever, wouldnt i have to get these artifacts even on desktop? I only got these artifacts when launching a game or running a benchmark.)

I do not believe I solved the issue now but my card is probably having some faults and I am afraid that it might come back to haunting me.

What do you guys think?


----------



## Saizer

Quote:


> Originally Posted by *iARDAs*
> 
> I am having a major issue with my 590
> 
> 
> 
> bye bye card?
> I will try to reinstall the drivers now with a clean install.
> This only happens when a game is launched.
> Never on desktop, never when watching videos etc...
> EDIT 1 : I turned of the PC, unplugged it, took the card out, blew all over for dust, put it on and now I dont have this issue.
> However i do not feel so confident. If there was a problem with the GPU and its socket (like dust or whatever, wouldnt i have to get these artifacts even on desktop? I only got these artifacts when launching a game or running a benchmark.)
> I do not believe I solved the issue now but my card is probably having some faults and I am afraid that it might come back to haunting me.
> What do you guys think?


I was having similar problems but with BF 3. I did the same you did and the problem was fixed. Only god knows what happened...


----------



## iARDAs

Quote:


> Originally Posted by *Saizer*
> 
> I was having similar problems but with BF 3. I did the same you did and the problem was fixed. Only god knows what happened...


Strange enough for me this problem happend first when playing BF3
i turned the PC off and launched BF3 again and the problem occured once more.

I decided to launch 3Dmark11 and the problem occured with 3Dmark11 as well.
I did not try any other games and immideately turned off the PC again and did the procedure.

It DOES seem that ony God might know what happened.

I dont have the issue now what so ever. Though it is too early to talk.


----------



## Masked

Dust can create that issue -- So can a lot of things.

I'd call that a fluke because no outside issue actually occurred.

Eh, it happens.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Dust can create that issue -- So can a lot of things.
> I'd call that a fluke because no outside issue actually occurred.
> Eh, it happens.


Thats what I will do now.

I had few random freezes today and last night with my PC. Perhaps this was a sign. If the problem occures again i will be sending the card to be repaired or replaced..

Also remember my issues with GPU usage fluctuation. Maybe they were all related to this, though again maybe not.

Anyway. I am so glad that the problem is gone now though i am keeping a close eye to it.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thats what I will do now.
> I had few random freezes today and last night with my PC. Perhaps this was a sign. If the problem occures again i will be sending the card to be repaired or replaced..
> Also remember my issues with GPU usage fluctuation. Maybe they were all related to this, though again maybe not.
> Anyway. I am so glad that the problem is gone now though i am keeping a close eye to it.


Freezes are a tricky thing to diagnose...For example, my PC freezes about 2/3 times a week and emits a very loud BUZZZZZzzzzRRRZZRRZZZZ through my speakers...Doesn't BSOD, just sits there.

I happen to know it's my Creative crap audio card causing the error but, in your case, it may very well be, the card.

Always check twice before making a decision -- Saves a tremendous amount of time and effort.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Freezes are a tricky thing to diagnose...For example, my PC freezes about 2/3 times a week and emits a very loud BUZZZZZzzzzRRRZZRRZZZZ through my speakers...Doesn't BSOD, just sits there.
> I happen to know it's my Creative crap audio card causing the error but, in your case, it may very well be, the card.
> Always check twice before making a decision -- Saves a tremendous amount of time and effort.


Thanks for the tip bro.










I know that sometimes random components can make your PC crash. Funny though when I had a PC bfore i switched back to PS3, a creative soundcard i had also made my system crash frequently.









Though I am happy with my creative xfi titanium now. Unless there is something much better than it in gaming i will probably stick with it.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thanks for the tip bro.
> 
> 
> 
> 
> 
> 
> 
> 
> I know that sometimes random components can make your PC crash. Funny though when I had a PC bfore i switched back to PS3, a creative soundcard i had also made my system crash frequently.
> 
> 
> 
> 
> 
> 
> 
> 
> Though I am happy with my creative xfi titanium now. Unless there is something much better than it in gaming i will probably stick with it.


Yeah, I just suck it up.

I know for a fact it's the card-based audio conflicting with the video-card's audio but, at this point it's a question of...Do I want better sound or would I rather have my video card do it and suck performance?

It's not a big deal really.

In your situation, that's definitely dust...It happens but, remember that dust can also cause issues so, this could very well be a foreshadowing of a real issue.

Just keep an eye on it and if more artifacting occurs, especially when not gaming, it's time to RMA that sucker.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Yeah, I just suck it up.
> I know for a fact it's the card-based audio conflicting with the video-card's audio but, at this point it's a question of...Do I want better sound or would I rather have my video card do it and suck performance?
> It's not a big deal really.
> In your situation, that's definitely dust...It happens but, remember that dust can also cause issues so, this could very well be a foreshadowing of a real issue.
> Just keep an eye on it and if more artifacting occurs, especially when not gaming, it's time to RMA that sucker.


True that bro.

funny thing is i was thinking of cleaning inside my PC this weekend, well i atleast I blew over the GPU and the MOBO socket so i am 20% done









The next sign of artifacting, i will send the GPU to the place I bought. Hopefully they can send me a new one if they have 590 in stock. but still i would rather have no more issues and stick with this GPU as i know it will be at least a week for the new GPU to arrive if i ever send it









EDIT ; I also emailed Zotac too. To their European branch. I attached the youtube video and waiting for a response from them too.


----------



## Image132

Yeah dust can be a major issue. I remember with my old pc it used to BSOD randomly until I did a good clean, then the bsods went away.

I'll let you in on a secret, buy fan filters. Living in a house with A LOT of dust I used to have to clean my old pc out every month or two. My new case came with filters for all my intake fans and now I clean my pc about twice, three times a year and that's only really the filters. Inside is still good to go.


----------



## iARDAs

I just played a round of BF3. 64 people map on ULTRA. The highest my GPU got was 79 degrees. Before i hit 85.

Interesting.


----------



## iARDAs

Quote:


> Originally Posted by *Image132*
> 
> Yeah dust can be a major issue. I remember with my old pc it used to BSOD randomly until I did a good clean, then the bsods went away.
> I'll let you in on a secret, buy fan filters. Living in a house with A LOT of dust I used to have to clean my old pc out every month or two. My new case came with filters for all my intake fans and now I clean my pc about twice, three times a year and that's only really the filters. Inside is still good to go.


I took note of this bro thank you.

How do you clean inside your PC? I would like to have some insight.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> I took note of this bro thank you.
> How do you clean inside your PC? I would like to have some insight.


http://www.amazon.com/Metro-Vacuum-ED500-500-Watt-Electric/dp/B001J4ZOAW/ref=sr_1_1?ie=UTF8&qid=1331143737&sr=8-1

I use the same thing. It's also good for other things around the house that needs a good blowin'


----------



## rush2049

Best way to clean a dusty PC case?

Carry it outside, power up the air-compressor and blow away from every possible angle........ Helps if you have a hand activated nozzle for the air hose.


----------



## iARDAs

I just recieved an email from Zotac about the issue i had.

They told me that I should return the card for replacement since it is faulty. I wonder if i should do this or wait if the problem happens again.

The thing is as MASKED said this might be a fluke and maybe when they recieve the card, in their tests they wont see a problem and send the same card again.

I just sent them an email telling that the problem is not happening now, and waiting for their response.

Also about cleaning, i will be buying an air compressor unit this weekend. It is hard for me to purchase that product in amazon and have them ship it here.


----------



## Image132

Quote:


> Originally Posted by *rush2049*
> 
> Best way to clean a dusty PC case?
> Carry it outside, power up the air-compressor and blow away from every possible angle........ Helps if you have a hand activated nozzle for the air hose.


+1

Air compressors are the best. Period. Stay away from any kind of vacuum cleaner even if it says "esd safe" or whatever. All the dust flying up into the nozzle, static storm. I personally have a mid sized air comprossor in my workshop that I use and I have to say a combination of a brand new paint brush and an air comprossor and there is no dust that can hide from you.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Recipe7*
> 
> http://www.amazon.com/Metro-Vacuum-ED500-500-Watt-Electric/dp/B001J4ZOAW/ref=sr_1_1?ie=UTF8&qid=1331143737&sr=8-1
> I use the same thing. It's also good for other things around the house that needs a good blowin'


+rep for the link!


----------



## iARDAs

So i got another reply from Zotac. Basically telling me that if the problem doesnt happen now, i should use the card for now but stress test it with Furmark or Ati tool

I never used any of these softwares before. Do i have to use both of them or just 1 will be enough?


----------



## kzinti1

I found the following for ATITool at TechPowerUp's download page: "*ATITool will only work on Windows 2000/XP/2003 (64 bit versions are supported).*"

If this is so, and they have no reason to lie about it, then this program is worthless to both you and most of the rest of us since we mostly use modern operating systems.


----------



## iARDAs

Quote:


> Originally Posted by *kzinti1*
> 
> I found the following for ATITool at TechPowerUp's download page: "*ATITool will only work on Windows 2000/XP/2003 (64 bit versions are supported).*"
> If this is so, and they have no reason to lie about it, then this program is worthless to both you and most of the rest of us since we mostly use modern operating systems.


Exactly. I looked up the Ati Tools as well and they were all released in 2006. I am not sure if this will work at all. The guy told me this "stress test with Furmark and scan for artifacts with ati tool"

Anyhow I stress tested for 15 mins at 1080p with Furmark. Highest I had was 84 degrees in one core and 79 in the other one. Also my burn in score is 2456. No idea what that means.


----------



## aceifeR

Hey guys, I've just recently built the rig in my sig and been having issues which can be found in this thread.
This time around when I crashed I got the same event id error but now I got the "Windows has recovered from an unexpected shut down" message with a BCCode 116.
Looked around the internet and found out that the 116 error code was due to a graphics card error.
I also found a thread on evga site where someone had a similar problem with his 2 gtx 590's; they were being undervolted from factory to .913v and mine are at the same.
Could this be the reason why my pc would hang during gaming?

P.S: I've ran FurMark for a good 30 min without any hiccups.


----------



## Masked

Quote:


> Originally Posted by *aceifeR*
> 
> Hey guys, I've just recently built the rig in my sig and been having issues which can be found in this thread.
> This time around when I crashed I got the same event id error but now I got the "Windows has recovered from an unexpected shut down" message with a BCCode 116.
> Looked around the internet and found out that the 116 error code was due to a graphics card error.
> I also found a thread on evga site where someone had a similar problem with his 2 gtx 590's; they were being undervolted from factory to .913v and mine are at the same.
> Could this be the reason why my pc would hang during gaming?
> P.S: I've ran FurMark for a good 30 min without any hiccups.


Technically, yes...I'd revert to an older driver -- 280-series...

Maybe the 287 driver would serve you best.


----------



## aceifeR

Quote:


> Originally Posted by *Masked*
> 
> Technically, yes...I'd revert to an older driver -- 280-series...
> Maybe the 287 driver would serve you best.


I only see 285.62 driver. Will that do?


----------



## Masked

Quote:


> Originally Posted by *aceifeR*
> 
> I only see 285.62 driver. Will that do?


Sorry, 287 wasn't public -- Do 285.62 ~ I have to start checking the site before I recommend drivers.

Also:

It has come to my attention that several people asked on the EVGA forums for our custom driver profiles...I'll explain this a bit.

[email protected] was originally my account but, now is used by our JSA ~ It's used to keep RMA's in order (our office's RMAs are all on that s/n) and Brian also enjoys that community so he posts there.

I am in the process of collaborating with several other manufacturers to create a driver for the 590 that will have no voltage lock and/or stutter issues.

It's a slow process because of how these drivers were coded and I literally have to dissect large portions of the driver on a whiteboard -- Then translate them over, only to find out they conflict with string 2.

I'm about half way done.

Am I talking a few weeks done? No, not by a long shot.

Am I talking a month or two done? Perhaps, if I don't find any more conflicting strings.

I'm not a programmer, obviously but, after writing @ 10-15 of our mobile drivers, I thought this would be easy...Ha-ha-ha; guess what? It's not.

I do welcome some assistance if anyone would like to "chip in" but, keep in mind that the 680 will be out, long before we finish.

PM me if interested but, keep in mind also, we will need testers that understand these are CUSTOM DRIVERS...So, I'll be making a post about that too, eventually.

Thanks guys.


----------



## Recipe7

Good luck Masked (and thanks!). I hope someone on the forum can give you a hand, God knows I can't


----------



## rush2049

I'm a programmer, and could help. I am not very familiar with drivers per-say but I learn real quick....
Are you working with de-compiled drivers? or have access to the source code in some form....

Sent a PM regardless, with this post in case you miss it.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> Sorry, 287 wasn't public -- Do 285.62 ~ I have to start checking the site before I recommend drivers.
> Also:
> It has come to my attention that several people asked on the EVGA forums for our custom driver profiles...I'll explain this a bit.
> [email protected] was originally my account but, now is used by our JSA ~ It's used to keep RMA's in order (our office's RMAs are all on that s/n) and Brian also enjoys that community so he posts there.
> I am in the process of collaborating with several other manufacturers to create a driver for the 590 that will have no voltage lock and/or stutter issues.
> It's a slow process because of how these drivers were coded and I literally have to dissect large portions of the driver on a whiteboard -- Then translate them over, only to find out they conflict with string 2.
> I'm about half way done.
> Am I talking a few weeks done? No, not by a long shot.
> Am I talking a month or two done? Perhaps, if I don't find any more conflicting strings.
> I'm not a programmer, obviously but, after writing @ 10-15 of our mobile drivers, I thought this would be easy...Ha-ha-ha; guess what? It's not.
> I do welcome some assistance if anyone would like to "chip in" but, keep in mind that the 680 will be out, long before we finish.
> PM me if interested but, keep in mind also, we will need testers that understand these are CUSTOM DRIVERS...So, I'll be making a post about that too, eventually.
> Thanks guys.


+rep Masked!

Funny, I was just going to PM you about this very thing, as I was definitely one of the cats that asked Brian for a copy of those drivers..

Any way I can help, or anywhere to learn so I may be able to help for real at some point, I'm down.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> +rep Masked!
> Funny, I was just going to PM you about this very thing, as I was definitely one of the cats that asked Brian for a copy of those drivers..
> Any way I can help, or anywhere to learn so I may be able to help for real at some point, I'm down.


Quote:


> Originally Posted by *Recipe7*
> 
> Good luck Masked (and thanks!). I hope someone on the forum can give you a hand, God knows I can't


Quote:


> Originally Posted by *rush2049*
> 
> I'm a programmer, and could help. I am not very familiar with drivers per-say but I learn real quick....
> Are you working with de-compiled drivers? or have access to the source code in some form....
> Sent a PM regardless, with this post in case you miss it.


I thank you all for the support and the PM's...I'll be contacting most of you guys next week.

I'm dissecting the drivers in 2 forms ~ First is the profile, itself and ~ While I originally stripped the driver of //ALL// other support, I came to the realization that I cannot do that due to the fact that many of us use a 2nd card for Physx ~ So, that tossed away 2 weeks of work, instantaneously.

The biggest issue I'm having is finding where the voltage lock actually correlates with the bios.

I've done 100s of these, especially when we release a custom card and never 1 time have I had this big of an issue.

For those of you that want to help a bit:

http://forums.guru3d.com/showthread.php?t=171843

Read that thread and do a bit of your own dissecting ~ It will not only be a learning experience but, it will put you on the same page.

I think everyone who wants to use this will have to re-flash their BIOS on the cards and we'll have to just implement the custom, ONTOP of that bios.

That takes a bit more work at the base but, it's more than likely, necessary.


----------



## rush2049

Looking at that thread... I assume you tried to just tell the driver our 590 is actually a 580 and see what happens....... cause that would solve it real quick if it worked

Also take a look at what these guys do: http://forums.laptopvideo2go.com/topic/29163-v29551-windows-7vista-64bit-nvidia-desktop/
(That's a link to the newest driver thread with the modded inf's at the top)
I modded the inf's to support my 8400M back when Nvidia didn't even provide the unofficial mobile drivers....


----------



## iARDAs

After my inital problem few days ago now i have this issue




I will be sending my GPU tomorrow. God knows when I will get my new GPU back. This sucks.


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> After my inital problem few days ago now i have this issue
> 
> 
> 
> I will be sending my GPU tomorrow. God knows when I will get my new GPU back. This sucks.


Ouch!

Yeah, she's had it.


----------



## iARDAs

Its like 2.5 months old


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> Its like 2.5 months old


Brand new or back from RMA?


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Brand new or back from RMA?


This is a brand new GPU. never ever been sent for RMA

now i will get a crappy refurbished one right?


----------



## iARDAs

I am now thinking that all of these could be a driver issue.

The symphtoms of my GPU for the last week was

1-) When going to the Nvidia Control Panel for the 1st time after turning on my PC and clicking on Manage 3D settings, my computer would freeze for a few seconds

2-) TDR issues. I never ever had a TDR issue before this latest WHQL driver

3-) Random and frequent freezes especially in games while quiting or starting an application. I never ever had this issue as well

4-) Major artifacting on games, that goes away if i turn off the PC and turn it back on. But the problem never goes away if i click restart.

So i went back to the last beta driver. 295.51 i guess.

So far I dont have the 1st problem I mentioned above. I will not sleep tonight much and will game and try to see if i have freezes or TDR issues.

Can bad drivers give a person major artifacting as shown in my videos?


----------



## Smo

I was secretly hoping that your card was already a refurbished RMA. I've never seen artifacts caused by drivers before. Maybe someone with a bit more experience can shed some light.


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> I was secretly hoping that your card was already a refurbished RMA. I've never seen artifacts caused by drivers before. Maybe someone with a bit more experience can shed some light.


Me neither.

maybe things like small flickerings and etc but i would always think that artifacting has to be hardware related.

So far I am on the last beta and things are fine overall, but its been such a small period of testing.


----------



## Masked

I'm stuck in a meeting and I'll only have my Ipad this evening ~ I'll give you my suggestions at around 9pm est ~ I apologize they're not instantly like they normally are -- It's been a long day.


----------



## Smo

Quote:


> Originally Posted by *Masked*
> 
> I'm stuck in a meeting and I'll only have my Ipad this evening ~ I'll give you my suggestions at around 9pm est ~ I apologize they're not instantly like they normally are -- It's been a long day.


Get back to your boring 'agree with the boss' session you cheeky git


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> Get back to your boring 'agree with the boss' session you cheeky git


Wow, 3 hours of an HTML base protocol meeting ~ I literally wanted to ice-pick my head after 10 minutes...They could've literally cut out the .net security correlations and dropped it on my desk...6 interns were literally passed out and drooling while the other 3 blatantly raced each-other on Ipads...Such useless crap.

Anyway, that looks to be a driver issue atm -- Although, I wouldn't rule out artifacting.

Roll back, try a different driver, the 270- I recommended earlier.

If the problem repeats, time to RMA the card...And believe it or not but, the RMA's are actually better than your vanilla.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Wow, 3 hours of an HTML base protocol meeting ~ I literally wanted to ice-pick my head after 10 minutes...They could've literally cut out the .net security correlations and dropped it on my desk...6 interns were literally passed out and drooling while the other 3 blatantly raced each-other on Ipads...Such useless crap.
> Anyway, that looks to be a driver issue atm -- Although, I wouldn't rule out artifacting.
> Roll back, try a different driver, the 270- I recommended earlier.
> If the problem repeats, time to RMA the card...And believe it or not but, the RMA's are actually better than your vanilla.


Sorry about the meeting masked







I have always hated those kind of protocol meetings where people get together in the sake of getting together and nothing else. lol

The 4 steps I listed few posts above also indicate me that this could very well be a driver issue. I rolled back to the 295.51 beta, which was extremely stable for me and things are stable for now. I had 0 problems with the WHQL at first but later on things started to happen. I will monitor the situation very closely once more and IF i happen to have artifactign again this time the card will be sent for RMA.


----------



## Smo

Quote:


> Originally Posted by *Masked*
> 
> Wow, 3 hours of an HTML base protocol meeting ~ I literally wanted to ice-pick my head after 10 minutes...They could've literally cut out the .net security correlations and dropped it on my desk...6 interns were literally passed out and drooling while the other 3 blatantly raced each-other on Ipads...Such useless crap.
> Anyway, that looks to be a driver issue atm -- Although, I wouldn't rule out artifacting.
> Roll back, try a different driver, the 270- I recommended earlier.
> If the problem repeats, time to RMA the card...And believe it or not but, the RMA's are actually better than your vanilla.


Sounds pretty tedious bud, I must admit! Although meetings at our place are genuinely interesting most of the time as I'm actually interested in what they're about! Apart from the last couple that were about staying late the rest of the week and working the weekend to get this game sorted for release.

Coffee > *


----------



## iARDAs

Forceware 296.10 is out guys

I will check it right now.

I hope i wont have the above problems.


----------



## iARDAs

So far i have no issues. I am glad that Forceware 296.10 has been extremely stable for me. *knocks on wood*

Do i need to have a SLI motherboard to have a physx GPU?

Also what is a good physx GPU that you could recommend along with my 590?


----------



## AlbertMwugabi

Don't own a GTX590, just wanna say how jelly i am right now, this thread made me so nostalgic and i miss my old GTX295 so bad. The looks of these cards is just perfect.

@ iARDAs, your motherboard does not need to be SLI capable to run a PhysX card. Don't really sure about the second question, but maybe a GTX460? Should be easy to find pretty cheap and that would do the job great.


----------



## iARDAs

Quote:


> Originally Posted by *AlbertMwugabi*
> 
> Don't own a GTX590, just wanna say how jelly i am right now, this thread made me so nostalgic and i miss my old GTX295 so bad. The looks of these cards is just perfect.
> @ iARDAs, your motherboard does not need to be SLI capable to run a PhysX card. Don't really sure about the second question, but maybe a GTX460? Should be easy to find pretty cheap and that would do the job great.


Thank you for the answer bro. Yeah to be honest we love our GPUs









Will you be sticking with your 480? or SLI it later down the road?

So i will grab myself a second GPU. All i will do is that i will install it to my motherboard, and it will come up on the Nvidia control panel and i will select that card for physx?

I am guessing i dotn have to connect that GPU to my 590 with a SLI bridge as well?

Edit : I might go for a 550Ti if it can do the job.


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you for the answer bro. Yeah to be honest we love our GPUs
> 
> 
> 
> 
> 
> 
> 
> 
> Will you be sticking with your 480? or SLI it later down the road?
> So i will grab myself a second GPU. All i will do is that i will install it to my motherboard, and it will come up on the Nvidia control panel and i will select that card for physx?
> I am guessing i dotn have to connect that GPU to my 590 with a SLI bridge as well?
> Edit : I might go for a 550Ti if it can do the job.


Exactly. No SLI bridge. Just plug it into PCIe slot #2 or #3. Go to Nvidia Control Panel / Set PhysX Configuration / then choose the GPU in the drop down menu for dedicated PhysX card.

Higher FPS in games that use PhysX where the dedicated PhysX card can free up your 590 to render all the graphic processing separately.


----------



## Image132

Quote:


> Originally Posted by *Arizonian*
> 
> Exactly. No SLI bridge. Just plug it into PCIe slot #2 or #3. Go to Nvidia Control Panel / Set PhysX Configuration / then choose the GPU in the drop down menu for dedicated PhysX card.
> Higher FPS in games that use PhysX where the dedicated PhysX card can free up your 590 to render all the graphic processing separately.


In theory, yes. I tired it using one of my older cards as a physx card and my 590 being the render horse. Well I got less fps with a dedicated physx card in batman:Arkham city than just using my 590...go figure.


----------



## AlbertMwugabi

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you for the answer bro. Yeah to be honest we love our GPUs
> 
> 
> 
> 
> 
> 
> 
> 
> Will you be sticking with your 480? or SLI it later down the road?
> So i will grab myself a second GPU. All i will do is that i will install it to my motherboard, and it will come up on the Nvidia control panel and i will select that card for physx?
> I am guessing i dotn have to connect that GPU to my 590 with a SLI bridge as well?
> Edit : I might go for a 550Ti if it can do the job.


Just as Arizonian said, no need for a SLI bridge when running PhysX.

I'll be sticking with my 480's they do perform good but i would like cards that draws less power then these though. So i will see what kepler brings and maybe upgrade.


----------



## iARDAs

Thank you for the answers guys. I will be playing some games that support Physx first and than see if i need to get a GPU for Physx.

However Image132 said that having a dedicated GPU next to a 590 actually gave him lower FPS. This is making me think the whole thing from the beginning.

Does anyone else with a 590 and a dedicated GPU for Physx can give tell their own experience?

Also i hear that 590 + Physx GPU will make the 590 run HOTTER. is that true?

Also @ Image132, which GPU did you use as a Physx card?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you for the answers guys. I will be playing some games that support Physx first and than see if i need to get a GPU for Physx.
> However Image132 said that having a dedicated GPU next to a 590 actually gave him lower FPS. This is making me think the whole thing from the beginning.
> Does anyone else with a 590 and a dedicated GPU for Physx can give tell their own experience?
> Also i hear that 590 + Physx GPU will make the 590 run HOTTER. is that true?
> Also @ Image132, which GPU did you use as a Physx card?


The idea of secondary card for Physx is that it takes the load off the primary card...So of course it works...It would work with any other card...

Of course it will be hotter...You're now running 2 cards in the same area where you previously had 1 card...Your ambient will increase.

I have a 590 and a 580 as a dedicated Physx ~ I'm on water so, I don't care about the ambients but, my rig never goes above 35, anyway so, for me, it's not a big deal.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> The idea of secondary card for Physx is that it takes the load off the primary card...So of course it works...It would work with any other card...
> Of course it will be hotter...You're now running 2 cards in the same area where you previously had 1 card...Your ambient will increase.
> I have a 590 and a 580 as a dedicated Physx ~ I'm on water so, I don't care about the ambients but, my rig never goes above 35, anyway so, for me, it's not a big deal.


Thank you for the response.

I am using a custom fan profile for my 590 GTX and the hottest i see is around 80-81 on fullload. Maybe sometimes a little spike until 83 but it always comes back. The fastest i see my GPU fan is 75% under this custom profile.

Should i be expecting a 90 degrees kinda hot when having a 2nd GPU?

if thats the case i will not be bothered with a 2nd GPU for Physx.

Also how do you like your setup in Physx games? Is it really worth it masked? Not that i will get a 580 for Physx GPu but i found a brand new 550Ti with life time warranty that is OCed factory at 975mhz. I am seriously thinking of getting it if there is a performance increase.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you for the response.
> I am using a custom fan profile for my 590 GTX and the hottest i see is around 80-81 on fullload. Maybe sometimes a little spike until 83 but it always comes back. The fastest i see my GPU fan is 75% under this custom profile.
> Should i be expecting a 90 degrees kinda hot when having a 2nd GPU?
> if thats the case i will not be bothered with a 2nd GPU for Physx.
> Also how do you like your setup in Physx games? Is it really worth it masked? Not that i will get a 580 for Physx GPu but i found a brand new 550Ti with life time warranty that is OCed factory at 975mhz. I am seriously thinking of getting it if there is a performance increase.


I don't know how your computer is cooled so I can't comment on your temperatures.

You're adding a 2nd card to the mix...In the same area so, of course it will generate heat.

I don't really play any Physx games in general and again, I can't comment on heat but, there is a performance increase...


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I don't know how your computer is cooled so I can't comment on your temperatures.
> You're adding a 2nd card to the mix...In the same area so, of course it will generate heat.
> I don't really play any Physx games in general and again, I can't comment on heat but, there is a performance increase...


Thank you.

I am actually thinking of taking some photos of my RIG in detail and tell every single component that i have. I wonder if i can do something about the cooling of my rig without water cooling. Maybe i can add some more FANS that could help.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you.
> I am actually thinking of taking some photos of my RIG in detail and tell every single component that i have. I wonder if i can do something about the cooling of my rig without water cooling. Maybe i can add some more FANS that could help.


Np, sorry I'm being so short ~ Talking on Skype w/a friend in Sweeden while I answer your Q's...

If you throw up some photos, I can tell you what you can do to keep the heat down without W/Cing -- That's not a challenge.


----------



## iARDAs

@ Masked

My friend please take your time, you can respond me whenever you have time as my questions are not urgent at all. Thank you so much for taking the time though

Here are some pics i took . If more needed please let me know.

I took the pics with my Iphone and if it is not clear enough i can take the pics with my Canon.

So here we go

This is how my Desktop sits under my Desk,



Here are the pics of the front, back and top of my CASE





Now on to the interior photos






I have a feeling that as things are right now, its not so smart to add a second GPU to the system.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> @ MaskedI have a feeling that as things are right now, its not so smart to add a second GPU to the system.


Get some stand-offs so your case is raised @ 1-2in and add a fan blowing IN, next to your PSU ~ That will lower your temperatures, substantially.


----------



## iARDAs

I have my interior designer coming in 30 mins to do some work at the house. Getting a new aquarioum and having custom things built for it.

The stand under my PC was built by the same guy but for my older PC. I can now have him build me a new one as he can design it.

So basically i should tell him that i want something that will make my CASE stand 1-2 inches above the wooden thingie? and doesnt touch it right?

I always wanted to change it and this could be the perfect excuse.


----------



## iARDAs

Basically something like that?



Please dont be jealous of my incredible paint skills


----------



## rush2049

Make it so your fans blow in the directions that I indicated with the green arrows. And if you are able to get a fan on the bottom next to the power supply (internally mounted, or externally) make it blow in as I indicated....



This will most likely be your best airflow solution without too much work.

In my opinion, unless you have a nvidia gpu sitting around doing nothing, don't buy a second card to do physx.... unless you are an extreme batman arkham city fan and play it every day.....


----------



## iARDAs

This is the bottom of my PC



My guy is making a wooden block under it with wheels, so i can move it around. The wooden block will have 4 built in stand offs on it and the PC will stand on them. They will be right under the circle legs of the case. The guy will come up with something that will hold the pc efficiently. The stand offs will be between 1-2 inches. around 3 cms to be exact.

About the fan near the PSU,

Thank you RUSH for taking the time and drawing the arrows. I have one question though, i was also going to ask this to Masked. What kind of a fan are we talking about here? can i mount it on the CASE? if so how?

Also Rush the way i understand from your drawings that if i make room on the bottom of my case, the air will come in from there, combine with the CPU coolers hot air and get out from the top of my case? And also I believe the arrow that is pointing towards right is the air that is coming to my case (the one next to the CPU cooler) that cold air will also turn to heat and go from the top am i right? I wonder if it sucks the air from outside or blows it from inside at the moment. I gotta check it.









EDIT : Some people claim that when a PSU is built on the bottom of the PC, than they benefited from having its fan looking upwards. What do you guys think? Mine looks bottom.


----------



## Image132

Hey guys. I was using a 9600 as a dedicated physx card. I know I know, it's old and thats probably why. I personally don't think it's worth it iARDAs, the list of games that uses nvidia physx is short that it's not worth the money for another card. Rather save and buy another 590.

Here is the list of games that support physx:

http://www.geforce.com/Hardware/Technologies/physx/games

The bottom fan will help with 2 things. First it will bring cool air to your gpu. Secondly it well help move the hot air that your gpu dumps in your case to the top to vent it.

The rear (or left fan that will now be sucking into your case) should help to bring cool air to your cpu and help cool it down. I'll do a diagram now, just I'm not so good at art









*EDIT


----------



## rush2049

Yes, we drew the same diagram.

If you are able to mount a new fan next to the power supply without the cables in the way then do that. If you aren't able, since you have someone building you a stand, you would be able to mount it on the outside.

I am not familiar with that case, but find out what size that fan grill is on the bottom (probably going to be 120mm or 140mm) and get the highest CFM fan your ears can stand.....

For reference here is my case:


Every Fan has a red arrow pointing in the direction it blows. (the only confusing one is the one on the far right, it blows away from the camera)


----------



## Masked

Quote:


> Originally Posted by *rush2049*
> 
> Yes, we drew the same diagram.
> If you are able to mount a new fan next to the power supply without the cables in the way then do that. If you aren't able, since you have someone building you a stand, you would be able to mount it on the outside.
> I am not familiar with that case, but find out what size that fan grill is on the bottom (probably going to be 120mm or 140mm) and get the highest CFM fan your ears can stand.....


Even with a 1000 x 15db fan, he's better off than with the nothing he has now.

I'm eyeballing it but, that appears to be a 120mm.


----------



## iARDAs

Ok guys i think i get it now.

Next to my PSU there are bunch of cables. So you are all suggesting me that i should try to put those cables somewhere else and install a fan there. It wont be mounted, as there is no place to mount it but i will just put it there still. Connect it and than it will take the cold air from outside and will pump the air inside

How is this fan for instance
http://www.vatanbilgisayar.com/So%C4%9Futma%20Sistemleri/xilence-coo-xpf140-140mm-kasa-fani/productdetails.aspx?I_ID=39119

thats 140

http://www.vatanbilgisayar.com/So%C4%9Futma%20Sistemleri/xigmatek-xsf-f1252-xsf-serisi-120mm-extreme-silent-fan/productdetails.aspx?I_ID=58382

and thats 120

So i should buy something like these right?

The question is where do i put all those cables.

+rep to everyone


----------



## rush2049

There is a place to mount it... look at that bottom picture, those four screw holes are for screwing a fan on. You have to measure to find what size those holes are for.



Also there are threads around here about fan choice... give the search feature a try.


----------



## iARDAs

Thank you. I just saw those holes RIGHT NOW while i was fiddling with my cables

Here is my latest cable management



Is this space enough for a fan to be mounted?

Sorry guys i am a noob when it comes to fans. I really am. I am doing this all to be able to have better air flow in my case so my 590 will last longer.

I am also going to ask the fan to the guys on the Haf 922 thread, maybe they can recommend me a good fan that they have been using.

Here comes a very noob question

The fan on the top left, near my CPU cooler, seems to be sucking the air inside and throwing it outside. Or did i feel it wrong? if so, do i have to change the direction or something like that?









Edit : I will be cleaning my case tomorrow, so many dust under it. unbeliveable.


----------



## rush2049

Yes that is enough space for a fan, it is literally the dimensions of those holes + a few mm. The thickness will be similar to your other fans.

To switch the direction of a fan, simple.... unscrew it, flip it so the other side is against the metal, then re-screw it.

If you are un-sure of the direction of a fan (un-sensitive hands), do the paper test. Hold a peice of paper dangling next to the fan, if it gets blown away it is blowing that direction, if it gets sucked against... well you probably understand.


----------



## iARDAs

Thank you

Should i be switching the direction of that upper left fan even though i dont have a fan on the bottom yet?

or should i wait?


----------



## rush2049

Basically like Image132 explained, the bottom fan is going to give the 590 cooler air. Switching the back fan gives the cpu cooler air to work with. Then both are exhausted out the top of your case.

Switching that fan will help with cpu temp regardless if you get a new fan for the bottom or not.


----------



## iARDAs

I switched the fan. It was very easy now i got the idea. Thank you guys.









I will now fiddle with my case cables more see if i can do a good cable management. I will be asking this in the Haf 922 case topic.

Again thank you so much guys, i learned thigs i did now know


----------



## iARDAs

Thanks to the guys on the Cooler Master HafX thread this is my latest cable management

I suppose there is plenty of room to put the fan and it will be more efficient.



My custom stand for my PC case will arrive in 2 weeks but i can get the fan tomorrow. Should i wait until i get my custom stand so there will be room between the fan and the ground? or should i just go ahead and install the fan now?


----------



## rush2049

Just install it now, it will only get better later when you have the stand-offs.


----------



## iARDAs

I am ordering it tonight when i decide which fan to buy









Edit : I ordered an Akasa Viper 140mm fan. I will have it on Saturday and install it. I hope i can have at least a 5 degree difference on FULL load. I can maybe leave the fan monitor on STOCK levels and dont use 75%.

So my case is a mid tower case. Is this enough for my setup? Or should i go for a FULL TOWER case for my 590?


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> I am ordering it tonight when i decide which fan to buy
> 
> 
> 
> 
> 
> 
> 
> 
> Edit : I ordered an Akasa Viper 140mm fan. I will have it on Saturday and install it. I hope i can have at least a 5 degree difference on FULL load. I can maybe leave the fan monitor on STOCK levels and dont use 75%.
> So my case is a mid tower case. Is this enough for my setup? Or should i go for a FULL TOWER case for my 590?


Glad to see you've listened to the advice offered to you - a fan there will definitely improve your temperatures, that's for sure! The 590 is an awkward card to efficiently cool. Different cases offer marginally better solutions but at the end of the day if the card is operating well, and within recommended temperatures then there's no real need to completely overhaul your system.

As for your fan profile, if you can live with the noise then there's no real reason to change it. That said, I've never left a GPU on standard profile!


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Glad to see you've listened to the advice offered to you - a fan there will definitely improve your temperatures, that's for sure! The 590 is an awkward card to efficiently cool. Different cases offer marginally better solutions but at the end of the day if the card is operating well, and within recommended temperatures then there's no real need to completely overhaul your system.
> As for your fan profile, if you can live with the noise then there's no real reason to change it. That said, I've never left a GPU on standard profile!


I agree with the different cases cools 590 differently as a friend of mine had a 590 and highes he saw his GPu was low 70s at stock speed and fan.

I am excited about saturday

I will clean the case's interior
I will do more cable management as i have purchased few things for that job
also I will install my fan.

I am also thinking of upgrading the fan on the top of the case but some people say that its not neccesary.

but i have a feeling that in a few months i will be changing all the fans of my case to the much improved versions. .

All for my 590 to run cooler









I love this thing.


----------



## Recipe7

I just caught up with the thread, I wasn't able to read it for the past week.

iARDAs, I hope your situation truly is a driver situation. Your card is fairly brand new, and I know it brings you much graphical joy, hehe. Good luck with your case too, you will be very relieved once you have it the way you want.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> I just caught up with the thread, I wasn't able to read it for the past week.
> iARDAs, I hope your situation truly is a driver situation. Your card is fairly brand new, and I know it brings you much graphical joy, hehe. Good luck with your case too, you will be very relieved once you have it the way you want.


I did some cable managemens today and this is the result




I am too proud of myself and believe it or not i feel 2-3 degrees difference now with my GPU. Maybe its because of the weather thoush.

When my fan arrives i MIGHT OC my 590 without tweaking the voltage. Though i wonder if it is neccessary.

Last 2-3 days i had an urge to play games on 3D once more, and for instance i cant play Crysis 2 on ULTRA with High Res pack on 3D with 60 fps steady.
I quess i need a 580 tri way SLI or a 590 Quad SLI to do that. Who knows.

About the issues with my GPU, it seems so far that its a driver issue. I cant believe a driver gave me artifacting, Weird. 296.10 has not been so great either but much better than 295.71


----------



## Xraze

Does anyone know why my temps are so high? ( Around 94c at load ) I have a reference GTX 590 @ 0.963v. I also have a case fan blowing air right on the card.


----------



## rush2049

Its not always bad intake, but exhaust. You might not be extracting the hot air fast enough.

Also use msi afterburner to set up a custom fan profile for the card... the more aggressive the cooler it will run.


----------



## iARDAs

Hey folks I actually started a new thread for this as not many people look in to this topic and I would like to get back to him as fast as I can.

But my friend has a problem with his 590




The problem happens in all of the drivers, and also only when MULTI GPU is selected. Any ideas?


----------



## Smo

How is he connecting to his monitor (what cable/converters) and what refresh rates is he using?


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> How is he connecting to his monitor (what cable/converters) and what refresh rates is he using?


He said he used DVI, VGA to DVI and also HDMI

let me ask about the refresh rate.

Edit : Weird that this only happens with MULTI GPU. When he disables that the problem goes away.

His last option is to format windows. I honestly cant think of anything else about that problem. Weird one.


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> He said he used DVI, VGA to DVI and also HDMI
> let me ask about the refresh rate.
> Edit : Weird that this only happens with MULTI GPU. When he disables that the problem goes away.
> His last option is to format windows. I honestly cant think of anything else about that problem. Weird one.


Reinstalling windows is a last resort.

Ask him to try looking in the NVIDIA Control Panel and in 3D Settings under the 'Global' tab. Has he got the card set to Multi-Display or SIngle Display mode?

I'd also recommend sticking to DVI for the time being.


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Reinstalling windows is a last resort.
> Ask him to try looking in the NVIDIA Control Panel and in 3D Settings under the 'Global' tab. Has he got the card set to Multi-Display or SIngle Display mode?
> I'd also recommend sticking to DVI for the time being.


Yeah reinstalling is a last resort. He has his windows on a SSD and it will be a fast business but downloading and installing lots of games will be a pain.

I read all over the internet that few people also had this problem but the problem was solved with different sets of drivers.

but not on his case.

EDIT ; Refresh Rate is at 60. And he also tried with multi and single displays, under the Nvidia Control Panel but no luck.


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah reinstalling is a last resort. He has his windows on a SSD and it will be a fast business but downloading and installing lots of games will be a pain.
> I read all over the internet that few people also had this problem but the problem was solved with different sets of drivers.
> but not on his case.
> EDIT ; Refresh Rate is at 60. And he also tried with multi and single displays, under the Nvidia Control Panel but no luck.


Hmm. It's possible that there's a conflict in refresh rates between the Windows Control Panel and the NVIDIA Control Panel, might be worth checking they are the same.

Also - do you know if he has tried using the cable in either of the other two DVI ports?


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Hmm. It's possible that there's a conflict in refresh rates between the Windows Control Panel and the NVIDIA Control Panel, might be worth checking they are the same.
> Also - do you know if he has tried using the cable in either of the other two DVI ports?


Yeah both refresh rates are 60

and he used all 3 DVI ports...


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah both refresh rates are 60
> and he used all 3 DVI ports...


Does the flickering go away if he uses windowed mode?


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Does the flickering go away if he uses windowed mode?


Yes the flickering goes away in windowed mode.. Only happens in FULL mode.


----------



## Masked

So, I'm actually leaving Alienware and as of this evening, sold most of my related stuff.

Unfortunately, that means my 590 driver write-up will not happen...To that, I'm sorry guys.

I'm going to start my own business, actually and already have a vendor backing + some of the interns are coming with when their contracts expire so, not much will really change.

I wish everyone in here the best of luck ~ You can still PM me, ask questions etc -- I can ironically, do significantly more elsewhere, than I ever could with AW and I'm actually very glad it ended, positively.

I will still be doing my SWTOR build -- I kind of saw this coming.

Again, any questions, feel free but, good luck, most of all -- Good luck


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> So, I'm actually leaving Alienware and as of this evening, sold most of my related stuff.
> Unfortunately, that means my 590 driver write-up will not happen...To that, I'm sorry guys.
> I'm going to start my own business, actually and already have a vendor backing + some of the interns are coming with when their contracts expire so, not much will really change.
> I wish everyone in here the best of luck ~ You can still PM me, ask questions etc -- I can ironically, do significantly more elsewhere, than I ever could with AW and I'm actually very glad it ended, positively.
> I will still be doing my SWTOR build -- I kind of saw this coming.
> Again, any questions, feel free but, good luck, most of all -- Good luck


Man first of all thanks for everything so far and BEST OF LUCK TO YOU in your new venture.

Whatever you do, with your knowledge things will be as you hope for I AM SURE.

So this means you wont be using a 590 anymore? the thread will miss your expertice, but its great that you will be in the OCN forums so it will be good to contact you when there is a problem via PM.

P.S. I can see you getting a 690 or 790 when it is released. I will follow your footsteps


----------



## iARDAs

Wow Crysis 2 is really pushing the card in DX11 Ultra with High Res Pack on.

i cant really get above 70 fps. I can get as low as 50

Also I noticed something. When playing Crysis 2 my temps were around 82 degrees. I paused the game and when i came back the temperatures were around 87-88 degrees.

I had noticed this with other games before too. For some reason when i pause a game the heat of the GPU gets more.

Any ideas?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Wow Crysis 2 is really pushing the card in DX11 Ultra with High Res Pack on.
> i cant really get above 70 fps. I can get as low as 50
> Also I noticed something. When playing Crysis 2 my temps were around 82 degrees. I paused the game and when i came back the temperatures were around 87-88 degrees.
> I had noticed this with other games before too. For some reason when i pause a game the heat of the GPU gets more.
> Any ideas?


Without the fan on the bottom, you're literally creating a vortex in the middle of your case.

It's a question of air dynamics within a closed area.

So the hot air is circling and only so much hot air is getting out -- It just continues in a constant cycle.

In fact, I bet if you played a game with the side of your case, off -- You'd notice a big difference.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Without the fan on the bottom, you're literally creating a vortex in the middle of your case.
> It's a question of air dynamics within a closed area.
> So the hot air is circling and only so much hot air is getting out -- It just continues in a constant cycle.
> In fact, I bet if you played a game with the side of your case, off -- You'd notice a big difference.


You have no idea how i am dying to install my fan tomorrow. I am looking for more possibilities to install fans to but so far I cant see any.

However i might change the 200mm fan on the top of my case with a coolermaster megaflow fan.

I can take that part of the case out btw but my feet could anytime enter inside my hardware







as the case is sitting below my desk right near my feet.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> You have no idea how i am dying to install my fan tomorrow. I am looking for more possibilities to install fans to but so far I cant see any.
> However i might change the 200mm fan on the top of my case with a coolermaster megaflow fan.
> I can take that part of the case out btw but my feet could anytime enter inside my hardware
> 
> 
> 
> 
> 
> 
> 
> as the case is sitting below my desk right near my feet.


I put my big toe into a 2500 rpm enermax, once ~ Imagine how exciting that was









Your issue is that nothing is funneling air to exit the case, you're just expecting it to exit, so it's not.

And when it idles and the fan speed slows down, nothing is pushing the hot air out as forcefully as it was so, it vortexes.

With the fan on the bottom, you'll have a constant stream/push for the air to leave and you'll cool the card down, significantly.


----------



## Smo

Quote:


> Originally Posted by *iARDAs*
> 
> Wow Crysis 2 is really pushing the card in DX11 Ultra with High Res Pack on.
> i cant really get above 70 fps. I can get as low as 50
> Also I noticed something. When playing Crysis 2 my temps were around 82 degrees. I paused the game and when i came back the temperatures were around 87-88 degrees.
> I had noticed this with other games before too. For some reason when i pause a game the heat of the GPU gets more.
> Any ideas?


When playing a game the GPU usage is constantly changing due to differences in what is being rendered at any given time. In some games if you pause when GPU usage is very high (90+%) then walk off for say 20 mins, your card will have been working very hard that whole time so temperatures will peak. Not all games are like this, and have forced v-sync in their menus to stop the GPU overworking unnecessarily.

Starcraft II were forced to implement this feature after people's GPUs were working at full pelt in the pause menu, overheating and dying!


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I put my big toe into a 2500 rpm enermax, once ~ Imagine how exciting that was
> 
> 
> 
> 
> 
> 
> 
> 
> Your issue is that nothing is funneling air to exit the case, you're just expecting it to exit, so it's not.
> And when it idles and the fan speed slows down, nothing is pushing the hot air out as forcefully as it was so, it vortexes.
> With the fan on the bottom, you'll have a constant stream/push for the air to leave and you'll cool the card down, significantly.


thank you so much. If it wasnt for you i would have never ever got that fan. Matter of fact i never ever saw that there was a place to put a fan there. Before there were bunch of cables there, now it got cleaned and ready to have a fan.

This is the fan i purchased

Its the best I could find here. They claim to have 30% airflow than most top of the line fans. We shall see.

Next month i will also change the fans that are on the rear top (120mm) and on the top (200mm). I hear great things about the megaflow fan.

Also lets not forget that my stand is not here yet. so there is not enough airflow on the bottom of the case.

P.S. thanx God that your big toe is alright


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> When playing a game the GPU usage is constantly changing due to differences in what is being rendered at any given time. In some games if you pause when GPU usage is very high (90+%) then walk off for say 20 mins, your card will have been working very hard that whole time so temperatures will peak. Not all games are like this, and have forced v-sync in their menus to stop the GPU overworking unnecessarily.
> Starcraft II were forced to implement this feature after people's GPUs were working at full pelt in the pause menu, overheating and dying!


Thats what ihad thought actually. I had paused Crysis 2 when an explosion had happened. But now i know better. However it seems safe to exit a game unless you are taking a toilet break or grabing a soda from the kitchen.

I would hate to pause my game, watch a soccer game than 2 hours later when i come back my GPU would have burned down


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> thank you so much. If it wasnt for you i would have never ever got that fan. Matter of fact i never ever saw that there was a place to put a fan there. Before there were bunch of cables there, now it got cleaned and ready to have a fan.
> This is the fan i purchased
> Its the best I could find here. They claim to have 30% airflow than most top of the line fans. We shall see.
> Next month i will also change the fans that are on the rear top (120mm) and on the top (200mm). I hear great things about the megaflow fan.
> Also lets not forget that my stand is not here yet. so there is not enough airflow on the bottom of the case.
> P.S. thanx God that your big toe is alright


It actually fractured my big toe ~ Tells you how strong those fans are.

That being said, make sure you put the fan in so the air is blowing upwards ~ And, pressure isn't something you really need to worry about in that case but, you may want a better top fan in the future, a higher CFM would create some suction -- Once you understand how fans work, etc ~ Static pressure ~ bla bla bla ~ You'll get where to put your fans etc.

Rush, drew a fantastic picture that I think he deserves rep for


----------



## Smo

Quote:


> Originally Posted by *Masked*
> 
> It actually fractured my big toe ~ Tells you how strong those fans are.
> That being said, make sure you put the fan in so the air is blowing upwards ~ And, pressure isn't something you really need to worry about in that case but, you may want a better top fan in the future, a higher CFM would create some suction -- Once you understand how fans work, etc ~ Static pressure ~ bla bla bla ~ You'll get where to put your fans etc.
> Rush, drew a fantastic picture that I think he deserves rep for


Stay with us, I'll miss you


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> Stay with us, I'll miss you


Actually, I have some good news too, the interns saved my progress on the drivers and actually have it on a USB for me.

Since, I currently don't have the resources to putter around on a white-board all day, I'll be sending this out to Rush and a few others that asked to help.

If anyone else is interested, I'll shoot it to them.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> It actually fractured my big toe ~ Tells you how strong those fans are.
> That being said, make sure you put the fan in so the air is blowing upwards ~ And, pressure isn't something you really need to worry about in that case but, you may want a better top fan in the future, a higher CFM would create some suction -- Once you understand how fans work, etc ~ Static pressure ~ bla bla bla ~ You'll get where to put your fans etc.
> Rush, drew a fantastic picture that I think he deserves rep for


Yep i will make sure the fan blows hair upwards. And i will get a better 200mm fan.
I came across this picture of my case in the Haf thread.

http://www.coolermaster-usa.com/product.php?product_id=2919&product_name=HAF%20922

It seems that i can install a 200mm fan on the side of the case. It says optional. Would this help lowering my 590s temperatures as well?

Also i repped most of the posts that helped me through in this situation i am in. I learned new things with every other post. Literally.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Yep i will make sure the fan blows hair upwards. And i will get a better 200mm fan.
> I came across this picture of my case in the Haf thread.
> http://www.coolermaster-usa.com/product.php?product_id=2919&product_name=HAF%20922
> It seems that i can install a 200mm fan on the side of the case. It says optional. Would this help lowering my 590s temperatures as well?
> Also i repped most of the posts that helped me through in this situation i am in. I learned new things with every other post. Literally.


Yes but, then you face a pressure issue...I'll look at it later but, I don't think that would really benefit you until you get a better "exit" fan.

The idea is to create a vacuum...ultimately.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Yes but, then you face a pressure issue...I'll look at it later but, I don't think that would really benefit you until you get a better "exit" fan.
> The idea is to create a vacuum...ultimately.


Thats true. Maybe that side fan might ruin things.

What about 2 120mm fans instead of the 200mm on top. I was told i can do that.

Would it perform better?


----------



## rush2049

What you want to look at is the CFM (cubic feet per minute). It measures the volume of air that a fan can move during a period of time.

Now not all manufacturers measure it the same way, so if some third party fan manufacturer claims significantly higher numbers I question their validity....

But to answer your question, if your single 200mm fan has a higher CFM than 2 120mm fans added together, then don't change. But, if that 200mm fan is one that came with your case (almost definitely true) then I would be willing to bet that a pair of high CFM fans would beat it soundly.


----------



## Agenesis

Hey guys, I'm just dropping by to ask if there has been any new revisions of the 590s since release, particularly around the vrm area. Thought some actual 590 owners might be of help


----------



## Juggalo23451

Quote:


> Originally Posted by *Agenesis*
> 
> Hey guys, I'm just dropping by to ask if there has been any new revisions of the 590s since release, particularly around the vrm area. Thought some actual 590 owners might be of help


there is no revision and there will be none


----------



## Agenesis

Quote:


> Originally Posted by *Juggalo23451*
> 
> there is no revision and there will be none


Damn did a search in the thread and there was some talk of revisions so got a little confused.


----------



## rush2049

Technically there were revisions, but officially the answer you are looking for is: No there were no revisions...

(there were revisions that people with access to review cards saw, and there is a revised card with a different bios as default, but the differences are slight)


----------



## Masked

Quote:


> Originally Posted by *Juggalo23451*
> 
> there is no revision and there will be none


There are and were several revisions of this card and late in the game, they swapped VRM's.

It's illustrated earlier in the thread because there are also 2 different versions of water blocks ~ Pre-revision and post-revision.

Rush is correct, there is no OFFICIAL revision but, there were several, hardware revisions on the card.


----------



## Xraze

Count me in, 725/1800.









http://www.techpowerup.com/gpuz/a4dc6/


----------



## iARDAs

My fan will be shipped to me on Monday so i will have it on Tuesday. Sux.


----------



## kzinti1

Quote:


> Originally Posted by *Agenesis*
> 
> Hey guys, I'm just dropping by to ask if there has been any new revisions of the 590s since release, particularly around the vrm area. Thought some actual 590 owners might be of help


You're probably still confused, so I can assure you that the answer given by Masked is absolutely correct.

Yes, there are at least two different revisions of these cards. Whether you can properly fit a waterblock on your specific card(s) is still a gamble.

I've already had to return a block that wouldn't fit. It was a rather large hassle that I choose to not go through again.

If, *IF* there is a GTX690 made available, watercooled and from EVGA, then both of my GTX590 Classified's are definitely going to be for sale.

Hell, I'm even considering GTX680's in tri-sli. Especially if Nvidia screws us over again as they did by undervolting all of the 590's just because a few dumbasses intentionally borked their cards just because they could, then post it on Facebook to hurt both Nvidia and all the people who bought these cards.

I'm completely convinced this was done by some jealous AMD fanboys since the 590's blew their Radeon 6990's away.

BTW, I've posted this before: http://vr-zone.com/articles/revised-geforce-gtx-590-cards-in-june/12222.html, read it yourself and decide who's right.


----------



## Agenesis

Quote:


> Originally Posted by *kzinti1*
> 
> You're probably still confused, so I can assure you that the answer given by Masked is absolutely correct.
> 
> Yes, there are at least two different revisions of these cards. Whether you can properly fit a waterblock on your specific card(s) is still a gamble.
> 
> I've already had to return a block that wouldn't fit. It was a rather large hassle that I choose to not go through again.
> 
> If, *IF* there is a GTX690 made available, watercooled and from EVGA, then both of my GTX590 Classified's are definitely going to be for sale.
> 
> Hell, I'm even considering GTX680's in tri-sli. Especially if Nvidia screws us over again as they did by undervolting all of the 590's just because a few dumbasses intentionally borked their cards just because they could, then post it on Facebook to hurt both Nvidia and all the people who bought these cards.
> 
> I'm completely convinced this was done by some jealous AMD fanboys since the 590's blew their Radeon 6990's away.
> 
> BTW, I've posted this before: http://vr-zone.com/articles/revised-geforce-gtx-590-cards-in-june/12222.html, read it yourself and decide who's right.


Glad to hear there are new, more stable revisions, because I've been thinking of replacing my 7970 with a 590. The only thing stopping me now is kepler but even then I doubt it'll outperform the 590 given that people are saying it trades blows with a 7970. Man I hate this time of the year where you feel you need to upgrade even thought you don't need it and there is always something new and possibility better less than a month away.


----------



## ekindbest

Can I be in the 590 club? I need some advises from the pros on here (if this is not the right place, please kindly guide me to the right page).









its a evga gtx 590 w/ XSPC waterblock installed the second the card came outta its evga box. The card idles at 41 C when I play BF3 the max will go to 90 C till I added a second rad (480 black ice stealth) and it brought few degrees down to 35c idle /87c max. It's on its dedicated loop 1/2 tubing DD5 (mp335)pump, ek coolant, 140mm phobya radiator push/pull and a second 480 black ice push/pull was added to bring the max down to 87C. Btw the thermal paste I used is something from Cooler Master High Performance paste, thats the only thing I had laying around when I was installing.

Does any1 know if that's normal or something wrong? I do have a D5 pump still in the box but too lazy to change the whole loop that involves new Res and new location to mount cuz I'm using the dual bay xspc dual 335 pumps res.

My PC spec (if some1 needs it):

Asus Rampage IV extreme (WC'd)
I7 -3960X 4.25Ghz
corsair dominator 32gb ram (WC'd)
Evga GTX 590 (WC'd)
OCz SSD 128GB
WD 1TB
CM 1200W PSU
Corsair 800D case


----------



## Smo

There's something very wrong there dude - possibly an air bubble in the block.


----------



## ekindbest

are there any other factor(s)? btw should one 480 black ice rad be sufficient to cool the 590 ?


----------



## Smo

Quote:


> Originally Posted by *ekindbest*
> 
> are there any other factor(s)? btw should one 480 black ice rad be sufficient to cool the 590 ?


Easily mate, yes. I would typically have suggested three things;

- poor flow.
- air pocket in the loop.
- inefficient TIM/poor application.

Your idle temps don't really suggest that here's a flow problem to be honest but the load temp is ~30c higher than I'd expect. Which could be explained by air in a block. The TIM always plays a part but a difference like that is too large really. You're running just the one card right?


----------



## ekindbest

yup, any suggestions on getting the air bubble out? I thought I had them all out by tilting the shT outta the case during filling lol


----------



## Masked

Quote:


> Originally Posted by *Agenesis*
> 
> Glad to hear there are new, more stable revisions, because I've been thinking of replacing my 7970 with a 590. The only thing stopping me now is kepler but even then I doubt it'll outperform the 590 given that people are saying it trades blows with a 7970. Man I hate this time of the year where you feel you need to upgrade even thought you don't need it and there is always something new and possibility better less than a month away.


It was never a question of stability -- There were 0 issues with stability, period.

The issue was that, mid-run, they decided to compensate for, what we in the business coin as "Owner-operator stupidity" or, OOS.

They beefed up the back end so that, if someone DID decide to do what Sweclockers and others did (Again, against a major press release) then, they won't meed the same end.

Unfortunately, I think I've been through the most 590s on this forum







~ The beefed up VRMS didn't really help all that much.

Since I now own my own business, I was asked by many potential customers to keep current and thus, I'm now running something else...I do miss my 590, tremendously and with EVGA's new RMA policy, I miss it a little bit more.

Still, it should be noted that, aside from OOS, there were never any issues with this card from a hardware standpoint, it's always been a driver-related issue, especially when related to throttling.


----------



## Smo

Quote:


> Originally Posted by *ekindbest*
> 
> yup, any suggestions on getting the air bubble out? I thought I had them all out by tilting the shT outta the case during filling lol


Rocking the case from side to side (gently of course) while the loop is running could sort it.


----------



## ekindbest

that's been done so many times............appreciate the help, I'll try few more times if it don't sort the problem out, I think I need to RMA something.


----------



## iARDAs

Hey folks

When i get my fan tomorrow and install it if i can see some temperature decrease i am thinking of getting into Ocing

so can i OC the GPU around 5-10% without voltage tweaking?

I remember OCing my 460*M* 20% without a voltage tweak.

However I couldnt OC my 570 factory OCed card 5% without a voltage tweak.


----------



## rush2049

You can get about 630-635 mhz without any voltage tweaks,

at .963 mV you can get 700 mhz....

I don't go higher than that, as I am on air....


----------



## iARDAs

Quote:


> Originally Posted by *rush2049*
> 
> You can get about 630-635 mhz without any voltage tweaks,
> at .963 mV you can get 700 mhz....
> I don't go higher than that, as I am on air....


Cool.

How hot were you running on stock and how hot are you running now since you are also on air?

Also did having 700 mhz actually benefit in games?

I might follow your route but carefully.


----------



## Masked

Quote:


> Originally Posted by *rush2049*
> 
> You can get about 630-635 mhz without any voltage tweaks,
> at .963 mV you can get 700 mhz....
> I don't go higher than that, as I am on air....


Yeah but, if you're already running a primary card -- On certain motherboards you can't OC 1 card without pumping the same voltage into the other PCI slots...

Just be aware of that...


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Yeah but, if you're already running a primary card -- On certain motherboards you can't OC 1 card without pumping the same voltage into the other PCI slots...
> Just be aware of that...


I will only be running this 590. I am not going to be purchasing a 2nd GPU fpr Physx. At least for now.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I will only be running this 590. I am not going to be purchasing a 2nd GPU fpr Physx. At least for now.


I was just saying, if you intend on running a physx card, keep in mind that when tweaking the voltage on 1 slot, you're also, technically doing it lane-wide.

There are SOME exceptions but, not many.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Cool.
> How hot were you running on stock and how hot are you running now since you are also on air?
> 
> Also did having 700 mhz actually benefit in games?
> 
> I might follow your route but carefully.


And to answer this, atm it's hit/miss ~ Sometimes, yes and sometimes not -- It can cause crashing, artifacting etc ~ Be sure to dial it back if you face any issues.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> And to answer this, atm it's hit/miss ~ Sometimes, yes and sometimes not -- It can cause crashing, artifacting etc ~ Be sure to dial it back if you face any issues.


Cool.

taking it to 635 mhz from 605 is not really a great improvement but i might go for it. And than i will probably tweak the voltage, however i dont want to void my warranty.

in 2 years when i will build a new rig i will be sure that it will be on water.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I was just saying, if you intend on running a physx card, keep in mind that when tweaking the voltage on 1 slot, you're also, technically doing it lane-wide.
> There are SOME exceptions but, not many.


I actually didnt know that. Thank you


----------



## iARDAs

Hmmm interesting.

I got into Asus Suite to look at my fans for my case and i selected TURBO for Chasis Fan profile (before it was standard), I used to get 82 degrees in Crysis 2 now i get around 77. I wonder if it did have a real effect or my that level wasnt intensive enough.

Also i have 3 fans in my chasis but there is 1 profile, i am guessing that, that 1 profile is controling all of them?

Wow I cant wait to get my hands in my new fan.

Also right now i am seriously thinking of upgrading my 200mm fan on the top with 2 120mm fans.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hmmm interesting.
> I got into Asus Suite to look at my fans for my case and i selected TURBO for Chasis Fan profile (before it was standard), I used to get 82 degrees in Crysis 2 now i get around 77. I wonder if it did have a real effect or my that level wasnt intensive enough.
> Also i have 3 fans in my chasis but there is 1 profile, i am guessing that, that 1 profile is controling all of them?
> Wow I cant wait to get my hands in my new fan.
> Also right now i am seriously thinking of upgrading my 200mm fan on the top with 2 120mm fans.


Just make sure you have the proper CFM or, you're wasting your time.

Like I said before, it's all about pressure ~ Very simple stuff.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Just make sure you have the proper CFM or, you're wasting your time.
> Like I said before, it's all about pressure ~ Very simple stuff.


Unfortunately all i can find about my top fan is that it is 700 RPM

I can not find the CFM valur for it

For example the new fan that i will be mounting on the bottom is 110cfm
That is a 140mm one.

There is also a 120mm version of the same fan I am guessing the cfm for it is around 110 too

but for the love of God i cant find out the cfm of my top fan.

EDIT : Just found out that the 200mm fan on the top is 110 cfm.

Also found out that 120mm Akasa Viper is like 86 cfm. Will grab 2 of those puppies.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> So, I'm actually leaving Alienware and as of this evening, sold most of my related stuff.
> Unfortunately, that means my 590 driver write-up will not happen...To that, I'm sorry guys.
> I'm going to start my own business, actually and already have a vendor backing + some of the interns are coming with when their contracts expire so, not much will really change.
> I wish everyone in here the best of luck ~ You can still PM me, ask questions etc -- I can ironically, do significantly more elsewhere, than I ever could with AW and I'm actually very glad it ended, positively.
> I will still be doing my SWTOR build -- I kind of saw this coming.
> Again, any questions, feel free but, good luck, most of all -- Good luck


Awesome news for you Masked! Congratulations!

Hopefully you departure was as congenial as possible.

Definitely put me down as a potential customer, I trust you'll stick around on OCN so you'll be reachable when it's time to buy!

As for the drivers, while I don't have any experience, I am totally down with throwing a donation at Rush or whomever can work on them for their effort and, especially, time it takes to make them work. I'd spend good money to get a set of drivers that could really tap into this card and let her stretch her legs.


----------



## kzinti1

I know I can shut down unused PCIe slots on this mobo, but never bothered.

Should I? Or just leave those switches alone?


----------



## ekindbest

one more thing, I kinda just realized that GPU 2 temp will never go over 55C while GPU 1 is around 73C. Is that because of the air bubble or something else? Btw the water flow is in the order of Res., pump to 420mm rad then goes in to GPU 2 and out the GPU1 then to the 140mm rad to cool the hot coolant off and back to res.


----------



## Saizer

Quote:


> Originally Posted by *ekindbest*
> 
> one more thing, I kinda just realized that GPU 2 temp will never go over 55C while GPU 1 is around 73C. Is that because of the air bubble or something else? Btw the water flow is in the order of Res., pump to 420mm rad then goes in to GPU 2 and out the GPU1 then to the 140mm rad to cool the hot coolant off and back to res.


Hmmm, both GPUs are at the same % (percentage) of use when that happens? (You can check that with MSI AB)


----------



## ekindbest

both GPU woking at 94-99%


----------



## Saizer

Quote:


> Originally Posted by *ekindbest*
> 
> both GPU woking at 90-94%


Well so let me give you my humble opinion. I think that the problem is either on the waterblock (1 GPU is getting colder than the other for a wrong installation of the kit, or any other unknown reason) OR maybe, the temperature sensor is failing.

Either ways, what I recommend you to do before attempting re assembling the waterblock onto the card is:

1. Try a different driver of Nvidia. Preferably the 185.
2. Try a different version of MSI AfterBurner.
3. If you have your CPU OCed, turn it back to it's stock values (believe me, I know why I'm telling you this).
4. Change the card from PCI-e.

If what stated above failed, then check the installation of the waterblock and carefully check if both GPU's are receiving the same ammount of cold. If you consider that's OK, then (for me) it's a problem of the temperature sensor.


----------



## jethsmart

hi I would like to join this gtx 590 club.. I have 2 evga gtx 590 Quad SLI setup. Both Video cards running in stock settings.


----------



## Saizer

Quote:


> Originally Posted by *jethsmart*
> 
> hi I would like to join this gtx 590 club.. I have 2 evga gtx 590 Quad SLI setup. Both Video cards running in stock settings.


mother of good


----------



## ekindbest

temp sensor on the 590 card itself right.......







.............I don't have MSI afterburner, I'm using evga precision.


----------



## Masked

Quote:


> Originally Posted by *ekindbest*
> 
> temp sensor on the 590 card itself right.......
> 
> 
> 
> 
> 
> 
> 
> .............I don't have MSI afterburner, I'm using evga precision.


It's an air bubble...Plus, core 2 is supposed to be a few C hotter -- Just shake your machine...

The only issue with the block could be core 2 contact, so your other option is to re-seat the block but, by no means is this an error in any software beyond a temp sensor.

So at this point you have 3 solid factors, 2 of which actually make sense.

1) Air Bubble
2) Block seated incorrectly
or
3) Temp sensors are off (Which, they always are, anyway)


----------



## ekindbest

so there's still an air bubble on gpu 1....I'll give it a few more shakes.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Awesome news for you Masked! Congratulations!
> Hopefully you departure was as congenial as possible.
> Definitely put me down as a potential customer, I trust you'll stick around on OCN so you'll be reachable when it's time to buy!
> As for the drivers, while I don't have any experience, I am totally down with throwing a donation at Rush or whomever can work on them for their effort and, especially, time it takes to make them work. I'd spend good money to get a set of drivers that could really tap into this card and let her stretch her legs.


Thank you sir























The ultimate irony is that I'm now actually sponsored by the same companies that refused to sponsor AW because of Dell -- So, booya to that!

I'm not leaving OCN nor, this thread since I'm still working on the drivers.

Once I get settled and my work-space is built (The best companies in the world started in the basement) -- I'm going to apply to be a vendor, here...Actually.

I already have 20-30 builds lined up including some major players in NYC...

Will be PM'ing Rush in the upcoming weeks and we'll get started on this debacle...Don't worry, will have something for you all soon.

~~

GPU temp fluctuations on the same block are only the cause of air bubbles or, the block isn't seated properly -- You really don't have a whole list of options to choose from, it's a or b...Really is that simple.

Temp sensors have been wrong since the stone age -- They're just a semi-accurate "guess"...

20c ~ Tells you, it's either or...If you don't start bleeding air -- The block isn't seated properly...Maybe you didn't use enough paste...Maybe you used too much -- Can only be so many things.


----------



## iARDAs

I wish i was living in USA so you could build me a rig in the next 2 years Masked 

Anyhow, my fan is finally shipped today. I got the online invoice. I will never work with them as they first said i would have the fan on saturday, than tuesday. I will have it tomorrow. Ibought my 590 from them too and had it the next day but they screwed up this time.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I wish i was living in USA so you could build me a rig in the next 2 years Masked
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow, my fan is finally shipped today. I got the online invoice. I will never work with them as they first said i would have the fan on saturday, than tuesday. I will have it tomorrow. Ibought my 590 from them too and had it the next day but they screwed up this time.


Another irony is that UPS wishes to keep their international shipping contract with me, sir -- Which means, you don't need to live in the US for me to build you...Anything







.

Plus since EVGA now covers cross-vendoring AND warranty transferring...The possibilities are endless


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Another irony is that UPS wishes to keep their international shipping contract with me, sir -- Which means, you don't need to live in the US for me to build you...Anything
> 
> 
> 
> 
> 
> 
> 
> .
> Plus since EVGA now covers cross-vendoring AND warranty transferring...The possibilities are endless


Hmmmmm interestingg.. I will check on that possibility than. The problem is the Turkish Customs will probably hold the PC, and it might be a pain in the arse to get the PC from them, and plus tax. But still it could be cheaper and much better.

You know what, i am content with my PC now but in the future i might ask for a PC build from you, or perhaps components. Good to know









If i ever want to go water cooling, there isnt a place i can purchase good equipments here.

Ah also we dont have EVGA products in Turkey too which sux to be honest.

Edit ; Would your company sell certain components as well? or do we have to get a complete PC from you? Like can i purchase a new GPU + water cooling system alone?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hmmmmm interestingg.. I will check on that possibility than. The problem is the Turkish Customs will probably hold the PC, and it might be a pain in the arse to get the PC from them, and plus tax. But still it could be cheaper and much better.
> 
> You know what, i am content with my PC now but in the future i might ask for a PC build from you, or perhaps components. Good to know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If i ever want to go water cooling, there isnt a place i can purchase good equipments here.
> 
> Ah also we dont have EVGA products in Turkey too which sux to be honest.
> 
> Edit ; Would your company sell certain components as well? or do we have to get a complete PC from you? Like can i purchase a new GPU + water cooling system alone?


Sure -- That's something I'd normally do for free, though -- ...It's actually easier for me to get product, now; than it ever was with AW...So, I mean we could build/design an entire setup for you easily.

On another note...

Curiosity has the better of me and I don't actually know/have this answer.

Under EVGA's new policy, is the 590 actually covered?


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Sure -- That's something I'd normally do for free, though -- ...It's actually easier for me to get product, now; than it ever was with AW...So, I mean we could build/design an entire setup for you easily.
> On another note...
> Curiosity has the better of me and I don't actually know/have this answer.
> Under EVGA's new policy, is the 590 actually covered?


Great news bro... Again its not going to be anytime soon but one day i will be knocking on your door. I could even say hey build me something from scratch. If i didnt build my rig that i have now i would probably ask you to build me one. Maybe with custom case paint too


----------



## iARDAs

Does having a PCI 3.0 mobo improve 590 performance?


----------



## iARDAs

Quote:


> Originally Posted by *JassimH*
> 
> Explain any sort of logic behind that? The 590 doesn't support PCIe, however on "crappy" mobo's, which I doubt you'd be using it on, I can see an advantage because PCIE 3.0 is specced to provide 150w while PCIE 2.0 only 75w. This could mean you won't ever get any errors on it? But I mean who the _sprinkles_ runs a foxconn motherboard with a GTX 590...?
> It won't, sorry to disappoint you :\.
> The next dual GPU cards will benefit a LOT though due to their increased power saturating a 16x PCIE 2.0 lane with two high-power GPUs.


Ah good than. My mobo is PCI 2.0 and i have my 590 on the 16x slot... But i will set my eyes on a PCI3 motherboard when the next dual GPU card arrives. I will probably purchase the next dual GPU card on release date.


----------



## iARDAs

Well seems like 590 is still the king









680 performs 10% lower than 590 so far according to toms hardware benches.

I am falling in love with 590 more each day. Honestly.


----------



## Agenesis

Quote:


> Originally Posted by *iARDAs*
> 
> Well seems like 590 is still the king
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 680 performs 10% lower than 590 so far according to toms hardware benches.
> 
> I am falling in love with 590 more each day. Honestly.


Now wait for the overclocking benches


----------



## iARDAs

Quote:


> Originally Posted by *Agenesis*
> 
> Now wait for the overclocking benches


I am sure 680 will oc better i have no doubt.. Still though if this is nvidias biggest weapon in 6xx series, as Masked had mentioned a few weeks back, there is no need to upgrade to a 680 over 590.

However 680 SLI can be prefered to 590 Quad SLI, as i hear that some games have issues utilizing 4 cores.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I am sure 680 will oc better i have no doubt.. Still though if this is nvidias biggest weapon in 6xx series, as Masked had mentioned a few weeks back, there is no need to upgrade to a 680 over 590.
> 
> However 680 SLI can be prefered to 590 Quad SLI, as i hear that some games have issues utilizing 4 cores.


Yes, this is correct.

I can't really say more but, with 1x590, you still have an edge AND the price will be dropping a bit more, soon enough.

The only real issue is that of the warranty -- Asus, MSI and Gigabyte have been under extreme pressure to go with a transferable policy...EVGA just did (effective obviously with *AHEM*)...So the only real reason for YOU guys to switch is for warranty purposes.

Plus, I actually have a feeling they'll be unlocking the 590 a bit more in the next few weeks which is mostly why, I'm holding back on writing our own version -- If they do as they've said ~ There will be some real changes...If not, time to get crunching.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Yes, this is correct.
> I can't really say more but, with 1x590, you still have an edge AND the price will be dropping a bit more, soon enough.
> The only real issue is that of the warranty -- Asus, MSI and Gigabyte have been under extreme pressure to go with a transferable policy...EVGA just did (effective obviously with *AHEM*)...So the only real reason for YOU guys to switch is for warranty purposes.
> Plus, I actually have a feeling they'll be unlocking the 590 a bit more in the next few weeks which is mostly why, I'm holding back on writing our own version -- If they do as they've said ~ There will be some real changes...If not, time to get crunching.


I got myself a Zotac 590 and i found out last month that i had to register online (within 15 days of purchase) to qualify for extended warranty. Now i am stuck with the regular warranty. I will never purchase Zotac products again. A company should stand by their products and not require for online registration within 15 days of purchase. Crap if you ask me.

I cant wait to OC my GPU tomorrow. First time OCing my 590.










Right now at full load i get around 85, and with the fan installed i am hoping to lower this a lot. I hope i can get to 700 mhz on air with voltage tweaking and not go over 85 degrees.


----------



## iARDAs

I just did my first ever OC today for my 590

Core Clock is now 637 (5% more than stock core)

my 3Dmark11 score moved to 9560 from 9200

I did not notice any temperature changes.

I wonder if it is safe to say that this is a stable OC?


----------



## kzinti1

Quote:


> Originally Posted by *Masked*
> 
> Yes, this is correct.
> I can't really say more but, with 1x590, you still have an edge AND the price will be dropping a bit more, soon enough.


How can the price drop lower than EOL, and no longer produced? EVGA doesn't even list them any longer.

If you know where to buy more of these 590's then it's your obligation to tell your fellow Forumites here where they are.

Are you also saying Nvidia has plans to relax the voltage limits on these cards or are they going to re-release them with the proper components that can safely use higher volts?

We all know that these cards are capable already of so much more than Nvidia will allow, so if they are going to allow us to use what we've already paid for, then how are they going to go about it? Driver workarounds or new BIOS' straight from the factory?

As I've said before, Nvidia has screwed every single one of us over by making these cards underperform and also made liars of EVGA since they can't let us overvolt these GTX590 Classifieds as they said we could when they first sold us these cards.


----------



## Recipe7

I am as disappointed as the next 590 owner in regards to the underwhelming performance of the 590. For myself, I take some pride in knowing that it still is the best single card solution out right now. It beats the 7970 and will edge out the 680. I have no more hope for any improvement of the 590, but it's ok, it still rocks!

What I am waiting for now is a dual-gpu card that can maintain 120fps in BF3 (might as well say BF4 in terms of time) so I can bump to a future max res monitor running 120hz.

I could care less what GPU comes out this year. I run BF3 maxed at 1920 at a vsyned 60fps. Give me a 2-slot gpu that will force me to get a 120hz monitor, I will have money ready to throw at it.

I guess it's safe to say that the GTX790 is next on my plate.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> I just did my first ever OC today for my 590
> 
> Core Clock is now 637 (5% more than stock core)
> 
> my 3Dmark11 score moved to 9560 from 9200
> 
> I did not notice any temperature changes.
> 
> I wonder if it is safe to say that this is a stable OC?


Temps?

I can run at 685 core, but my temps hit 84C at 95% fan. I prefer a 650 core with a temp of 79C at 85% fan.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> I am as disappointed as the next 590 owner in regards to the underwhelming performance of the 590. For myself, I take some pride in knowing that it still is the best single card solution out right now. It beats the 7970 and will edge out the 680. I have no more hope for any improvement of the 590, but it's ok, it still rocks!
> What I am waiting for now is a dual-gpu card that can maintain 120fps in BF3 (might as well say BF4 in terms of time) so I can bump to a future max res monitor running 120hz.
> I could care less what GPU comes out this year. I run BF3 maxed at 1920 at a vsyned 60fps. Give me a 2-slot gpu that will force me to get a 120hz monitor, I will have money ready to throw at it.
> I guess it's safe to say that the GTX790 is next on my plate.


you and me... will probably do that next step at the same time.

i need at least a 50% increase in performance to make the jump and have the " WOW " factor.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> Temps?
> I can run at 685 core, but my temps hit 84C at 95% fan. I prefer a 650 core with a temp of 79C at 85% fan.


My temp is actually the same, i didnt notice a single different. Though i did not change my voltage... Are you on 685 mhz with the stock voltage?

also do you OC the memory clock as well or leave it be?


----------



## Masked

Quote:


> Originally Posted by *kzinti1*
> 
> How can the price drop lower than EOL, and no longer produced? EVGA doesn't even list them any longer.
> If you know where to buy more of these 590's then it's your obligation to tell your fellow Forumites here where they are.
> Are you also saying Nvidia has plans to relax the voltage limits on these cards or are they going to re-release them with the proper components that can safely use higher volts?
> We all know that these cards are capable already of so much more than Nvidia will allow, so if they are going to allow us to use what we've already paid for, then how are they going to go about it? Driver workarounds or new BIOS' straight from the factory?
> As I've said before, Nvidia has screwed every single one of us over by making these cards underperform and also made liars of EVGA since they can't let us overvolt these GTX590 Classifieds as they said we could when they first sold us these cards.


I was speaking about the market price, actually...Not NIB...

The 590 was a limited release, there aren't any more nib...

And beyond that, I can't comment...


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> you and me... will probably do that next step at the same time.
> 
> i need at least a 50% increase in performance to make the jump and have the " WOW " factor.


As stupid as it is... I want that time to come now. I've dreamed of gaming at 120hz... but with today's hardware, you have to go tri or quad sli and spend 2000USD for some crazy setup with WC.
Quote:


> Originally Posted by *iARDAs*
> 
> My temp is actually the same, i didnt notice a single different. Though i did not change my voltage... Are you on 685 mhz with the stock voltage?
> 
> also do you OC the memory clock as well or leave it be?


Yes, stock volts (925).

I run the clock to 1750 only. I don't see any improvement with bumping it higher than that.

The 685 runs smooth in all the games I've tried. However, it crashes in 3dmark11 50% of the time.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> As stupid as it is... I want that time to come now. I've dreamed of gaming at 120hz... but with today's hardware, you have to go tri or quad sli and spend 2000USD for some crazy setup with WC.
> Yes, stock volts (925).
> I run the clock to 1750 only. I don't see any improvement with bumping it higher than that.
> The 685 runs smooth in all the games I've tried. However, it crashes in 3dmark11 50% of the time.


120 hz gaming is amazing, it really is but the problem is that some games are weird on lettin you use 120 hz. For example in Black Ops in the menu you can set that you have a 120hz monitor, but the game is locked at 90 fps. Or in Witcher 1 that i am playing now i get around 80-90 fps, but the GPU usage is around 40-50% I mean why not have it higher so i can game in 120 fps? Half life 2 is great, Homefront is great in 120 hz but few games are being jerks about it...

Do you have a 120 hz monitor now? They are so great. I bought it for 3D gaming but i rarely game in3 D now. 120 hz 2d ftw.

Hmmm i will try to OC 10% around 670 mhz but i will only do it tomorrow when my fan finally arrives.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> 120 hz gaming is amazing, it really is but the problem is that some games are weird on lettin you use 120 hz. For example in Black Ops in the menu you can set that you have a 120hz monitor, but the game is locked at 90 fps. Or in Witcher 1 that i am playing now i get around 80-90 fps, but the GPU usage is around 40-50% I mean why not have it higher so i can game in 120 fps? Half life 2 is great, Homefront is great in 120 hz but few games are being jerks about it...
> 
> Do you have a 120 hz monitor now? They are so great. I bought it for 3D gaming but i rarely game in3 D now. 120 hz 2d ftw.
> 
> Hmmm i will try to OC 10% around 670 mhz but i will only do it tomorrow when my fan finally arrives.


That's not so bad. I really wanna play counter-strike at 120fps. Can't wait till the new one comes out.

I don't have a 120hz monitor. I don't feel up to buying one just yet since I purchased my ASUS back in July (LED 1920x1080). It was to replace my aged 1650 reso dell.

I didn't consider getting a 120hz because of the price... now I obviously regret it. However, I may jump on a 2560 reso 120hz monitor if they ever come out. They will be crazy expensive though.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> That's not so bad. I really wanna play counter-strike at 120fps. Can't wait till the new one comes out.
> I don't have a 120hz monitor. I don't feel up to buying one just yet since I purchased my ASUS back in July (LED 1920x1080). It was to replace my aged 1650 reso dell.
> I didn't consider getting a 120hz because of the price... now I obviously regret it. However, I may jump on a 2560 reso 120hz monitor if they ever come out. They will be crazy expensive though.


Ah i see.. but honestly whenever you can switch for the 120 hz one, it is pure joy, Some people say they cant feel the difference but i guess my eyes are too sensitive. Are our 590s good enough for 2560 resolution? since people suggest around 2 gb vram for that kind of a resolution thats why i am asking.


----------



## Recipe7

I'm sure it is! My eyes are quite sensitive as well, I can spot fps changes easily. I know I would benefit immensely from a 120hz. In the future, sigh*.

I really don't know if it's enough as I don't have personal experience with it. But from various benchmarks, it seems the 1.5gb fairs well enough. Not so much in BF3, but any other game it isn't a problem.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> I'm sure it is! My eyes are quite sensitive as well, I can spot fps changes easily. I know I would benefit immensely from a 120hz. In the future, sigh*.
> I really don't know if it's enough as I don't have personal experience with it. But from various benchmarks, it seems the 1.5gb fairs well enough. Not so much in BF3, but any other game it isn't a problem.


Yeah thats what i thought... In BF3 the vram could be an issue with 2650 gaming

BTW i OCed my GPU exactly 10% making it 667 mhz and memory is 1750. i played BF3 for a good while and i had the exact same temperatures with just 10% more fan speed.

I used to have low 80s at 75% and now i have low 80s at 85% ( i am also using my own fan profile )

I wonder if i can push it even higher.


----------



## teichu

how come my stock voltage only 913V?? some of people can get 925??


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah thats what i thought... In BF3 the vram could be an issue with 2650 gaming
> 
> BTW i OCed my GPU exactly 10% making it 667 mhz and memory is 1750. i played BF3 for a good while and i had the exact same temperatures with just 10% more fan speed.
> 
> I used to have low 80s at 75% and now i have low 80s at 85% ( i am also using my own fan profile )
> 
> I wonder if i can push it even higher.


When you get your new fan, you do just that. Let us know what you come up with.


----------



## Recipe7

Quote:


> Originally Posted by *teichu*
> 
> how come my stock voltage only 913V?? some of people can get 925??


Are you using the latest driver?


----------



## iARDAs

Ocing over 5% gave me stability issues at the end

I am now at 637 mhz.

I will tweak the voltage later but i might not. I guess it voids the warranty.


----------



## rush2049

I posted this earlier, maybe you missed it:

Quote:


> Originally Posted by *rush2049*
> 
> You can get about 630-635 mhz without any voltage tweaks,
> 
> at .963 mV you can get 700 mhz....
> 
> I don't go higher than that, as I am on air....


Draw a graph with those two points as the end points... scaling between them is almost linear.....

while you can go higher, stability isn't maintained..... or you have stability, but see graphical artifacts....


----------



## iARDAs

Quote:


> Originally Posted by *rush2049*
> 
> I posted this earlier, maybe you missed it:
> Draw a graph with those two points as the end points... scaling between them is almost linear.....
> while you can go higher, stability isn't maintained..... or you have stability, but see graphical artifacts....


Yeah i saw that but i wanted to give it a shot though u seem SPOT ON right.

I will go to bed now and tomorrow i will increase my voltage and hopefully not smoking my 590


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> Thank you sir
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The ultimate irony is that I'm now actually sponsored by the same companies that refused to sponsor AW because of Dell -- So, booya to that!
> I'm not leaving OCN nor, this thread since I'm still working on the drivers.
> Once I get settled and my work-space is built (The best companies in the world started in the basement) -- I'm going to apply to be a vendor, here...Actually.
> I already have 20-30 builds lined up including some major players in NYC...
> Will be PM'ing Rush in the upcoming weeks and we'll get started on this debacle...Don't worry, will have something for you all soon.
> ~~


That is Jedi news. Just out of curiosity, is your company going to be building gaming notebooks as well? I'm dying for a new one, especially an M11x killer with a proper panel. Though to be honest, I don't know how much I'd need one right now after the wife bought me a PS Vita. I am so taken with that little console. Pretty much the reason I haven't been around as much the last week and a half. She bought me the Vita and I bought her the new IPad, and I have to say, the Vita just kicks it's ass in terms of games and game play. And we both seem to be more taken with the Vita's OLED screen than the IPad Retina, though that may be because we're IPad 2 owners. I don't know how to best describe why the Vita kicks my ass so hard as a gamer, but there's something about the blend of the controls, touch screen and beautiful display that just makes it so captivating. Especially as a portable. Of course I'll always need a note book for games like SWTOR, but I wouldn't be worried at all if I was traveling and all I had was my Vita. If you're a gamer and don't need an IPad for work stuff and are getting one for content consumption and entertainment purposes, then seriously consider the Vita. Especially if you already own an IPad. My only concern for the console is developer support for it. But of course if they see the console selling...

Back on topic -

120hz gaming is a sensation like no other and you should definitely pick up one of those displays if you can afford one. That being said, I'm actually really torn right now about whether to play Mass Effect 3 @ 120fps/120hz on my 24" BenQ 3D Display, or @ 60fps/60hz on my 60" Pioneer Kuro plasma. I've been playing it on the Kuro, if that says anything. I might play it tonight on the BenQ.

Speaking of BenQ, SMO and I both use the same model, the XL2410T, which has recently been replaced by their new XL2420T. The new one has Nvidia's new 3D Vision2 Light boost whereas the one we have doesn't, but last year's model is still an excellent display and can probably found heavily discounted. FYI, it will need to be calibrated (you can find calibration profiles online) it looks like arse out of the box, but is excellent once a proper calibration profile is applied.

I'm really, really psyched to hear that the 590 still outperforms the 680. And while I'm sure two 680's SLI normally will outperform 2 590's in Quad, I really would like to test that real world once Masked and the other Shinobi make these new drivers, if Nvidia doesn't come through. Somehow, I have a feeling the results may be different...


----------



## teichu

Quote:


> Originally Posted by *Recipe7*
> 
> Are you using the latest driver?


yea i did its 296 driver


----------



## iARDAs

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Back on topic -
> 120hz gaming is a sensation like no other and you should definitely pick up one of those displays if you can afford one. That being said, I'm actually really torn right now about whether to play Mass Effect 3 @ 120fps/120hz on my 24" BenQ 3D Display, or @ 60fps/60hz on my 60" Pioneer Kuro plasma. I've been playing it on the Kuro, if that says anything. I might play it tonight on the BenQ.
> Speaking of BenQ, SMO and I both use the same model, the XL2410T, which has recently been replaced by their new XL2420T. The new one has Nvidia's new 3D Vision2 Light boost whereas the one we have doesn't, but last year's model is still an excellent display and can probably found heavily discounted. FYI, it will need to be calibrated (you can find calibration profiles online) it looks like arse out of the box, but is excellent once a proper calibration profile is applied.
> I'm really, really psyched to hear that the 590 still outperforms the 680. And while I'm sure two 680's SLI normally will outperform 2 590's in Quad, I really would like to test that real world once Masked and the other Shinobi make these new drivers, if Nvidia doesn't come through. Somehow, I have a feeling the results may be different...


Hmm Kuro Plasmas are amazing and thats a tough decision you are facing there. However i like to game very close to my monitor due to being more into the game this way, but again Kuro displays are amazing.

So i guess that you are skipping of upgrading to 6xx series as well?


----------



## teichu

how come everytime when i flash the bios and drag gtx590a.rom to nvflash , nvflash wont open it , it just pop out quickly anyone know the reason ?? thanks


----------



## rush2049

Quote:


> Originally Posted by *teichu*
> 
> how come everytime when i flash the bios and drag gtx590a.rom to nvflash , nvflash wont open it , it just pop out quickly anyone know the reason ?? thanks


Your bios chip might have the protect mode enabled, you are going to have to turn that off using nvflash's command line switches before attempting to flash....

You really should type out the commands and not just drag it over that way you can see the window after it finishes and read if it was successful.....


----------



## Masked

Quote:


> Originally Posted by *rush2049*
> 
> Your bios chip might have the protect mode enabled, you are going to have to turn that off using nvflash's command line switches before attempting to flash....
> You really should type out the commands and not just drag it over that way you can see the window after it finishes and read if it was successful.....


Well said, sir.

Once I get my operation up and going, I'll be shooting you a PM.


----------



## iARDAs

Hey folks

My fan finally arrived, i attached it but it did not have any effect on my GPU temperatures

Here is the link to the thread I started about the issue, I was asked to change the rear fan to act like an exhaust but that didnt help also

I am guessing i need those stand offs fast right? Maybe the bottom fan is not really taking cool air in because its close the the wooden stand i have.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hey folks
> 
> My fan finally arrived, i attached it but it did not have any effect on my GPU temperatures
> 
> Here is the link to the thread I started about the issue, I was asked to change the rear fan to act like an exhaust but that didnt help also
> 
> I am guessing i need those stand offs fast right? Maybe the bottom fan is not really taking cool air in because its close the the wooden stand i have.


The issue, as we've said before, is that no hot air is actually venting because your top fan, isn't creating enough back-pressure...

You need new top fans mate.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> The issue, as we've said before, is that no hot air is actually venting because your top fan, isn't creating enough back-pressure...
> You need new top fans mate.


Yeah thats what i figured.

Maybe 2 120mm ones?

The thing is i dont have the space on my motherboard to install two more fan headers.

I will take out the 200mm i can put one of the 120mm ones to the motherboard, but where do i connect the 2nd one?

I am looking for the best 200mm thta money can buy but the options are limited here.

EDIT : Also the 200mm fan i have is 110 cfm. Ones i find are sometimes 90 cfm.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah thats what i figured.
> 
> Maybe 2 120mm ones?
> 
> The thing is i dont have the space on my motherboard to install two more fan headers.
> 
> I will take out the 200mm i can put one of the 120mm ones to the motherboard, but where do i connect the 2nd one?
> 
> I am looking for the best 200mm thta money can buy but the options are limited here.
> 
> EDIT : Also the 200mm fan i have is 110 cfm. Ones i find are sometimes 90 cfm.


Why are you installing fans on your headers? ~ If you are, that's bad mojo, dude ~ You allow for a dual-surge, especially if a fan shorts out.

Always go direct to your PSU or through a controller...Seriously.

The draw with too many fans, is bad -- Can short out your motherboard and do all sorts of damage.

I'd make a guess that not a single fan you have, if they're all connected to the motherboard, is operating at 100%, either.

Do 2 120mm fans, put all of them on a controller but, get them off the motherboard and -- Go from there.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Why are you installing fans on your headers? ~ If you are, that's bad mojo, dude ~ You allow for a dual-surge, especially if a fan shorts out.
> Always go direct to your PSU or through a controller...Seriously.
> The draw with too many fans, is bad -- Can short out your motherboard and do all sorts of damage.
> I'd make a guess that not a single fan you have, if they're all connected to the motherboard, is operating at 100%, either.
> Do 2 120mm fans, put all of them on a controller but, get them off the motherboard and -- Go from there.


Hmmm i will do that.

All my fans are connected to my motherboard

Chasis 1 (200mm)

Chasis 2 (the new 140mm on the bottom)

CPU Fan

Rear Fan (120 mm)

they are all connected to the motherboard and i have all of them set in TURBO speed except the CPU which is in silent mode.

If i take them out from the motherboard and connect them to the CPU, would they always work at 100%?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Hmmm i will do that.
> 
> All my fans are connected to my motherboard
> 
> Chasis 1 (200mm)
> Chasis 2 (the new 140mm on the bottom)
> CPU Fan
> Rear Fan (120 mm)
> 
> they are all connected to the motherboard and i have all of them set in TURBO speed except the CPU which is in silent mode.
> 
> If i take them out from the motherboard and connect them to the CPU, would they always work at 100%?


PSU ~ Power supply.

The issue of them being on the motherboard is that they can never operate at full draw ~ They're always competing with the other components based off of/on the motherboard.

This isn't a situation of you have a USB and powering things through a USB ~ This is an issue of cross-through components that have the capability of randomly shorting/dying and then that short goes directly into your motherboard -- That's BAD.

You want either a fan controller or just plug them into your PSU ~ 100% on a separate redundancy is still better than 8 fans being DC'd into the motherboard...


----------



## iARDAs

Wow first time i ever heard of a fan controller

So basically its this right?

What i am going to do is that i install it on my case, below the DVDRW

connect it to the USB, and than connect all the fans to it?

Forgot to add a link

http://site.hardwarena.com/Video-Inceleme-Zalman-ZM-MFC3.html


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Wow first time i ever heard of a fan controller
> 
> So basically its this right?
> 
> What i am going to do is that i install it on my case, below the DVDRW
> 
> connect it to the USB, and than connect all the fans to it?
> 
> Forgot to add a link
> 
> http://site.hardwarena.com/Video-Inceleme-Zalman-ZM-MFC3.html


Fans work on a voltage principal ~ Basically, voltage controls the speed of the fan ~ Has a max/min but, voltage is your control.

The issue with the fans being on the motherboard, like I said earlier, is that there's a constant fight for power -- And while you don't actually SEE this fight, if you had a good monitoring program, you can actually see the conflicts, especially if you overclock.

My major issue with fans on the motherboard, is that if you have too many and one shorts, you've essentially just given a lead to fry out your whole system.

If you have 1 fan on the motherboard, like your CPU fan, that's warranted because 1 fan isn't going to cause any issues...So 1 fan, isn't a conflicting issue...2/3/4/5 fans, yes, absolutely.

I've actually watched an intern fry her computer because the PSU shorted, fan came on...Click click click...SPARKS...Dead everything.

So what do I mean by a fan controller?

Any of these: http://www.xoxide.com/5fanco1.html
Many of these: http://www.sidewindercomputers.com/fancontrollers.html

Your goal is control the fans OFF of a software based protocol that's undervolting -- Turbo is BS and it throttles the fans, tremendously.

For YOU: http://www.amazon.co.uk/Zalman-Rheobus-Channels-Aluminum-ZM-MFC1/dp/B000S0UOBW/ref=pd_cp_computers_0

Or maybe: http://www.amazon.co.uk/Lamptron-Controller-2-Channel-Anodized-Aluminium/dp/B002R84J28/ref=sr_1_18?s=computers&ie=UTF8&qid=1332340973&sr=1-18:


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Fans work on a voltage principal ~ Basically, voltage controls the speed of the fan ~ Has a max/min but, voltage is your control.
> The issue with the fans being on the motherboard, like I said earlier, is that there's a constant fight for power -- And while you don't actually SEE this fight, if you had a good monitoring program, you can actually see the conflicts, especially if you overclock.
> My major issue with fans on the motherboard, is that if you have too many and one shorts, you've essentially just given a lead to fry out your whole system.
> If you have 1 fan on the motherboard, like your CPU fan, that's warranted because 1 fan isn't going to cause any issues...So 1 fan, isn't a conflicting issue...2/3/4/5 fans, yes, absolutely.
> I've actually watched an intern fry her computer because the PSU shorted, fan came on...Click click click...SPARKS...Dead everything.
> So what do I mean by a fan controller?
> Any of these: http://www.xoxide.com/5fanco1.html
> Many of these: http://www.sidewindercomputers.com/fancontrollers.html
> Your goal is control the fans OFF of a software based protocol that's undervolting -- Turbo is BS and it throttles the fans, tremendously.
> For YOU: http://www.amazon.co.uk/Zalman-Rheobus-Channels-Aluminum-ZM-MFC1/dp/B000S0UOBW/ref=pd_cp_computers_0
> Or maybe: http://www.amazon.co.uk/Lamptron-Controller-2-Channel-Anodized-Aluminium/dp/B002R84J28/ref=sr_1_18?s=computers&ie=UTF8&qid=1332340973&sr=1-18:


I totally understand you now.

I was actually thinking to leave the CPU fan on the motherboard too before reading your post, and connect all other fans to the CPU fan controller.

Unfortunately here in Turkey we only have the Zalman Fan Controller.

http://www.zalman.com/eng/product/Product_Read.asp?idx=341

Will this do the job or should i try to import the ones you mentioned?

The Zalman is around 110 US dollars here.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I totally understand you now.
> 
> I was actually thinking to leave the CPU fan on the motherboard too before reading your post, and connect all other fans to the CPU fan controller.
> 
> Unfortunately here in Turkey we only have the Zalman Fan Controller.
> 
> http://www.zalman.com/eng/product/Product_Read.asp?idx=341
> 
> Will this do the job or should i try to import the ones you mentioned?
> 
> The Zalman is around 110 US dollars here.


Ew ~ PM me...

We'll buy you a better, cheaper one and I'll ship it...

Can also get you 2 high CFM fans much cheaper...


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Ew ~ PM me...
> We'll buy you a better, cheaper one and I'll ship it...
> Can also get you 2 high CFM fans much cheaper...


just Pmed you.

Edit : Also you were right. I changed the fan profile to SILENT, whish is even lower than standard, and i have the exact same temperatures. Lol No need for Turbo I guess.


----------



## teichu

Quote:


> Originally Posted by *rush2049*
> 
> Your bios chip might have the protect mode enabled, you are going to have to turn that off using nvflash's command line switches before attempting to flash....
> You really should type out the commands and not just drag it over that way you can see the window after it finishes and read if it was successful.....


hi so how can i turn off using nvflash command line?? thanks


----------



## rush2049

This will do ya up good: http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread/0_20


----------



## teichu

Quote:


> Originally Posted by *rush2049*
> 
> This will do ya up good: http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread/0_20


sorry i couldnt find it , can u just tell me what command should i type before i drag the rom?? thanks


----------



## Masked

Quote:


> Originally Posted by *teichu*
> 
> sorry i couldnt find it , can u just tell me what command should i type before i drag the rom?? thanks


You couldn't click on that link?

It's a full guide to flashing the 590...


----------



## teichu

Quote:


> Originally Posted by *Masked*
> 
> You couldn't click on that link?
> It's a full guide to flashing the 590...


i did , but i still dont know how to turn off the protect mode by using command....


----------



## rush2049

If your having this much trouble, perhaps you shouldn't be looking into flashing your gpu bios.... but here ya go:


----------



## iARDAs

I told it to Masked via PM and let me tell you too

I am changing my case from midtower to FULL TOWER

I ordered an Aerocool Xpredator Evil Black case

I should have it by monday.

Lets see if i will be able to assemble it









be sure to follow my log on my signature on friday as i will be adding lots of photos and maybe videos during the process


----------



## Smo

Nice looking case!


----------



## iARDAs

Quote:


> Originally Posted by *Smo*
> 
> Nice looking case!


Masked will be happy to hear that


----------



## Image132

I gotta agree that is an awesome looking case, if that doesn't cool your 590 nothing will.

I love my NZXT Phantom though


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> Nice looking case!


They borrowed/copied/cloned our whole backbone design...The Alienware 51 Chassis...I helped design it which is why our Turkish friend says I'd welcome the compliment.

Copying is a form of flattery so, I guess so.

That being said in 30 minutes, I'll post some real evidence and give you guys some real stats -- So, hold tight.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> They borrowed/copied/cloned our whole backbone design...The Alienware 51 Chassis...I helped design it which is why our Turkish friend says I'd welcome the compliment.
> Copying is a form of flattery so, I guess so.
> That being said in 30 minutes, I'll post some real evidence and give you guys some real stats -- So, hold tight.


When I was searching for the case tons of people also said that this case was an exact copy of an Alienware case,









Gotta love the Taiwanese people for cloning it. How much would that Alienware 51 Case cost if it was on sale?

Besides that how many people are using their 590s with Full Tower case?


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> When I was searching for the case tons of people also said that this case was an exact copy of an Alienware case,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta love the Taiwanese people for cloning it. How much would that Alienware 51 Case cost if it was on sale?
> 
> Besides that how many people are using their 590s with Full Tower case?


I think it's 300$ retail -- Perhaps a bit more but, the Alienware case has hard drive "chassis" on the back section with propritary stuff...

It's a good case but, non-boutique good? No, not really.

I believe off the top of my head, there were 8 or 9 people that had dual 590's ~ I know 3 of them are on water -- The rest I'd imagine are full tower with Ex-ATX or HPTX.


----------



## rush2049

I am in a full tower (Corsair 800D), one of the largest full towers.... with how often I am in there fiddling, all the extra space is nice. Though all my hard drive bays are occupied currently and I need more space soon....

Just make sure you don't get lazy with the cable management....


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I think it's 300$ retail -- Perhaps a bit more but, the Alienware case has hard drive "chassis" on the back section with propritary stuff...
> It's a good case but, non-boutique good? No, not really.
> I believe off the top of my head, there were 8 or 9 people that had dual 590's ~ I know 3 of them are on water -- The rest I'd imagine are full tower with Ex-ATX or HPTX.


The Aerocool one is half the price of that Alienware Chasis however I am 100% sure that Alienware used better products on building it. Aerocool is very heavy though. 16 kgs i believe.

Honestly cant a company take a patent for a chasis they come up with?

My Mid tower haf 922 was not really something that had lots of space. If i ever put another GPU it would probably touch the PSU so a Full Tower is the way to go

Quote:


> Originally Posted by *rush2049*
> 
> I am in a full tower (Corsair 800D), one of the largest full towers.... with how often I am in there fiddling, all the extra space is nice. Though all my hard drive bays are occupied currently and I need more space soon....
> Just make sure you don't get lazy with the cable management....


When i put all of the hardware in it, before firing my rig up i will handle the cable management. I got lots of holders and stuff to help me with it. I want to get it done so i dont have to worry about it later.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> The Aerocool one is half the price of that Alienware Chasis however I am 100% sure that Alienware used better products on building it. Aerocool is very heavy though. 16 kgs i believe.
> 
> Honestly cant a company take a patent for a chasis they come up with?
> 
> My Mid tower haf 922 was not really something that had lots of space. If i ever put another GPU it would probably touch the PSU so a Full Tower is the way to go
> 
> When i put all of the hardware in it, before firing my rig up i will handle the cable management. I got lots of holders and stuff to help me with it. I want to get it done so i dont have to worry about it later.


Yes and no -- It's a very tricky subject, especially in this market...There's no real solid answer.

So here's the skinny from MY tests.

Non-OC'd the 590 is 10% ahead of the 680 ~ This is without the boost feature active on the 680.

Now, when the 680 is OC'd yes, it beats a 590 but about 10-15%.

In SLI and OC'd ~ I don't think I need to say it.

HOWEVER, with some driver improvements apparently coming down the pipeline, the 590 will now be shown some love.

We'll have to see what happens in the long run...Until then, hold onto your 590s.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> Yes and no -- It's a very tricky subject, especially in this market...There's no real solid answer.
> So here's the skinny from MY tests.
> Non-OC'd the 590 is 10% ahead of the 680 ~ This is without the boost feature active on the 680.
> Now, when the 680 is OC'd yes, it beats a 590 but about 10-15%.
> In SLI and OC'd ~ I don't think I need to say it.
> HOWEVER, with some driver improvements apparently coming down the pipeline, the 590 will now be shown some love.
> We'll have to see what happens in the long run...Until then, hold onto your 590s.


Great tests Masked. Thank you. Honestly if i ever go for a 680 that would only be for SLI reasons. However if i can grab a 2nd hand 590 on a cheap side i guess i shouldnt say no to that right?

Any word on 3Dvision performance with 6xx series? Do they have better performance or still does 3Dvision perform 50% less when 3Dvision is enabled in 680?

Also what kind of driver improvements are we talking about? More voltage? or better performance all together.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Great tests Masked. Thank you. Honestly if i ever go for a 680 that would only be for SLI reasons. However if i can grab a 2nd hand 590 on a cheap side i guess i shouldnt say no to that right?
> 
> Any word on 3Dvision performance with 6xx series? Do they have better performance or still does 3Dvision perform 50% less when 3Dvision is enabled in 680?
> 
> Also what kind of driver improvements are we talking about? More voltage? or better performance all together.


Well, since there's no NDA on any of this anymore ~ I was told down the line, there would be improvements made to utilization since they now "have time" to really punch them out.

I can't comment on 3dvision just yet --

Overall I think you're talking about better SLI-Profiles and overall just better performance from profiles in general.


----------



## iARDAs

Thank you Masked. Lets hope for better drivers from Nvidia to maximize the performance of our cards. I just hope they dont give up on 590 as they are not being produced anymore.

I just saw that 680 is now in stocks here in Turkey.

a brand new 680 is 999 Us dollars + tax

a brand new 590 is 950 US dollars + tax

Tax is roughly around 18%

This is the pricing of the most expensive retailer here in Turkey. They usually have 25 % off everything sales once a month so thats when you grab these GPUs.


----------



## iARDAs

Epic

I used a registry cleaner program yesterday ( no idea why i guess i was bored ) and I noticed that Nvidia Control Panel was no longer in the desktop after right click. I had to re install my drivers.

It sucks that we dont have newer drivers and 301 drivers are only for 680.

Also how are the other new features of 680? like that new Vsync method they came up with or the FXAA... Are these innovations the kind that we can expect to have via driver update?


----------



## kzinti1

I just bought a pair of EVGA GTX 680's from the factory. We'll see how they compare to my 590's.

I hope. None of the 680's are pictured with SLi bridges or cables. Are all SLi. cables compatible or are they platform specific?

Why assume that Alienware uses better materials than AeroCool? They're just another company where the bottom-line is at or near the top of their considerations. Especially since being taken over by Dell. When you buy Alienware or any other boutique computer you're paying an unGodly amount just for the name.

You can easily build a far better computer than you can buy preassembled. I found that out the hard way when I bought a Falcon NorthWest around 6 years, or more, ago.

It had a no-name motherboard, cheap Seagate hdd's, no-name ODD's and the hottest running case I've ever had the misfortune of using, even though it's also the most beautiful case I've ever seen. A CoolerMaster WaveMaster. All dolled up with the case painted in the American flag design, a $1,400 option. I fought the U.S. Government for over 25 years and won, with backpay. Never make the mistake of my lacking in persistance, just judgement when I see something I really want. I should've researched a helluva lot more than I did and saved a couple thousands of bucks.

If the new drivers, or whatever, for the GTX 590's make my pair of them faster than the GTX 680's I just bought, which I doubt, since the 680's are PCIe 3, I don't mind at all. I'll just have another set of cards, all ready and broken in for the next build.

I'm still a little worried about no SLi. cables being packaged with these new cards. It just doesn't make any sense for them not to be included. There was no mention of 680's in SLi. revewed in Guru 3D's review that was posted today. Just that they would work together better than ever.

Why does Turkey make the prices of 590's and 680's almost the same? That's just plain ridiculous.

That AeroCool is one fine looking case. Since I'm into watercooling though, my next one is going to be a CaseLabs of one form or another. I can't find any decent pics of them yet, comparing them side by side, or figure out what options I need since the options are also not pictured or explained very well. I need to visit the CaseLabs thread when I'm ready to bite the bullet.


----------



## iARDAs

Quote:


> Originally Posted by *kzinti1*
> 
> I just bought a pair of EVGA GTX 680's from the factory. We'll see how they compare to my 590's.
> 
> I hope. None of the 680's are pictured with SLi bridges or cables. Are all SLi. cables compatible or are they platform specific?
> 
> Why assume that Alienware uses better materials than AeroCool? They're just another company where the bottom-line is at or near the top of their considerations. Especially since being taken over by Dell. When you buy Alienware or any other boutique computer you're paying an unGodly amount just for the name.
> 
> You can easily build a far better computer than you can buy preassembled. I found that out the hard way when I bought a Falcon NorthWest around 6 years, or more, ago.
> 
> It had a no-name motherboard, cheap Seagate hdd's, no-name ODD's and the hottest running case I've ever had the misfortune of using, even though it's also the most beautiful case I've ever seen. A CoolerMaster WaveMaster. All dolled up with the case painted in the American flag design, a $1,400 option. I fought the U.S. Government for over 25 years and won, with backpay. Never make the mistake of my lacking in persistance, just judgement when I see something I really want. I should've researched a helluva lot more than I did and saved a couple thousands of bucks.
> 
> If the new drivers, or whatever, for the GTX 590's make my pair of them faster than the GTX 680's I just bought, which I doubt, since the 680's are PCIe 3, I don't mind at all. I'll just have another set of cards, all ready and broken in for the next build.
> 
> I'm still a little worried about no SLi. cables being packaged with these new cards. It just doesn't make any sense for them not to be included. There was no mention of 680's in SLi. revewed in Guru 3D's review that was posted today. Just that they would work together better than ever.
> 
> Why does Turkey make the prices of 590's and 680's almost the same? That's just plain ridiculous.
> 
> That AeroCool is one fine looking case. Since I'm into watercooling though, my next one is going to be a CaseLabs of one form or another. I can't find any decent pics of them yet, comparing them side by side, or figure out what options I need since the options are also not pictured or explained very well. I need to visit the CaseLabs thread when I'm ready to bite the bullet.


I attended college in New York between 2000-2004 and in that period i actually got to know about Alienware. i was so in love with them, the cases looked good and the products seemed very well but they were so pricey. I ended up getting a pc from Tigerdirect at the end but to date I always wanted to own an Alienware PC. But i do agree that they unfortunately cost too much. And for that very reason I can never see myself getting a pre built PC. Either I will build it or have it custom built that will blow my mind away.

I wish the Aerocool case i orded has the same quality as an AW case because those cases always blow my mind away. They are perfect for gamers. Thank you Masked once again 

Strange that 680 does not have a SLI bridge and to date i have yet to see a SLI review but the card is new so we should have them soon. As I said earlier I am 100% sure that when you have 2 680s, you would want to SLI them instead of 590 QUADSLI. I am still undecided. I am sure that i will not change my 590 into a 680 but i might if i find 680 to be really great, however if i ever see a used 590 which is in good condition, i will just grab it and be done this generation. It makes me happy that 590 performs better than 680 when 680 is not OCed or boosted. Though I must admit that i find the new features in 680 quite good.

About the pricing. Unfortunately Turkish vendors and government like to rip us apart so thats why the pricing is like that. The prices were announced 4 hours ago and people are in RAGE. Its so expensive. We get 1 GPU in the price of 2 GPUs. I am sure they will do something with the prices sometime next month hopefully. Lots of people were waiting for Kepler and the releease of Kepler did not have any effect on the prices of 5xx cards so everybody is dissapointed here.

yeah i loved that Aerocool case. It looks so great and spacy. I wish that case will take me through at least 5 years. I know it would. But upgrading is a disease so you never know what might happen in 2 years. If you eve saw my build log, when i first got my PC i said to myself that i would not upgrade anything. Maybe do SLI but thats it. Boy was i wrong.

I had never heard of Caselabs but had seen that case before few times. How are they compared to a Full Tower case? better air flow? More space? They look like a square as far as I can see.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> I had never heard of Caselabs but had seen that case before few times. How are they compared to a Full Tower case? better air flow? More space? They look like a square as far as I can see.


The case-labs case I'm having built for my showcase build is costing me about 1,000.00$ usd...So, 3-4x as expensive as yours...I'd venture a guesstimate.

Everything is more expensive in Turkey because of inflation/Tariffs -- That's never going to change.

It's cheaper for him to actually buy the cards here and then ship to Turkey but, then if he gets caught, he's slammed with taxes and the prices are virtually equal to those that he listed.

The Alienware case is of higher quality because it's actually an all-aluminum chassis weighing in at about 12-15lbs...The one he's chosen has a significant amount of plastic and actually, from what I've ascertained, isn't of the same quality but, to each their own.


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> The case-labs case I'm having built for my showcase build is costing me about 1,000.00$ usd...So, 3-4x as expensive as yours...I'd venture a guesstimate.
> Everything is more expensive in Turkey because of inflation/Tariffs -- That's never going to change.
> It's cheaper for him to actually buy the cards here and then ship to Turkey but, then if he gets caught, he's slammed with taxes and the prices are virtually equal to those that he listed.
> The Alienware case is of higher quality because it's actually an all-aluminum chassis weighing in at about 12-15lbs...The one he's chosen has a significant amount of plastic and actually, from what I've ascertained, isn't of the same quality but, to each their own.


The thing is if i have a 500 US dollar card shipped here, the shipping will be around 30-40 max, and if the item is stuck in customs, the most i will pay is like 100 US dollars, So in one word, its still going to be cheaper. So i might actually import next time, or ask from my brother in law to bring me one as he works for Intel in San Franciso.

My in laws are visiting him end of June. Last time they brought me the Asus g73 sw laptop as a present to be honest, and this time i will probably ask for a good SSD and IF ivy bridge CPUs come out, i will get one of them. My brother in law has adiscount with Intel products such as SSDs and CPUs. Those items will not be held in customs too as the custom officers dont really check lagguages.

However GPUs are too big so it would be strange for me to ask my in laws to bring it. However if my brother in law visits here in winter of 2013 than i might just ask him to bring me a GPU. Hopefully a 690 

Thats what i actually wanted to say about quality. The aerocool is is round 30 lbs  Kind a heavy but oh well.

Oh by the way masked. The case comes with only 2 fans. 1 on the top 230mm and 1 on the fromt 230mm..

I can put 4 fans on the side, 1 on the bottom and 1 on the rear. I currently have that yellow fan. Should i put it on the bottom? or rear? What would you advice?

last but not least, i wish one day i can have a case labs case. I do love them, they really look more professional and classy.


----------



## Forty-two

So as people upgrade to the 680, I guess there will be a flood of 590s hitting eBay. I think I'll add a second ASUS GTX590 to my rig.


----------



## iARDAs

Quote:


> Originally Posted by *Forty-two*
> 
> So as people upgrade to the 680, I guess there will be a flood of 590s hitting eBay. I think I'll add a second ASUS GTX590 to my rig.


If i see a 2nd hand 590 i will definitely get one too it seems.

It will be a much cheaper solution than selling a 590 and getting 2 680s


----------



## iARDAs

If i place 1 200mm fan on the side here where 4 fans are, would i benefit from this?

( the case does not come with the 4 fans. That area is optional. Either 4 120 or 140mm fans or 1 200mm fan.

I am guessing 1 200mm fan can do the job and be quiet too.

What do you guys think?

Would it have effect on my 590?


----------



## rush2049

What is the airflow pattern inside the case look like ignoring those side fans?


----------



## iARDAs

This is the supposed air flow in the case


----------



## rush2049

Is their going to be a fan at the bottom (near #5) blowing up? If their is I would suggest only putting in a fan on the bottom left, or both left places on those four side fan areas. You want a vaccum to pull the hot air from both your processor and video card out of the case.... having cold aiur forcing it against your motherboard is just going to cause problems when the exhaust has no where to go.

If their isn't a fan at the bottom blowing up, place 3 fans on all but the top right spot, hopefully it will create the correct pressure and force air up/out......


----------



## iARDAs

I have yet to get a bottom fan but i will be installing one for sure.

Thats the number 1 priority before the side panel fans.

So here it is guys.

You can see my new case, and how i set it up in my LOG.

I will post just 1 picture here not to distract you guys from the topic



In IDLE the CPU is now 6-7 degrees cooler than my previous setup

In IDLE the GPU is now 2-3 degrees cooler than my previous setup ( a bottom fan and a side fan will definitely help)

I will talk about the LOAD temperatures when i do some testing.

So far.. GREAT!!!


----------



## Recipe7

Nice numbers bro! Definitely a great purchase.


----------



## iARDAs

Thanks mate I am loving it so far

FULL LOAD, The CPU is around 8-10 degrees cooler

FULL LOAD, The GPU is 2-3 degrees cooler but by custom fan profile is now 10% less. Which is great.

Now, i would really like to get some advice on how to make my GPU evne cooler.

I will grab myself a 3 pin 140mm fan for the bottom.

I am still undecided about the side panel.

Any suggestions?


----------



## Dennybrig

Guys im leaving the club, i decided to jump the ship for dual GTX 680s..

My PSU crapped down on me and failed so i had to return it to Amazon.

Ive really enjoyed following this thread and all the comments of you guys.

I will however enjoy my GTX 590s until i sell my 590s that, by the way, in Mexico i can get a lot more than in the US for them.

Thanks guys


----------



## iARDAs

Dear past or present 590 QUAD SLI owners

I want to ask you something...

How great was or is your 590 QUAD SLI?

Do 4 cores give you guys lots of compatibility issues?

Would you recommend going Quad Sli?

Also could someone help me out in voltage tweaking? i am going to OC my 590s voltage but i want to be safe.


----------



## Wogga

i have no compability issues except swtor. some games dont properly use all cores until new drivers or patches
as for perfomance, really cant say because went from X1950PRO (AGP)


----------



## Image132

Quote:


> Originally Posted by *iARDAs*
> 
> Dear past or present 590 QUAD SLI owners
> 
> I want to ask you something...
> 
> How great was or is your 590 QUAD SLI?
> 
> Do 4 cores give you guys lots of compatibility issues?
> 
> Would you recommend going Quad Sli?
> 
> Also could someone help me out in voltage tweaking? i am going to OC my 590s voltage but i want to be safe.


I've read it's not worth the money. The performance boost you get is no where near what you expect. I might need to be corrected here (masked care to fix my figures?) but something in the region of 25-40% increase depending on the game. If you game with 2 or more monitors, thats a different story.

I personally don't think it's worth it. I'm rather going to save that cash for maxwell and haswell and upgrade to that. Not to mention the extra heat that would dump in your case if you don't water cool.


----------



## jethsmart

Quote:


> Originally Posted by *iARDAs*
> 
> Dear past or present 590 QUAD SLI owners
> 
> I want to ask you something...
> 
> How great was or is your 590 QUAD SLI?
> 
> Do 4 cores give you guys lots of compatibility issues?
> 
> Would you recommend going Quad Sli?
> 
> Also could someone help me out in voltage tweaking? i am going to OC my 590s voltage but i want to be safe.


I had my retail Classified evga gtx 590 QUAD on air running factory overclocked at 630 for about 5 months now.. never gave me any problem or compatibility issue on games / benchmarked. Its better than my previous 2 gtx 580 on SLI for sure. If you have the money got for it and get a 2nd gtx 590! its a monster just check out my rig.

Im using my quad gtx 590 on my 3d surround setup with 6060x1080 resolution (triple monitor 3d Asus VG236H). NO complaints here. microstutter is very minimal to NO issue for me. To be honest, I hardly notice it even while playing games like bf3 or MW3 in 3D or 2D surround. I really dont understand where people are getting this microstutter issues.

The only problem I'm having is the temperature hitting 87 - 90 C at full load with 95 % - 100 % custom fan profile setup. Idle is at 44 c - 48 c at 45% fan speed. Prior to adding a 2nd gtx 590 card, full load was 77c - 82c with 80% fan speed. I live in California with about 29c ambient room temperature.

I sort of fix the the temperature issue on the QUAD video cards by buying a Portabable AC and keeping my room ambient temperature at 24c or lower. This lowers the max temp on both my GTX 590 and whole computer by 3-4c on FULL LOAD lol.
Im still exploring other means to cool off my Beastly Quad gtx 590, any suggestion will be appriciated. Water cooling is not an option.

Regarding voltage tweaking for OC, I wouldnt recommend it . Too much risk for little gain in speed. The GTX 590 is already fast enough as it is. If you chose to Overclock anyway, try NOT to touch the default voltage settings just to be safe.


----------



## iARDAs

Thank you for the response folks.

I am still in dilemma.

I can grab a 2nd hand 590 on the cheapside and be done with it, or sell my 590 and get 2 680.

I will wait a bit though and see if there will be a 690.

My 2 issues are as follows

1-) Motherboard is not SLI compatible. I will change it of course but grabbing a 690 if it ever comes out will save me from this hassle.

2-) My PSU is 1000W. I wonder if 2 590s, 1 ssd 1 hdd, 1 i5 2500k OCed at 4.5 will be able to work functional with a 1000W PSU. If not i really dont want to change this PSU as well.

This decision is not a decision i will make soon so i still have time, but its nice to do some brainstorming.


----------



## jethsmart

I can try to answer your number 2 question:
I have an external wattage meter plug into my computer. At full load (all 4 GPU at 95% - 100% on Haven benchmark), my meter goes up to 1080 watts. Check out my rig and compare what components you have to give you an idea. All my components are factory stock /settings. No self OC and NO overvolting.

I myself is considering getting a 1500 watts since i can feel my current Thermaltake Toughpower 1200w PS get HOT when gaming. I would only guess this is due to closely maxing its output..


----------



## rush2049

Just a note to everyone, the latest nvidia drivers (296.10) + my custom bios (voltage stock at .963 mV)....

are letting me up my memory to gtx 580 speeds.
Stock the core is 612 mhz. I have it running at 690 mhz.
Stock the memory is 1710 mhz, I have it running at 2004 mhz.

I just ran 3dmark 11, got about 500 more marks than I usually get: http://3dmark.com/3dm11/3048328

Time to see if I can push the core speed any more than what I considered stable.....


----------



## rush2049

Nope, it seems the core speed needs more voltage to get up to 580 speeds.... which figures cause the default 580 voltage is 1.012 mV or something.... realisticly I wouldn't be able to get there with .962 mV

But the memory results are interesting, the latest drivers must be upping the stability on that front.... lets hope similar improvements continue!


----------



## Recipe7

Innnnnnteresting. I may just have to finally flash my 590.


----------



## rush2049

Someone else see if they can do it....... I am curious if it is just me or if the drivers really are getting better.


----------



## Recipe7

Rush, have you tried the new 301 drivers?


----------



## rush2049

I haven't though thats cause my cpu fan went kapput on me last night.... and had to back off on my overclock....


----------



## Recipe7

I see. I will try to get around and do it myself if I get the time.


----------



## iARDAs

Hey guys i added 2 more fans.

You can find them here

It did help me few degrees but I was hoping for more.

Also running these fans in low RPM or high RPM does not make a drastic difference. I might just keep them running low.


----------



## heyskip

Quote:


> Originally Posted by *rush2049*
> 
> Someone else see if they can do it....... I am curious if it is just me or if the drivers really are getting better.


I can't really comment on if the drivers are getting better because I haven't played with the memory clocks before. But with the latest drivers I am able to get to 1856mhz before performance drops off. X3407 seems to be the best I can get at the moment. Couldn't complete the benchmark at anything above 1900mhz. This is @ .963v and 705 core.


----------



## rush2049

Quote:


> Originally Posted by *heyskip*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rush2049*
> 
> Someone else see if they can do it....... I am curious if it is just me or if the drivers really are getting better.
> 
> 
> 
> I can't really comment on if the drivers are getting better because I haven't played with the memory clocks before. But with the latest drivers I am able to get to 1856mhz before performance drops off. X3407 seems to be the best I can get at the moment. Couldn't complete the benchmark at anything above 1900mhz. This is @ .963v and 705 core.
Click to expand...

lower your core to 690 mhz or so and try higher memory...... My card was never able to go to 700 or above on .963 mV, it might be limiting you.....


----------



## goku5868

I just got mine EVGA GTX 590 SLI...I wanna water block for them please HELP








Here the link I need help Thank you again guys...









http://www.overclock.net/t/1236018/help-now#post_16837882


----------



## Shinobi Jedi

It's been crickets in here the last few days...

Is it because everyone's been sucked away deep into Mass Effect 3 like I have?

For those Gaming-Jedi who've already conquered it, super-please, no spoilers.

Yes, I know the ending sucks. I'm ready for this story and series to pull a Battlestar Galactica. Meaning, I expect Mass Effect 3 to ruin a great story that spanned over three games in the course of a short period/playthrough - Exactly the same way the explanation of Starbuck, or lack thereof in the very last few minutes of the series finale and completely and effectively douched a 4 year narrative in 5 minutes.. I'm expecting ME 3 to do something like that. But I've still managed to stay spoiler free.. So thanks for keeping it that way..









Still.... It's crickets in here....


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Yes, I know the ending sucks. I'm ready for this story and series to pull a Battlestar Galactica. Meaning, I expect Mass Effect 3 to ruin a great story that spanned over three games in the course of a short period/playthrough - Exactly the same way the explanation of Starbuck, or lack thereof in the very last few minutes of the series finale and completely and effectively douched a 4 year narrative in 5 minutes.. I'm expecting ME 3 to do something like that. But I've still managed to stay spoiler free.. So thanks for keeping it that way..
> 
> 
> 
> 
> 
> 
> 
> 
> Still.... It's crickets in here....


I liked the ending -- When we beta tested it, I remember the office MANY TIMES, making suggestions of how to change/alter it to the point of where it would've been "better", I guess?

I do know that all the beta testers recommended the ending be changed.

Personally, I think it goes with Shepard's whole montage and I was expecting a protagonist-like ending truth be told.

In fact, I even bought the Razer ME3 BWU because I needed a new keyboard!









Let me know what you think


----------



## Recipe7

I still have yet to play even the first installment of Mass Effect


----------



## ProfeZZor X

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> It's been crickets in here the last few days.......


I'm wondering if a lot of that has to do with the release of the 680. I have to admit that even I got caught up in posting in that thread. But seeing as I'm a newbie to watercooling, overclocking and PC trends, I'm in somewhat of a gullible position to what's necessary to complete my buiild, and what's just plain overkill.









I've found that Masked has provided some very sound and helpful information on the 590 so far. So I'll take that to the bank.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I liked the ending -- When we beta tested it, I remember the office MANY TIMES, making suggestions of how to change/alter it to the point of where it would've been "better", I guess?
> I do know that all the beta testers recommended the ending be changed.
> Personally, I think it goes with Shepard's whole montage and I was expecting a protagonist-like ending truth be told.
> In fact, I even bought the Razer ME3 BWU because I needed a new keyboard!
> 
> 
> 
> 
> 
> 
> 
> 
> Let me know what you think


Nice, Masked! That gives me hope!

Mass Effect has become my personal favorite game/game series. Specifically in terms of story/presentation, especially for an original story/franchise designed ground up for video games, compared to a great game based off an established property from another medium. (KOTOR or Batman:AC for example)

While the first game was definitely an RPG, I know it and it's sequels have been a disappointment to many when judged by the parameter's of that genre. But a good buddy of mine who works in the industry I think nailed it best: the game isn't an RPG, but actually morphed into an Adventure Game with some RPG overtones. And while I can understand why people who judge it by RPG's genre conventions are disappointed with the series, I think when evaluated from the conventions of the Adventure genre, it makes for an Epic Kick-Ass game that deserves the same respect as the Uncharted series or Half-Life or any of the other worthy giants.

I'm trusting I'll be like you Masked and get and pick up the subtle intent layered by the story developers that seems to not have communicated well to most audiences to the point of being abrasive.

I'm still on the fence with how I feel about them adding DLC that amends the ending to appease fan reaction.


----------



## rush2049

Adding DLC for the ending is like....

Here is a 60$ beginning and crescendo to a game, pay us another 20$ and we will write an ending if enough people complain.....


----------



## iARDAs

I am planning on a 15 day visit to USA in mid June guys...

I will probably be picking up a GPU while I am there. I might grab myself a 680 or some higher card if they come out by then. That is of course if wife doesnt go crazy on the shopping. Also purchasing an SSD is my top priority.

I might maybe take advantage of the marketplace of OCN and see if there are any good products on sale.


----------



## Masked

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Nice, Masked! That gives me hope!
> Mass Effect has become my personal favorite game/game series. Specifically in terms of story/presentation, especially for an original story/franchise designed ground up for video games, compared to a great game based off an established property from another medium. (KOTOR or Batman:AC for example)
> While the first game was definitely an RPG, I know it and it's sequels have been a disappointment to many when judged by the parameter's of that genre. But a good buddy of mine who works in the industry I think nailed it best: the game isn't an RPG, but actually morphed into an Adventure Game with some RPG overtones. And while I can understand why people who judge it by RPG's genre conventions are disappointed with the series, I think when evaluated from the conventions of the Adventure genre, it makes for an Epic Kick-Ass game that deserves the same respect as the Uncharted series or Half-Life or any of the other worthy giants.
> I'm trusting I'll be like you Masked and get and pick up the subtle intent layered by the story developers that seems to not have communicated well to most audiences to the point of being abrasive.
> I'm still on the fence with how I feel about them adding DLC that amends the ending to appease fan reaction.


Quote:


> Originally Posted by *rush2049*
> 
> Adding DLC for the ending is like....
> Here is a 60$ beginning and crescendo to a game, pay us another 20$ and we will write an ending if enough people complain.....


I mostly agree w/Rush on the DLC front...I mean, coming from a beta office -- That just doesn't make sense...Battlefield 3 did the same thing with BTK...It's just not good business much less good gaming...

I'm 100% for the artistry of a game and I, personally, feel, that changing the ending to this game is flat out wrong...It's a work of art, the designers came together and created this; polished it enough to release it -- Respect that and let it be...If it sucks, tell the world it sucks and move on.

The cupcake thing, I also believe, was ingenious and one of the most intelligent things the gaming community et al, has ever done...So, while I disagree with the movement, you can't exactly attack their tactics...

Overall, it creates a stigma and a terrible precedent, even releasing the DLC ending, IMO, is bad news because...What happens if I go play Zelda and I don't like the ending...Am I allowed to run out and sue Nintendo because the Zelda: A link to the past has some MAJOR plot-holes? Which, it does but, I think I was 9 when I last played that game, maybe 10...What rights do I have? Should I have any? I mean I paid 50$...I want a different ending...So, while it is a terrible precedent, it brings some long overdue questions to the table.

There's a point to which the whole situation becomes irrational and I think it's gone there.

I can't say much about the ending but, for me, I read into things so, it made sense...I was disappointed in the ultimate outcome in that, I didn't want Shepard to do what he did but, he did what he had to do, I had a gut feeling it would happen and while I shouted NOOOOO at my computer monitor, I'm content with the way it ended...I thoroughly enjoyed the experience and while I don't think the ending was blockbuster, it fits with Shepard.

I think most people miss that...Look at who Shepard is and look at the decisions he makes and it makes it extremely easy to rationalize the ending...That being said there are some very small plot holes but, to the average, intelligent gamer, you'll put 2+2 together which, is what they wanted you do to.

HOWEVER, I can relate this to something VERY similar...I paid 850$+shipping for a graphics card that was supposed to be the enthusiasts dream...It was supposed to be unlocked, overclockable and a work of genius.

It turns out to have locked me out of my fairy-tail ending...And now I have to rely on a different designer to write drivers that allow me to experience the card in full...

What rights do I have in regards to getting that fixed? Should I have to rely on this guy at Alienware for my information when Nvidia should be informing me of what's going on? He's abrasive, often a total d-bag and while he's often spot on, he is very arrogant sometimes...Why should I have to listen to him at all?

So, realistically, there are 2 sides to this coin and while I think Bioware did respond eventually, I feel like they dropped the ball...Just like Nvidia dropped ours.


----------



## squishysquishy

Hey guys. My BF3 client keeps crashing on me I keep loosing video and get the normal driver is bad buzzing sound. Then I have to shutoff and reboot my system.

I was a couple of cycles behind in terms of drivers, but happens even when running latest drivers, cooling is fine (upon reboot temps were at 55C), processor temps were at 58C, did a mem test no problems there, and its annoying.

Any suggestions?


----------



## rush2049

What processor are you using? 58C is too high for an amd processor....

the heat of a gpu + cpu working at maximum might be enough to kill it and cause those types of issues.

Run a stress test on the gpu and cpu at the same time and watch the temperatures. (furmark and prime95 small fft)

BF3 in general is a very crash happy game though.... I get one at least every third day or so.....


----------



## squishysquishy

Quote:


> Originally Posted by *rush2049*
> 
> What processor are you using? 58C is too high for an amd processor....
> the heat of a gpu + cpu working at maximum might be enough to kill it and cause those types of issues.
> Run a stress test on the gpu and cpu at the same time and watch the temperatures. (furmark and prime95 small fft)
> BF3 in general is a very crash happy game though.... I get one at least every third day or so.....


I took my side cover off my PC, and everything was happy for an hour long gaming session. I guess a 590 4 hdds and a i7 in a matx is too much the thermal aspect of the case.

drats, time to cut a cpu inlet in the side of my tower.


----------



## Image132

Anyone know if the new nvidia drivers (296.10) support display port or do I still have to wait with that?

*EDIT Just found it does. At last.


----------



## iARDAs

No love in the new beta drivers for 590?


----------



## kzinti1

Has anyone tried the new Beta GeForce 301.24 drivers?

They support cards all the way back to GeForce 6100.


----------



## Wogga

i've tried. got new options for vsync and AA as stated in release. voltage is still locked. swtor now runs smoother with adaptive vsync


----------



## iARDAs

Did you guys benefit from adaptive sync?


----------



## Recipe7

All I know that I still get ridiculous input lag in CS: GO and Source.

Have yet to try it on intense games.


----------



## heyskip

Quote:


> Originally Posted by *Wogga*
> 
> i've tried. got new options for vsync and AA as stated in release. voltage is still locked. swtor now runs smoother with adaptive vsync


I don't think there is a maximum voltage lock in these drivers as my 590 with modified bios runs at 1.025v


----------



## Wogga

Quote:


> Originally Posted by *heyskip*
> 
> I don't think there is a maximum voltage lock in these drivers as my 590 with modified bios runs at 1.025v


yeah, i mean you cant manually change voltage in AB for ex. like in some of the older drivers


----------



## Xraze

You can still set any voltage you want though, I just hate that stinking PDL.


----------



## heyskip

Quote:


> Originally Posted by *Wogga*
> 
> yeah, i mean you cant manually change voltage in AB for ex. like in some of the older drivers


Yeah true, but for the first time since 280.19 we can actually force a voltage higher than 0.975 which is a massive plus.
Quote:


> Originally Posted by *Xraze*
> 
> You can still set any voltage you want though, I just hate that stinking PDL.


PDL is definately a pain but hopefully they keep most games off the trigger list.


----------



## Xraze

In what games are you hitting the PDL Heyskip? I'm running @1.000v, works perfect in BF3. I get PDL IN 3Dmark & Heavan unfortiantly.


----------



## Wogga

Quote:


> Originally Posted by *heyskip*
> 
> Yeah true, but for the first time since 280.19 we can actually force a voltage higher than 0.975 which is a massive plus.
> PDL is definately a pain but hopefully they keep most games off the trigger list.


i'm still on old bios (32), so 0.963v is my max, but anyways i dont need more. 2x590 @670 is overkill


----------



## Xraze

Quote:


> Originally Posted by *Wogga*
> 
> i'm still on old bios (32), so 0.963v is my max, but anyways i dont need more. 2x590 @670 is overkill


I can easily push to 710mhz @ 0.963v Stable.


----------



## heyskip

Quote:


> Originally Posted by *Xraze*
> 
> In what games are you hitting the PDL Heyskip? I'm running @1.000v, works perfect in BF3. I get PDL IN 3Dmark & Heavan unfortiantly.


None yet. I have some stuttering in crysis but thats to do with vsync/refresh rate issues. But yeah 3dmark definately.


----------



## FateZero

Hi all, I'm new here and I would like to enlist into the proud GTX 590 owners club








The brand of my GTX 590 is by Nvidia and not by other 3rd party distributor since I got the whole rig from Dell Alienware.

I would also like to ask, how come when I run the MSI Kombustor DX11 v2.0.0 it shows that only 1 GPU is doing the processing, particularly only GPU 2?
Is there something wrong with my settings, driver or card?


----------



## Smo

Quote:


> Originally Posted by *FateZero*
> 
> Hi all, I'm new here and I would like to enlist into the proud GTX 590 owners club
> 
> 
> 
> 
> 
> 
> 
> 
> I would also like to ask, how come when I run the MSI Kombustor DX11 v2.0.0 it shows that only 1 GPU is doing the processing, particularly only GPU 2?
> Is there something wrong with my settings, driver or card?


Welcome to the club dude. Kombustor has a known issue working with SLi - if you search Google there's a file you modify (kombustor.ini or similar) to enable it.


----------



## Masked

There have only been a few occasions in my life where I actually regret selling hardware...1 of them was the 9800GX2...The other is my 590...

The new drivers, at the base level, indicate that for the most part, the lock is gone...Obviously you're still limited by the TDP/OCP but, for the most part, 0.97v is safe.

For the VAST MAJORITY of you using (I'm about to generalize) anything copper rated and below ~ Keep in mind the rails aren't exactly perfect and on draw, you may experience +/- .03 so, take that into account...

As Xraze pointed out ~ Nvidia basically "let go" of the lock and once re-flashed, you get rid of the bios limitations.

Truth be told, the 590 actually does have a good platform base in terms of the profiles//development...


----------



## Xraze

Quote:


> Originally Posted by *Masked*
> 
> There have only been a few occasions in my life where I actually regret selling hardware...1 of them was the 9800GX2...The other is my 590...
> The new drivers, at the base level, indicate that for the most part, the lock is gone...Obviously you're still limited by the TDP/OCP but, for the most part, 0.97v is safe.
> For the VAST MAJORITY of you using (I'm about to generalize) anything copper rated and below ~ Keep in mind the rails aren't exactly perfect and on draw, you may experience +/- .03 so, take that into account...
> As Xraze pointed out ~ Nvidia basically "let go" of the lock and once re-flashed, you get rid of the bios limitations.
> Truth be told, the 590 actually does have a good platform base in terms of the profiles//development...


While I'm glad we have control over voltage again, the PDL is still blocking us. For example my PDL starts hitting at 0.988v which sucks. I can't test benchmarks either to know if I'm even gaining performance in games that don't have PDL on the trigger. I'm really hoping there will be some way to modify the PDL.


----------



## Masked

Quote:


> Originally Posted by *Xraze*
> 
> While I'm glad we have control over voltage again, the PDL is still blocking us. For example my PDL starts hitting at 0.988v which sucks. I can't test benchmarks either to know if I'm even gaining performance in games that don't have PDL on the trigger. I'm really hoping there will be some way to modify the PDL.


If the PDL exists above the OCP, it can really only be done at the driver level...

There is also a micro-stutter and fluctuation issues that exists over 1.00v so, I can certainly understand why the limit is 0.988...

Hrm...

In that case, I'd say don't go above 0.975v which, is still a big boost...

At 0.98, I had my cards at 800mhz...Which is about the point of the fluctuation issues etc...

I'll shoot out an email and get back to you guys


----------



## rush2049

Okay explain to me real quick....

the 301.24 drivers have the voltage lock gone? So if I re-flash to my default bios I can adjust voltage like normal?


----------



## Xraze

Quote:


> Originally Posted by *Masked*
> 
> If the PDL exists above the OCP, it can really only be done at the driver level...
> There is also a micro-stutter and fluctuation issues that exists over 1.00v so, I can certainly understand why the limit is 0.988...
> Hrm...
> In that case, I'd say don't go above 0.975v which, is still a big boost...
> At 0.98, I had my cards at 800mhz...Which is about the point of the fluctuation issues etc...
> I'll shoot out an email and get back to you guys


You what? 800mhz @ 0.988v that's insane







. Mine only goes up to 740mhz on 0.98v. I need 1.025v for stable 780mhz.
Quote:


> Okay explain to me real quick....
> 
> the 301.24 drivers have the voltage lock gone? So if I re-flash to my default bios I can adjust voltage like normal?


Nope, you have to set it in the bios using NiBiTor.


----------



## YP5 Toronto

Quote:


> Originally Posted by *rush2049*
> 
> Okay explain to me real quick....
> the 301.24 drivers have the voltage lock gone? So if I re-flash to my default bios I can adjust voltage like normal?


yes what he said....someone simplify for us.


----------



## Xraze

Quote:


> Originally Posted by *YP5 Toronto*
> 
> yes what he said....someone simplify for us.


I'll post it here so you guys can see:

I found it! If you want a certain voltage with any drivers just use this method. Download http://www.overclock.net/attachments/1235
Use NiBiToR 6.4 to change the voltages to


Choose the voltage you want in Setting 2 and keep the other settings exactly like shown and your card will undervolt itself in 2D mode ( Idle ) . Edit: Works with latest Nvidia 301.24 Drivers. Happy overclocking! Note that 0.8225v can be too little voltage for your card in idle so you can up that a bit.


----------



## Recipe7

Wow, this is some interesting news. This program can be used with the stock BIOS correct?


----------



## Shinobi Jedi

If everyone is pretty much using stock clocks when they game, my question is, will raising voltages help performance even at stock clocks? If I'm running EVGA fan bios, do I still need to re-flash? Or is that not necessary with this program?

Will this help with my still happening crashes to desktop in SWTOR?

...

I finished Mass Effect 3. If that's the ending, then I haven't been kicked that hard in the balls since 3rd grade outside of the school gym. And while I hope for a new DLC ending, I don't think DLC as an ending was ever intentional. I think the ending presented was what they intended. And if so, then my faith in them as storytellers and a little as game designers, is severely shaken. As in *** were they thinking in that would fly? I'm convinced this is why the game was leaked. Someone on the team knew it was horrible and was hoping fan outrage would get the project decision makers to see reason. They didn't and so they didn't change anything, and just as those leakers surmised, the fans hated it and Bioware caught deserved heat for it.

And just to be clear, Bioware is my favorite studio. I really think it's a case of them spreading themselves too thin. But I guess Dragon Age 2 was a real disappointment, and a lot of people are unhappy with SWTOR, though I went straight back to it with the new patch. I'm going to reserve my final opinion until we all see what they come up with.

Regardless, this is one time where I have no problem paying for DLC for a better ending. Who also thinks Bioware is going to run with the wild fan theory floating the web since it makes more narrative sense than anything they came up with?

Though after finishing that and with SWTOR servers down on this rainy day in Los Angeles, I am feeling starved for games. I built this rig for BF3, but SWTOR sucked me away and I haven't felt the drive to go back since the game has reportedly become cheat city. Though I did log in not that long ago and was amazed how much performance in the game has improved with my 590's.

I'm totally stuck on Batman:AC and pissed the game has only checkpoints and not a save system where I can go back and beef up my Batman stats before getting stuck in the end mob fights when they start busting out stun sticks and shooting at you from the corners.

Skyrim is just too big and open for my story craving side, but even more so, for some reason I just can't get into Old World Fantasy that much lately. Meaning, I really dig Modern Day Fantasy like Harry Potter. But I'm a geek who loves comics and space opera's and anime, and I'm just kind of burnt out on old world fantasy settings much the same as I am on Apocalyptic Wasteland Settings.

I guess what I'm saying is, Max Payne 3 can't come soon enough.

Any game suggestions are welcome.

@Masked:

If it's any solace, your regrets make me feel reassured about holding onto mine for awhile with all the (deserved) 680 hype blowing up right now.

This unlocking reminds me kind of the go around on my last set of cards. I was interested in the GTX 295 at the time. But all the hate from the price and botiqueness of the card and the fear bombs about running dual GPU's on one card convinced my super-noob self at the time to go with two GTX 275's instead, much the same many went with two 570's instead of a 590.

And then, lo and behold, when I was shopping for new cards a year ago and researching, I kept coming up on comment after comment about how great a card the GTX 295 turned out to be, and how many preferred it over the initial 400 series.

These new drivers/unlocking of voltage, feels like the beginning of that renaissance period the GTX 295 had. I guess only time will tell.

Lastly, any word on that SLI profile from Nvidia for SWTOR? Especially now that the Jesus patch went live?

These are good days to be a GTX 590 Shinobi... Believe it.


----------



## Tuthsok

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> If everyone is pretty much using stock clocks when they game, my question is, will raising voltages help performance even at stock clocks? If I'm running EVGA fan bios, do I still need to re-flash? Or is that not necessary with this program?
> Will this help with my still happening crashes to desktop in SWTOR?
> ...
> I finished Mass Effect 3. If that's the ending, then I haven't been kicked that hard in the balls since 3rd grade outside of the school gym. And while I hope for a new DLC ending, I don't think DLC as an ending was ever intentional. I think the ending presented was what they intended. And if so, then my faith in them as storytellers and a little as game designers, is severely shaken. As in *** were they thinking in that would fly? I'm convinced this is why the game was leaked. Someone on the team knew it was horrible and was hoping fan outrage would get the project decision makers to see reason. They didn't and so they didn't change anything, and just as those leakers surmised, the fans hated it and Bioware caught deserved heat for it.
> And just to be clear, Bioware is my favorite studio. I really think it's a case of them spreading themselves too thin. But I guess Dragon Age 2 was a real disappointment, and a lot of people are unhappy with SWTOR, though I went straight back to it with the new patch. I'm going to reserve my final opinion until we all see what they come up with.
> Regardless, this is one time where I have no problem paying for DLC for a better ending. Who also thinks Bioware is going to run with the wild fan theory floating the web since it makes more narrative sense than anything they came up with?
> Though after finishing that and with SWTOR servers down on this rainy day in Los Angeles, I am feeling starved for games. I built this rig for BF3, but SWTOR sucked me away and I haven't felt the drive to go back since the game has reportedly become cheat city. Though I did log in not that long ago and was amazed how much performance in the game has improved with my 590's.
> I'm totally stuck on Batman:AC and pissed the game has only checkpoints and not a save system where I can go back and beef up my Batman stats before getting stuck in the end mob fights when they start busting out stun sticks and shooting at you from the corners.
> Skyrim is just too big and open for my story craving side, but even more so, for some reason I just can't get into Old World Fantasy that much lately. Meaning, I really dig Modern Day Fantasy like Harry Potter. But I'm a geek who loves comics and space opera's and anime, and I'm just kind of burnt out on old world fantasy settings much the same as I am on Apocalyptic Wasteland Settings.
> I guess what I'm saying is, Max Payne 3 can't come soon enough.
> Any game suggestions are welcome.
> @Masked:
> If it's any solace, your regrets make me feel reassured about holding onto mine for awhile with all the (deserved) 680 hype blowing up right now.
> This unlocking reminds me kind of the go around on my last set of cards. I was interested in the GTX 295 at the time. But all the hate from the price and botiqueness of the card and the fear bombs about running dual GPU's on one card convinced my super-noob self at the time to go with two GTX 275's instead, much the same many went with two 570's instead of a 590.
> And then, lo and behold, when I was shopping for new cards a year ago and researching, I kept coming up on comment after comment about how great a card the GTX 295 turned out to be, and how many preferred it over the initial 400 series.
> These new drivers/unlocking of voltage, feels like the beginning of that renaissance period the GTX 295 had. I guess only time will tell.
> Lastly, any word on that SLI profile from Nvidia for SWTOR? Especially now that the Jesus patch went live?
> These are good days to be a GTX 590 Shinobi... Believe it.


Heh, got into reading this and thought I was back on BioWare's forums for a sec 

I am going to have to give those new Beta 300 series drivers a try! Sounds like they are working quite nice for 590 owners.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Tuthsok*
> 
> Heh, got into reading this and thought I was back on BioWare's forums for a sec
> I am going to have to give those new Beta 300 series drivers a try! Sounds like they are working quite nice for 590 owners.


Yeah, I totally deserve that. My bad! There's just no one else in my world except this club to have an intelligent conversation about a game with. And I'm sure you can't have one on the BW forums.

I'm still holding out final opinion after the DLC drops. I don't mean to jump on the hate wagon. But man, I went into it with really low expectations after the response blew up. And I guess I'm amazed at how disappointing (so far) it still was.


----------



## mywifeispist

Quote:


> Originally Posted by *heyskip*
> 
> Yeah true, but for the first time since 280.19 we can actually force a voltage higher than 0.975 which is a massive plus.
> PDL is definately a pain but hopefully they keep most games off the trigger list.


weird thing with mine i noticed today after i completely reformatted and went to adjust my voltage on my 590 through Afterburner beta and it was locked at 0913, i modded my bios a while back to a default 650/1300/1800 and voltage to 0963 and it will not move off of 0913 now that i have redone everything....... any ideas besides reflashing ??


----------



## Tuthsok

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> Yeah, I totally deserve that. My bad! There's just no one else in my world except this club to have an intelligent conversation about a game with. And I'm sure you can't have one on the BW forums.
> I'm still holding out final opinion after the DLC drops. I don't mean to jump on the hate wagon. But man, I went into it with really low expectations after the response blew up. And I guess I'm amazed at how disappointing (so far) it still was.


LOL! That's ok I hear ya









On top of fixing up the ending I wish they would build for PC and then port to XBOX. That way the game may actually push my Quad SLI dual 590 setup! ( just to keep things on topic ^_^ )


----------



## emett

Play it in 3d.


----------



## FateZero

Quote:


> Originally Posted by *Smo*
> 
> Welcome to the club dude. Kombustor has a known issue working with SLi - if you search Google there's a file you modify (kombustor.ini or similar) to enable it.


Okay thanks







So I guess it's isn't my card's problem.. what a relieve.
I also found out that why my AW2310 monitor's refresh rate will not go any higher than 60GHz..
I guess I have to get a Dual Link DVI huh? Currently using HDMI.


----------



## mywifeispist

Quote:


> Originally Posted by *Xraze*
> 
> I'll post it here so you guys can see:
> I found it! If you want a certain voltage with any drivers just use this method. Download http://www.overclock.net/attachments/1235
> Use NiBiToR 6.4 to change the voltages to
> 
> Choose the voltage you want in Setting 2 and keep the other settings exactly like shown and your card will undervolt itself in 2D mode ( Idle ) . Edit: Works with latest Nvidia 301.24 Drivers. Happy overclocking! Note that 0.8225v can be too little voltage for your card in idle so you can up that a bit.


i set mine in NiBiToR and made default clocks at 650/1300/1800
.913 - .913
.925 - .950
1.025 - 1.025
limit to: 1.063
works real good with the new drivers and stable at 725 core on first core running in 3way SLi first pass, gonna see how far it will go........ thanks for the info
http://3dmark.com/3dm11/3201163


----------



## Xraze

Quote:


> Originally Posted by *mywifeispist*
> 
> i set mine in NiBiToR and made default clocks at 650/1300/1800
> .913 - .913
> .925 - .950
> 1.025 - 1.025
> limit to: 1.063
> works real good with the new drivers and stable at 725 core on first core running in 3way SLi first pass, gonna see how far it will go........ thanks for the info
> http://3dmark.com/3dm11/3201163


How are you running 3 way 590 SLI?


----------



## mywifeispist

i dedicated GPU 2-A as physics and it will change to 3 way SLi, i was running quad SLi when running 3 monitors but when i wanna play on my 42" plasma in 1080p i run 3 way so it wont bottleneck in a single monitor situation. quad sli is great for 5760x1080 but if you switch to a single 1920x1080 its not so good. i actually like the plasma over the 3 monitor setup, i was running 3 samsung 27" LED's and it was just way too much to look at honestly...... over 6ft. wide. also just tried 750 core stable in 3 way, gonna try a little more at a time just for the hell off it even though my new default settings of 650/1300/1800 are more then enough.


----------



## mywifeispist

these are my ASUS GTX 590 bios if someone wants to use them
settings are: default clocks 650/1300/1800
voltages:
.913 - .913
.925 - .950
1.025 - 1.025
limit to: 1.063

590 bios oc.zip 92k .zip file


----------



## Masked

Quote:


> Originally Posted by *mywifeispist*
> 
> these are my ASUS GTX 590 bios if someone wants to use them
> settings are: default clocks 650/1300/1800
> voltages:
> .913 - .913
> .925 - .950
> 1.025 - 1.025
> limit to: 1.063
> 
> 590 bios oc.zip 92k .zip file


You guys realize that with the 0.03+/- fluctuation and the fact that the card has a max voltage of 1.05 before it blows that, if you go to 1.02 or above, you're basically going to blow up your cards, right?

Just sayin.


----------



## Xraze

Quote:


> Originally Posted by *Masked*
> 
> You guys realize that with the 0.03+/- fluctuation and the fact that the card has a max voltage of 1.05 before it blows that, if you go to 1.02 or above, you're basically going to blow up your cards, right?
> Just sayin.


The limit is closer to 1.063v and 1.025v is perfectly fine for a card under water.


----------



## Masked

Quote:


> Originally Posted by *Xraze*
> 
> The limit is closer to 1.063v and 1.025v is perfectly fine for a card under water.


Considering I'm the only one in this thread that's blown up 3+ ~ I strongly, vehemently, disagree.

Nvidia's press release stated 1.05, I've proven to myself it's 1.05 ~ And I've been under water for over 3 years so...Yeah.

Heed it or don't, I don't have a 590 to blow up anymore


----------



## mywifeispist

both of my cards are on full waterblocks and all my cards always are....... i raised my clocks to 750/1500/1800 and the cards are only using .988 0f the 1.025 limit.


----------



## Masked

Quote:


> Originally Posted by *mywifeispist*
> 
> both of my cards are on full waterblocks and all my cards always are....... i raised my clocks to 750/1500/1800 and the cards are only using .988 0f the 1.025 limit.


It doesn't matter if you're on water or not.

Heat/dissipation has absolutely 0 to do with the hardware load on your moffsets...Thermally, maybe but, cooling has VERY little effect on voltage transferring etc.

1.05 is the max "safe" voltage for the card which, was noted in March 11.

I've had 5, 590's blown up 3 and all 3 were blown going above 1.05 ~ The VRM load just can't be handled by the hardware on the card.

I don't know in what world water-cooling somehow dissipates the load on VRM's but, as I said, I don't have a 590 anymore, don't work for a company that uses any and if you want to blow up your own, that's fine...But, 1.05 is the max safe and if you go above that, you do so at your own risk...

So have at it


----------



## heyskip

Quote:


> Originally Posted by *mywifeispist*
> 
> i set mine in NiBiToR and made default clocks at 650/1300/1800
> .913 - .913
> .925 - .950
> 1.025 - 1.025
> limit to: 1.063
> works real good with the new drivers and stable at 725 core on first core running in 3way SLi first pass, gonna see how far it will go........ thanks for the info
> http://3dmark.com/3dm11/3201163


.913 is a bit high for idle. Could safely change that to the stock .875 and save yourself a bit of power/heat.


----------



## mywifeispist

with that bios setting the cards do not use more then .988 at 750core , i monitor it on another screen in AID64. cards idle at 28-30c and hit 45-47c load which i think is pretty decent.


----------



## mywifeispist

Quote:


> Originally Posted by *heyskip*
> 
> .913 is a bit high for idle. Could safely change that to the stock .875 and save yourself a bit of power/heat.


made the change for idle from .913 to .875 and made no difference for me but it may for someone else though, also changed the 3D voltage from 1.025 to 1.005 and still only needs .988 of that to hit 750 on the core.

backed the clocks down in precisionX to 700/1400/1800 with those changes to the bios and it seems to run better really, temps are the same all around but performace was better all around.

running in 3way SLi - http://3dmark.com/3dm11/3204366


----------



## emett

You realize one of those 590s isn't being used for the bench mark right? Correct me if i'm wrong but how is it possible to run 3 590's in sli? Each card only has one sli connector.


----------



## mywifeispist

Quote:


> Originally Posted by *emett*
> 
> You realize one of those 590s isn't being used for the bench mark right? Correct me if i'm wrong but how is it possible to run 3 590's in sli? Each card only has one sli connector.


actually they do see all three and use all three gpu's, i can run 2way sli in 3dmark 11 and im around 11k in 3way 14k and in quad over 16k .....so yes it sees and uses all three. i am running them in 3way because i no longer use a 3 monitor setup.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *mywifeispist*
> 
> weird thing with mine i noticed today after i completely reformatted and went to adjust my voltage on my 590 through Afterburner beta and it was locked at 0913, i modded my bios a while back to a default 650/1300/1800 and voltage to 0963 and it will not move off of 0913 now that i have redone everything....... any ideas besides reflashing ??


I'm in the same boat.

I'd still like to know if upping the voltages while keeping stock clocks in gaming are worth it, and/or improve performance? Because if not, then I'll just leave things be.


----------



## Masked

Quote:


> Originally Posted by *mywifeispist*
> 
> actually they do see all three and use all three gpu's, i can run 2way sli in 3dmark 11 and im around 11k in 3way 14k and in quad over 16k .....so yes it sees and uses all three. i am running them in 3way because i no longer use a 3 monitor setup.


I just got off the phone w/my buddy at Nvidia and considering an entire driver-recreation would be necessary for this to happen and a custom SLI-Link...I'm calling 100% shens...Total BS.

I want pictures...You're a new member, I've tried this at Alienware and failed...Prove to me it's possible.

Not through DirectX because even a noob can jack their scores -- I want picture perfect proof that the SLI is enabled VIA Bios/Driver and proof of the SLI working in co-op.

Keep in mind, for this to work -- You'd need an entirely custom DX as well since tri-sli 590's isn't supported at the core of the programming.


----------



## ProfeZZor X

This weekend I hooked up my wiring after flushing out all of my water cooling hardware, and the last thing I connected was my 590. Needless to say, after opening up the packaging I wasn't too happy with the length of the cables evga supplied, nor was I happy with the yellow wires sticking out in the two 8-pin connectors.

Does anyone know if there's a longer (matching pin) alternative 8-pin cable or extended power cables for the 590 out there? I tried the extended 8-pin at Fry's, but the pins didn't match.


----------



## khemist

delete


----------



## emett

Quote:


> Originally Posted by *Masked*
> 
> I'm calling 100% shens...Total BS.


Yeah thought so..


----------



## iARDAs

Hey folks.

I have to ask a question.

Is 680 OCed better than 590 for sure?

I am thinking of grabing a 680 now and a second one later towards the end of summer.

Also our 590 is 768 bit and 680 is 256 bit.

What kind of a difference does this have?

I AM happy with my 590 but a friend of mine really likes the 590 a lot and wants to get a Dual Gpu card and a brand new one is out of stocks so he wants to get mine. He is in love with the 590s even more than I am.


----------



## emett

Quote:


> Originally Posted by *iARDAs*
> 
> Hey folks.
> 
> Is 680 OCed better than 590 for sure?


No..
Quote:


> Originally Posted by *iARDAs*
> 
> Also our 590 is 768 bit and 680 is 256 bit.
> 
> What kind of a difference does this have?


NFI but your average joe with an overclocked 680 is only scoring less (3dmark2011) if not the same as an overclocked 590. Keep in mind that this is due to throttling by the drivers of the 590 in 3dmark11. The 590 is a faster card.
Quote:


> Originally Posted by *iARDAs*
> 
> I AM happy with my 590 but a friend of mine really likes the 590 a lot and wants to get a Dual Gpu card and a brand new one is out of stocks so he wants to get mine. He is in love with the 590s even more than I am.


If I had someone ready to pay top dollar for my 590 i'd sell it and get a 680. Only because sli 680's use a hell of a lot less power.


----------



## iARDAs

Thank you emett for the answer...

I pretty much agree with everything you are stating. I ADORE my 590 and it is great for my needs but again the other party is really interested.

I can pretty much sell the 590 to him and get a brand new 680.

I am currently in a dilemma at the moment.

The thing is I can sell my 590 for around 600 US dollars here in Turkey and I will be visiting USA in June so I can grab myself a 680 around 550 US dollars with tax and etc.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Thank you emett for the answer...
> 
> I pretty much agree with everything you are stating. I ADORE my 590 and it is great for my needs but again the other party is really interested.
> 
> I can pretty much sell the 590 to him and get a brand new 680.
> 
> I am currently in a dilemma at the moment.
> 
> The thing is I can sell my 590 for around 600 US dollars here in Turkey and I will be visiting USA in June so I can grab myself a 680 around 550 US dollars with tax and etc.


I, now, have 3 680's and I'll be the first to tell you...They're not really worth the hype.

It is true that this is their "mid-range" so, would I sacrifice a 590 for it? No.

It's not mid-range in the sense of it's scores, obviously but, it is mid-range in terms of the hardware on the card.

Quite personally, I'd wait -- Sept/Oct was the date I received for round 2 and that's when the "real" 680 will be released...Get it then but, not now.


----------



## mywifeispist

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> I'm in the same boat.
> I'd still like to know if upping the voltages while keeping stock clocks in gaming are worth it, and/or improve performance? Because if not, then I'll just leave things be.


i reflashed the cards, reinstalled the drivers and precisionX and everything is good now


----------



## Xraze

Masked, if my 590 was on water... what voltage would you recommend? Something that could push some good Mhz but not blow it up preferably.


----------



## Masked

Quote:


> Originally Posted by *ProfeZZor X*
> 
> This weekend I hooked up my wiring after flushing out all of my water cooling hardware, and the last thing I connected was my 590. Needless to say, after opening up the packaging I wasn't too happy with the length of the cables evga supplied, nor was I happy with the yellow wires sticking out in the two 8-pin connectors.
> Does anyone know if there's a longer (matching pin) alternative 8-pin cable or extended power cables for the 590 out there? I tried the extended 8-pin at Fry's, but the pins didn't match.


I know that Lutr0 could make you some if you wanted...

I might be able to but, I'd need the connectors...I do have the sleeving, though...Hrm.
Quote:


> Originally Posted by *Xraze*
> 
> Masked, if my 590 was on water... what voltage would you recommend? Something that could push some good Mhz but not blow it up preferably.


I'm still kind of hot/bothered over the fact that someone tried to butcher Ohm's law in regards to "water cooling"...Really?

Temperature creates no resistance or acceleration on a graphics card because it doesn't thermally reach a point in either direction that it would have //ANY// effect on the voltage transfer...

Obviously, Joule's first law applies in relation to the thermal conductivity producing heat but, again, this has 0 effect on the load the VRM's can handle...It doesn't mean if watercooled they can handle more...That's absurd.

I'd have a good debate with you about acceleration/resistance on N2 but, in no/way/shape/form is that card STOCK at 1.06 -- No...Kingpin had to use an EVbot to legitimately BOOST the voltage and bypass the VRM's altogether.

Since we've had this discussion at least 20x, I'll repeat what I can say...If there's something I shouldn't have said, PM me and I'll edit it.

When this card was released, within a day, before it even hit testers labs, Nvidia discovered the VRM load was frying cards -- Actually, this was discovered in Feb but, only thoroughly tested on sample release...I was among the first to get samples (thank you UPS) and immediately following, 3 emails showed up in my box...All 3 stated the maximum SAFE voltage...It's 1.05...~ That's not 1.06v or 1.052 or 1.051, that's 1.050v.

Sweclockers had their cards at 1.2v ~ We've all seen the Vrms POP like a bag of popcorn ~ They just can't handle the load transfer at //ANYTHING// over 1.05v...

Like I said, I've blown up 3 of them...Sadly...And it was because the 1st core was overvolting (something I mentioned in a past post as well)...

While your primary core may be at 1.02v, there is a +/- .03 fluctuation, especially under load ~ Again, it is IMPOSSIBLE via Ohm's law for a +/- 10c change in temperature to effect the voltage transfer in this situation...So NO, watercooling does NOT prevent this from happening NOR does it raise your voltage roof.

When overclocking the card, that +/- 0.3 does change...Why? You're now demanding more load which, demands more power...Again, at 30c - 50c there is absolutely NO CHANGE to the amount of transfer the VRM's can handle.

If you're going to overclock this card, I would not SAFELY go over 0.95v -- Why?

It allows for max transfer, VRM load and overall, ensures you're not going to blow your card up.

I've blown 1 card at 1.00v because the load at 800mhz shot up over capacity and "snap crackle pop"...The other blew at 0.99v ~ and they actually blew within a week of each-other. ~ When I later tested the NEW cards under load, 1.00v became 1.05v on the 2nd core...The load was just unreal...

So realistically, as I said, your load on the hardware can fluctuate and I wouldn't go over 0.95 ~ Again, it's your card...Your investment...Ultimately your decision...But, don't be ignorant and think that water cooling "dissipates voltage load" because that's just about the dumbest thing I've ever heard on this forum.


----------



## mywifeispist

Quote:


> Originally Posted by *Masked*
> 
> I just got off the phone w/my buddy at Nvidia and considering an entire driver-recreation would be necessary for this to happen and a custom SLI-Link...I'm calling 100% shens...Total BS.
> I want pictures...You're a new member, I've tried this at Alienware and failed...Prove to me it's possible.
> Not through DirectX because even a noob can jack their scores -- I want picture perfect proof that the SLI is enabled VIA Bios/Driver and proof of the SLI working in co-op.
> Keep in mind, for this to work -- You'd need an entirely custom DX as well since tri-sli 590's isn't supported at the core of the programming.


This should show something, four screen shots and four runs total with vantage and 3dmark 11. 3way and 4way runs in both, but had to clock down to 650/1300/1800 for the quad run in 3dmark 11 today, it ran the other day at 700 so i will have to investigate why not today so the 3dmark 11 run is slightly less then 16k in quad. (screen shots attached)

3way sli was done by dedicating 2-a in control panel to do physics

vantage both runs

vantage runs.png 340k .png file


3dmark 11 both runs but quad at 650/1300/1800, 3 way at 700/1400/1800

3dmark11both runs.png 341k .png file


screenshot nvidia driver 3-way sli.png 504k .png file


screenshot quad sli.png 1536k .png file


----------



## Masked

Quote:


> Originally Posted by *mywifeispist*
> 
> This should show something, two screen shots and four runs total with vantage and 3dmark 11. 3way and 4way runs in both, but had to clock down to 650/1300/1800 for the quad run in 3dmark 11 today, it ran the other day at 700 so i will have to investigate why not today so the 3dmark 11 run is slightly less then 16k in quad. (screen shots attached)
> 3way sli was done by dedicating 2-a in control panel to do physics
> vantage both runs
> http://3dmark.com/compare/3dmv/4027517/3dmv/4027499
> 3dmark 11 both runs but quad at 650/1300/1800, 3 way at 700/1400/1800
> http://3dmark.com/compare/3dm11/3207711/3dm11/3207598
> 
> screenshot nvidia driver 3-way sli.png 504k .png file
> 
> 
> screenshot quad sli.png 1536k .png file


That's NOT tri-SLI.

Tri-SLI is 3 cards in a row -- Thus, Tri-SLI.

On a dual-PCB the SLI is understood.

So you have SLI 590's ~ Period.

Re-read what you've said before and now understand, why everyone in here is/was confused...

And FYI, just because 1 core is dedicated to Phys-x, it still operates under the assumption that SLI is enabled, it just also handles computing tasks...So you're always active with 4 cores, period ~ Regardless of what you enable.

The only way to CHANGE that would be at the Bios level with a custom control panel...So, you have QUAD SLI with 1 core dedicated to Physx. Nvidia control panel works by card...Not by core, even though it may appear to do so.

I also, really don't care about screenshots ~ I've edited so many things, doctoring a DX result is literally child's play.


----------



## mywifeispist

dont know whats so confusing here, not trying to argue at all really. all im saying here is all the benchmarks see 3gpu's and when i dont dedicate one for physics it enables all 4 for quad, and the scores are different ..... sounds simple to me.

all i said in the beginning was i run in 3 way mode with the single 1080p plasma, and when i ran 3x 27 inch monitors i enabled all 4 for 5760x1080 res., ....... using all 4 gpu's with one 1080p screen bottlenecks so i simply use one for physics and it says 3 way right in the driver and bottleneck is gone and get super smooth game play with high frames, dont know what is so confusing here.


----------



## mywifeispist

Quote:


> Originally Posted by *Masked*
> 
> That's NOT tri-SLI.
> Tri-SLI is 3 cards in a row -- Thus, Tri-SLI.
> On a dual-PCB the SLI is understood.
> So you have SLI 590's ~ Period.
> Re-read what you've said before and now understand, why everyone in here is/was confused...
> And FYI, just because 1 core is dedicated to Phys-x, it still operates under the assumption that SLI is enabled, it just also handles computing tasks...So you're always active with 4 cores, period ~ Regardless of what you enable.
> The only way to CHANGE that would be at the Bios level with a custom control panel...So, you have QUAD SLI with 1 core dedicated to Physx. Nvidia control panel works by card...Not by core, even though it may appear to do so.
> I also, really don't care about screenshots ~ I've edited so many things, doctoring a DX result is literally child's play.


why would i edit it...... doesnt make sense to do so


----------



## Masked

Quote:


> Originally Posted by *mywifeispist*
> 
> dont know whats so confusing here, not trying to argue at all really. all im saying here is all the benchmarks see 3gpu's and when i dont dedicate one for physics it enables all 4 for quad, and the scores are different ..... sounds simple to me.
> all i said in the beginning was i run in 3 way mode with the single 1080p plasma, and when i ran 3x 27 inch monitors i enabled all 4 for 5760x1080 res., ....... using all 4 gpu's with one 1080p screen bottlenecks so i simply use one for physics and it says 3 way right in the driver and bottleneck is gone and get super smooth game play with high frames, dont know what is so confusing here.


Quote:


> Originally Posted by *mywifeispist*
> 
> why would i edit it...... doesnt make sense to do so


GPU stands for Visual Graphical Processing UNIT.

*You don't have 4 GPU's, you have 2.*

When you say you have tri-sli GPUs, it means, in THIS WORLD, that you have 3 physical GTX 590's in Tri-Sli which, you in fact do not.

You have 2 gpu's in SLI.

...I'm not about to go through a year of bugs/issues with you...It's not worth my time.

The simple fact is that if you have SLI enabled, all 4 cores are in SLI congruence, period...Even if 1 is dedicated for Phys-x ~ Your scores are different because the CPU load is off and it enables graphical processing from the CPU core + Since the 590 is actually moderately optimized, it boosts your score...You're still in SLI on all 4 cores.

Again, it's confusing because you're using the wrong terminology...And you're coming to a thread where everyone has been using the correct terminology for over a year...Confusing? Absolutely.

At work, I had the 590's in SLI on 3x40's and they performed, wonderfully...At home, I eventually gave up with SLI 590's and went with 1x590 and 1x580 ~ Was more than fine.


----------



## mywifeispist

Quote:


> Originally Posted by *Masked*
> 
> GPU stands for Visual Graphical Processing UNIT.
> *You don't have 4 GPU's, you have 2.*
> When you say you have tri-sli GPUs, it means, in THIS WORLD, that you have 3 physical GTX 590's in Tri-Sli which, you in fact do not.
> You have 2 gpu's in SLI.
> ...I'm not about to go through a year of bugs/issues with you...It's not worth my time.
> The simple fact is that if you have SLI enabled, all 4 cores are in SLI congruence, period...Even if 1 is dedicated for Phys-x ~ Your scores are different because the CPU load is off and it enables graphical processing from the CPU core + Since the 590 is actually moderately optimized, it boosts your score...You're still in SLI on all 4 cores.
> Again, it's confusing because you're using the wrong terminology...And you're coming to a thread where everyone has been using the correct terminology for over a year...Confusing? Absolutely.
> At work, I had the 590's in SLI on 3x40's and they performed, wonderfully...At home, I eventually gave up with SLI 590's and went with 1x590 and 1x580 ~ Was more than fine.


actually im not..... just reading what the driver says period........ i guess nvidia is wrong with there terminology or wording then, i called it 3 way like they did, am i right? so how am i using wrong terminology ?
im not upset by any means and im not the one throwing insults here either, just responded and posted what i see thats all.

screenshot nvidia driver 3-way sli.png 504k .png file


i think you think im arguing with you and its not the case here....... this is what nvida puts in there control panel thats all....3way sli
and yes i understand what a gpu is and understand what you mean by 590's in sli, yes it is two physical cards each card being dual gpu


----------



## Xraze

Quote:


> Originally Posted by *Masked*
> 
> GPU stands for Visual Graphical Processing UNIT.
> *You don't have 4 GPU's, you have 2.*
> When you say you have tri-sli GPUs, it means, in THIS WORLD, that you have 3 physical GTX 590's in Tri-Sli which, you in fact do not.
> You have 2 gpu's in SLI.
> ...I'm not about to go through a year of bugs/issues with you...It's not worth my time.
> The simple fact is that if you have SLI enabled, all 4 cores are in SLI congruence, period...Even if 1 is dedicated for Phys-x ~ Your scores are different because the CPU load is off and it enables graphical processing from the CPU core + Since the 590 is actually moderately optimized, it boosts your score...You're still in SLI on all 4 cores.
> Again, it's confusing because you're using the wrong terminology...And you're coming to a thread where everyone has been using the correct terminology for over a year...Confusing? Absolutely.
> At work, I had the 590's in SLI on 3x40's and they performed, wonderfully...At home, I eventually gave up with SLI 590's and went with 1x590 and 1x580 ~ Was more than fine.


1 590 + 1 580? Do you mean the 580 for physics?


----------



## Masked

Quote:


> Originally Posted by *mywifeispist*
> 
> actually im not..... just reading what the driver says period........ i guess nvidia is wrong with there terminology or wording then, i called it 3 way like they did, am i right? so how am i using wrong terminology ?
> im not upset by any means and im not the one throwing insults here either, just responded and posted what i see thats all.
> 
> screenshot nvidia driver 3-way sli.png 504k .png file
> 
> i think you think im arguing with you and its not the case here....... this is what nvida puts in there control panel thats all....3way sli
> and yes i understand what a gpu is and understand what you mean by 590's in sli, yes it is two physical cards each card being dual gpu


Yeah -- That's something new and I'm not trying to insult you, I come off as being...Abrasive, I'm really not...In fact, my only objection thus far was/is the voltages you claim to differ on water -- That's a falsity, 100% ~ Ohm's law proves this is absolutely false so, your maximum safe voltage is 1.05v period.

It's interesting they changed core to GPU ~ I wonder how long that's been around -- Anyone notice that pre-680? ~ I know it wasn't there before but, every release they update the splash so, that may be the new "terminology" etc.

I'm also kind of miffed they now allow you to tri-sli cores on the GUI?

So, I formally withdraw, actually...

Someone sell me a pair of 590's, damn-it.

@ Xraze

No.

Windows 7, in fact every windows since XP has allowed for dual driver profiles...So, I had a 590 operating my middle monitor with both cores SLI'd and the 580 was operating the 2 wide monitors, solo.

Word on the street is that the 680's will be SLI-able with the 690 ~ Something we all look forward to.


----------



## Xraze

Yeah, I'm also pretty sure 7970 can be Crossfired with 7990, correct me if I'm wrong.


----------



## Smo

Quote:


> Originally Posted by *Masked*
> 
> Word on the street is that the 680's will be SLI-able with the 690 ~ Something we all look forward to.


Well now, that is interesting!

Quote:


> Originally Posted by *Xraze*
> 
> Yeah, I'm also pretty sure 7970 can be Crossfired with 7990, correct me if I'm wrong.


Yes you can, it's called CrossfireX. Two cards that are the same is Crossfire, when one is different then it's CrossfireX.


----------



## Masked

Quote:


> Originally Posted by *Xraze*
> 
> Yeah, I'm also pretty sure 7970 can be Crossfired with 7990, correct me if I'm wrong.


I can't speak about the 7990's so, I apologize but, in terms of the 6 series, yes.

In fact, the 6990 is a beast so, it actually performed, almost equally to the 6970's.

You still suffered @ a 5-10% total loss in performance vs. 3x6970's but, for 2 slots, it's not a major loss.


----------



## emett

How can you run any system with a 7990 when its not even been released?
Nubs..


----------



## Smo

Quote:


> Originally Posted by *emett*
> 
> How can you run any system with a 7990 when its not even been released?
> Nubs..


Wut?


----------



## mywifeispist

Quote:


> Originally Posted by *Masked*
> 
> Yeah -- That's something new and I'm not trying to insult you, I come off as being...Abrasive, I'm really not...In fact, my only objection thus far was/is the voltages you claim to differ on water -- That's a falsity, 100% ~ Ohm's law proves this is absolutely false so, your maximum safe voltage is 1.05v period.
> It's interesting they changed core to GPU ~ I wonder how long that's been around -- Anyone notice that pre-680? ~ I know it wasn't there before but, every release they update the splash so, that may be the new "terminology" etc.
> I'm also kind of miffed they now allow you to tri-sli cores on the GUI?
> So, I formally withdraw, actually...
> Someone sell me a pair of 590's, damn-it.
> @ Xraze
> No.
> Windows 7, in fact every windows since XP has allowed for dual driver profiles...So, I had a 590 operating my middle monitor with both cores SLI'd and the 580 was operating the 2 wide monitors, solo.
> Word on the street is that the 680's will be SLI-able with the 690 ~ Something we all look forward to.


i totally understand what you mean by voltage.... voltage is voltage no matter what, if the vrm's can handle it its one thing but if they cant they cant, i water cool as much as i can on everything as i dont like heat and it does help alot with stressing things i believe. one thing i have noticed more so with these 590's is that they vary more then every other card i have seen in a while, seems alot of people report more differences with these cards from one to the other then most. if they blow they blow then i can say i had fun....lol, as for the maximum safe voltage at 1.05v i agree with you 100% on that and my cards have never seen that voltage either, i used 1.02v but never actually saw that, i saw 0.988mv at 750core and have never attempted to take the cards past 750 either as the chances of them going up in smoke are way more likely at those clocks. like i give my cpu 1.35v on core at 4.626ghz and it only uses 1.32 under load but these things vary per chip and same goes for most things too as well. as far as the future 690 goes, i wont give up these 590's for one unless i see a major major improvement over the 590, i know 590's have had a few flaws here and there but they are great cards in the end and i hold nvidia drivers responsible for there flaws mainly in the past of limiting....... as for the vrm's thats another story, people are just gonna have to be carefull thats all u can do, nvidia gave warning on that already and now that the voltage cap is gone people should just be carefull with it.

also i didnt take your tone real personal either, thats why i was telling you i wasnt mad about it, just was trying to make you aware of what i was saying, sometimes its easier for people to show what they mean then say what they mean, these forums are for everyone to help each other in some way and thats they way i look at it and took it..... so atleast we have become familiar with each others views and it will help all of us in the end. you gotta admit it was a good clean conversation back and forth. *Masked,* i have seen other post by you and see you help people here and there by offering a hand with things and again thats what this is about, i plan on being active here at this site so consider us friends.


----------



## mywifeispist

Quote:


> Originally Posted by *Xraze*
> 
> Masked, if my 590 was on water... what voltage would you recommend? Something that could push some good Mhz but not blow it up preferably.


these are the settings i settled on in nibitor: ( if it helps you )
default clocks are now - 650/1300/1800
voltages are: 1.005 max setting at the top
0.875 0.875
0.925 0.950
0.988 0.988
here they are if you wanna copy them over to yours, i can run the cards to 750 core easy but wont go past that even though it will do it. my cards are ASUS.

590ocA-B.zip 92k .zip file


----------



## Recipe7

Good information the past day. Healthy arguments are always good.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *mywifeispist*
> 
> i reflashed the cards, reinstalled the drivers and precisionX and everything is good now


Which Bios did you reflash to? Same ones?

I'm running bios 70.10.37.00.92, fwiw


----------



## IronMaiden1973

Quote:


> Originally Posted by *Masked*
> 
> Yeah -- That's something new and I'm not trying to insult you, I come off as being...Abrasive, I'm really not...In fact, my only objection thus far was/is the voltages you claim to differ on water -- That's a falsity, 100% ~ Ohm's law proves this is absolutely false so, your maximum safe voltage is 1.05v period.
> It's interesting they changed core to GPU ~ I wonder how long that's been around -- Anyone notice that pre-680? ~ I know it wasn't there before but, every release they update the splash so, that may be the new "terminology" etc.
> I'm also kind of miffed they now allow you to tri-sli cores on the GUI?
> So, I formally withdraw, actually...
> Someone sell me a pair of 590's, damn-it.
> @ Xraze
> No.
> Windows 7, in fact every windows since XP has allowed for dual driver profiles...So, I had a 590 operating my middle monitor with both cores SLI'd and the 580 was operating the 2 wide monitors, solo.
> Word on the street is that the 680's will be SLI-able with the 690 ~ Something we all look forward to.


Mate, people like you piss me off. I signed up to this forum just to reply to all the **** you post.. Yesterday you were asking mywifeispist for proof as to what he was claiming and calling it all bull****.. Today you are " formally withdrawing"??? Is this your way of an apology? People like you are what give forums around the world a a bad name.. I guess with 2500+ posts you think yourself and Alpha to these parts.. You're nothing but a fat **** without a girlfriend.. And I bet I'm right.. Apologize to the man properly and then pull your head in ..


----------



## Masked

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Mate, people like you piss me off. I signed up to this forum just to reply to all the **** you post.. Yesterday you were asking mywifeispist for proof as to what he was claiming and calling it all bull****.. Today you are " formally withdrawing"??? Is this your way of an apology? People like you are what give forums around the world a a bad name.. I guess with 2500+ posts you think yourself and Alpha to these parts.. You're nothing but a fat **** without a girlfriend.. And I bet I'm right.. Apologize to the man properly and then pull your head in ..


Haven't laughed this hard in a long time, thanks for the nerd rage














And the fact that you became a member here JUST to rage, I'm so flattered, it's insane...Wow









The recent updates/replies in this thread greatly rely on something you don't have, context.

Quad-SLI in the computer world means you have 4 physical GPU's linked together...and with the GTX 590, that's impossible...So of course there were immediate issues with what he said ~ You should read the flood of PM's I recieved asking if this was/is possible so, again, context ~ You have none.

So, first, subtract from this equation, NVIDIA's new GUI naming scheme. Prior to the release of the 680s, the terminology was that your CORES were SLI'd because, in fact your cores were and not the GPU's themselves...This causes extreme confusion. Again, context ~ You have none.

Also, prior to recently, you couldn't tri-sli with 1 core dedicated on Physx and there's a year of proof, vouching for that, in this thread. Again, context ~ You have none.

So him coming in, immediately post that update and posting what he said, goes against everything every single member here, has learned, for over a year. Again, context ~ You have none.

I wasn't being mean to him, we had a discussion and in fact, he was 100% incorrect on a scientific law...Stating it as fact ~ Are we supposed to let such misinformation thrive on a public forum? Absolutely not...But, without context, you're just as ignorant as the next laymen so, I'll give you a break.

Ohm's law exists for a reason, it's a LAW...And at no temperature in our operating world does the voltage fluctuate enough to allow a user to run a higher voltage and not blow a cap that's limited to 1.05v...I'm sorry but, that's been proven by other users, myself and Sweclockers...The VRM's are voltage limited.

Even Kingpin, the worlds greatest overclocker, had to use an EVBOT plus the substitute voltage VRM's to acquire the OC load that he did...Why? The VRM's cannot handle over 1.05v at //ANY// feasible operating temperature.

If you think I'm apologizing for that, I'd suggest you keep nerd raging because, again, I'm so flattered it's incredible.

I withdrew my protest against his Tri-SLI claim because, Nvidia, very recently, has changed their stance on the card and finally released these "updates"...Which, is great but, as I said, not everyone here, that has a 590, knew about those changes...There were no emails...No updates in the WHQL notes...So, what did you expect?

We were having an argument but, one that was adult-like, no insults were thrown and while I may come off as being arrogant, that's never going to change...I'll always come off as being such but, I'm not and in the past week, I've learned a lot.

And just for future reference...I weigh a solid 250lbs, I hit the gym 5x a week, I'm a former VPOP/Admin of Alienware and I now own/operate my own computer company out of Danbury, Connecticut...I'm available to meet you personally -- Meeting my haters in person does bring me such joy.

So please, send me a PM, I'd love to meet you in person and prove that I am who I say I am, 100% on the forums just as much as off...


----------



## Smo

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Mate, people like you piss me off. I signed up to this forum just to reply to all the **** you post.. Yesterday you were asking mywifeispist for proof as to what he was claiming and calling it all bull****.. Today you are " formally withdrawing"??? Is this your way of an apology? People like you are what give forums around the world a a bad name.. I guess with 2500+ posts you think yourself and Alpha to these parts.. You're nothing but a fat **** without a girlfriend.. And I bet I'm right.. Apologize to the man properly and then pull your head in ..


Are you serious? Masked has been a huge help to many forum members and a great bloke.


----------



## mywifeispist

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Mate, people like you piss me off. I signed up to this forum just to reply to all the **** you post.. Yesterday you were asking mywifeispist for proof as to what he was claiming and calling it all bull****.. Today you are " formally withdrawing"??? Is this your way of an apology? People like you are what give forums around the world a a bad name.. I guess with 2500+ posts you think yourself and Alpha to these parts.. You're nothing but a fat **** without a girlfriend.. And I bet I'm right.. Apologize to the man properly and then pull your head in ..


we are all good, i think everything got worked out in my earlier post and his (Masked)..... we are fine, just a innocent debate. just lets not get this thread out of hand and keep it going in the right direction.


----------



## Shinobi Jedi

I flashed my Bios again, and still no change. When I load up the Nvidia Bios editor, nothing registers at all and I have to load a bios file from my documents tab to make changes. Some site online said then the editor would recognize the bios on the card? But that doesn't make any sense. Is it that I need to reflash the card to the bios file in the documents tab that has been changed with Nibitor?

And I'd still like to know if this is even worth doing if I don't plan to do any overclocking and run things at stock?

Thanks!


----------



## mywifeispist

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> I flashed my Bios again, and still no change. When I load up the Nvidia Bios editor, nothing registers at all and I have to load a bios file from my documents tab to make changes. Some site online said then the editor would recognize the bios on the card? But that doesn't make any sense. Is it that I need to reflash the card to the bios file in the documents tab that has been changed with Nibitor?
> And I'd still like to know if this is even worth doing if I don't plan to do any overclocking and run things at stock?
> Thanks!


no overclock? then safer bet is to leave it alone if its working fine.


----------



## Xraze

Quote:


> Originally Posted by *Shinobi Jedi*
> 
> I flashed my Bios again, and still no change. When I load up the Nvidia Bios editor, nothing registers at all and I have to load a bios file from my documents tab to make changes. Some site online said then the editor would recognize the bios on the card? But that doesn't make any sense. Is it that I need to reflash the card to the bios file in the documents tab that has been changed with Nibitor?
> And I'd still like to know if this is even worth doing if I don't plan to do any overclocking and run things at stock?
> Thanks!


If you're not gonna overclock just use the stock BIOS. It's the safest and easiest.


----------



## emett

He only had to ask 3 time


----------



## iARDAs

Quote:


> Originally Posted by *Masked*
> 
> I, now, have 3 680's and I'll be the first to tell you...They're not really worth the hype.
> It is true that this is their "mid-range" so, would I sacrifice a 590 for it? No.
> It's not mid-range in the sense of it's scores, obviously but, it is mid-range in terms of the hardware on the card.
> Quite personally, I'd wait -- Sept/Oct was the date I received for round 2 and that's when the "real" 680 will be released...Get it then but, not now.


Sorry for the late response. I was very busy in the last few days and had a lot going on.

I will definitely listen to your advice. To be honest i am taking a look at guru3d.com daily and see their reviews on the new 680s including the OC versions and pretty much none of them beats 590. A OC 680 performs very similar to 590 in some games or 590 still outperforms that card considerably. Thats why I am definitely NOT going to buy these 680s but will probably hunt for 690.


----------



## iARDAs

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Mate, people like you piss me off. I signed up to this forum just to reply to all the **** you post.. Yesterday you were asking mywifeispist for proof as to what he was claiming and calling it all bull****.. Today you are " formally withdrawing"??? Is this your way of an apology? People like you are what give forums around the world a a bad name.. I guess with 2500+ posts you think yourself and Alpha to these parts.. You're nothing but a fat **** without a girlfriend.. And I bet I'm right.. Apologize to the man properly and then pull your head in ..


Wow amazing post. This dude deserves at least 10 reps from everyone...

I have a feeling of who you are anyhow.


----------



## emett

I hope you are joking iArdAs!

Masked what can you tell us about that nVidia teaser pic? Is it gonna be the gtx690 or just a gtx680m?

Super pumped if it is the 690..


----------



## iARDAs

Quote:


> Originally Posted by *emett*
> 
> I hope you are joking iArdAs!
> Masked what can you tell us about that nVidia teaser pic? Is it gonna be the gtx690 or just a gtx680m?
> Super pumped if it is the 690..


About Iron Maiden's post?

Of course I am 

There seems to be an attack on 590 by some individuals on this forum. I have seen 4-5 different users with 1 posts in various 680 threads bashing 590.


----------



## iARDAs

Guys how does our GPU perform with resolutions over 1080p?

Especially in BF3?

Also anyone using a 1440p monitor? how is it with our 590 in general? also is it a good upgrade over 1080p?


----------



## Evilrandy

Hi there, My name is Evilrandy and would like to be recognized as a member of the 590 club. Im currently running 2 GTX590 EVGA Classifieds in Quad SLI on a Asus P8Z68 Deluxe Gen 3 board with i7-2700k, H100 Cooler, 16g of GSkill ram and 2 120g Corsair Force GT SSD's in Raid 0. Also I have a MSI P67-GD65 (B3) board. with the same components. Both are in Coolermaster HAF X cases. Both was and is currently utilizing a i7-2700k. I do not overclock but am interested in trying it out. Great site by the way


----------



## Evilrandy

Here is a picture of the 4 590s on my kitchen counter the day they arrived................


----------



## mironccr345

can I have one?


----------



## mywifeispist

Quote:


> Originally Posted by *Evilrandy*
> 
> Hi there, My name is Evilrandy and would like to be recognized as a member of the 590 club. Im currently running 2 GTX590 EVGA Classifieds in Quad SLI on a Asus P8Z68 Deluxe Gen 3 board with i7-2700k, H100 Cooler, 16g of GSkill ram and 2 120g Corsair Force GT SSD's in Raid 0. Also I have a MSI P67-GD65 (B3) board. with the same components. Both are in Coolermaster HAF X cases. Both was and is currently utilizing a i7-2700k. I do not overclock but am interested in trying it out. Great site by the way


love my 590's


----------



## tonyjones

How come the GTX 590 market price is like 25% higher than Radeon 6990 but they MSRP the same price, haha I guess supply and demand, not enough going around. I just picked up a 6990 pretty much new for $400 shipped, was trying to find a 590 but wasn't going to shell out $600.


----------



## Masked

Quote:


> Originally Posted by *tonyjones*
> 
> How come the GTX 590 market price is like 25% higher than Radeon 6990 but they MSRP the same price, haha I guess supply and demand, not enough going around. I just picked up a 6990 pretty much new for $400 shipped, was trying to find a 590 but wasn't going to shell out $600.


6990 was never a limited edition and the hardware on the card, itself, is cheaper.

I'm not about to get into a VRM debate with anyone...Again...But, the actual quality of the hardware on your PCB is quite significant in comparison to AMD's offerings.

That being said, I am not going the route of the 690...I can't say more, just stating my opinion.

The whole 590 situation and waiting until the 680 to release MASSIVE updates -- Just leaves a very bad taste in my mouth...

I'm personally waiting for 680, round 2 ~ Even though my current 2 are beasts.


----------



## tonyjones

gotcha, yeah I'm just excited to see how much performance gain will the 690 be over the 590 if so I'll just order one and if I don't want it I can always resell ha


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> 6990 was never a limited edition and the hardware on the card, itself, is cheaper.
> I'm not about to get into a VRM debate with anyone...Again...But, the actual quality of the hardware on your PCB is quite significant in comparison to AMD's offerings.
> That being said, I am not going the route of the 690...I can't say more, just stating my opinion.
> The whole 590 situation and waiting until the 680 to release MASSIVE updates -- Just leaves a very bad taste in my mouth...
> I'm personally waiting for 680, round 2 ~ Even though my current 2 are beasts.


It will be interesting to hear your opinion on why you are going to avoid the 690 when you can talk about it.


----------



## iARDAs

If masked says avoid 690 than i believe him.

So I am waiting for you 790


----------



## mywifeispist

i will absolutely look at the 690's, it just has to be worth me giving up my 590's and im not sure i wanna yet............ and believe me i will be looking hard and doing my research on them


----------



## iARDAs

Hey Guys

690 is announced it seems good but i gave up on 3D gaming and 1080p will be the only resolution that I will be gaming. 690 would be too much for my purposes right? 590 will probably cut it for the next 2 years as well. What do you guys think?


----------



## Tslm

Quote:


> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tonyjones*
> 
> How come the GTX 590 market price is like 25% higher than Radeon 6990 but they MSRP the same price, haha I guess supply and demand, not enough going around. I just picked up a 6990 pretty much new for $400 shipped, was trying to find a 590 but wasn't going to shell out $600.
> 
> 
> 
> 6990 was never a limited edition and the hardware on the card, itself, is cheaper.
> 
> I'm not about to get into a VRM debate with anyone...Again...But, the actual quality of the hardware on your PCB is quite significant in comparison to AMD's offerings.
Click to expand...

The 6990 pcb is exceptional. You may want to educate yourself before posting information, especially when making a general observation about the way both companies make pcbs in general. Ignoring nvidias habit of skimping on power delivery (the 590 wasn't the only one) for a moment, AMD pcbs are always extremely solid and are no worse than nvidia equivalents.


----------



## rush2049

Quote:


> Originally Posted by *Tslm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tonyjones*
> 
> How come the GTX 590 market price is like 25% higher than Radeon 6990 but they MSRP the same price, haha I guess supply and demand, not enough going around. I just picked up a 6990 pretty much new for $400 shipped, was trying to find a 590 but wasn't going to shell out $600.
> 
> 
> 
> 6990 was never a limited edition and the hardware on the card, itself, is cheaper.
> 
> I'm not about to get into a VRM debate with anyone...Again...But, the actual quality of the hardware on your PCB is quite significant in comparison to AMD's offerings.
> 
> Click to expand...
> 
> The 6990 pcb is exceptional. You may want to educate yourself before posting information, especially when making a general observation about the way both companies make pcbs in general. Ignoring nvidias habit of skimping on power delivery (the 590 wasn't the only one) for a moment, AMD pcbs are always extremely solid and are no worse than nvidia equivalents.
Click to expand...

Tslm, I think rather you are the under-educated one.

I don't have nearly the experience with this hardware I am sure that masked has, but the one 6990 I have seen has overheating and driver issues. The 590 does not have power draw problems, or even weak hardware on that front. It was designed to be less than the 580's solution (probably size/layout constraints) and it performs very well, when not limited by the drivers artificially.....


----------



## Tslm

Quote:


> Originally Posted by *rush2049*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tslm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Masked*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tonyjones*
> 
> How come the GTX 590 market price is like 25% higher than Radeon 6990 but they MSRP the same price, haha I guess supply and demand, not enough going around. I just picked up a 6990 pretty much new for $400 shipped, was trying to find a 590 but wasn't going to shell out $600.
> 
> 
> 
> 6990 was never a limited edition and the hardware on the card, itself, is cheaper.
> 
> I'm not about to get into a VRM debate with anyone...Again...But, the actual quality of the hardware on your PCB is quite significant in comparison to AMD's offerings.
> 
> Click to expand...
> 
> The 6990 pcb is exceptional. You may want to educate yourself before posting information, especially when making a general observation about the way both companies make pcbs in general. Ignoring nvidias habit of skimping on power delivery (the 590 wasn't the only one) for a moment, AMD pcbs are always extremely solid and are no worse than nvidia equivalents.
> 
> Click to expand...
> 
> Tslm, I think rather you are the under-educated one.
> 
> I don't have nearly the experience with this hardware I am sure that masked has, but the one 6990 I have seen has overheating and driver issues. The 590 does not have power draw problems, or even weak hardware on that front. It was designed to be less than the 580's solution (probably size/layout constraints) and it performs very well, when not limited by the drivers artificially.....
Click to expand...

He was referring to the pcb, not the cooler nor the software the card uses.

Anyway it was probably a bit rash to say uneducated but this supposed difference in pcb quality doesn't actually exist. I wasn't having a go at the 590.


----------



## mywifeispist

Quote:


> Originally Posted by *iARDAs*
> 
> Hey Guys
> 
> 690 is announced it seems good but i gave up on 3D gaming and 1080p will be the only resolution that I will be gaming. 690 would be too much for my purposes right? 590 will probably cut it for the next 2 years as well. What do you guys think?


i also gave up up the multi monitor 3D as well but i still own two 590's so i just run them in 3 way sli on my 42" panasonic plasma now and love this setup for 1080p gaming...... yes a little overkill now but it is a beast and in 3 way with one monitor its pretty darn nice, im good for a while with these things.


----------



## Masked

Quote:


> Originally Posted by *Tslm*
> 
> He was referring to the pcb, not the cooler nor the software the card uses.
> Anyway it was probably a bit rash to say uneducated but this supposed difference in pcb quality doesn't actually exist. I wasn't having a go at the 590.


I was referring to the PCB and considering I see actual cost and am beyond aware of the actual cost (Something you clearly aren't) I disagree entirely with your statement.

The VRM's the 590 used are the highest quality in the industry and cut Nvidia's profit margin per card, down significantly...It was their DELIVERY of the power to the "hardware" itself that was the failure, not in any/form/function of reality was it a "cheap hardware issue"...Do your research.

Having been with Alienware when both cards were released, I have an intimate knowledge of both and...I'm sorry but, in terms of cost, the 6990 was cheaper...

I mean you can sit there and whine, complain, B&M but, at the end of the day, AMD had a MUCH HIGHER profit margin on the 6990 then the 590 did and it was because NVIDIA used more expensive hardware on the cards.

Another indication of this was/is the fact of how limited the 590 actually was ~ I can't give out unit numbers but, it was limited because of the hardware on the card ~ ONLY the best was used, even of the VRM's and nothing was gimped out at all.

The 6990 on the other hand has about 10-15x the available units...Has been cited for extreme UNDERCLOCKING because their cores can't handle the load and the heat index is staggering...Had AMD actually used quality caps etc, the dissipation would at least be marginal.

So unfortunately, I absolutely disagree from a vendor perspective...Even from an enthusiast's perspective, actually...That the 6990 is a higher quality card than the 590 ~ That's just a lie.


----------



## Shinobi Jedi

I'm not going to lie, I just woke up to the 690 press release a little while ago and am drooling.

But like Masked, I am also a little torked and disappointed with Nvidia for not showing the 590 _any_ love during the entire release cycle. That is ridiculously bad customer service on Nvidia's part considering we're the customers that bought their most expensive and boutique product. So I think I'll be voting with my dollar and sitting this Hardware round out. The problem is, Nvidia probably won't get the message or care since the probable limited quantities of the card almost guarantee a sold out market.

Combine that with the fact that with most PC games being ports of console builds, and even the ones that aren't are not taxing these cards at all so that 120fps at full blast on a 120hz display on almost every game is acheived, then the decision becomes even easier.

Also, and this may be the biggest deal breaker for me, is that it's only 2gb VRAM per GPU. I was hoping/expecting for at least 3gb per GPU considering how many 580 3gb owners seem to insists that you need more than 2gb of VRAM and often seem to have the screen shots to prove it.

Masked, hopefully you can give us more insight tech wise as to why you're holding off when the NDA is lifted on Thursday and the reviews come out? And is there any chance of a 690 round 2 that might be worth it then?


----------



## Masked

Since I'm getting like 2-3 PM's/Day about this -- I'm going to answer as much as I can.

I'm not getting the 690 because I feel, burned...For me, this is personal...And it's mostly personal because of things I can't really mention here but, the 590 wasn't what was promised...To anyone.

For ME, from a business standpoint and from a vendor standpoint the 590 was an incredible success...Stock sold out, the cores were indeed cherry picked and every component on the card was top dollar...I mean, as a former vendor, limited editions are always the prized pony and the 590 made vendors LOTS and LOTS of money, especially considering cost...Incredible success.

From an enthusiast standpoint, they didn't deliver what they promised. There's no well; they sort of did for me...We were all "burned" in one way or another by this card...Also, there was no mention, anywhere of the updates for the 680 WHQL drivers, no foresight, no maybe bla bla bla, nothing...There was no communication that anything was changing...And I'm actually higher on the totem pole than I was before...Not even a whisper.

That's great for you guys, it really is...Many of the things we've "QQ'd" about all year finally happen...It's awesome, actually...But, read my previous posts...Every single thing they've said says they support this card 100%, it's the best thing since sliced bread but, we're really the ginger step child...Let's be real about that.

That being said, I'm still not dismissing the 690, I'm going to stock the card, it won't be as limited (not to the extent the 590 was) and I'm going to offer it to customers...But, as a personal user, I'm on the fence...Do I spend 1200$ to watercool the card and pray all year like I did with the 590? Do I spend 3 weeks trying to write my own drivers in the hopes the card will be what it can be, again?

I had the highest overclock on water...1.01v, 850mhz ~ I remember it like it was yesterday...That to me, is what I'll remember when I think about this card because prior to the OCP and driver lock-downs, that was it's moment of glory.

You release a card, stand by it stubbornly as the community points out it's flaws...They make custom drivers to bypass these flaws...They actually still support the card...And you wait a fiscal year (11.2 months) to implement everything without a word...While it took the custom driver folks, a week to write the driver and release it on Guru...That just burns me, incredibly.

As a single slot solution, you still can't be the dual PCB cards...

In terms of the 680 = 590, with driver updates, it will happen...I can OC 1 680 about 10% past my 590 in terms of overall performance...Realistically right now, they're the same price, as well.

Let's be honest, though ~ This year, especially if you're a gamer -- There's nothing REAL and groundbreaking coming until Sept/Oct/Nov. ~ GW2 is an amazing game...D3 is an amazing game but, they're all based on the DX9 ported over to a DX11 engine...Meaning, they're just not that taxing...So with your single 590, you're still in the game, performance is still there, you're missing NOTHING by not upgrading this round.

I don't want to hold any of you back from upgrading...The 690 will be a good card...I personally, just don't have faith, as a user, that they're going to follow through...Maybe I'll do the 790, eventually but, for now, tri-sli 680's is my happy place.


----------



## iARDAs

Masked thank you for a well explained argument about whether upgrading to 690 or not.

I will probably pass it as well as my 590 is handling games in 2D perfectly well.

I will be traveling to USA in June and i MIGHT sell my 590 for around 600 US dollars here and grab a 690 there but I doubt I will do that. It all depends on wifes shopping over there 

Btw my fan is hitting 89 degrees on one core because the whether is getting hot here in Turkey. My custom fan is at 85% when this happens.

Should I be worried? or is 89 degrees still accapteble for this card on stock air cooling?


----------



## rush2049

I would say up to 90 is ok..... technically they can go to 100 or so without problems.... but stability starts getting flaky around 85+

Just use MSI Afterburner and set a slightly more aggressive fan profile curve.....


----------



## iARDAs

Quote:


> Originally Posted by *rush2049*
> 
> I would say up to 90 is ok..... technically they can go to 100 or so without problems.... but stability starts getting flaky around 85+
> Just use MSI Afterburner and set a slightly more aggressive fan profile curve.....


actually i Turned on lots of feature in Nvidia Control Panel and this might be the reason for higher temps. Also the latest drivers 3xx ones are running hotter for me. It could very well be driverl related too

but i will be having a more aggresive fan profile right now.

Last but not least I have 1 140mm fan on the side panel. I can have upto 4 there. I am thinking of placing that 1 i have to another place and see if it helps.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> Since I'm getting like 2-3 PM's/Day about this -- I'm going to answer as much as I can.
> I'm not getting the 690 because I feel, burned...For me, this is personal...And it's mostly personal because of things I can't really mention here but, the 590 wasn't what was promised...To anyone.
> For ME, from a business standpoint and from a vendor standpoint the 590 was an incredible success...Stock sold out, the cores were indeed cherry picked and every component on the card was top dollar...I mean, as a former vendor, limited editions are always the prized pony and the 590 made vendors LOTS and LOTS of money, especially considering cost...Incredible success.
> From an enthusiast standpoint, they didn't deliver what they promised. There's no well; they sort of did for me...We were all "burned" in one way or another by this card...Also, there was no mention, anywhere of the updates for the 680 WHQL drivers, no foresight, no maybe bla bla bla, nothing...There was no communication that anything was changing...And I'm actually higher on the totem pole than I was before...Not even a whisper.
> That's great for you guys, it really is...Many of the things we've "QQ'd" about all year finally happen...It's awesome, actually...But, read my previous posts...Every single thing they've said says they support this card 100%, it's the best thing since sliced bread but, we're really the ginger step child...Let's be real about that.
> That being said, I'm still not dismissing the 690, I'm going to stock the card, it won't be as limited (not to the extent the 590 was) and I'm going to offer it to customers...But, as a personal user, I'm on the fence...Do I spend 1200$ to watercool the card and pray all year like I did with the 590? Do I spend 3 weeks trying to write my own drivers in the hopes the card will be what it can be, again?
> I had the highest overclock on water...1.01v, 850mhz ~ I remember it like it was yesterday...That to me, is what I'll remember when I think about this card because prior to the OCP and driver lock-downs, that was it's moment of glory.
> You release a card, stand by it stubbornly as the community points out it's flaws...They make custom drivers to bypass these flaws...They actually still support the card...And you wait a fiscal year (11.2 months) to implement everything without a word...While it took the custom driver folks, a week to write the driver and release it on Guru...That just burns me, incredibly.
> As a single slot solution, you still can't be the dual PCB cards...
> In terms of the 680 = 590, with driver updates, it will happen...I can OC 1 680 about 10% past my 590 in terms of overall performance...Realistically right now, they're the same price, as well.
> Let's be honest, though ~ This year, especially if you're a gamer -- There's nothing REAL and groundbreaking coming until Sept/Oct/Nov. ~ GW2 is an amazing game...D3 is an amazing game but, they're all based on the DX9 ported over to a DX11 engine...Meaning, they're just not that taxing...So with your single 590, you're still in the game, performance is still there, you're missing NOTHING by not upgrading this round.
> I don't want to hold any of you back from upgrading...The 690 will be a good card...I personally, just don't have faith, as a user, that they're going to follow through...Maybe I'll do the 790, eventually but, for now, tri-sli 680's is my happy place.


Masked,

Are those drivers you referred to on Guru3D worth downloading? I can't find them. I won't bother if they're not worth it.

As for the 690's, the only thing that has me potentially wanting them over my 590's is because of the extra VRam. In case I wanted to get into 3D Surround gaming at 2560, or 3D stereo gaming on a surround setup. But the 690 only has 2gb of VRam per GPU vs. the 590's 1.5gb per GPU.

Is that extra half a gig per GPU worth it? Performance wise or cost wise? Maybe I'm trying to justify not upgrading, but I'm leaning towards No. Especially if you're on a single monitor. I bought a second 590 and went SLI so I could play BF3 in 3D full blast and stay @ 60fps constant. And it essentially does that just fine. TBH, I haven't played the game in months due to getting sucked into SWTOR, but every game I've run looking to play @ 120fps on a 120hz display has been meeting or exceeding those FPS no problem. Save Metro 2033 in some spots, but it avgs 100fps. So in terms of clock speeds, I'm not feeling the need to upgrade. Heck, I was getting in excess of 500fps on ME 3.

So my concern is if 2gb of VRam per GPU on the 690 still might be short for 3 monitor gaming at 2560x1600 at full blast, and thus a pointless upgrade. Or will it be enough, and worth considering? Thoughts on this?

Is the upgrade from the 590 to a 690 worth it for the VRAM difference? Will that difference be enough to achieve 120fps on a 3D surround setup at high resolutions?

Or wait for the round 2 on the 680's?

Masked, can you comment on whether or not they're might be a round 2 on the 690's?

For myself, I admit, I'm just being a Hardware junkie and caught up in the cycle. As it seems pretty pointless to upgrade if I'm staying on one monitor and not going to a 3 monitor surround setup. Which as far as I know, I'm not. Seeing as I don't have the room for it, and have yet to come up with a MacGyver solution.

I still wouldn't mind some clarification on this if anyone has any. I trust reviews on Thursday will also shed some more light.

I also like the LED's on the 590 better than the 690 and is another reason why I'm probably going to stay where I'm at.


----------



## emett

Quote:


> Originally Posted by *Masked*
> 
> Since I'm getting like 2-3 PM's/Day about this -- I'm going to answer as much as I can.
> I'm not getting the 690 because I feel, burned...For me, this is personal...And it's mostly personal because of things I can't really mention here but, .


I had a feeling this was the case, thanks for clearing it up.


----------



## iARDAs

@ Shinobi Jedi

we are on the same boat. The only reason to upgrade for 690 for the 3D purposes. I LOVE 3D but darn technology have so many issues. There are only few titles out there that have 0 issues in 3D but most of the games I have issues that bother you so much in 3D and thats why I am giving it up for the moment. So a 690 for a single monitor might be a overkill. I cant get 120 fps constant on BF3 with a 590 on everything ULTRA but its ok. I get around 90 so I can still take advantage of my 120 hz screen.

I am skipping 6xx series and this seems to be my final answer. Masked had seen this coming and told few of us to hang on to our 590s which I will be doing.


----------



## Xraze

I too want to know about that Guru3D driver, Masked,


----------



## Masked

I've edited my last post because conflicting information is coming down the pipeline...So, at this point, who knows. (In regards to my edit)

I actually can't find it anymore :/ ~ I shot off an email but, I don't have an answer.

I can't answer any performance based questions about the card...


----------



## Recipe7

@ Masked.

The 590 was a big letdown. However, it was marked at the 700USD price. It was essentially the price of 570 SLI, which was exactly as how they performed.

The 690 is 1000USD, and is said to be 90-100% of 680 SLI.

Wouldn't the 590 be justified if considering the price alone?

I like to think of the situation as such. I'm pissed that I can't clock to 800+ on the card, but it performs at it's price range. If it did clock to 800+ with the 'famed' voltage tweak... it would have been a marvel, and no one would have opened their mouths against the 590.


----------



## Masked

Quote:


> Originally Posted by *Recipe7*
> 
> @ Masked.
> The 590 was a big letdown. However, it was marked at the 700USD price. It was essentially the price of 570 SLI, which was exactly as how they performed.
> The 690 is 1000USD, and is said to be 90-100% of 680 SLI.
> Wouldn't the 590 be justified if considering the price alone?
> I like to think of the situation as such. I'm pissed that I can't clock to 800+ on the card, but it performs at it's price range. If it did clock to 800+ with the 'famed' voltage tweak... it would have been a marvel, and no one would have opened their mouths against the 590.


Like I said, from the vendor standpoint, this card was an incredible success. -- The 590 was advertised as 2x cherry picked GTX 580's with the best hardware to date on any PCB...Which, it is actually factual however, the performance was equivalent to 2x570's that's absolutely correct.

The difference with the 590, is how it was marketed...And on it's release, it was more/less meant to rival the price of the 6990's.

Now in terms of real life performance, it actually beats the 6990, even with the OCP etc...Stock for stock.

I'm not necessarilly upset with performance, my beef is very personal in regards to the fact that, many of the things we've asked for, even vendors...Were ready months and months ago, only to be rolled out with the 680's.

Now, is the 690 2x680s? You know I can't answer that









Look at it from this perspective...Your 590's are about 1x680 (Let's be very real about that, especially with soft OC's) so, you essentially, right now, have a 680. (I mean no offense but, you guys know I'm a realist)

So, you already have a card capable of playing every single game on the market, well...Witcher 2, Borderlands 2, D3, GW2, SWTOR...(Notice some are yet to be released) so, your performance is still there and it will be until September.

Is it worth the upgrade from THAT standpoint? IMHO, no...I say no because you already have the performance of 1 680 and that's more than enough for the upcoming lineup.

Is the 690 worth it if you have the cash? Sure...You have my blessing, I'll be stocking it, as I mentioned, fire away...But, if you play the games above...I'd wait.


----------



## Recipe7

I am really happy with my 590. Never considered a 6xx. Barely considering a 7xx, especially with the lineup of games this year, and maybe next (I'm sure the 590 can manage 60fps in Crysis 3 at 1080).

At the end of the day, it's the user's needs and wants. You always hit the dot on the general population of gamers, it then allows those who are nit-picky to really develop some sort of idea on pulling triggers on specific hardwares.

Thanks for your input Masked.


----------



## iARDAs

@ masked

Do you know if there will be games in 2013 that will challange BF3 and will require more beef?

The games you mentioned are all due this year and 590 will handle them easily.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> @ masked
> 
> Do you know if there will be games in 2013 that will challange BF3 and will require more beef?
> 
> The games you mentioned are all due this year and 590 will handle them easily.


I don't have an office that beta tests, anymore and unfortunately, I don't currently have the time with the amount of orders I have, plus the fact that I'm still a 1 man show so, I don't have an answer.

Bioshock Infinite will be hardcore but, they've done a great job so, it won't be that taxing...Metro LL will be the new standard for benching ~ I think that will be taxing...Prey 2 is really sexy...Prototype 2 was really cool...Devils Third...Risen 2...D3...

Of everything that's being released, I think Metro LL will become your gold standard and will that tax the 590? ~ More than likely...

From my personal perspective, as a hardcore MMO'er ~ Tri-Sli is going to last me at least a year...I'm still waiting on my custom blocks so, you'll get to see those babies when I get down to the nitty gritty.

I have to admit that at this point, I'm very biased because I only MMO on the PC, anymore...It's so much easier to pick up a controller and play a console game vs. being at the PC, for me.

In regards to the games above, practically every single game I mentioned is on the PS3 where, I'll almost be exclusively playing it...Granted, there are 1 or 2, I only want to play on the PC...In terms of my time/ease and enjoyment, I'd rather play them on a console.


----------



## Shinobi Jedi

Quote:


> Originally Posted by *Masked*
> 
> I don't have an office that beta tests, anymore and unfortunately, I don't currently have the time with the amount of orders I have, plus the fact that I'm still a 1 man show so, I don't have an answer.
> Bioshock Infinite will be hardcore but, they've done a great job so, it won't be that taxing...Metro LL will be the new standard for benching ~ I think that will be taxing...Prey 2 is really sexy...Prototype 2 was really cool...Devils Third...Risen 2...D3...
> Of everything that's being released, I think Metro LL will become your gold standard and will that tax the 590? ~ More than likely...
> From my personal perspective, as a hardcore MMO'er ~ Tri-Sli is going to last me at least a year...I'm still waiting on my custom blocks so, you'll get to see those babies when I get down to the nitty gritty.
> I have to admit that at this point, I'm very biased because I only MMO on the PC, anymore...It's so much easier to pick up a controller and play a console game vs. being at the PC, for me.
> In regards to the games above, practically every single game I mentioned is on the PS3 where, I'll almost be exclusively playing it...Granted, there are 1 or 2, I only want to play on the PC...In terms of my time/ease and enjoyment, I'd rather play them on a console.


Yeah, Metro LL will be the one. Probably even more than Crysis 3.

I'm not even sure two 590's in SLI will be enough for those games with the 1.5gb Vram cap. If the 590's were 2gb Vram per GPU like the 690's, we wouldn't even be having this conversation. But I doubt if I'll be upgrading just for that game.

I've also gotten so sucked into SWTOR, that it's been mostly all I've been playing. Considering I can get 60fps avg. with everything turned up except shadows on my G73 notebook, upgrading my desktop anytime soon on GPU's seems to be excessive.

@iARDAS:

Really good points about 3D gaming too. Furthur emphasizes why these aren't worth the upgrade. If I somehow end up on 3 monitors in surround and am limited by my 590's, I'll consider it then. But for now I'm way more focused on getting my CPU and Mobo upgraded.

This is a good conversation though. It helps kill the Hardware jones those 690's give to my admittedly addicted to Hardware self.

Cheers!


----------



## MoleStrangler

Quote:


> Originally Posted by *Recipe7*
> 
> I am really happy with my 590. Never considered a 6xx. Barely considering a 7xx, especially with the lineup of games this year, and maybe next (I'm sure the 590 can manage 60fps in Crysis 3 at 1080).
> At the end of the day, it's the user's needs and wants. You always hit the dot on the general population of gamers, it then allows those who are nit-picky to really develop some sort of idea on pulling triggers on specific hardwares.
> Thanks for your input Masked.


Like yourself I am happy with my 2 x 590s. I will skip the 6xx video cards, I'll wait for the 7xx cards to see if they give anything special.

The budget I would have spent on a 690 will go on a monitor upgrade. I've been looking a a decent NEC display for a while now, I'm on a crappy Dell screen right now but thinking that having spent lots of video power its time to spend some on quality the image.


----------



## Masked

Like I said about the 690...~ It's a good card and a great card but, necessary for future gaming? I don't think so.

There are also limitations on the OC already in place + the same throttling technology that exists on the 680's.

Personally, as a dual PCB card on air, I don't see it being as "successful" as the 590 was originally without all the driver/ocp broohah.


----------



## iARDAs

Whether is getting hot here in Turkey

My 590 would top around 84 degrees at 80-85%fan speed

Now i can only keep 84 degrees with 95% fan speed

Any suggestions?

Should i be adding more side fans to my case?

Should i go with water cooling?

How much would it cost me to water cool my system? And will the watercooling system i get will only be compatible with 590?

I will be in USA on mid june so i might grab a watercooling kit but i am not sure/.


----------



## w4rp1e

hey guys, I'm just wondering if anyone has a core that sits 30 degrees over the other?

I'm currently running quad SLI with 2 590s, I just played a round of bf3 (ultra settings ect ect) and all the cores bar 1 sat around 47, while one core sat at 70??
its weird though because at idle this still happens, currently if I look at my temps I read 66 42 38 41.
why is one way hotter than the others at all times?


----------



## kzinti1

If you're watercooling, then it sounds like a stray bubble. If you're running them on air, I haven't a clue. But, if on air, swap them around and see what happens.


----------



## w4rp1e

It's air, and I've swap one card for the other and ran them both by themselves, it's only the one core, doesn't seem to effect the way they run, maybe a faulty reading?


----------



## iARDAs

So I turned off my PC. Took my 590 out and saw that under the hood right on the side of where the FAN is visible, the place where the heat from the cores would come out are covered with layers of DUST.



I am talking about the places that I labeled in red.

I dont have anything to clean them, so i got a toothpick, entered carefully from the fan and got the layers of dust with it. Luckily everytime the layer got stuck on the toothpick and in 3-4 tries i could get all the layers of dust on those spots.

Now idle my 590 is 5-6 degrees lower.

I can say that maybe those dust layers were 70-80% covering those places.

No wonder why i was hitting 90s

Anyhow. Althouygh i keep my room clean. dust is everywhere on my PC> I am thinking of putting a wooden block under the case so the dust will not be able to reach my PC as much as it used to.

EDIT : At Full load my GPU would hit 92-93 degrees before cleaning the 590

now at full load it hits 78 max. MY GOD is all i can say.

I am using 95% fan speed. Would this bring the life span of the fan down?


----------



## w4rp1e

Quote:


> Originally Posted by *iARDAs*
> 
> Anyhow. Althouygh i keep my room clean. dust is everywhere on my PC> I am thinking of putting a wooden block under the case so the dust will not be able to reach my PC as much as it used to.


I just lifted my PC off the carpet with a big piece of ply, now there's room underneath for the PSU to bring in air as I upgraded to a 1500w
It's also made it more stable on the floor and much nicer to slide back and forth


----------



## Xraze

By the way, am I the only GTX 590 user that has bouncy GPU usage in BF3 multiplayer? It's never at constant 99% usage


----------



## iARDAs

Quote:


> Originally Posted by *Xraze*
> 
> By the way, am I the only GTX 590 user that has bouncy GPU usage in BF3 multiplayer? It's never at constant 99% usage


I used to have it too

For me it was a driver issue.


----------



## Recipe7

Quote:


> Originally Posted by *Xraze*
> 
> By the way, am I the only GTX 590 user that has bouncy GPU usage in BF3 multiplayer? It's never at constant 99% usage


I'm with you.

Quote:


> Originally Posted by *iARDAs*
> 
> I used to have it too
> 
> For me it was a driver issue.


I still have it bouncing around 40-60% most of the time. I have the latest drivers too.

And regarding your dust issue, If you cleaned it properly and able to maintain between 75-85 degrees at load, that should be perfectly fine. I don't prefer running the fan at 95% for extended periods of time (like more than 30 minutes to an hour).

Just gotta blame the weather in Turkey, hehe.


----------



## iARDAs

Quote:


> Originally Posted by *Recipe7*
> 
> I still have it bouncing around 40-60% most of the time. I have the latest drivers too.
> And regarding your dust issue, If you cleaned it properly and able to maintain between 75-85 degrees at load, that should be perfectly fine. I don't prefer running the fan at 95% for extended periods of time (like more than 30 minutes to an hour).
> Just gotta blame the weather in Turkey, hehe.


Man that dust layer was incredible. from now on I will clean the interior of my PC every month, worse case scenario every other month. I have my 590 for almost 5 months maybe more, and i was yet to clean it until today.

I just changed my fan profile. The highest I did is 75% and lets see how this works out.

At 60 degrees 55%

70 degrees 65%

80 degrees 75%

I also turn off MSAA and only use FXAA.

I find FXAA to be very good and use it in every single game. I am loving the technology so far.

Ah and the whether in Turkey will get even worse.

So i purchased my first ever SSD and i will have it on Monday or Tuesday

Right now this is how my case looks



Would having my soundcard right above my 590's fan cause an issue?

I believe I can also place my soundcard to the small tiny blue place on the motherboard right below my CPU fan. (on the below pic my 590 is not installed)



Since I also have side panel fans would it be smart to put the soundcard right between the CPU fan and the 590?

Any thoughts?


----------



## YP5 Toronto

Or the water block isnt seated evenly. Happened to me...just had to tighten a few screws and it was normalized.


----------



## Recipe7

I have some dust filters all around the outside of my case, particularly at the intake ports. Even with the filters, my case gets quite dusty, even if I maintain my room well.

I have the Datavac, makes cleaning dust easy. Once a month or so I take my case to the backyard and blow air throughout the case, even through the fan vent of the 590. There is always an impressive puff of dust that comes out both points of the 590, it's crazy how much dust gets stuck there.

As for your soundcard, it seems it isn't took close to the intake of the GPU since it looks like it's a good 3/4's of an inch away from the fan, plus its only covering about 30-40% according to your picture. Another thing is that the soundcard doesn't produce any heat, so you should be ok with where it is now.


----------



## MKHunt

Sup 590 friends. Long time, eh? I'm still running 285.27 or 67 or w/e it is because anything 290+ makes minecraft run at 15fps and it makes all my clocks all sorts of screwy. Not sure if anyone else has experienced that.

Also I discovered Firefall almost a month ago. And now I yearn to test it and discuss its flaws and strengths with every fiber of my being. Blizzard knows how picky I can be in betas









So yeah. That's my updates. I've gotten back into cars. It switches between cars and computers. Current project: 1989 Rolls Royce Silver Spur.


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> So I turned off my PC. Took my 590 out and saw that under the hood right on the side of where the FAN is visible, the place where the heat from the cores would come out are covered with layers of DUST.
> 
> 
> 
> I am talking about the places that I labeled in red.
> 
> I dont have anything to clean them, so i got a toothpick, entered carefully from the fan and got the layers of dust with it. Luckily everytime the layer got stuck on the toothpick and in 3-4 tries i could get all the layers of dust on those spots.
> 
> Now idle my 590 is 5-6 degrees lower.
> 
> I can say that maybe those dust layers were 70-80% covering those places.
> 
> No wonder why i was hitting 90s
> 
> Anyhow. Althouygh i keep my room clean. dust is everywhere on my PC> I am thinking of putting a wooden block under the case so the dust will not be able to reach my PC as much as it used to.
> 
> EDIT : At Full load my GPU would hit 92-93 degrees before cleaning the 590
> now at full load it hits 78 max. MY GOD is all i can say.
> 
> I am using 95% fan speed. Would this bring the life span of the fan down?


Oh it's quite obvious that the heatsinks will eventually start catching dust at some point.

I had to clean my 590 three times already, dust was gathering at the exact same spots that you marked and yes, it does affect temperatures more than people would think. Mine was also running at the 90°C mark when it happened.

With 70% fanspeed im mostly getting temps between 78°C and 80°C so when the card starts going to the upper 80's I know its time to clean it from dust







The removable cover makes it practical and easy too.


----------



## iARDAs

Quote:


> Originally Posted by *toX0rz*
> 
> Oh it's quite obvious that the heatsinks will eventually start catching dust at some point.
> I had to clean my 590 three times already, dust was gathering at the exact same spots that you marked and yes, it does affect temperatures more than people would think. Mine was also running at the 90°C mark when it happened.
> With 70% fanspeed im mostly getting temps between 78°C and 80°C so when the card starts going to the upper 80's I know its time to clean it from dust
> 
> 
> 
> 
> 
> 
> 
> The removable cover makes it practical and easy too.


I was hiting 92-93 degrees. It was waaaaaaaaay too much. and this happened with 95% fan speed.

After cleaning out, my tempereatures are now 82 max with 80% fan speed. Incredible difference.

I actually could not clean it as much as I wished for but it still did the job. Ah also i did not take off the cover for my 590. I am thinking of it but i am too clumsy and i might break something thats for sure.

Today my SSD will come and i will be installing it to my case, and I am thinking of doing some cleaning as well.

I came across to this guide on youtube to take a 590 apart






I am guessing this is a detail taking apart video. Is it possible to just take that front cover off without too much hassle so the 590 will be like the picture I posted few posts back?


----------



## mironccr345

Hey everyone! I've been reading this thread since I've got my 590 over two months ago, theirs lots of information and it makes me feel like a noob again.









I just wanted to share some pic's of my 590. It's been sitting in the box for over two months and the only time I've used it is to test it to make sure it works, (I bought it used from a OCN member.) My main rig is down because Im waiting for a 2500k that I bought from an OCN member, and it's going on three weeks now. I might have to report the dude since he hasn't reponded to my emails.









But here's my 590 in it's temporary home. Im using my HTPC that I use in my room. I switched out the PSU with an AX850 and slapped in the 590. Im not gaming on it, just using it until I get my main rig up and running.

The case is a Fractal Design Core 1000.


----------



## vwmikeyouhoo

I have been getting strange hiccups lately with my card. I get frame rate dips at random times, even running 3dmark 11 the frames are not smooth and I went from a 11k+ graphic score to a 9400 score. I completely wiped drivers and tried 4 different drivers with the same effect.

I noticed while playing bf3 yesterday that the gpu usage would sometimes be only 50%! I feel like im going crazy with this poor performance. Anyone have any clues to why this may be happening?


----------



## Krazeswift

Hey guys quick question,

Any reason why my fan speed isn't altering on certain games? Is it anything to do with games that aren't fully sli supported?

Just tried the sniper elite and ridge racer demos from steam and the fan didn't budge once, just stayed at 40% whilst gpu1 was hitting 90*c ! Gpu2 was only at 50*c with minimal usage.

I'm using msi afterburner with a custom fan profile, tried disabling afterburner but the fan still didn't kick in.. ideas?


----------



## iARDAs

Quote:


> Originally Posted by *mironccr345*
> 
> Hey everyone! I've been reading this thread since I've got my 590 over two months ago, theirs lots of information and it makes me feel like a noob again.
> 
> 
> 
> 
> 
> 
> 
> 
> I just wanted to share some pic's of my 590. It's been sitting in the box for over two months and the only time I've used it is to test it to make sure it works, (I bought it used from a OCN member.) My main rig is down because Im waiting for a 2500k that I bought from an OCN member, and it's going on three weeks now. I might have to report the dude since he hasn't reponded to my emails.
> 
> 
> 
> 
> 
> 
> 
> 
> But here's my 590 in it's temporary home. Im using my HTPC that I use in my room. I switched out the PSU with an AX850 and slapped in the 590. Im not gaming on it, just using it until I get my main rig up and running.
> The case is a Fractal Design Core 1000.


Do something about that guy. Did you pay him already? Good idea to get a better PSU by the way and ENJOY your 590. 690 is the new king but 590 will take you to the future for 1-2 more years in everything maxed at 1080p









Quote:


> Originally Posted by *vwmikeyouhoo*
> 
> I have been getting strange hiccups lately with my card. I get frame rate dips at random times, even running 3dmark 11 the frames are not smooth and I went from a 11k+ graphic score to a 9400 score. I completely wiped drivers and tried 4 different drivers with the same effect.
> I noticed while playing bf3 yesterday that the gpu usage would sometimes be only 50%! I feel like im going crazy with this poor performance. Anyone have any clues to why this may be happening?


I wonder if something else is conflicting. Also are you running everything with stock settings? Maybe something is wrong with the overclock though you tried 4 different drivers. For me 301.24 was terrible and i was getting lots of stuttering now i am back to the latest WHQL and everything works like a charm. I also had that GPU usage issue in BF3 but for me it was a driver issue. However for some people it is not.

Quote:


> Originally Posted by *Krazeswift*
> 
> Hey guys quick question,
> Any reason why my fan speed isn't altering on certain games? Is it anything to do with games that aren't fully sli supported?
> Just tried the sniper elite and ridge racer demos from steam and the fan didn't budge once, just stayed at 40% whilst gpu1 was hitting 90*c ! Gpu2 was only at 50*c with minimal usage.
> I'm using msi afterburner with a custom fan profile, tried disabling afterburner but the fan still didn't kick in.. ideas?


Did you try with a different driver? At one driver in the past i had a similar issue with my fan. It would not start at all or would stuck at 40% Also check your powercables and see if the Geforce Led in your 590 is lit and not flashing.


----------



## MKHunt

Quote:


> Originally Posted by *iARDAs*
> 
> Do something about that guy. Did you pay him already? *Good idea to get a better PSU by the way* and ENJOY your 590. 690 is the new king but 590 will take you to the future for 1-2 more years in everything maxed at 1080p


I was about to say the AX850 is one of the best PSU's in its class (absolutely ungodly ripple suppression) but then I realized you were complimenting and not suggesting.

Herp derp.

@mironccr345: Embrace the 590 for what it is. It's not perfect nor the king, but it's still a beast.


----------



## iARDAs

So guys today I will be purchasing a air blower kind of an equipment that is designed to blow the dust away in the PC.

How should I use this product with my 590? I know few of you there have such equipments to blow out dust but where should i push the air from in 590? Also how close should i get?

I will also be using this machine in my entire PC, do i have to take equipments out one by one and use the air blower? or is it ok that they will be installed on the PC and i can still use the air blower?

thank you folks.


----------



## Krazeswift

Thanks for the suggestions, it seems to be a problem with afterburner. I've disabled my custom profile and at least getting some movement on the fan now. Tried a few drivers but remained the same.

I'll keep playing


----------



## emett

iARD just learn to pop the cover of, it's only four screws and one clip from memory. It will allow you to clean it properly.

I'm selling my mighty Palit GTX 590 to a mate whose gonna run a quad gpu setup. I yesterday bought 2 GTX 680's after a couple days trying to source a GTX 690. I'm sure i'll bump into you guys else where on the forum.. Thanks for all the help with overclocking etc.. I'm sure it will come in handy when my mate realises his first 590 is under performing


----------



## iARDAs

Quote:


> Originally Posted by *emett*
> 
> iARD just learn to pop the cover of, it's only four screws and one clip from memory. It will allow you to clean it properly.
> I'm selling my mighty Palit GTX 590 to a mate whose gonna run a quad gpu setup. I yesterday bought 2 GTX 680's after a couple days trying to source a GTX 690. I'm sure i'll bump into you guys else where on the forum.. Thanks for all the help with overclocking etc.. I'm sure it will come in handy when my mate realises his first 590 is under performing


Well the heavy stones in 590 Club are leaving one by one. I might be the next person in line around June-July. Thinking of grabbing a GPU while I am on vacation in USA. Normally I would not but everything here in Turkey is 1.5 or 2 times more expensive here so I might just use my advantage of going for vacation in USA and grab a 690. I can sell my 590 here for around 610$ so i will just add another 400$..

I just bought one of those things to blow air and it worked like a charm a few mins ago, for now I am skipping unscrewing the cover but maybe in the next clean up I might do that.

So your friend is going to run a Quad 590 setup? Tell him to linger here.

See you later mate.


----------



## MKHunt

You can count on me being here for a while. I might not post, but until my 590 doesn't cut it, I have a feeling it'll stay safely ensconced in my case. I don't post often because I am content with it


----------



## Masked

Quote:


> Originally Posted by *MKHunt*
> 
> You can count on me being here for a while. I might not post, but until my 590 doesn't cut it, I have a feeling it'll stay safely ensconced in my case. I don't post often because I am content with it


420 Is such a legit number as well.

So as some of you have been asking...

670 =/= 680

670 OC'd == 680 (Lesser shaders etc but, score wise @ the same)

670 OC'd =/= 680 OC'd (This is especially true on water...680 has a higher roof due to hardware)

590 = 680 ~ This is true and I've said it previously, you're within 5% so, a stock 590 is the equivalent to a stock 680.

Now, I've laid out the performance above, you can obviously deduce that an OC'd aftermarket 670 = 590 ~ That is true.

The major issue that you'll have in upgrading, if you choose to, is availability...I, as a vendor, cannot even get more 690's then my initial order...There just aren't any right now...Same goes for 670s and 680's.

So in terms of performance, believe it or not, the 590's are still in the game...I wouldn't really worry about upgrading until 110, to be honest because all of the games coming out within the next year (2 were just pushed back to 2013) are easily handled by what you have...

I'm going to be unavailable on the forums over the weekend + for a while there-after because I was just asked (as a favor) to fix a mistake that someone made server-side with a major, major game so, I fly out in a few hours. If there are any questions about what I said above or, any other questions you'd like answered -- PM me directly or, my work email is in the signature.


----------



## iARDAs

You are the man Masked. 

As we discussed earlier my reasons for upgrading are financial only. Otherwise 590 is great.

Let me tell others what I said to Masked.

Here in Turkey it is very hard to sell a 590 GTX because of the price. This will be even harder next year. Right now I can get around 600$ from a 590 therefor I will try to sell it for that price.

I will be going for a vacation to USA in mid June until early July. I will be either picking up dual 670s or a single 690. The reason for a dual 670 is that it is the same performance as in 690 but costs 200$ less. However my motherboard is not an SLI motherboard so i will have to upgrade the motherboard as well. I can sell my motherboard and get a new one and pay 200$ max for the difference. Probably even less

So in my exact situation

2 670 + 1 motherboard = 1 690 in price.

I am a dual GPU enthusiast and i love taking the 590 in my hands. You can feel the power even when you hold it 

However if i again purchase a 690 it will be harder for me to sell when the time comes.

I will be monitoring the market closely for the next month and will buy an Asus brand 690 or 2 670s when they become available. The reason for purchasing the Asus brand is because the warranty will be valid here in Turkey as well. We dont have Evga here so, i wont be purchasing it.

So in the above 2 scenarios I will be paying around 400$ extra after selling my 590 and i believe it is a good deal for me. The 690 will be priced around 1500$ here in Turkey I can assure you that.

My 3rd option is to sell the 590 for 600$. Get a new motherboard and 1 670 or a 680 in USA, and I can ask for a second one around the christmas time when my brother in law will be coming to Turkey from San Francisco.

In this 3rd scenario i wont have to pay anything extra as 600$ will be enough for a 670 and a new motherboard (considering i will also get money selling the motherboard i have now.) Than I can wait until chritsmas as the prices will come maybe lower or maybe i can find a good used deal.

All of the options are good for me but i am yet to decide.

Though as masked says the availibility is an issue.


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> You are the man Masked.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As we discussed earlier my reasons for upgrading are financial only. Otherwise 590 is great.
> 
> Let me tell others what I said to Masked.
> 
> Here in Turkey it is very hard to sell a 590 GTX because of the price. This will be even harder next year. Right now I can get around 600$ from a 590 therefor I will try to sell it for that price.
> 
> I will be going for a vacation to USA in mid June until early July. I will be either picking up dual 670s or a single 690. The reason for a dual 670 is that it is the same performance as in 690 but costs 200$ less. However my motherboard is not an SLI motherboard so i will have to upgrade the motherboard as well. I can sell my motherboard and get a new one and pay 200$ max for the difference. Probably even less
> 
> So in my exact situation
> 
> 2 670 + 1 motherboard = 1 690 in price.
> 
> I am a dual GPU enthusiast and i love taking the 590 in my hands. You can feel the power even when you hold it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However if i again purchase a 690 it will be harder for me to sell when the time comes.
> 
> I will be monitoring the market closely for the next month and will buy an Asus brand 690 or 2 670s when they become available. The reason for purchasing the Asus brand is because the warranty will be valid here in Turkey as well. We dont have Evga here so, i wont be purchasing it.
> 
> So in the above 2 scenarios I will be paying around 400$ extra after selling my 590 and i believe it is a good deal for me. The 690 will be priced around 1500$ here in Turkey I can assure you that.
> 
> My 3rd option is to sell the 590 for 600$. Get a new motherboard and 1 670 or a 680 in USA, and I can ask for a second one around the christmas time when my brother in law will be coming to Turkey from San Francisco.
> 
> In this 3rd scenario i wont have to pay anything extra as 600$ will be enough for a 670 and a new motherboard (considering i will also get money selling the motherboard i have now.) Than I can wait until chritsmas as the prices will come maybe lower or maybe i can find a good used deal.
> 
> All of the options are good for me but i am yet to decide.
> 
> Though as masked says the availibility is an issue.


Also thinking of retiring my 590, but Im unsure. In my case it would be more or less a sidegrade. here in germany you can get around 400€ for an used 590 which is around ~515 $.
Thats the same price that a new 670 retails for here in germany (yeah hardware prices outside the US are soo ****ed up). OC it and it matches the 590 with the advantage of running cooler, quieter, drawing less power and having more VRAM. I am not OC'ing the 590 on air (theres not much headroom anyway nowadays). Could potentially go for a 2nd 670 in a few months then.
Still unsure though, I just love the 590 (and any dual GPU card for that matter







)


----------



## pokpok

4 gtx 690 in sli and the earth explode


----------



## Smo

Quote:


> Originally Posted by *pokpok*
> 
> 4 gtx 690 in sli and the earth explode


It certainly would, because it's not possible


----------



## Masked

Quote:


> Originally Posted by *Smo*
> 
> It certainly would, because it's not possible


Yepperoo!

2x SLI 690 is the most that's possible.

Also, you cannot SLI the lesser with the 590/690.

I.E. you cannot SLI 580's with 590's or 680's with 690's or 680's with 590's.

Windows 7 and every windows past XP allows for dual driver profiles. Meaning you can operate a 590 and a 690 within the same computer but, not together, congruently. They exist on their own, apart from one-another on whatever monitor you have them on.


----------



## iARDAs

I wish we could have SLId 2 different products.


----------



## mojobear

hey guys!

im such a infrequent poster its not even funny. i had an issue with my GTX 590 cooling using an xspc waterblock....the one with the higher vregs. Because of the "unoffical revision" the block would not sit evenly and one of the GPU temps would be extraordinarily high. EVGA was nice enough to send me an RMAed gtx 590 with the older version vregs and voila! the second gtx 590 has a waterblock now too. now I can officially join the club without hanging my head in shame from a half completed watercooling system.


----------



## alancsalt

Say you have a hypothetical water cooled 590 in a CM690 II case, and your son sits beef brisket on a cardboard box on top of that case while he keeps playing BF3, and the cardboard box sags at one end under the heat and weight of said beef brisket and deposits it's contents onto the mesh top of the case and gravy drips through until bf3 suddenly stops and there's a strong whiff of burnt electrics coming from the graphics card.....hypothetically of course, I'm supposing there is no way of repairing such a damaged card....?


----------



## rush2049

In such a hypothetically hypothetical situation you most likely hypothetically shorted something. Depending on your hypothetical gtx 590 warranty info you might be able to hypothetically get a replacement.....


----------



## Masked

Quote:


> Originally Posted by *alancsalt*
> 
> Say you have a hypothetical water cooled 590 in a CM690 II case, and your son sits beef brisket on a cardboard box on top of that case while he keeps playing BF3, and the cardboard box sags at one end under the heat and weight of said beef brisket and deposits it's contents onto the mesh top of the case and gravy drips through until bf3 suddenly stops and there's a strong whiff of burnt electrics coming from the graphics card.....hypothetically of course, I'm supposing there is no way of repairing such a damaged card....?


Take it apart, clean it up -- Call them and say it fried out.

If it's EVGA, chances are you'll get a replacement but, make sure there are 0 traces of gravy.

If it's anyone else...Good luck.


----------



## YP5 Toronto

Quote:


> Originally Posted by *Masked*
> 
> Take it apart, clean it up -- Call them and say it fried out.
> If it's EVGA, chances are you'll get a replacement but, make sure there are 0 traces of gravy.
> If it's anyone else...Good luck.


x2


----------



## iARDAs

Hey folks

I just sold my 590 to a friend and will be purchasing 2 670 OC or 2 680 OC. Have not decided yet.

It was great chatting with you guys all. I will drop by from time to time to this topic because of my love to the 590.

See you all over the forum guys.


----------



## Recipe7

Quote:


> Originally Posted by *iARDAs*
> 
> Hey folks
> 
> I just sold my 590 to a friend and will be purchasing 2 670 OC or 2 680 OC. Have not decided yet.
> 
> It was great chatting with you guys all. I will drop by from time to time to this topic because of my love to the 590.
> 
> See you all over the forum guys.


Sad to see you go iARDAs, you've been a valuable member. Please drop by from time to time =)


----------



## iARDAs

Thanks mate. I will drop by from time to time.

The only reason for me to let go 590 was that it is very hard to sell these GPUS 2nd hand here in Turkey. But single core cards can be sold easily. It would be pretty much impossible for me to sell the 590 next year so I had to sell it this year.

I will miss the 590 so much because it had exited me A LOT and lived to the hype for me. I am not so thrilled at a 680 OC edition. But i i will be grabbing 2. soon.


----------



## alancsalt

MSI N590GTX-P3D3GD5, in Australia,
Quote:


> Graphics Card Product
> 3 years (except VN series 1 year)
> In accordance with original manufacturer's products serial number/barcode, graphics card products manufactured from July 01 2003 onwards are by original manufacturer, where as Graphics Card products purchased before July 01 2003, are warranted for 1 year.


Hypothet.........


----------



## Masked

Quote:


> Originally Posted by *alancsalt*
> 
> MSI N590GTX-P3D3GD5, in Australia,
> Quote:
> 
> 
> 
> Graphics Card Product
> 3 years (except VN series 1 year)
> In accordance with original manufacturer's products serial number/barcode, graphics card products manufactured from July 01 2003 onwards are by original manufacturer, where as Graphics Card products purchased before July 01 2003, are warranted for 1 year.
Click to expand...

Call and pray...After cleaning the card.


----------



## MKHunt

Quote:


> Originally Posted by *alancsalt*
> 
> MSI N590GTX-P3D3GD5, in Australia,
> Quote:
> 
> 
> 
> Graphics Card Product
> 3 years (except VN series 1 year)
> In accordance with original manufacturer's products serial number/barcode, graphics card products manufactured from July 01 2003 onwards are by original manufacturer, where as Graphics Card products purchased before July 01 2003, are warranted for 1 year.
> 
> 
> 
> Hypothet.........
Click to expand...

Quote:


> Originally Posted by *Masked*
> 
> Call and pray...After cleaning the card.


I would put emphasis on the cleaning and praying. Don't clean off ALL the flux though. Leave some flux in areas if there is any residual (chances are there is) on the PCB. Old flux is slightly tacky and can attract dust which, if it has the right makeup, can conduct electricity. The theory here is that if the card wasn't cleaned enough to be suitable for its typical working environment (some dust, even with weekly cleaning) then that is the manufacturer's problem as it is their responsibility to ensure the longevity of the product throughout the warranty period. EVGA kind of does their own thing and tend to be awesome with warranty in general.


----------



## flerndip

I am running an EVGA GTX 590 Classified with the stock vapor cooler removed and replaced by EVGA's Hydro Copper water block, on a Danger Den Monsoon D5 Premium reservoir/pump with 360mm radiator and six Ultra Kaze fans in a Silverstone TJ11 chassis. My load temps (BF3 on Ultra settings) seldom exceed the mid-40's with idle temps around 30C (around 23C ambient). I am really happy with this setup, but of course I use a headset so the fan noise is not an issue. Honestly, only a dog could possibly hear the fans over the H80 anyway. I run everything on Ultra settings and have found no compelling reasons to attempt overclocking with it. If you're wondering, that's a PCI-E OCZ Revo-Drive hybrid SSD. CPU: i7 x3930, mobo: ASUS x79 Sabertooth


----------



## mironccr345

Nice rig you got there! ^^^ I can't wait to get my 590 in my main rig and under water!


----------



## PCModderMike

Quote:


> Originally Posted by *mironccr345*
> 
> Nice rig you got there! ^^^ I can't wait to get my 590 in my main rig and under water!


Would really like to see that, hope it happens soon for ya!


----------



## mironccr345

Quote:


> Originally Posted by *PCModderMike*
> 
> Would really like to see that, hope it happens soon for ya!


Hopfully I'll have it in by the end of this week?!!?!!? I have a cpu on the way and pretty much have everything I need to get the build completed. Will post pics when Im finished.


----------



## OccamRazor

Hi guys! I'm new in this forum and will be a proud owner of a asus 590 in 2 days!








My entire rig is watercooled and so will be my 590!








As I said I'm new around here but one thing has come to my attention while read through this thread; the lack of respect members like Masked have from people that don't do their homework!
Respect is earned through valid actions and guys like Masked have done so in several occasions sharing experience and knowledge Without asking anything in return! Some guys should really be ashamed!
Ok! Having this out of my chest feels good! I really hate to see good people being hassled!
Have a nice weekend y'all!

Ed

P.s. @ iARDAs - sad to see you go man, always had good things to say! All the best to you!


----------



## iARDAs

Quote:


> Originally Posted by *OccamRazor*
> 
> Hi guys! I'm new in this forum and will be a proud owner of a asus 590 in 2 days!
> 
> 
> 
> 
> 
> 
> 
> 
> My entire rig is watercooled and so will be my 590!
> 
> 
> 
> 
> 
> 
> 
> 
> As I said I'm new around here but one thing has come to my attention while read through this thread; the lack of respect members like Masked have from people that don't do their homework!
> Respect is earned through valid actions and guys like Masked have done so in several occasions sharing experience and knowledge Without asking anything in return! Some guys should really be ashamed!
> Ok! Having this out of my chest feels good! I really hate to see good people being hassled!
> Have a nice weekend y'all!
> Ed
> P.s. @ iARDAs - sad to see you go man, always had good things to say! All the best to you!


Masked is the man









Thank you for your wishes to me. I will always follow this thread. I will pick up 2 670 or 2 680 soon but this will not give me the excitement of a DUAL GPU card. It is very hard to sell a Dual GPU card here in Turkey thats why I am going with an SLI setup.

You will have lots of good times with a 590.


----------



## OccamRazor

Yap, i agree with you, Masked is the man!







recognition is always a good thing amongst realistic and true to the bone people!
I just got a 590 from a friend that ordered a couple month ago 2 asus gtx 590 for a project in his firm but never used them,
so he sold me one for 300€ and im tempted to buy the other one...








Its my believe that the full kepler will be the one to buy, whenever it comes out...
IARDAs, did you try any overclocking with the recent batch of the 300 drivers? was the voltage unlocked?
Thanks for your reply!

Ed


----------



## iARDAs

Quote:


> Originally Posted by *OccamRazor*
> 
> Yap, i agree with you, Masked is the man!
> 
> 
> 
> 
> 
> 
> 
> recognition is always a good thing amongst realistic and true to the bone people!
> I just got a 590 from a friend that ordered a couple month ago 2 asus gtx 590 for a project in his firm but never used them,
> so he sold me one for 300€ and im tempted to buy the other one...
> 
> 
> 
> 
> 
> 
> 
> 
> Its my believe that the full kepler will be the one to buy, whenever it comes out...
> IARDAs, did you try any overclocking with the recent batch of the 300 drivers? was the voltage unlocked?
> Thanks for your reply!
> Ed


The only OCing i did was taking the clock speed to 630 from 607  and that did not need voltage tweaking.

The card is so darn fast that OCing was not neccesary for me.

The game runs BF3 flawlessly but i wish the performance or the drives were better for Crysis2.


----------



## OccamRazor

iARDAs, did you see this article in guru3d? http://guru3d.com/news/gk110based-surfaces--has-2880-shader-processors/
i think you should wait a while before you buy the 670/80...
What do you think?


----------



## iARDAs

Quote:


> Originally Posted by *OccamRazor*
> 
> iARDAs, did you see this article in guru3d? http://guru3d.com/news/gk110based-surfaces--has-2880-shader-processors/
> i think you should wait a while before you buy the 670/80...
> What do you think?


Yeah i read about that and those GK110 will be in play soon.

However i am going for vacation to USA and GPUS are way cheaper than when you compare it here in Turkey and I will take advantage of this situation.

Worst case scenario I can always sell my GPUs that i bought in USA to the same price as a 2nd hand GPU in Turkey


----------



## OccamRazor

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah i read about that and those GK110 will be in play soon.
> 
> However i am going for vacation to USA and GPUS are way cheaper than when you compare it here in Turkey and I will take advantage of this situation.
> 
> Worst case scenario I can always sell my GPUs that i bought in USA to the same price as a 2nd hand GPU in Turkey


I understand, even here in Portugal where the prices are not so bad, sometimes i get stuff from ebay with transport fees cheaper that i buy them here...








Harley exaust pipes as an example, €200 cheaper if i buy them overseas...








the only problem you might face in the US is the avaliability of the 670/80.


----------



## fr0st.

Okay, don't have time to do an entire search of the thread but what's the deal with the 590's? I know the whole blowing up catastrophe was party the drivers but what are the maximum safe values? If I was going to get one it'd be under water and have a back plate and be running very cool the entire time. Which voltage and clocks should I stay under? What's the safe range?

I don't want to blow up a $650 card when I'm on a student's salary :3


----------



## emett

Read the first post.
http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread


----------



## fr0st.

Quote:


> Originally Posted by *emett*
> 
> Read the first post.
> http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread


It's alright I just bought a 6990 :3


----------



## emett

Good luck with that..


----------



## tats

Hey all - Thinking of picking up a 590 soon. What is the current state of drivers? I realize since this is way after launch things have probably been sorted out but I was wondering if there was a running list anywhere with any unresolved issues.

i currently have a 580 that is on its way out with no RMA options available so I was hoping that this would be a nice upgrade now that prices have dropped.

Any insight into what I'm getting into? I've read a bunch of threads but so many were from over a year ago, I can't be certain if things have been addressed in subsequent drivers.

Thanks


----------



## iARDAs

Quote:


> Originally Posted by *tats*
> 
> Hey all - Thinking of picking up a 590 soon. What is the current state of drivers? I realize since this is way after launch things have probably been sorted out but I was wondering if there was a running list anywhere with any unresolved issues.
> i currently have a 580 that is on its way out with no RMA options available so I was hoping that this would be a nice upgrade now that prices have dropped.
> Any insight into what I'm getting into? I've read a bunch of threads but so many were from over a year ago, I can't be certain if things have been addressed in subsequent drivers.
> Thanks


The latest beta drivers had been amazing for my 590.

Every game was running smooth. BF3 was exceptionally good.

Overall great purchase.

How much will you be paying for it?


----------



## alancsalt

Quote:


> Originally Posted by *alancsalt*
> 
> Say you have a hypothetical water cooled MSI N590GTX-P3D3GD5 in a CM690 II case, and your son sits beef brisket on a cardboard box on top of that case while he keeps playing BF3, and the cardboard box sags at one end under the heat and weight of said beef brisket and deposits it's contents onto the mesh top of the case and gravy drips through until bf3 suddenly stops and there's a strong whiff of burnt electrics coming from the graphics card.....hypothetically of course, I'm supposing there is no way of repairing such a damaged card....?


Just an update: I did what a good father would, I think. Found him an EVGA Classified 590 to replace it.


----------



## tats

Quote:


> Originally Posted by *iARDAs*
> 
> The latest beta drivers had been amazing for my 590.
> 
> Every game was running smooth. BF3 was exceptionally good.
> 
> Overall great purchase.
> 
> How much will you be paying for it?


Thanks - I'm paying just about $500 for it. I figure I get a little more performance than a 680 for the same money and if it craps out I should be able to get something cool in RMA to replace it.

I also want to stick with a one card solution and not go SLI, so this seems like a good bet.


----------



## iARDAs

Quote:


> Originally Posted by *tats*
> 
> Thanks - I'm paying just about $500 for it. I figure I get a little more performance than a 680 for the same money and if it craps out I should be able to get something cool in RMA to replace it.
> I also want to stick with a one card solution and not go SLI, so this seems like a good bet.


590 is still in play... You will never regret the decision.

It is a card that the problems are solved. Latest drivers run perfect with the 590, but they seem to have rare issues with the 6xx series cards.

The 590 is very well built so you will not have the need to send it to RMA i am sure of it  I mean it is a technological thing and anything can be broken but the 590 is a tough card.

On stock 590 outperforms 680 by a small margin too. Since it is a dual GPU card .







Notice how 590 outperforms 680 or a 670 in Metro 2033 and Crysis 2.

In battlefield the difference in smaller but still 590 leads.

The only reason I switced to Kepler was to run 670 SLI. Which outperforms 590 though but of course costs more.

After 690, I can easily say that 590 is the best performing card out there. (besides Asus Mars II which is a rare GPU performs like a 690)

Enjoy the card bro.


----------



## tats

Thanks man. For the RMA I was thinking of a year or so down the road when they aren't making 590's anymore there's a chance I can get a cool replacement.

My brother had a 295 that was crapped and EVGA gave him a 480 as a replacement - so if this happens in a few years I wouldn't mind getting something like that.


----------



## toX0rz

Quote:


> Originally Posted by *iARDAs*
> 
> After 690, I can easily say that 590 is the best performing card out there. (besides Asus Mars II which is a rare GPU performs like a 690)


The Mars II performs around 20% better than a stock 590.
A 690 easily performs 50% - sometimes 70% better than a 590.

And the 590 can be OC'd to Mars II clocks anyway.


----------



## mywifeispist

Quote:


> Originally Posted by *toX0rz*
> 
> The Mars II performs around 20% better than a stock 590.
> A 690 easily performs 50% - sometimes 70% better than a 590.
> And the 590 can be OC'd to Mars II clocks anyway.


I just put together another system for a friend using two 680's and I am impressed with the performance of those cards, they perform just as good if not better then my 590's.


----------



## iARDAs

Quote:


> Originally Posted by *toX0rz*
> 
> The Mars II performs around 20% better than a stock 590.
> A 690 easily performs 50% - sometimes 70% better than a 590.
> And the 590 can be OC'd to Mars II clocks anyway.


Are you sure that Mars II performs only 20% better than a 590? I thought it was at least 50% but i am not sure. I never ever touched a Mars II 

Quote:


> Originally Posted by *mywifeispist*
> 
> I just put together another system for a friend using two 680's and I am impressed with the performance of those cards, they perform just as good if not better then my 590's.


It seems that although 590 = 680 (give or take)

680 SLI is better than 590 Quad SLI.

Masked has also said that in his tests few pages before.

So i am not suprised.


----------



## Masked

Quote:


> Originally Posted by *iARDAs*
> 
> Are you sure that Mars II performs only 20% better than a 590? I thought it was at least 50% but i am not sure. I never ever touched a Mars II
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Masked has also said that in his tests few pages before.


You mean this, eh?

Quote:


> Originally Posted by *Masked*
> 
> 420 Is such a legit number as well.
> So as some of you have been asking...
> 670 =/= 680
> 670 OC'd == 680 (Lesser shaders etc but, score wise @ the same)
> 670 OC'd =/= 680 OC'd (This is especially true on water...680 has a higher roof due to hardware)
> 590 = 680 ~ This is true and I've said it previously, you're within 5% so, a stock 590 is the equivalent to a stock 680.
> Now, I've laid out the performance above, you can obviously deduce that an OC'd aftermarket 670 = 590 ~ That is true.
> The major issue that you'll have in upgrading, if you choose to, is availability...I, as a vendor, cannot even get more 690's then my initial order...There just aren't any right now...Same goes for 670s and 680's.
> So in terms of performance, believe it or not, the 590's are still in the game...I wouldn't really worry about upgrading until 110, to be honest because all of the games coming out within the next year (2 were just pushed back to 2013) are easily handled by what you have...
> I'm going to be unavailable on the forums over the weekend + for a while there-after because I was just asked (as a favor) to fix a mistake that someone made server-side with a major, major game so, I fly out in a few hours. If there are any questions about what I said above or, any other questions you'd like answered -- PM me directly or, my work email is in the signature.


I, personally, don't think the Mars 2 should be included in comparisons for 2 reasons...

1) They were extremely limited...Limited enough that my former office only got 1...That says an incredible amount to me alone...
aaaannnddddddd...
2) 1/3rd were RMA'd and of the rest that actually still exist...Only about 1/4 work as advertised.

So, the Mars 2, maybe working as intended, you have about 200...200 is not mainstream enough for an end user to connect with or even worry about...

That being said, it was 20-30% stock over a 590, 40% OC'd but, the driver errors were incredible...I mean you're talking about Vsync tessellation, sometimes the GPU's wouldn't scale...Notice you haven't heard how awesome they are since launch...And they keep very quiet about the card? *Whispers* There's a reason for that.

I actually have 2 clients that refused to listen to me, their cards came DOA and Asus basically called them liars...I'm not getting into that drama but, the Mars 2 was not by any means, a successful product...A successful launch for Asus? Yes...But, not for the customer.

So ultimately, I wouldn't factor the Mars 2 into any graph/conclusion or even comparison because it was not a quality product for the most part and isn't something mainstream enough to even have a successful comparison.


----------



## iARDAs

Hmm good inside information Masked.. Thank you. I had always thought that Asus Mars II was an incredible product.

It was way overpried for a 580 SLI but thats the marketint portion of it. Shame that it turned out this way though. I would be mad angry if i had purchased an Asus MArs II and send it to RMA.

Any how. I am still waiting for mhy 670 SLI setup system. By the way your signature needs updating bro


----------



## tats

I'm excited for my 590 to come. Hopefully this will keep me content for another generation or two.


----------



## iARDAs

Quote:


> Originally Posted by *tats*
> 
> I'm excited for my 590 to come. Hopefully this will keep me content for another generation or two.


DO you intend to game on a single monitor @ 60 fps? or some other thing?

If you are only going to be gaming in a single monitor with 60 fps I am sure the 590 will be enough for you for the next 2 years with high quality settings I am sure. Unless a poorly optimized game comes out.


----------



## mywifeispist

Anyone interested in two XSPC water blocks for GTX 590 $85.00 shipped in USA 48, PM me...... they are used in good condition and have thermal pads to go with if you need. (paypal)


----------



## tats

Quote:


> Originally Posted by *iARDAs*
> 
> DO you intend to game on a single monitor @ 60 fps? or some other thing?
> 
> If you are only going to be gaming in a single monitor with 60 fps I am sure the 590 will be enough for you for the next 2 years with high quality settings I am sure. Unless a poorly optimized game comes out.


Yep - Going to stick with my u2711 and stop throwing money into this pit


----------



## PCModderMike

Quote:


> Originally Posted by *iARDAs*
> 
> 590 is still in play... You will never regret the decision.
> 
> It is a card that the problems are solved. Latest drivers run perfect with the 590, but they seem to have rare issues with the 6xx series cards.
> 
> The 590 is very well built so you will not have the need to send it to RMA i am sure of it
> 
> 
> 
> 
> 
> 
> 
> I mean it is a technological thing and anything can be broken but the 590 is a tough card.
> 
> On stock 590 outperforms 680 by a small margin too. Since it is a dual GPU card .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Notice how 590 outperforms 680 or a 670 in Metro 2033 and Crysis 2.
> 
> In battlefield the difference in smaller but still 590 leads.
> 
> The only reason I switced to Kepler was to run 670 SLI. Which outperforms 590 though but of course costs more.
> 
> After 690, I can easily say that 590 is the best performing card out there. (besides Asus Mars II which is a rare GPU performs like a 690)
> 
> Enjoy the card bro.


Can you link me to the source for those benchmarks please? Just interested in reading the article if there is one associated with them. Thanks


----------



## iARDAs

Quote:


> Originally Posted by *PCModderMike*
> 
> Can you link me to the source for those benchmarks please? Just interested in reading the article if there is one associated with them. Thanks


Sure buddy

here you go

http://www.guru3d.com/article/geforce-gtx-670-2-and-3way-sli-review/1


----------



## PCModderMike

Quote:


> Originally Posted by *iARDAs*
> 
> Sure buddy
> here you go
> 
> http://www.guru3d.com/article/geforce-gtx-670-2-and-3way-sli-review/1


Thank you







rep given


----------



## iARDAs

Quote:


> Originally Posted by *PCModderMike*
> 
> Thank you
> 
> 
> 
> 
> 
> 
> 
> rep given


Thank you







rep taken


----------



## MasterVampire

Quote:


> Originally Posted by *iARDAs*
> 
> 590 is still in play... You will never regret the decision.
> 
> It is a card that the problems are solved. Latest drivers run perfect with the 590, but they seem to have rare issues with the 6xx series cards.


What driver version are you talking about?


----------



## mironccr345

I'm almost finished adding my 590 in my main rig.


----------



## iARDAs

Quote:


> Originally Posted by *MasterVampire*
> 
> What driver version are you talking about?


301 drivers were exrtremely solid for my 590


----------



## mironccr345

What's the best driver for the 590 to date? I just got my 590 in my loop installed the latest driver 301.42. I just installed 296.10 drivers and it seems to work better. Does any one have suggestions on which would be the best driver to use? Also, if you're gaming on 5760x1080 or higher resolution, what are your settings?


----------



## alancsalt

Quote:


> Originally Posted by *mironccr345*
> 
> What's the best driver for the 590 to date? I just got my 590 in my loop installed the latest driver 301.42. I just installed 296.10 drivers and it seems to work better. Does any one have suggestions on which would be the best driver to use? Also, if you're gaming on 5760x1080 or higher resolution, what are your settings?


I think you just use what works for you. Some have trouble with the new drivers, some don't. I just put it down to hardware variation. I don't know that any one setup works for everybody. Different games make a difference too. If I never play SkyRim, I wouldn't know if it had problems or not, but that's one game mentioned from people the new drivers don't work for.


----------



## mironccr345

Thanks for the info, I guess I'll try a couple more drivers and compare 3dmark 11 scores. I also noticed on MSI afterburner, the core voltage is maxed out at .925, is that normal for this card?


----------



## alancsalt

I'm not finding which 590 you have in your sig rigs, but some have speed stepping with the voltages. You've unlocked the volts in settings? You can monitor them in the graphs if it's enabled in settings. You might find they rise on demand. Check the graph immediately after a benchmark or game.

It's my son that has a 590, but my dual 580s only go up to 1.15v...


----------



## mironccr345

Quote:


> Originally Posted by *alancsalt*
> 
> I'm not finding which 590 you have in your sig rigs, but some have speed stepping with the voltages. You've unlocked the volts in settings? You can monitor them in the graphs if it's enabled in settings. You might find they rise on demand. Check the graph immediately after a benchmark or game.
> It's my son that has a 590, but my dual 580s only go up to 1.15v...


I have a EVGA 590. I haven't updated my sig rig because i've just upgraded my rig from AMD to Intel. I unlocked all the settings and as I type this, the gpu's are at 0.875v. I'll run some bench marks and post the results.

Nice, so you got a new 590, or your son did. (I read the hypothetical incident)


----------



## mironccr345

ok, I ran the card on stock and with a small OC using 3Dmark11. The volts go up to 0.925v very time I run 3dmark11 and idles around 0.875v. I guess that's normal? But the card seems to work better with the previous driver.

Sorry for the double post


----------



## rush2049

The core voltage is locked down by the drivers. Nvidia built a hook into the drivers to not allow voltage tweeking above .925 volts. There are custom bioses floating around that will unlock it by setting all voltage profiles higher (max .963). See the 'gtx 590 flashing overclocking' thread for more info.

Really recently some people have claimed to even pass that voltage lock by using some other software that I forget the name of right now, I haven't played with that as I am on air and .963 is enough volts for me.....


----------



## mironccr345

Quote:


> Originally Posted by *rush2049*
> 
> The core voltage is locked down by the drivers. Nvidia built a hook into the drivers to not allow voltage tweeking above .925 volts. There are custom bioses floating around that will unlock it by setting all voltage profiles higher (max .963). See the 'gtx 590 flashing overclocking' thread for more info.
> Really recently some people have claimed to even pass that voltage lock by using some other software that I forget the name of right now, I haven't played with that as I am on air and .963 is enough volts for me.....


That's what I figured, and i'd rather not mess with bios. I think an OC of 655MHz is OK for me. I can't get it stable beyond that, but I dont think I'll need to go any higher. I get a 3Dmark11 score of 10k with my 2700k @ 4.8GHz and 655 on the card.


----------



## mojobear

for people with the evga gtx 590 and newer bios 70.10.42.00.90 and 70.10.42.00.91, you are not locked at 0.963 V. Modify those bios with nbitior, then force flash your GTX 590 with nvflash from dos and reinstall driver.


----------



## mojobear

oops...the force flash is for people with older bios versions. to update the bios on ur evga gtx 590 you will need to force flash using nvflash in dos.


----------



## mironccr345

Quote:


> Originally Posted by *mojobear*
> 
> oops...the force flash is for people with older bios versions. to update the bios on ur evga gtx 590 you will need to force flash using nvflash in dos.


Interesting, i'll have to look into that. Although I dont want to flash my BIOS, Im dying to get the card to 1050k-1100k on 3dmark11. (if it's possible?) What's your 3Dmark11 score with your quad Sli?


----------



## mojobear

hey actually i havent benched 3dmark2011 with the quad. bc of the power draw limitations, having an overclock and benching 3dmark is a huge pain with inconsistent results. For games no problems what so ever from people going to 0.988 V etc in terms of PDL. I myself have not encountered the PDL with 0.988V @ 710 core.


----------



## mojobear

check out the gtx 590 overclock thread. http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread/660

Post 667 - one guy hit > 11000 with0.988V @ 740 mhz.


----------



## mironccr345

Thanks for the info!


----------



## Wogga

with SB CPU highest scores for quad are around 16.5k and up to 19k with SB-E CPU (because of higher cpu score)
my P16644 for ex.


----------



## mironccr345

Quote:


> Originally Posted by *Wogga*
> 
> with SB CPU highest scores for quad are around 16.5k and up to 19k with SB-E CPU (because of higher cpu score)
> my P16644 for ex.


Nice score! Have you tried pushing it a little further?


----------



## mywifeispist

i dont have my 590's in this rig at the moment but figured i would post 3dmark 11 screen shot with two 680's i have in at the moment. Quad sli was around or a little over 16k with the 590's.......... here is what the 680's did on the same rig.


----------



## MtheG

guys I am so glad i took the time to read these posts, I have just pre ordered a pair of GTX 680 Hydro Coppers to replace my GTX 590's Hydro Copper and one with EK Waterblock.

Currently on i7 3930k with Asus Rampage iv Extreme with EK waterblocks all on a single loop with a 480 and 360 RAD.

My Processor is running at 4.8 and both cards on Stock clock, CPU idles less then 40 and averages around 60 when playing BF3 sometimes it hits 62. The Cards get no hotter than 4 running Ultra and on a Dell U27.

So after reading all of your posts I have a good mind to cancel my pre order and wait for the new 110's (or next set of cards)

I'm thinking that would be money better spent.

http://3dmark.com/3dm11/3578773


----------



## mojobear

i finally ran 3dmark2011. Score P15306. This is with my old i7 920 @ 4.3 ghz







http://3dmark.com/3dm11/3598563

Good news is looking at the sub score, graphics test 1-4 are the same or a bit better than the sandybridge cards. I guess its a double edge sword because I would really like to get some new hardware soon







just cant justify a gaming side grade....grrrr.


----------



## mironccr345

Quote:


> Originally Posted by *mywifeispist*
> 
> i dont have my 590's in this rig at the moment but figured i would post 3dmark 11 screen shot with two 680's i have in at the moment. Quad sli was around or a little over 16k with the 590's.......... here is what the 680's did on the same rig.
> [/URL]


Nice score! So you getting the same results as your 590 sli? And are you selling your 590's?

Quote:


> Originally Posted by *MtheG*
> 
> guys I am so glad i took the time to read these posts, I have just pre ordered a pair of GTX 680 Hydro Coppers to replace my GTX 590's Hydro Copper and one with EK Waterblock.
> Currently on i7 3930k with Asus Rampage iv Extreme with EK waterblocks all on a single loop with a 480 and 360 RAD.
> My Processor is running at 4.8 and both cards on Stock clock, CPU idles less then 40 and averages around 60 when playing BF3 sometimes it hits 62. The Cards get no hotter than 4 running Ultra and on a Dell U27.
> So after reading all of your posts I have a good mind to cancel my pre order and wait for the new 110's (or next set of cards)
> I'm thinking that would be money better spent.


Good Choice! After reading this thread, I decided to keep my 590 as well.









Quote:


> Originally Posted by *mojobear*
> 
> i finally ran 3dmark2011. Score P15306. This is with my old i7 920 @ 4.3 ghz
> 
> 
> 
> 
> 
> 
> 
> http://3dmark.com/3dm11/3598563
> Good news is looking at the sub score, graphics test 1-4 are the same or a bit better than the sandybridge cards. I guess its a double edge sword because I would really like to get some new hardware soon
> 
> 
> 
> 
> 
> 
> 
> just cant justify a gaming side grade....grrrr.


4.3 oh a 920, that's a nice OC. What are your temps?


----------



## mojobear

hey mironccr345

im getting about 75 C with small FFTs on prime95 and intel burn test. the ek420 and xspc ex360 sure help things. hope the overclock thread helps.


----------



## Canis-X

So does anyone still have GTX590's anymore? Looks as thought he 6xx series is really good, unfortunately I can't afford to switch over or I would.....bummer!!


----------



## iARDAs

Quote:


> Originally Posted by *Canis-X*
> 
> So does anyone still have GTX590's anymore? Looks as thought he 6xx series is really good, unfortunately I can't afford to switch over or I would.....bummer!!


Oh yeah there are many people out there with 590s. To be honest a person who has a 590 does not really need an upgrade as a stock 670 or an 680 still underperforms when compared to 590, but beats it when overclocked.

It is true that few of this club's members including myself have switched over to the Kepler but 590 is still such a great card.

The 2 reasons I switched to Gigabyte 670 instead of a 590 are

1-) Since I live in Turkey it is easier to sell single core cards here. I had a tough time selling the 590 and if i waited more, it would have been even harder to sell I had to sell the card to almost half the price I bought it so it is not a great investment in my case

2-) 1 590 outperforms a 670 but since i will go SLI with my 670, than it would beat 590, keep quieter and cooler. 590 quad SLI is a possibility too but not every game takes advantage of 4 cores.


----------



## mironccr345

Quote:


> Originally Posted by *Canis-X*
> 
> So does anyone still have GTX590's anymore? Looks as thought he 6xx series is really good, unfortunately I can't afford to switch over or I would.....bummer!!


I got my 590 in Feb 2012. Am I reading your sig rig right, you have four 590's?


----------



## Canis-X

No...LOL....just two. I tried out the "System Interrogator" feature in Rig Builder and that is how it populated the info. I'm going to fix it now....odd.


----------



## mironccr345

Cause I was like


----------



## iARDAs

Technically a person CAN own 4 590s. but can only install 2 of them


----------



## mironccr345

haha, I knew that.







I was just curiouser as to why he has four of them?


----------



## Canis-X

LOL.....so you would ask that question.








I also saw that it added an entry for each stick of my RAM....nice little app, but you gotta double check it when it's done.


----------



## mywifeispist

i wasnt going to sell my 590's but someone that wanted them bad enough paid me $1000 for them so i let them go and i am using two 680's and they are really really good cards and seem much faster overall too.


----------



## Canis-X

Not bad! Were they on stock air or WB'd?


----------



## mywifeispist

Quote:


> Originally Posted by *Canis-X*
> 
> Not bad! Were they on stock air or WB'd?


he didnt want the water blocks so i took them off and put the stock cooler back on for him, stock cooler was never used either.


----------



## StrayderGame

Hi all here.i have one question for u here guys....my friend uses gtx 590,and after 6 months he is getting next problem. When he goes to sleep for example,he leaves hes PC turned on (no sleep mode) and when he comes back and start game he receive huge fps drop,micro stuttering etc.He need to restart PC to everything work fine again...any1 have any suggestion,and could be that related to graphic card? That first 6 months everything was fine...He tried to format disk reinstalled drivers etc..but still same problem... And he confirmed me again to that happens 90% time when he comes back from idle...


----------



## mironccr345

I haven't had any issues with my 590 waking out of sleep mode. Maybe it's the drivers he's using? Ask him to test out different drivers and see if that works?


----------



## alancsalt

Does one of his notification area/memory resident programs have a memory leak?
tried sfc /scannow?
tried chkdsk?

Not a problem I've had though...just the sort of possibilities that I think of first up..


----------



## toX0rz

When I used the Windows 7 Sleep / Standby mode my CPU would never clock up to full clock speeds again after waking up, it'd always stay at idle clock (800MHz, it was on my old AMD Phenom X6 system) which caused huge performance loss. Always needed a reboot to fix the problem .

I never really went into that problem but I simply avoided using the Standby mode since then.
So the first thing I'd do is monitoring the clock speeds (preferably of both, CPU and GPU) and see if they run on full clocks in the first place.


----------



## Masked

A tremendous amount of people, since the release of SB have had issues with their PC's in standby and sleep-mode.

This hasn't really been addressed by microsoft/intel or anyone really but, there is an underlying issue.

I believe there's a massive memory leak that takes place when you begin to use the PC again...So, the 590 would be directly effected by this.

My computer @ home takes 20 seconds to boot to Windows, less with QB on...So, I just turn it off if I'm going out for a while.

I'd suggest anyone going to sleep for the evening, do the same unless you're priming.


----------



## heyskip

Quote:


> Originally Posted by *toX0rz*
> 
> When I used the Windows 7 Sleep / Standby mode my CPU would never clock up to full clock speeds again after waking up, it'd always stay at idle clock (800MHz, it was on my old AMD Phenom X6 system) which caused huge performance loss. Always needed a reboot to fix the problem .
> I never really went into that problem but I simply avoided using the Standby mode since then.
> So the first thing I'd do is monitoring the clock speeds (preferably of both, CPU and GPU) and see if they run on full clocks in the first place.


^ This. I had a 2500K on Gigabyte board and after resume from sleep CPU was stuck in idle state 1600mhz. Only a restart would fix it. Contacted Gigabyte but received no help at all. Replaced board with ASUS and never had the problem again.


----------



## dklic6

Well I just won an Asus 590 on Ebay with a Dangerden block on it that I'm not going to use. All I need is thermal tape and paste to put the cooler back on correct? I'm moving from 6970s to a 590 basically so I can fold on it while I'm not gaming (playing less lately). Does anybody have any tips for a first time Nvidia/dual gpu card guy? Sig rig is up to date with all current hardware.


----------



## Zeronrg

Can I get into the chart...? I will update with pics soon. How do I add my rig sig???


----------



## dklic6

Go to the rigbuilder link at the top right of the page and add your hardware.


----------



## Canis-X

Got a question for you guys, for any ASUS GTX590 owners....

Has anyone attempted to RMA these cards now because of the fraudulent advertisement that is on the box that these come in that states "Voltage Tweak" "50% Faster" "Shift into Overdrive"?



Just curious


----------



## mywifeispist

i had 2 of them, they were awesome cards, reflash them......... see the voltage mod thread on flashing here. there are plenty of bios on here too, i had 3-4 different asus bios uploaded.


----------



## Canis-X

How high could you get your voltage/GPU freq on say 3DM11 before PDL kicked in? Highest so far for me is 700Mhz @ .963v.

E: Grammar correction


----------



## mywifeispist

Not sure who your asking but i used to run mine to 750/.988 easily, didnt go over that by choice. I eventually found that they performed for me better at 720/.988 but not all cards are the same. I dont own the cards anymore as i sold both of them last month.


----------



## Canis-X

Quote:


> Originally Posted by *mywifeispist*
> 
> Not sure who your asking but i used to run mine to 750/.988 easily, didnt go over that by choice. I eventually found that they performed for me better at 720/.988 but not all cards are the same. I dont own the cards anymore as i sold both of them last month.


Yeah, I was speaking to you...LOL....sorry for not quoting to clear that up. Thanks for the info. I've been reading through that thread quiet a bit over the last few days. I'll see if there is anything that I can find of use in it.


----------



## Zeronrg

Quote:


> Originally Posted by *dklic6*
> 
> Go to the rigbuilder link at the top right of the page and add your hardware.


But how do I add it in the post??? Sorry noob question I know....

http://www.overclock.net/lists/display/view/id/4362978

http://3dmark.com/3dm11/3701322

http://3dmark.com/3dm11/3765433


----------



## alancsalt

Quote:


> Originally Posted by *Zeronrg*
> 
> But how do I add it in the post??? Sorry noob question I know....


My profile/Signature/add list isn't it? (IIRC)


----------



## Zeronrg

Quote:


> Originally Posted by *alancsalt*
> 
> My profile/Signature/add list isn't it? (IIRC)


Thanks m8, very appreciated.


----------



## Sniffyy

Hi all i need some advice.

I have a 6970 paired with a new i5 3570k. ive been looking at upgrading my gpu as my 6970 is extremely loud and hot (goes well over 90c in the summertime). ive been looking at nvidia as i like their emphasis on quiet and my roommate is pleased with the gtx 680. naturally i looked at them but i also have found a new gtx 590 for $620. over here gtx 680s go for around $700 so the 590 caught my eye.

would you say the 590 is worth it? i guess im concerned about noise and heat the most as my 6970 is really offputting (i think the tim is dodgy or something, its always been like this). also does it benefit from improvements made to other 5xx cards? So if the gtx 580 gets a boost in something, is this reflected in the 590s performance? also what can you tell me about reliability etc? are they well made and do they last a long time?

thanks









edit: also the brand is zotac. i swear its the last new 590 in my country haha


----------



## mywifeispist

I would go with a 680, its cheaper and performance is very much the same in alot of games, i had two 590's and now have two 680's and the performance in what i play (bf3) is better with the 680's and i switch out playing on a 1080p plasma and then 3x 27" samsung at 5760x1080..... which ever i feel i wanna use that day.


----------



## ProfeZZor X

So after 7 months of waiting, I finally get to the final stage of my build. Everything appeared to be going well, and my brother and I were making progress with connecting all the wires. Then as luck would have it, the harness that I had "professionally" extended for my 590 didn't work. I was pissed... One of the harnesses works, but it appears that any time I connect one of the 6-pins to the power supply, it shorts out everything. We've narrowed it down to one of the two 6-pin connectors, so I hope that this problem can be resolved before the holiday weekend. So close to the finish line and I trip... Well, I'll be taking it back to the shop that did it today, so they can fix it. If push comes to shove, I'll see if I can get another harness and start over. Hopefully EVGA carries them, or someone else is selling theirs... Although I'd still like to lengthen and sleeve it.


----------



## mironccr345

^ hope you work things out. Any pics?


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> ^ hope you work things out. Any pics?


I sent the cable back to the computer shop that did the work, so it's in their hands now. If they can't fix it, then I'll buy one off of ebay.

As for pictures, I have a couple, but I'd rather wait until it's fully operational. I don't want to jinx myself.


----------



## ProfeZZor X

Okay, the faulty harness has been fixed and is installed now. The pc is hooked up to the tv via a DVI to HDMI cable, and the televison is recognizing the physical connection between it and the pc, but nothing is happening beyond that. One thing I should note is that the EVGA logo is flashing on the side of the water block (when the pc is turned on)... Oh, and this is my initial start up.

What am I doing wrong here? Shouldn't the first thing a new pc should have is a visual connection? Admittedly I am a newbie to this, so any constructive help would be appreciated. You can also PM me your answers too. I just hate standing at the finish line and not be able to cross it.


----------



## mironccr345

I wouldn't know where to begin. The only time my lights would blink on my card, is when one of the power cables weren't plugged in? Did you test the card on air before you put it under water?


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> I wouldn't know where to begin. The only time my lights would blink on my card, is when one of the power cables weren't plugged in? Did you test the card on air before you put it under water?


No, I didn't. Should I have? And is that safe to run without any kind of cooling on it?


----------



## MKHunt

I find myself in need of a second monitor for productivity. I recall that the 590 had trouble back in the day with having 2 monitors when one is not used for gaming. I would like to keep impact on gaming to a minimum to none. Any advice?


----------



## Spizzy01

I'm hoping one of you guys can help here...

I've had a GTX 590 for about a year now, and it's been running perfectly ever since.. Up until two weeks ago when i updated my graphics drivers from 296 to 301 because Battlefield 3 was playing up.

During the initial installation the screen went black as normal when installing the drivers, but never came back on after - So i had to hard reset.

My machine then persisted to BSOD after about 5 mins of use, which i figured was normal since the installation seemed incomplete. So I reinstalled the drivers again, and again the installation died and it needed a hard reset.

I figured, I'd boot into safe mode, remove the drivers manually and reinstall the new ones - This also didn't work, however instead of dying on install my machine simply BSOD'd after a few mins of use. (All these BSOD's are 'Black Screens' btw)

So THEN, i tried new drivers - Specifically the BETA 304 drivers... Same results... But with these it wouldn't do it until i enabled my multi-monitor setup. (I run with 3 monitors at all times)

Eventually, i gave up and reverted back to 296 and I'm currently on that driver now. However since the whole kerfuffle with the driver update my machine now persists to BSOD when resuming from safe mode after a few mins of use. Which is a bit annoying as you can imagine!

I only have one 590, but run across 3 screens with a resolution of 5760x1080. I'm using a first gen i7 950 at stock clock speeds (my motherboard, my GA-X58A-UD5, seems to no longer be stable for very long when overclocked so i reverted back - it also has a tendency to simply disconnect hard drives/controllers at its own will - stating it has ran out of IRQ's.. But that's an entirely different issue!)

I've ran a MemTest - All good in that dept. And there's nothing showing up in the event log that could explain *** is going on.

If any of you guys could help at all, it would be greatly appreciated!

On an extra note, the Windows Error Reporting initially stated the issue was my graphics drivers however since reverting back to the 296 drivers it simply states that the issue is being looked into.


----------



## OccamRazor

Dont know if i can help but something along that line happened to me some time ago, instaled some drivers (can't remember which ones),
and suddenly in the middle of the installation black screen and reboot, no go with that driver, so installed the previous one, everything back to normal except
the new and ever increasing BSOD (never had them before), tried everything, removed overclocks (all of them, gpu and CPU) bios set to default, nothing helped
until I reformatted and installed windows back again, BSOD gone for good and everything smooth as it was.
Probably the drivers broke something in windows, hope I helped with my 2 cents!

Ed


----------



## OccamRazor

Well I came across the chance to buy a second gtx 590 for less than 200$ and go for quad,
Just bought a i5 3570K and a Maximus gene V so no bottlenecking the quad;
I just use the rig for gaming so:
Proud Quad Gtx590 owners tell me:
is it worth to go for quad just for gaming?
Do the new games use the quad well? I know the scaling is not good but is it worth it?
Thanks for your input guys!

Ed


----------



## ProfeZZor X

Is insufficient power the only reason why a blinking EVGA light would appear on the 590? Now that the shop that I hired to extend my GPU power cables has fixed their snafu, the 590 seems to be powering up... sort of. Whenever I turn on the PC, I'm getting a blinking light on the card. At this point, I can't even get to the BIOS screen to install anything. So for now, my PC is stuck at being a water cooled, LED lit, night light.... I did a little bit of research and discovered that lack of power is the reason why the light is blinking and not solid, and why the card does not turn on completely. I'm currently running a Apevia 1100 watt PSU, but within a couple of weeks, I discovered that it's apparently one of the world's most overrated and underpowered PSU. I should have researched before I bought it online, but lesson learned. Nonetheless, I ordered a Rosewill Lightning 1300 yesterday, so I hope that will solve the problem. I've already tried it in different PCI-e slots, disconnected all other hardware, and made sure there's no shortage in the power cable connectors, but no such luck.

...If someone can PLEASE give me other reasons why the card won't work, I'd love to hear it.


----------



## mironccr345

I do remember getting the EVGA light blinking when I had a no-name 700W PSU hooked-up to it. I used my Corsair 600w and AX850 PSU, it powered on with no problems. Do you use cable extensions?


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> I do remember getting the EVGA light blinking when I had a no-name 700W PSU hooked-up to it. I used my Corsair 600w and AX850 PSU, it powered on with no problems. Do you use cable extensions?


If you consider cutting and adding length to the existing cables "cable extensions", then I suppose so. My initial goal was to conceal the black & yellow cables in the visible parts of my rig by having them lengthened to 12 inches and sleeved in black, then tucking them behind the mobo... This is no different from any other modder here.

From what I've been told about the Apevia 1100, Apevia rates it at 1100 watts, but in real world conditions it runs at around 900 watts. Call it impulse buying and/or a rooking mistake, I can tell you that my lesson has been learned over the months that I've been building this rig. And that PSU was my second mistake (the first being my Antec case). My guess is that this Apevia just can't handle my D5, Corsair SSD, Seagate HDD, 9 LED Enermax fans, and 3 UV tubes. I'll eventually find a good home for it on my kids rig, but my main concern is "IF" the Rosewill 1300 still doesn't solve the problem, then what?


----------



## OccamRazor

Dont you worry, the problem in your rig was the PSU!
The rosewill (superflower) will solve it, the blinking lights on the 590 are indeed lack of amps!


----------



## ProfeZZor X

Quote:


> Originally Posted by *OccamRazor*
> 
> Dont you worry, the problem in your rig was the PSU!
> The rosewill (superflower) will solve it, the blinking lights on the 590 are indeed lack of amps!


Well, my Rosewill Lightning 1300 arrived today, so I'll get started on removing the Apevia once I get off work and see if it solves the problem. I certainly hope so, because I'd feel like an ass for shelving out so much money on this rig, only to fall and break a leg just inches from the finish line. I'll keep you guys posted... One thing I have to admit, is that it's packaged quite nicely.


----------



## mironccr345

Hope it works out for you. Keep us posted.

Sent from The Past
using Tapatalk


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> Hope it works out for you. Keep us posted.
> Sent from The Past
> using Tapatalk


What the heck... Even with a brand new 1300 watt PSU, it still blinks. Am I missing something here? It's pretty depressing to get to this point and have this happen. The rig is pretty much behaving the same way as it did with the other PSU. I can turn it on, and all the lights, fans, and pump work, but that's as far as it goes. The RIVE will cycle through its usual boot up codes, a, but that's as far as it goes when its turned on.


----------



## mironccr345

Was it working before you added the block? I can't remember if you bought it with the block or reference cooler? Also, what have you done trouble shooting wise?


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> Was it working before you added the block? I can't remember if you bought it with the block or reference cooler? Also, what have you done trouble shooting wise?


I bought the hydro copper version. I didn't add a block.

As for trouble shooting, I don't know what else to do, other than what I've already done... Such as switching PCI-e slots, disconnecting everything else, and so on. I bought it "new" on eBay a few months ago through a third party, so I don't know if EVGA will honor it. At this point I just need to find out why its still not working after troubleshooting all the other possibilities of the flashing logo.

The other thing that I noticed about my rig is that when it's connected to my display, the display recognizes that the PC is there, but there's no visual connection.


----------



## OccamRazor

Try the card in a friends rig, if it doesn´t work you know you have a defect card and RMA the sucker, (if you have the original invoice they will as long as its inside the warranty period) if it works then go back to your rig and try another graphics card to see if its the board or Pcie lanes,
Good luck!


----------



## mironccr345

Quote:


> Originally Posted by *ProfeZZor X*
> 
> I bought the hydro copper version. I didn't add a block.
> As for trouble shooting, I don't know what else to do, other than what I've already done... Such as switching PCI-e slots, disconnecting everything else, and so on. I bought it "new" on eBay a few months ago through a third party, so I don't know if EVGA will honor it. At this point I just need to find out why its still not working after troubleshooting all the other possibilities of the flashing logo.
> The other thing that I noticed about my rig is that when it's connected to my display, the display recognizes that the PC is there, but there's no visual connection.


Any luck with the 590? Have you contacted the seller from ebay or even EVGA?


----------



## OccamRazor

@mironccr345

Hows your setup with the 590 powering the 3x 24" with the latest games?
Any slowdowns? I've got the asus 27" and I was thinking about grabbing 2x 24" because the 27" are too expensive
But was wondering the game efectiveness of having 24+27+24, any thoughts?

Thanks in advance









Ed


----------



## mironccr345

Quote:


> Originally Posted by *OccamRazor*
> 
> @mironccr345
> Hows your setup with the 590 powering the 3x 24" with the latest games?
> Any slowdowns? I've got the asus 27" and I was thinking about grabbing 2x 24" because the 27" are too expensive
> But was wondering the game efectiveness of having 24+27+24, any thoughts?
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


It's running alright. I was running EVGA 460 2GB (256 bit version) in SLi before my EVGA 590. A big improvement when playing on x3 24" monitors. I can run pretty much run any game on high/extreme. I usually have to customize my settings with AA turned down and I always have Vsync enabled so I can't give your accurate FPS. I still get some lag in some of the demanding games like Crysis 2 or Metro 2032, but you can always customize the settings for smooth game play.

Im sure you wont have any problems with those screens as long as they are 1080p monitors. You'll def. have to use Bezel correction with the 27" in the middle. I'm curious to see how it would look during a game with one 27" in the middle and two 24". Might take some time to get use too? Why not save up for two more 27"?


----------



## OccamRazor

thanks for the reply!








i would but one 27" is twice more expensive than the 24", so i can get 2x 24" for the price of one 27"...








i would like to go portrait but dont think it can be done...


----------



## mironccr345

Well good luck either way. Take pics when you get it up and running.


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> Well good luck either way. Take pics when you get it up and running.


Absolutely, I'll keep you posted, and take pictures when it's finally done...

Say, does anyone have any experience with EVGA Guest RMA's? I'm curious to know about that procedure and what happens if the graphics card ends up being faulty. If it ends up having to be replaced, how would they handle it if they no longer manufacture or sell my card? Would it be prorated down, or would I have to come out of the pocket and upgrade? If any of you have experience with EVGA warranty situations, it would be greatly appreciated, to put my mind at ease.


----------



## mironccr345

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Absolutely, I'll keep you posted, and take pictures when it's finally done...
> Say, does anyone have any experience with EVGA Guest RMA's? I'm curious to know about that procedure and what happens if the graphics card ends up being faulty. If it ends up having to be replaced, how would they handle it if they no longer manufacture or sell my card? Would it be prorated down, or would I have to come out of the pocket and upgrade? If any of you have experience with EVGA warranty situations, it would be greatly appreciated, to put my mind at ease.


I don't, but I had a buddy return an x58 mobo about 4 months ago.They send you the progress of your RMA or give you a ling to track the progress of it.


----------



## Pedropc

Hi all, sorry for my English. Gigabyte GTX590 @ Asus MARS II;



This with RL and works perfectly.


----------



## mironccr345

Quote:


> Originally Posted by *Pedropc*
> 
> Hi all, sorry for my English. *Gigabyte GTX590 @ Asus MARS II*;
> http://imageshack.us/photo/my-images/259/3dmarkmarsii.png/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This with RL and works perfectly.


Please explain?


----------



## Pedropc

Do not know if anyone has done this before me. I have a Gigabyte GTX590, could not do much overclocking with the latest drivers, jumped the PDL. So what I did was flash the bios of the Asus Mars II, I can put the latest drivers and overclock, and I have no PDL, the vga works ok with this bios, greetings.


----------



## mironccr345

Quote:


> Originally Posted by *Pedropc*
> 
> Do not know if anyone has done this before me. I have a Gigabyte GTX590, could not do much overclocking with the latest drivers, jumped the PDL. So what I did was flash the bios of the Asus Mars II, I can put the latest drivers and overclock, and I have no PDL, the vga works ok with this bios, greetings.


Nice, do you have a guide or a link to a guide? Thanks!


----------



## Pedropc

Hi, I have done so;

1º, download the bios TechPowerUp. (Asus Mars II).
2º, put the two bios on a bootable usb.
3º, boot from the usb.
4º, put nVFlash-r, to unlock the bios, this must be done twice, with bios 1 and bios 2.
5º, put nVFlash-i1 -4 -5 -6 xxxxA.rom where is the bios xxxxA 1.
6º, put nVFlash-i2 -4 -5 -6 xxxxB.rom where is the bios xxxxB 2.

ready.

When you try to flash the bios says you do not support, ignore, give YES and makes trouble flashing a greeting.


----------



## OccamRazor

Hi Pedro, (português?) for how long do you have those chip volts?
You know that due to design flaws the 590 can't handle too much right?
But interesting anyway!









Ed


----------



## Wogga

it can using water cooling. dunno for how long though

btw, hwinfo shows that i have 4 CHiL CHL8266 chips, shows their temperature and voltage. can those chips show this info or its just some sort of bug?
they show 0.955v and barely 30C under load


----------



## Pedropc

Good OccamRazor, I'm Spanish. I know this is picky vga the vrm's. Not to me through the daily voltage so I have;



Discounted frequencies even if the PDL does not enter at any time.

You know what the recommended voltage with RL?, I had read that you could get to 1.063 V, is much or ok?


----------



## OccamRazor

Hi Wogga, unfortunately water doesn't help much when it's voltages that kill the card but not heat;
The CHILL CHL8266 is a 6-phase digital synchronous buck controller for GPU regulation, regulates (and acts as a sensor as well giving readings) voltages at GPU level. More on this: http://www.chilsemi.com/products/multiphase-digital-power-controllers/chl8266/#tech


----------



## Wogga

thx for info on CHL8266.
i used to run one of my cards on 1.05v without any consequences. and it was "broken" at that moment (i accidently ripped out one of the capacitors)
also i think if you lucky enough, your card can run on 782 @ lower voltage than 1.038. there were examples


----------



## Pedropc

Good Wogga, the vga is stable to 782 mhzs core with 1.025 V, as I said this for RL with EK block. That voltage is dangerous for vrm's?, Thanks and greetings.


----------



## OccamRazor

Wogga and Pedropc;
I think you should read this : http://www.overclock.net/t/974902/nvidia-gtx-590-owners-club/4180
the maximum volts our card can handle is 1.050V and there's a voltage variation of 0.030+-,
And that means the max safe voltage is 0.0950! It's not the VRMS to blame but the delivery system that cannot handle the load!
Read what "Masked" has to say about volts, he blew a few 590, has extensive knowledge about this and it's one (if not the one) best praised members!
@pedropc
Mui bien pedrito, yo soy portugues pero mi bisabuela eres española de Sevilla!









Ed


----------



## mironccr345

Lots of good information, thanks guys. I def. want to run at safe/recommended volts.


----------



## OccamRazor

i think that we have a good thing here that deserves further investigation, the fact we can use asus mars 2 bios and tone down the volts to suit our cards its excelent!
we avoid the nasty limitations imposed by the bios and the drivers, but its to be used with sense (voltage wise) otherwise well start seeing blown cards again!
Another thing; as the 590 is 2 gpus its to be remembered that the voltages differ from gpu to gpu, so remember that when setting volts! the second gpu might be overvolting, so be conservative in the voltage applied and keep an aye out on the whatever gpu monitor youre using for voltages on both gpus!









@ Pedropc:
can you send me a PM with all the steps you took flashing the asus mars bios? and the commands too?
Muchas Gracias!









Ed


----------



## OccamRazor

@ mironccr345

Thanks man, not just because you have a 590 but because you're a nice guy! Should be more like you around!
Cheers!


----------



## mironccr345

Quote:


> Originally Posted by *OccamRazor*
> 
> i think that we have a good thing here that deserves further investigation, the fact we can use asus mars 2 bios and tone down the volts to suit our cards its excelent!
> we avoid the nasty limitations imposed by the bios and the drivers, but its to be used with sense (voltage wise) otherwise well start seeing blown cards again!
> Another thing; as the 590 is 2 gpus its to be remembered that the voltages differ from gpu to gpu, so remember that when setting volts! the second gpu might be overvolting, so be conservative in the voltage applied and keep an aye out on the whatever gpu monitor youre using for voltages on both gpus!
> 
> 
> 
> 
> 
> 
> 
> 
> @ Pedropc:
> can you send me a PM with all the steps you took flashing the asus mars bios? and the commands too?
> Muchas Gracias!
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


What @OccamRazor said about the flashing, would love to have a step by step guide if you have the time.
Quote:


> Originally Posted by *OccamRazor*
> 
> @ mironccr345
> Thanks man, not just because you have a 590 but because you're a nice guy! Should be more like you around!
> Cheers!


Thanks man, much appreciated.


----------



## OccamRazor

We are here to help each other, in the meantime if we can make friends, all for the better right?
We have a saying like this: "You can't choose your family but your friends you can" I go a step further and say: " friends are the family you can choose from"









Ed


----------



## emett

Wow, 782mhz on a 590, kudos man.


----------



## Canis-X

I would definitely like to hear more about the MARS2 flash!! I would like to see what I can actually do in some benchmarks and games without NVidia's limitations!!!


----------



## OccamRazor

Ok guys, im writting down a guide in english on Pedropc behalf cause hes spanish
and his english is not so good but what he lacks in speech he has in good heart and courage!
Kudos to you Pedro!


----------



## Canis-X

Fantastic! Thanks so much for taking this on Occam!! +REP


----------



## mironccr345

Quote:


> Originally Posted by *OccamRazor*
> 
> Ok guys, im writting down a guide in english on Pedropc behalf cause hes spanish
> and his english is not so good but what he lacks in speech he has in good heart and courage!
> Kudos to you Pedro!


This is going to be very helpful! Thanks!


----------



## OccamRazor

590 TO MARS II

1º get the two bios from techpowerup (http://www.techpowerup.com/vgabios/104118/Asus.GTX590.1536.110708.html and http://www.techpowerup.com/vgabios/104119/Asus.GTX590.1536.110708_1.html) its TWO (2) bios!
bios 1 and bios 2

2º create a bootable USB (heres a free one: http://www.softpedia.com/get/System/Boot-Manager-Disk/Bootable-USB-Drive-Creator-Tool.shtml) and put the bios files inside the usb drive with nvflash(http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html)

3º set your pc to boot from usb so it will read your fresh bootable usb drive

4º first thing to do is remove the bios protection so type: nvflash -r then it will ask you which bios you want to remove the protection, three bios apear, 0, 1 and 2, the 0 is the bios from NF200, (dont mess with that one, thats the SLI chip) so type 1 wait a moment and it will remove the protection from bios 1, now do the same for de second bios: nvflash -r type 2 and remove the protection from bios 2

5º now for the real flashing! type nvflash -i1 -4 -5 -6 xxx.rom, where xxx is the name of the downloaded bios 1, it will say several times that is the incorrect bios and that something does not match, dont pay attention to it, just type YES in capitals and it will flash it!

6º the same with bios 2, nvflash -i2 -4 -5 -6 xxx.rom, where xxx is the name of the downloaded bios 2, and the same warnings from before, just type YES and it will do it!

7º remove the usb drive and reboot and its done you now have a Mars II!

Now a WORD OF WARNING! please keep in mind that doing this is a risk! it worked with Pedro's gigabyte it might not work with my ASUS or your EVGA or another one! and people PLEASE be extra careful with the voltages applied!
REMEMBER the burned cards and listen to what MASKED said about volts on the 590: http://www.overclock.net/t/974902/nvidia-gtx-590-owners-club/4180 ,the maximum volts our card can handle is 1.050V and there's a voltage variation of 0.030+-,
And that means the max safe voltage is 0.0950 (NVIDIA says range from 0,012-0,025 = default is 0,963 + 0,025= 0,988 TOPMAX, MASKED burned a 590 with 1.000v so think about it) It's not the VRMS to blame but the delivery system that cannot handle the load! don't think because its on water you can raise the volts, its not the heat but the voltage that KILLS OUR 590´S!

Now enjoy!









Ed


----------



## Canis-X

Thank you Occam!! Has anyone attempted it yet? What were the results??







I must admit, I'm a little nervous...LOL


----------



## Wogga

check previous pages. there were some results


----------



## emett

Wish I still had my 590 :,(


----------



## OccamRazor

Its like a lingering feeling you should have kept the 590? i had a flash-forward when i was going to buy a 680, i had the 590 brought in by a friend for a good price (300$) and then thought of buying the 680 cause the price difference was just around 200$ more, suddenly i started to think that the 590 was like a challenge, that could perform better, that somehow someone could unleash the real 590, so i bought it instead of the 680, and now thanks to the clever PEDROPC (Kudos to you my Friend) we might have a better 590 like it should have been since the beginning!

Quad Mars II anyone?









Ed


----------



## OccamRazor

@ Canis-X

If you go for it please tell us how your QUAD MARS II behaves!!!
ive got a second 590 in line for the taking...









Ed


----------



## mironccr345

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *OccamRazor*
> 
> 590 TO MARS II
> 1º get the two bios from techpowerup (http://www.techpowerup.com/vgabios/104118/Asus.GTX590.1536.110708.html and http://www.techpowerup.com/vgabios/104119/Asus.GTX590.1536.110708_1.html) its TWO (2) bios!
> bios 1 and bios 2
> 2º create a bootable USB (heres a free one: http://www.softpedia.com/get/System/Boot-Manager-Disk/Bootable-USB-Drive-Creator-Tool.shtml) and put the bios files inside the usb drive with nvflash(http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html)
> 3º set your pc to boot from usb so it will read your fresh bootable usb drive
> 4º first thing to do is remove the bios protection so type: nvflash -r then it will ask you which bios you want to remove the protection, three bios apear, 0, 1 and 2, the 0 is the bios from NF200, (dont mess with that one, thats the SLI chip) so type 1 wait a moment and it will remove the protection from bios 1, now do the same for de second bios: nvflash -r type 2 and remove the protection from bios 2
> 5º now for the real flashing! type nvflash -i1 -4 -5 -6 xxx.rom, where xxx is the name of the downloaded bios 1, it will say several times that is the incorrect bios and that something does not match, dont pay attention to it, just type YES in capitals and it will flash it!
> 6º the same with bios 2, nvflash -i2 -4 -5 -6 xxx.rom, where xxx is the name of the downloaded bios 2, and the same warnings from before, just type YES and it will do it!
> 7º remove the usb drive and reboot and its done you now have a Mars II!
> Now a WORD OF WARNING! please keep in mind that doing this is a risk! it worked with Pedro's gigabyte it might not work with my ASUS or your EVGA or another one! and people PLEASE be extra careful with the voltages applied!
> REMEMBER the burned cards and listen to what MASKED said about volts on the 590: http://www.overclock.net/t/974902/nvidia-gtx-590-owners-club/4180 ,the maximum volts our card can handle is 1.050V and there's a voltage variation of 0.030+-,
> And that means the max safe voltage is 0.0950 (NVIDIA says range from 0,012-0,025 = default is 0,963 + 0,025= 0,988 TOPMAX, MASKED burned a 590 with 1.000v so think about it) It's not the VRMS to blame but the delivery system that cannot handle the load! don't think because its on water you can raise the volts, its not the heat but the voltage that KILLS OUR 590´S!
> Now enjoy!
> 
> 
> 
> 
> 
> 
> 
> 
> Ed






I appreciate this! Im also nervous and excited to try it. I'll give it a try in a couple of days. Wish me luck!


----------



## Canis-X

I wonder....should we mod the MARSII BIOS voltages and freqs before we flash the 590's to ensure we don't go BOOM?


----------



## mironccr345

That's a good questions. Who can do this?


----------



## OccamRazor

ok guys take a look at this:

ASUS MARS II

GPU Device Id: 0x10DE 0x108B
Version: 70.10.37.00.05
Mars II VB Ver 70.10.37.00.AS07S
Copyright (C) 1996-2011 NVIDIA Corp.
GF110 Board - 102015P0

Performance Level 0
Core Clk: 51.00 MHz
Mem Clk: 68.00 MHz
Shader Clk: 101.00 MHz
Voltage: 0.9625 V









Performance Level 1
Core Clk: 405.00 MHz
Mem Clk: 162.00 MHz
Shader Clk: 810.00 MHz
Voltage: 0.9625 V









Performance Level 2
Core Clk: 554.00 MHz
Mem Clk: 802.00 MHz
Shader Clk: 1107.00 MHz
Voltage: 0.9625 V









Performance Level 3
Core Clk: 782.00 MHz
Mem Clk: 1002.00 MHz
Shader Clk: 1564.00 MHz
Voltage: by ASIC (1.0375 V - 1.1000 V)








User Voltage limit: below 51.4566 V

the 3 first voltages are ok and as you can see they are default to our cards (0.963v)
the other in performance level 3 you can lower via afterburner or precision!
OR
use NIBITOR and change the bios values before flashing! http://www.mvktech.net/component/option,com_remository/Itemid,0/func,download/id,3566/chk,dc53ae1785c755cd2681967a3a016e0a/










Ed


----------



## OccamRazor

Well it seems NIBITOR doesn't recognize these bios!
Ive contacted MVKTECH and submited the bios to their database, as soon i have a reply from them i will post it!









Ed


----------



## CapitanPelusa

Going to Flash my ASUS GTX590 later tonight to confirm that it works on those. I don't have waterblock so I will not try to reach 782mhz on it heh but just confirm that the BIOS works ok on the asus variants.


----------



## OccamRazor

Good Man CaptainPelusa!








ill be online all night (night shift) until 0600 ( thats your 0100) so we can talk about how it went
as your card is the same as mine!
Thanks

P.S. remember to tone down the voltage and the clocks with afterburner or precision or what program you use because if you try to game the voltage ramps up and can damage the card, in idle and desktop use its fine as the voltage is default (0.963v)!


----------



## Canis-X

Cool! I may just give this a go this evening anyway. Until we can get NIBITOR to recognize the BIOS I may setup Afterburner to start with Windows with the safe voltage to help protect my cards.









Cheers!


----------



## OccamRazor

thats a good idea Canis-X, that way there will be no danger to the card!
as i said to CaptainPelusa i will be here all night so we can talk about it!

Cheers

Ed


----------



## CapitanPelusa

Just flashed the ASUS 590. As soon as the PC was turned back on the BIOS post presented heavy artifacts and corruption. Boots into windows and detects as 580's but Driver doesnt load. Device Manager shows "Windows has stopped this device because it has reported problems. (Code 43)" and resolution is stuck at 800x600 and artifacts are still present.

However i should mention that i flashed this with nvflash for windows. Im going to reflash stock bios and reboot and reflash MARS2 bios via USB boot disk to see if something went wrong.

I guess im paying for being lazy =/


----------



## CapitanPelusa

lol.. seems i just bricked my 590. I seem to be an idiot.

I followed the same steps I have used in the past to flash it and it seems this bios killed it or the fact that i did it via nvflash for windows killed it.

now when i try to flash back the stock bios it says no EEPROM found, no matches found and if i force the flash it gives me a "WARNING: Debug port will be disabled during EEPROM access" and does nothing.


----------



## CapitanPelusa

Well after banging my head against the wall for a long time i managed to unbrick the 590. Im going to make the USB drive and attempt to flash it to MARS2 Bios now via DOS.

Going to download the BIOSes again in case it was a corrupted file or wrong file...


----------



## OccamRazor

Captain! It's probably the memory frequencies! They are too high for your card!
Perhaps it's better to wait for nibitor support, but if you make it and see corruption again lower the mem frequency in afterburner and save it to start with windows!
Fingers crossed for you!








Suerte hermano

Ed


----------



## Canis-X

You know....thinking about this.....you probably should uninstall everything NVidia and run Driver Fusion before the flash, then download the NVidia install pack for the MARS2.....just a thought.


----------



## CapitanPelusa

OK after 4 flashes (properly done this time) I can say these BIOS arent working on the ASUS ENGTX590.

For some reason the First GPU's BIOS always gets ****ed up after the flash and its a big pain in the ass to get the GTX590 to work again.

All 4 attempts exhibited video corruption from the moment the BIOS posts all the way to windows. All attempts resulted in Device Errors 40 or 43 in Windows Device Manager.

I have reverted back to stock BIOS until we can sort this out.

If any other person happens to soft-brick their 590 while attempting this let me know cus now i got the steps to follow to un-brick it and its a bit of a pain in the ass.


----------



## OccamRazor

Humm... Perhaps these are European bios and after all it worked on a European card, PEDROPC's gigabyte,
I remember reading something in the 590 flashing thread about not flashing European bios to American cards and vice versa...
Let see if we can find more asus mars II bios and get to the bottom of this!

Ed


----------



## OccamRazor

And CapitanPelusa,
Thanks for your input!
Muchas gracias!









Ed


----------



## CapitanPelusa

Who has a North American Mars2? =(


----------



## Canis-X

Anyone have any ideas?


----------



## OccamRazor

Still waiting for a reply from the creator of nibitor, when i have the version of nibitor that can read the bios, ill mod and try them on my 590, also im waiting for a check on the bios to see if they're the same
as the ones pedropc flashed his gigabyte with and worked, hes gonna send them to me Friday!
ill get to the bottom of this!









Ed


----------



## Pedropc

Final configuration of CPU, memory and frequency / voltage of GPU, with this setup I have been playing Battlefield 3 several hours without any problems, a greeting.


----------



## OccamRazor

Hi Pedro, glad you're back!








don't go over 0,988v, its dangerous, there were 590´s that blew with 1.000v!
can you send me the bios please? i want to check if its the same!









Saludos

Ed


----------



## Pedropc

Yes, I know it's not good pass of 0.988 V, but if I put 1.00 V I is stable at 700 core mhzs. With the changes of the Mars II gives more performance, but need some more voltage than the original bios, guess it will be by the phase difference between a GTX590 and MARS II, a greeting.

P.D, I've answered privately.


----------



## CapitanPelusa

As i told Occam via PM i would like to update my report on the issue i had flashing my ASUS GTX590 with the MARS2 Bios.

As i said my ASUS 590 didnt work at all with the MARS2 bioses. But this is not my orignal 590.

I got my 590 on july 2011 and I flashed it a lot of times with many different bioses with volt mods etc, never had any problem and i had mine overclocked with those bioses.

1 month ago my brother spilled water on my rig and it fell on the 590 and blew it up to hell so i had to RMA it.

ASUS sent me back a different CARD via RMA and it seemed new and unused, but maybe it was refurbished. This is the card i tried flashing into MARS2 and it simply will not work with those bioses, GPU1 always gets ****ed with the MARS2 bios even tho GPU2 doesnt present a problem.

However a few days ago i tried flashing this RMA'd ASUS 590 with ALL other bioses i used to use on my original 590 and guess what, they dont work, they give me the same problem the MARS2 flash did. the ONLY freaking bios this RMAd card will take is the one posted on the ASUS website as the BIOS update for the aus 590.

So i guess my experience isnt indicative of what will happen with other ASUS 590s. Maybe this card i got from RMA is remanufactured, ****ed up, or simply programmed to not take any other bioses, who knows.

So if anyone has an ASUS 590 please try and see if it works for you. If you soft-brick it like i did i can help you restore back to stock.

Hope someone can confirm its working on ASUS 590 variants as mine seems to be a special case. Gonna try flashing more bioses into this card to confirm that its just ****ed up. It works perfectly as a stock card with the stock bios tho, just wont take any modded bioses =( Dunno why.


----------



## alancsalt

Don't know if this will help.
Cards get revised during manufacture. RAM can change, or other less major components.
Two of my Gainward GTX 580 cards use bios 70.10.17.00.01
I bought a third thinking to tri sli. It uses bios 70.10.48.00.01
If I flash it to 70.10.17.00.01 Windows says "Nvidia Card Not Found"

So what I thought:
What bios revision number does the Mars bios show? (Use GPUZ)
What bios revision number was originally on the 590s where this bios flash works? (Nibitor? Old GPUZ screenshots?)
What bios revision number was originally on the 590s where this bios flash did not work?

You could build up a table for those who come later showing which 590s will accept and which won't.....

Looks like Pedro has 70.10.37.00.05


----------



## CapitanPelusa

Man i feel like such an idiot.

I didnt see which bios revision this RMA card had before i flashed, i didnt backup the bios either. I just rushed to flash the MARS2 bios and when it failed and i bricked it I just used the ASUS BIos update posted to revert back to stock after a lot of trial and error.

Im sorry i cannot provide detailed info =(. I can only say which Bios my card currently has. It still wont take other bioses but ill keep trying to see.


----------



## CapitanPelusa

But definitely if other ASUS 590 users can confirm if their cards take the MARS2 bios well we can start from there and start comparing.


----------



## alancsalt

ASUS MARS - 70.10.37.00.05
GPU Device Id: 0x10DE 0x108B
GF110 Board - 102015P0

ASUS GTX590 - 70.10.37.00.01 - 613/1710
ASUS GTX590 - 70.10.37.00.02 - 613/1710

EVGA GTX590 - 70.10.37.00.90 - 630/1728
EVGA GTX590 - 70.10.37.00.91 - 630/1728
EVGA GTX590 - 70.10.37.00.92 - 630/1728
EVGA GTX590 - 70.10.37.00.93 - 630/1728
EVGA GTX590 - 70.10.42.00.90 - 630/1728

GAINWARD GTX590 - 70.10.37.00.02 - 608/1708

GIGABYTE GTX590 - 70.10.37.00.01 - 608/1708

MSI GTX590 - 70.10.37.00.02 - 608/1708

ZOTAC GTX590 - 70.10.37.00.02 - 608/1708

I am just guessing, but the EVGA with 42 in the revision number...I'm guessing that one wouldn't work....just guessing. For those who try, please give your revision number...

Example:
-Version: 70.10.37.00.01 for GPU 1
-Version: 70.10.37.00.02 for GPU 2
for the flash to work each version number has to be matched with that gpu b/c one gpu is master and one is the slave. Be aware they are different...need both.


----------



## Pedropc

The bios I had my Gigabyte originally was; GIGABYTE GTX590 - 70.10.37.00.01 - 608/1708


----------



## Wogga

hmm..now i dont see the point in flashing mars 2 bios. mine can do [email protected] and i dont play 3dmark11 (atleast for now =D)


----------



## Pedropc

Know where the real medicicion points Vcore on GTX590?, To measure it with tester, Thanks and regards.


----------



## Canis-X

Anyone have any updates/thoughts/herps/derps?


----------



## OccamRazor

i cannot try it right now because im still waiting for hardware to arrive, my 590 is looking at me as i type on my desk...








Confirmed bios is 70.10.42.00.02 but still gonna try!








@ CapitanPelusa: you can see which bios version your card is on the back of the pcb, there's a black sticker with bios version on it!
can you confirm if its the .42 as alancsalt is guessing?
@alancsalt: theres that guy TiN (russian overclocker) he knows for sure where the voltage measuring points are on the 590, do you know a way to contact him?

thanks guys

Ed


----------



## alancsalt

Quote:


> Originally Posted by *OccamRazor*
> 
> i cannot try it right now because im still waiting for hardware to arrive, my 590 is looking at me as i type on my desk...
> 
> 
> 
> 
> 
> 
> 
> 
> Confirmed bios is 70.10.42.00.02 but still gonna try!
> 
> 
> 
> 
> 
> 
> 
> 
> @ CapitanPelusa: you can see which bios version your card is on the back of the pcb, there's a black sticker with bios version on it!
> can you confirm if its the .42 as alancsalt is guessing?
> @alancsalt: theres that guy TiN (russian overclocker) he knows for sure where the voltage measuring points are on the 590, do you know a way to contact him?
> thanks guys
> Ed


Unfortunately not, but I'll put a post in the voltmod section to see if anyone there knows anything. Big arguments about the voltages back around p110 of this thread.....


----------



## CapitanPelusa

70.10.42.00.02 is the bios that my RMA'd ASUS gtx590 shipped with


----------



## OccamRazor

Thanks alancsalt for your help and input!
Appreciated!









Ed


----------



## alancsalt

Does anyone know voltage measuring points for GTX590?

Couple of answers talking about how to find them...join in, ask questions..


----------



## toX0rz

So I haven't checked here for a while.

What exactly does the Mars II BIOS do? Can't you simply overclock the 590 with the stock BIOS?


----------



## alancsalt

Here's an answer from just_nuke_em on the voltage measuring points..

http://www.overclock.net/t/1307373/does-anyone-know-voltage-measuring-points-for-gtx590#post_18188275


----------



## Wogga

Quote:


> Originally Posted by *toX0rz*
> 
> So I haven't checked here for a while.
> What exactly does the Mars II BIOS do? Can't you simply overclock the 590 with the stock BIOS?


"turns off" PDL in some benchmarks (atleast in 3DMark11 and unigine heaven )


----------



## OccamRazor

thanks alancsalt!








As lots of members already stated in this thread since last year,
the voltage deviation/fluctuation on the 590 is .03 (as stated by Nvidia), so if your card is at 1.000v actually if you measure it, its 1.030v!
(verified by Masked) so anyone that want to verify by yourself, heres alancsalt with the how and the where!
thanks man!








@PedroPc aqui lo tienes hombre!









Ed


----------



## Pedropc

OK, thank you very much, I bought a new blackplate eVGA, to replace EK, when I come in and change it, my intention is to measure the tester to know me real voltage deviation, and check if power management is different with the bios of the GTX590 and MARS II bios, greetings.


----------



## OccamRazor

Hi Pedro,
What was wrong with the EK? Ive got one on my 590...









Saludos

Ed


----------



## Pedropc

There's nothing wrong, it's great that of EK, only the change that I like the EVGA, greetings.


----------



## ProfeZZor X

Sent my RMA'd card out on Saturday, it arrived at EVGA on Monday, and got the replacement this morning (Wednesday)... Damn that was lightning fast.

I just hope that it'll work this time. If not, then you can only imagine the frustration and anger I'll be going through.

If anyone has any suggestions on what to do or what not to do before I test this replacement card out, I'm all ears.


----------



## ProfeZZor X

Well, I did another preliminary test last night and the EVGA logo was still blinking. The replacement that I received was in a practically new condition, so I'm wondering if it might be something else, other than the unit itself. Like the motherboard (R4E), the power cables (I had then extended), or even the memory itself (Corsair Vengeance low profile).

I refuse to give up after coming so far, so I guess with a little more patience and the right people to guide me, I'll eventually get this rig up and running...


----------



## alancsalt

Quote:


> Originally Posted by *OccamRazor*
> 
> thanks alancsalt!
> 
> 
> 
> 
> 
> 
> 
> 
> As lots of members already stated in this thread since last year,
> the voltage deviation/fluctuation on the 590 is .03 (as stated by Nvidia), so if your card is at 1.000v actually if you measure it, its 1.030v!
> (verified by Masked) so anyone that want to verify by yourself, heres alancsalt with the how and the where!
> thanks man!
> 
> 
> 
> 
> 
> 
> 
> 
> @PedroPc aqui lo tienes hombre!
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


just_nuke_em deserves all credit, honestly.


----------



## wermad

Bought a couple of pre-owned ones. Planning to get them nice and wet soon


----------



## MKHunt

Quote:


> Originally Posted by *wermad*
> 
> Bought a couple of pre-owned ones. Planning to get them nice and wet soon


I have a Koolance Rev 1.0 (final design) block BNIB collecting dust. I'll net it go for like... $80+ship? Box even has the Koolance tape on it still. Nickel + acetal.


----------



## alancsalt

Quote:


> Originally Posted by *MKHunt*
> 
> I have a Koolance Rev 1.0 (final design) block BNIB collecting dust. I'll net it go for like... $80+ship? Box even has the Koolance tape on it still. Nickel + acetal.


Knowing Wermad, he'll want two!


----------



## wermad

Picked up two ek blocks. Not my first choice but the price was pretty good. Just waiting on things to arrive in the next few days.


----------



## MKHunt

Quote:


> Originally Posted by *wermad*
> 
> Picked up two ek blocks. Not my first choice but the price was pretty good. Just waiting on things to arrive in the next few days.


EK blocks are nice. Running one on my 590. Flow is a little better than the Koolance, though it runs about 6-8c higher at full load. Just a matter of channel and fin size really. And 46 vs 52C isnt really that big of a deal lol.

Plus with the EK backplate it looks amazing. Pretty sure this pic has been posted in here at least 3 times before, but yeah.


----------



## wermad

nice


----------



## OverClocker55

Would a GTX 580 3GB come close to the same FPS as the 590? Anyone think that if I offered a trade anyone would accept?


----------



## heyskip

So with the recent developments my curiosity got the better of me and I flashed the Mars II bios to my EVGA GTX590. I'll give everyone a rundown of my experience incase you may want to do the same.

After the flash I had to re-install the graphics drivers as windows didn't recognise the new GPU. My audio output (DVI - HDMI cable to video processor then amp) was in-op after this which may have been fixed by moving the cable to a different DVI output but I didn't test this as other issues were bothering me. The latest version of Afterburner was unable to adjust the voltage so I installed the ASUS GPU Tweak tool. This allowed voltage adjustment, which I set to 1.000v. The ASUS tool reported 1.000v constantly but GPU-Z showed the voltage moving between 0.963v at idle to a high of 1.038v.

I did do a quick graphics only run of 3DMark 11 at 750mhz 1.000v which seemed to be PDL free and scored X3636. This score was encouraging as my previous best was X3607 graphics only at 740mhz 0.988v. I stopped playing here as the voltage spikes to 1.038v were a bit worrying.

I reflashed back to my 0.988v bios without drama. Once Nibitor has the ability to read these MARS II bios's and once Afterburner can adjust the voltage I will attempt another re-flash. But at this stage I don't trust the ASUS tool to accurately adjust/monitor the voltage.

My EVGA card originally had the 70.10.37.00.90/91 or 70.10.37.00.92/93 (not quite sure as I never saved the original) bios but has been running a modified 70.10.42.00.90/91 bios for a while now.

I hope this info helps anyone who is considering this procedure.

EDIT: Another thing I noticed was the original BIOS HDCP was not supported but was with the MARS II bios.


----------



## alancsalt

Interesting to me that you were able to run a 42 bios on a 37 card. In the 580 that rendered one of my cards inoperable. Would like to hear worked/not worked from anyone else who has cross flashed between these revisions.


----------



## Canis-X

Nice! Thanks Heyskip! So, we know that it will work on an EVGA card, has anyone else tried it on an ASUS? I haven't had the time myself to attempt it and then revert in case it borked my card like PedroPc's.


----------



## alancsalt

Where does it say it borked Pedropc's card? I have not seen that posted. Thought it was working fine for pedropc, hence the interest....


----------



## OccamRazor

No, it did not bork pedros card, it borked capitanpelusas asus but it's recoverable with asus update but it's interesting to know a 37 card with a 42 bios, capitanpelusas asus is a 42 and he tried to flash the mars 2 which is a 37 and it failed, is there a way to edit the bios to change the 42 into 37 so the card accepts it? Is there any other bios editor besides nibitor?
I've submitted mars 2 bios and sent a lot of messages to mkvtech kindly asking for an answer to if was possible in the next nibitor version to edit this bios but no answer, this was a week ago...









Heyskip can you share your intel?









Ed


----------



## alancsalt

Quote:


> Originally Posted by *OccamRazor*
> 
> No, it did not bork pedros card, it borked capitanpelusas asus but it's recoverable with asus update but it's interesting to know a 37 card with a 42 bios, capitanpelusas asus is a 42 and he tried to flash the mars 2 which is a 37 and it failed, is there a way to edit the bios to change the 42 into 37 so the card accepts it? Is there any other bios editor besides nibitor?
> I've submitted mars 2 bios and sent a lot of messages to mkvtech kindly asking for an answer to if was possible in the next nibitor version to edit this bios but no answer, this was a week ago...
> 
> 
> 
> 
> 
> 
> 
> 
> Heyskip can you share your intel?
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


Only Fermi Bios Editor 1.55 and that says "Unsupported Device"......

Suspect hardware revision between 37 and 42, but don't really know...

Now this youtube video says you can update Mars II bios with GPU Tweak, but I can find no bios available for download on any Asus site, so is GPU Tweak the only way to get bios updates for Mars II?


----------



## OccamRazor

Let's see what heyskip as to say! He said he was running 42 in a 37 card so if its true and bios are interchangeable apparently there was no hardware revision, if we can confirm that and with a revised version of nibitor ( if and when they do it) maybe we can do the flashing! I've got the 42 version, my hardware only arrives Monday so only then i can try to flash!

Cheers

Ed


----------



## Pedropc

Hi, my vga takes a month or so working with perfectly normal, 700 mhzs core, 1.00 V. It's a Gigabyte bios and had 37.

When you change the bios and put the Mars II GPU tweak the program finds a BIOS update, and the update without any problem


----------



## Canis-X

Quote:


> Originally Posted by *alancsalt*
> 
> Where does it say it borked Pedropc's card? I have not seen that posted. Thought it was working fine for pedropc, hence the interest....


Quote:


> Originally Posted by *OccamRazor*
> 
> No, it did not bork pedros card, it borked capitanpelusas asus but it's recoverable with asus update but it's interesting to know a 37 card with a 42 bios, capitanpelusas asus is a 42 and he tried to flash the mars 2 which is a 37 and it failed, is there a way to edit the bios to change the 42 into 37 so the card accepts it? Is there any other bios editor besides nibitor?
> I've submitted mars 2 bios and sent a lot of messages to mkvtech kindly asking for an answer to if was possible in the next nibitor version to edit this bios but no answer, this was a week ago...
> 
> 
> 
> 
> 
> 
> 
> 
> Heyskip can you share your intel?
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


Yep, I meant captainpelusas not pedropc.....sorry for the confusion!!


----------



## OccamRazor

no problemo Canis-X!








so, is there someone else that flashed successfully different bios revisions? like having 70.10.37.00.90 and flash to 70.10.42.00.90 being the 37 and the 42 different bios revisions!
can someone shine a light on this?

Ed


----------



## OccamRazor

Perhaps PedroPc is on to something, i found a size difference between the mars 2 bios we can download and the bios i recieved from PedrpPc, so the bios he sent me are the updates from the asus tweak!
maybe we have 2 asus mars 2 bios!








can you confirm that Pedro? ill send you a PM in a minute!

Ed


----------



## OccamRazor

strange.... nibitor doesnt work in my computer... even with administrator privileges it opens the bios files but everything is blanked out, and there's no values, speed or voltages...
tried earlier versions but its the same thing...


----------



## wermad

First 590 came in







. Sadly, usps "miss routed" my 2nd 590 and its across town now














. Hoping it gets delivered tomorrow


----------



## alancsalt

Quote:


> Originally Posted by *OccamRazor*
> 
> strange.... nibitor doesnt work in my computer... even with administrator privileges it opens the bios files but everything is blanked out, and there's no values, speed or voltages...
> tried earlier versions but its the same thing...


NiBiToR...
Open file in
Go to Tools
Use Fermi Clocks and Fermi voltages

Can read most but not Mars II.


----------



## CapitanPelusa

So since i cant flash to MARS2 atm (rev 42) and I cannot seem to be able to flash any of the old 37 modded bioses i got I want to tweak this 42 BIOS to increase volts a bit to .963 or .988

On the Fermi Volage Editor inside Nibitor i got:

Setting 0: 0 0.8755V 1 0.8755V
Setting 1: 0 0.8755V 1 0.9255V
Setting 2: 0 0.913V 1 0.963V

What do I need to change <.>?


----------



## alancsalt

Quote:


> Originally Posted by *CapitanPelusa*
> 
> So since i cant flash to MARS2 atm (rev 42) and I cannot seem to be able to flash any of the old 37 modded bioses i got I want to tweak this 42 BIOS to increase volts a bit to .963 or .988
> On the Fermi Volage Editor inside Nibitor i got:
> Setting 0: 0 0.8755V 1 0.8755V
> Setting 1: 0 0.8755V 1 0.9255V
> Setting 2: 0 0.913V 1 0.963V
> What do I need to change <.>?


See:

http://www.overclock.net/t/974902/nvidia-gtx-590-owners-club/1080#post_13269285

for 590 specific instructions

See

FERMI BIOS - NiBiToR - Fermi BIOS Editor guide.pdf 1468k .pdf file


For general instructions...


----------



## wermad

2nd 590 was dropped off right now!


----------



## heyskip

Quote:


> Originally Posted by *OccamRazor*
> 
> Let's see what heyskip as to say! He said he was running 42 in a 37 card so if its true and bios are interchangeable apparently there was no hardware revision, if we can confirm that and with a revised version of nibitor ( if and when they do it) maybe we can do the flashing! I've got the 42 version, my hardware only arrives Monday so only then i can try to flash!
> Cheers
> Ed


My understanding was that you could flash up from 37 to 42 just not the other way. But yes it definately worked for me. I can't remember the exact version of the original bios but I think it was the 70.10.37.00.90/91. The 42 bios I've modified was an EVGA one which may have helped the process. Theres alot of good info over in the Flashing & Overclocking thread. Towards the end of that thread is where we initially realised that Nvidia had finally taken out the voltage lock for the 590's in the 300 series drivers.

With Nibitor I think you can donate on the home page and get early access to the new version. Whether or not he has added Mars 2 support in that version I'm not sure. I may donate a little today to check it out.

Quote:


> Originally Posted by *CapitanPelusa*
> 
> So since i cant flash to MARS2 atm (rev 42) and I cannot seem to be able to flash any of the old 37 modded bioses i got I want to tweak this 42 BIOS to increase volts a bit to .963 or .988
> On the Fermi Volage Editor inside Nibitor i got:
> Setting 0: 0 0.8755V 1 0.8755V
> Setting 1: 0 0.8755V 1 0.9255V
> Setting 2: 0 0.913V 1 0.963V
> What do I need to change <.>?


Change both values under setting 2 to the max voltage you want to run. I have mine at 0.988v

Its not necessary but I also change both setting 0 to 0.825v so the card runs a couple of degrees cooler in desktop mode.


----------



## mywifeispist

I ran two 590's with 3X 27" monitors and had reflashed my cards as well, i ran my cards to 750 on the core with .988 as a max voltage but found the sweet spot to be of 740 at that voltage. It really made no difference between those clock speeds even when I flashed the voltage to 1.025, the cards ran fine at those voltages but it really wasnt worth the risk so I settled at .988 permanent. About 4-5 months ago I sold my 590's and got 2x 680's and im glad i did, the 680's are better IMO by far..... I play BF3 and that game is just sick with 680's. two 590's were good and I liked them but have absolutely no regrets getting rid of them what so ever for these 680's. I wish all you guys luck in stretching the life out of those cards.


----------



## heyskip

Quote:


> Originally Posted by *OccamRazor*
> 
> Perhaps PedroPc is on to something, i found a size difference between the mars 2 bios we can download and the bios i recieved from PedrpPc, so the bios he sent me are the updates from the asus tweak!
> maybe we have 2 asus mars 2 bios!
> 
> 
> 
> 
> 
> 
> 
> 
> can you confirm that Pedro? ill send you a PM in a minute!
> Ed


Hey Occam do you mind posting up the pair of bios you got from Pedropc.


----------



## OccamRazor

Thanks alancsalt, I'll try it when I get home!
@heyskip: give me your mail and I'll send it to you!









Ed


----------



## wermad

2nd 590 not booting up









Dropped in the first one and she lights up just fine. Contacting seller (ebay)


----------



## wermad

Quote:


> Originally Posted by *wermad*
> 
> 2nd 590 not booting up
> 
> 
> 
> 
> 
> 
> 
> 
> Dropped in the first one and she lights up just fine. Contacting seller (ebay)


ebay seller refunding me and the search begins for #2. Tested first 590 and it works awesome!


----------



## OccamRazor

@ CapitanPelusa and heyskip:
ok guys lets hear your feedback!!!








@ weremad: too bad about the fluke, hope you find another one!









Ed

P.S. Can anyone or does anyone know how to contact mavke A.K.A. Johan Leroux at mkvtech.net, the creator of NIBITOR? i tried and Heyskip as well but to no avail!


----------



## mywifeispist

Quote:


> Originally Posted by *alancsalt*
> 
> See:
> http://www.overclock.net/t/974902/nvidia-gtx-590-owners-club/1080#post_13269285
> for 590 specific instructions
> See
> 
> FERMI BIOS - NiBiToR - Fermi BIOS Editor guide.pdf 1468k .pdf file
> 
> For general instructions...


could be a copyright built into the asus bios ..... worth a try as asus is famous for doing this when flashing to there bios. i believe there are posts on this thread previously and will show u how to disable it, it is easy just cant remember the exact command off the top of my head.


----------



## OccamRazor

you´re probably right, its strange that CapitanPelusas cannot flash any other bios, perhaps they do the copyright stuff to all refurbished ones, mine has the same bios as his but its not refurbished, Monday ill see if i´m in the same boat !
lets find out about that command to disable the copyright!








Thank you mywifeispist about the hint!









Ed


----------



## mywifeispist

Quote:


> Originally Posted by *OccamRazor*
> 
> you´re probably right, its strange that CapitanPelusas cannot flash any other bios, perhaps they do the copyright stuff to all refurbished ones, mine has the same bios as his but its not refurbished, Monday ill see if i´m in the same boat !
> lets find out about that command to disable the copyright!
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you mywifeispist about the hint!
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


here is a link http://www.techpowerup.com/forums/showthread.php?t=119955
scroll down and just above step 6 is the command to disable a write protect on the gpu bios, use the latest NVflash for windows and run CMD as admin..... this should help as it worked for me in the past.


----------



## CapitanPelusa

Quote:


> Originally Posted by *mywifeispist*
> 
> here is a link http://www.techpowerup.com/forums/showthread.php?t=119955
> scroll down and just above step 6 is the command to disable a write protect on the gpu bios, use the latest NVflash for windows and run CMD as admin..... this should help as it worked for me in the past.


Erm.. this is the usual Write-Protect off command used whenever you flash a bios. This has nothing to do with the problem I experienced. Without using this command I wouldnt even have been able to flash the MARS2 BIOSes into the gtx590.

I WAS able to flash both GPU EEPROMS with the MARS2 bios, however the GPU1's EEPROM never took it well for some reason. Upon reboot the video display from POST to Windows was corrupted and presented incredible ammounts of artifacts. The device wouldnt start in windows and in short, the flash rendered the card unusable.

What was interesting tho was that when i tried to flash it back to stock 590 bios nvlflash reported that the EEPROM of the First GPU was not supported, so it technically soft-bricked the card since i couldnt flash it back, with any bios. I was able to find a workaround and restore back to 590 stock but for some reason the MARS2 BIOSes posted on techpowerup's bios database do not work on my Rev 42 ASUS gtx590.


----------



## Michi

Hi!

I have a GTX590 and I installed the latest NVIDIA drivers 306.23 and throws me to the desktop when I'm playing.

What drivers are you using?

Will my GPU problem?

Thanks!


----------



## mywifeispist

i had a similar issue with one of my 590's so i flashed back to stock bios..... i opened the stock bios in NibiTor and opened the the bios i wanted side by side and copied the settings to a copy of the stock bios and it worked for me. just thought you were having a copy write issue at first. flashing these sometimes comes with risks so good luck to u.


----------



## mojobear

hey guys,

anyone test out the new 306.23 WHQL drivers with your lovely gtx 590s? I have them in quad sli and i find performance has gone down in quit a few games, with peak usage going down a lot. I'm at the 301.24 drivers and although they have some issues, they have worked the best for me.

ps. also reading up on the MARS II bios....interesting! I dont do 3dmark, and with my EVGA bios i can get to 0.988 V @ 710 mhz for most games with no PDL etc downclocking the gpus.


----------



## mojobear

some examples include batman AC - FPS went down about 15%, Deus ex human revolution - was at 80-100% gpu usage, now down to 40-50% and even crashed on me :S


----------



## heyskip

Well ive donated again to the writer of NiBiTor. Its bloody difficult as the Chipin widget on his front page doesn't show up on certain browsers, seems to be ok at work though. I dare say he'd get alot more donations if people could easily do it.

When I get access to the latest version I will see if it reads the Mars bios.


----------



## OccamRazor

Hi mate!
Received yesterday a message from mkave, don't think we'll get a new version from nibitor soon...








I'll pm you in a minute!
Have you tried the new bios already?

Cheers mate

Ed


----------



## wermad

Any one interested in offloading their 590, send me a pm as I'm looking for a 2nd one.


----------



## OccamRazor

I might have one but you're too far away, customs and transport fees might cost a lot... The guy asks +- 450$ for it (350€), it's an asus and was never used, I'm tempted to buy it for a quad!









Cheers

Ed


----------



## wermad

Quote:


> Originally Posted by *OccamRazor*
> 
> I might have one but you're too far away, customs and transport fees might cost a lot... The guy asks +- 450$ for it (350€), it's an asus and was never used, I'm tempted to buy it for a quad!
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> Ed


Nice, tempting but its going to end up being a lot. I bought one and it was doa. By the time I got this one, all the ones on the ocn marketplace sold


----------



## Canis-X

Anyone make any headway on the MARSII flash?


----------



## wermad

Guys running quad, any concerns running 4.5 on my 2700k? Mostly gaming (Surround). My old and trusty ws P67 doesn't like the nice clocks I got with the EVGA Z77 I had


----------



## Canis-X

I'm running my 3930K at 5Ghz (H/T enabled) at 6144x1080 surround playing BF3, no problems really, but running without H/T enabled allows for less video stutter.


----------



## wermad

Quote:


> Originally Posted by *Canis-X*
> 
> I'm running my 3930K at 5Ghz (H/T enabled) at 6144x1080 surround playing BF3, no problems really, but running without H/T enabled allows for less video stutter.


Thanks Cannis







. I had it easy with Z77 and so I have to work a bit harder with this P67







. My second card should finally arrive some time next week so I'll know for sure by then


----------



## Canis-X

Fantastic!!!!

http://files.enjin.com.s3.amazonaws.com/94425/modules/forum/attachments/bf3_2012_06_26_00_00_31_724_1340602082.png


http://files.enjin.com.s3.amazonaws.com/94425/modules/forum/attachments/bf3_2012_06_26_00_00_37_521_1340602118.png


----------



## wermad

Quote:


> Originally Posted by *Canis-X*
> 
> Fantastic!!!!
> http://files.enjin.com.s3.amazonaws.com/94425/modules/forum/attachments/bf3_2012_06_26_00_00_31_724_1340602082.png
> 
> http://files.enjin.com.s3.amazonaws.com/94425/modules/forum/attachments/bf3_2012_06_26_00_00_37_521_1340602118.png


That's awesome
















Seller just shipped my #2 card. Arrives Monday and I can't wait to get it up and running. Paid a bit more this one but its only a couple of months old


----------



## wermad

This is some serious bad luck









Popped in the new card which is just two months old and only one core is being detected





















. My first Asus 590 is working fine and dandy. This is my second defective card. What a waste


----------



## Canis-X

Crap man.....sending some luck you way!!


----------



## rush2049

Quote:


> Originally Posted by *wermad*
> 
> This is some serious bad luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Popped in the new card which is just two months old and only one core is being detected
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> . My first Asus 590 is working fine and dandy. This is my second defective card. What a waste


At that point I would be concerned it was something else killing the cards.

maybe your first 590 just has slightly better power protection and can withstand whatever is causing the issue with the other cards.....


----------



## wermad

Quote:


> Originally Posted by *rush2049*
> 
> At that point I would be concerned it was something else killing the cards.
> maybe your first 590 just has slightly better power protection and can withstand whatever is causing the issue with the other cards.....


I bought Hardstyles 590 which came with the block on. This card is from the initial launch and it has never let me down. I would have imagine it had a better chance of being bad with the heavy block on it. Removed the Asus and installing drivers for the EVGA (newest one). I paid a bit more than normal for this EVGA since it was bit new but we'll see what happens in a bit.

Quote:


> Originally Posted by *Canis-X*
> 
> Crap man.....sending some luck you way!!


Thanks


----------



## wermad

Well, this is super weird and I'm a little dumb founded. Here's what I got:
(from initial steps):

-Asus in pcie16x #1 & EVGA in pcie16x #2 = 3 gpu detected.

-Remved EVGA and ran Asus in pcie16x #1: both cores detected

-EVGA in pcie16x #1 only = both cores detected

Ok, so now I'm thinking either my steam ssd in the marvel controller is causing the pcie16x #2 to not work right. Changed it to the p67 controller, and there was no change. Now I'm thinking I have a bad mb








. Good excuse to go x79. Did some more testing as I swore I saw the Asus work on both slots without an issue:

-Put the Asus in pcie16x #2 = both cores detected










-Put EVGA in pcie16x #1 and Asus in #2: *four* cores detected









But why is the EVGA not working right in pcie16x #2 and the Asus is???

Gonna do a few more rounds of testing but if any one has any more suggestion I would appreciate it. Obviously, if it works with the EVGA in pcie16x #1 and the Asus in pcie16x #2 then I'll leave it there but its very strange.

I did clear the cmos btw


----------



## alancsalt

Quote:


> Originally Posted by *wermad*
> 
> Well, this is super weird and I'm a little dumb founded. Here's what I got:
> (from initial steps):
> -Asus in pcie16x #1 & EVGA in pcie16x #2 = 3 gpu detected.
> -Remved EVGA and ran Asus in pcie16x #1: both cores detected
> -EVGA in pcie16x #1 only = both cores detected
> Ok, so now I'm thinking either my steam ssd in the marvel controller is causing the pcie16x #2 to not work right. Changed it to the p67 controller, and there was no change. Now I'm thinking I have a bad mb
> 
> 
> 
> 
> 
> 
> 
> . Good excuse to go x79. Did some more testing as I swore I saw the Asus work on both slots without an issue:
> -Put the Asus in pcie16x #2 = both cores detected
> 
> 
> 
> 
> 
> 
> 
> 
> -Put EVGA in pcie16x #1 and Asus in #2: *four* cores detected
> 
> 
> 
> 
> 
> 
> 
> 
> But why is the EVGA not working right in pcie16x #2 and the Asus is???
> Gonna do a few more rounds of testing but if any one has any more suggestion I would appreciate it. Obviously, if it works with the EVGA in pcie16x #1 and the Asus in pcie16x #2 then I'll leave it there but its very strange.
> I did clear the cmos btw


Do the bios version numbers match, and do both cards have the same settings?
May not help you, but may advance my understanding of how these bios versions affect things.
I have three 580 GPUs and one had different bios revision and different timings. They did not default to lowest clocks, but interfered with each other until bios was edited to same clocks.
Just wondering if this is similar.


----------



## MKHunt

I winder if the VGA BIOSes are slightly different, and the Asus is better equipped to handle double stacking NF200 chips? The 590 cores are linked with NF200 and the Z68 chipset uses NF200/NF200 equivalent to fake out having 32 lanes? The boards are all reference and should all be identical, so long as they fall within the same revision.


----------



## wermad

Quote:


> Originally Posted by *alancsalt*
> 
> Do the bios version numbers match, and do both cards have the same settings?
> May not help you, but may advance my understanding of how these bios versions affect things.
> I have three 580 GPUs and one had different bios revision and different timings. They did not default to lowest clocks, but interfered with each other until bios was edited to same clocks.
> Just wondering if this is similar.


Could be, the EVGA is a Classy, which I think all of the EVGA are classy. Everything is on default clocks atm. I picked up the P67 WS as I know it can do 16x on both blue slots as long as the grey slots are not populated. Ran 3d11 and the EVGA scored a little bit more than my Asus. Its apperantly a couple of months old according to the seller but I need to check the serial. I'm thinking EVGA, like MSI has a way to figure out the date based on the serial number. Then again, this is getting close to EOL so it might have stock from a few months if not last year. Card looks in pristine condition.
Quote:


> Originally Posted by *MKHunt*
> 
> I winder if the VGA BIOSes are slightly different, and the Asus is better equipped to handle double stacking NF200 chips? The 590 cores are linked with NF200 and the Z68 chipset uses NF200/NF200 equivalent to fake out having 32 lanes? The boards are all reference and should all be identical, so long as they fall within the same revision.


Thinking the same thing. Just very weird. Though I suspect the NF200 on the P67 WS is not getting along with the EVGA or the NF200 on the EVGA. The second slot (where the EVGA isn't picking up both cores) is run by the NF200 and the top one by the cpu. I do get 16x on each card in gpuz.

Edit:

Checked Afterburner and I'm showing the EVGA @ 630. The Asus site says their 590 is clocked at 612. I'm sure i can get both (or all four to be specific cores) cards @ 630 and hopefully a little bit more









I'm thinking of making the jump back to X79 or picking up a WS Z77. We'll see how it goes. I'll be monitoring the EVGA for the next few days. I did notify the seller to cover myself early on.

.


----------



## wermad

Here they are guys


----------



## Canis-X

Such beauties they are!!


----------



## wermad

Quote:


> Originally Posted by *Canis-X*
> 
> Such beauties they are!!


I orderd an EK bridge system (expensive







) to tie them together. Its also more structurally secured as the bottom one is sagging a bit and the sli fittings are prone to leak with any movement. I'm also experiencing 5°c more on one of the cores which I'm suspecting the block isn't making good contact. Just configured Surround but i have yet to game. Though I'm excited to see how it goes.


----------



## MKHunt

A thermal delta of 2-7C between the cores is perfectly normal. I have mounted blocks 6 times and every single time it had the same delta. It's just because the difference between fins to water is smaller at the second core since the water has already absorbed thermal energy from the first core. Even the stock coolers will show this difference.


----------



## wermad

Quote:


> Originally Posted by *MKHunt*
> 
> A thermal delta of 2-7C between the cores is perfectly normal. I have mounted blocks 6 times and every single time it had the same delta. It's just because the difference between fins to water is smaller at the second core since the water has already absorbed thermal energy from the first core. Even the stock coolers will show this difference.


Found the issue, either EK didn't thread their screw holes deep enough or they should have included shorter screws. Both blocks have a lot of play on the washer for all mounting screws. Its worse on the second core. I found some spare 6mm screws (compared to the EK 8mm) and I got one block done and the temps went down drastically. The first cored dropped 5c and the second 10c!. I don't have enough for the second one so I'm going have leave that for now. I'll be ordering some custom screws from mcmastercarr.com.

edit: Home Depot has the screws I need so I'll make a trip there today


----------



## OccamRazor

What temps do you have? With one 590 with EK block I got 36C idle and 60C average with 25C ambient temps!







this with a custom watercooler system!


----------



## OccamRazor

Ok, just tested asus mars 2 bios and it's a no go...








Screen corruption starting from bios and on into windows, system didn't recognize the card as a 590
Had to flash back to original bios, so, asus with bios version 42 cannot be flashed with this bios, you lucky bastards







with bios revision 37 can try as it will surely work, to all of us with rev. 42 will have to wait for Nibitor support to edit the bios and make it work with our cards !


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> What temps do you have? With one 590 with EK block I got 36C idle and 60C average with 25C ambient temps!
> 
> 
> 
> 
> 
> 
> 
> this with a custom watercooler system!


I think you need to examine the mounting.


----------



## OccamRazor

I've got a passive watercooler system, that's why the temps are high compared with yours! What's your ambient temp?
Cheers

Ed


----------



## wermad

Ok, turns out the the screws that worked are 8mm but flat head, so the threads are slightly shorter (~6.5-7mm) which is enough to grab the threads. Seems like no one makes 7mm philips head and I already placed an order with mcmastercarr.com for 6mm philips head. If they don't fit, I'll order the flat head ones. The second block wasn't that bad and most screws I was able to turn them a bit more to address the issue.

Temps on all four are now stable ~34c each







. Its a bit hot in my area so these will be a tad bit high. I haven't loaded the cards yet but I'm hopping they stay under 50c


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> I've got a passive watercooler system, that's why the temps are high compared with yours! What's your ambient temp?
> Cheers
> Ed


24C ambient.

Passive you say? That sounds nearly silent. Do you have a link to your setup?


----------



## OccamRazor

No but I can take pics and post it, and yes it's nearly silent!








It's a double zalman reserator ( not double loop) with 600lt/h pumps on each reservoir, one for the cpu, other for the vga, but the catch is they're outside the house, drilled holes in the wall to pass the tubes and made a housing for the reservoirs! You'll understand better with the pictures!









Cheers

Ed


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> No but I can take pics and post it, and yes it's nearly silent!
> 
> 
> 
> 
> 
> 
> 
> 
> It's a double zalman reserator ( not double loop) with 600lt/h pumps on each reservoir, one for the cpu, other for the vga, but the catch is they're outside the house, drilled holes in the wall to pass the tubes and made a housing for the reservoirs! You'll understand better with the pictures!
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> Ed


Oh, actually I get that with the text. Unfortunately during the summer here we hit a solid 100F+ with alarming regularity.

But that system is super awesome. During the winter I drop my fan volts to 5 and make ducting from the window to the case. Keeps it quiet and super cold.


----------



## OccamRazor

Well we get those temperatures here in summertime as well, that's why I have a 220v 16" fan on the reservoirs!!!







but winter time with temperatures dropping to near 45F it's excellent for the system!
I remember that in a post some time ago, the duct from the window, even remember pictures, it was you?

Ed


----------



## mironccr345

What's up guys, looks like the MARs II is a hit or miss? Guess I wont be flashing my BIOS.







Lot's of activity in here lately, glad to see this thread is still active.







Speaking of passive, my AP181 fans spin at 400rpm and my temps. don't go past 45c, when I turn the fans up to 1200rpm, the cards doesn't go over 40-42c depending on the game and the OC.

@Occam Where those pics of your rig?


----------



## OccamRazor

Hey man, whazzup?








It's a no go for my 590, rev. 42 is incompatible with these bios, we'll have to wait for nibitor I suppose...
I'm at work but I'll post the pics tomorrow!









Cheers

Ed


----------



## wermad

Still a bit warm but after 30 minutes of playing Crysis 2, they topped ~53°c each. I had my fans on 7v so they're a little hot


----------



## Canis-X

Hmmmm...I have a flashed BIOS version on my cards [70.10.37.00.01]....so you guys think that mine will take it ok then? I got my cards when I RMA'd my ASUS 5970's...they were brand new in the box though.

@wermad, I haven't seen mine go above 44C, me thinks, fully loaded with a decent OC on them.


----------



## alancsalt

Quote:


> Originally Posted by *Canis-X*
> 
> Hmmmm...I have a flashed BIOS version on my cards [70.10.37.00.01]....so you guys think that mine will take it ok then? I got my cards when I RMA'd my ASUS 5970's...they were brand new in the box though.
> @wermad, I haven't seen mine go above 44C, me thinks, fully loaded with a decent OC on them.


Even if it does not, you can flash back to original. Save a copy before trying.


----------



## OccamRazor

Ok guys here it is:









http://i1067.photobucket.com/albums/u431/riddikfurian/F77F9017-5275-4222-BBFA-C43ABD2184B7-222-000000E3EB1E2CA3_zps9eafe700.jpg

and the dual reserator casing and cooling fan for summertime!














Cheers

Ed


----------



## Pedropc

Best of all, excuse my bad English, sometimes not quite understand what you speak.

I seem to understand that some people have problems editing the bios of the Mars II, you can edit, you just have to do the following;

When the BIOS loads the Nvitor and says he can not read;

Where it says "device id: 108D (unsopported)" changed to "PCI-E Nvidia gtx590" and give to save the bios. Do the same with the bios 2. Now you can edit the bios and when flashing is taking her as a Mars II, a greeting.


----------



## mironccr345

Quote:


> Originally Posted by *OccamRazor*
> 
> Ok guys here it is:
> 
> 
> 
> 
> 
> 
> 
> 
> http://i1067.photobucket.com/albums/u431/riddikfurian/F77F9017-5275-4222-BBFA-C43ABD2184B7-222-000000E3EB1E2CA3_zps9eafe700.jpg
> and the dual reserator casing and cooling fan for summertime!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers
> Ed


That's an interesting set-up. Keep all the heat out side.


----------



## heyskip

Quote:


> Originally Posted by *Pedropc*
> 
> Best of all, excuse my bad English, sometimes not quite understand what you speak.
> I seem to understand that some people have problems editing the bios of the Mars II, you can edit, you just have to do the following;
> When the BIOS loads the Nvitor and says he can not read;
> Where it says "device id: 108D (unsopported)" changed to "PCI-E Nvidia gtx590" and give to save the bios. Do the same with the bios 2. Now you can edit the bios and when flashing is taking her as a Mars II, a greeting.


I didn't even think of that. Thanks mate, good find. +rep


----------



## johnnyw

Hmm does anyone else get weird clock/voltage fluctuation in 3DMark11 with their GTX590?

I Have 590 Ultra charged model with stock gpu clock of 671MHz with stock voltage of 0.963v, if i run 3Dmark with that it seems to remain on that clock during tests, but if i raise gpu clock to 700 then i can see in msi ab log that it has shortly downclocked to 478MHz/0.875v during some of the tests.

I Have latest drivers (306.97 whql) Any ideas?


----------



## rush2049

Quote:


> Originally Posted by *johnnyw*
> 
> Hmm does anyone else get weird clock/voltage fluctuation in 3DMark11 with their GTX590?
> 
> I Have 590 Ultra charged model with stock gpu clock of 671MHz with stock voltage of 0.963v, if i run 3Dmark with that it seems to remain on that clock during tests, but if i raise gpu clock to 700 then i can see in msi ab log that it has shortly downclocked to 478MHz/0.875v during some of the tests.
> 
> I have latest drivers (306.97 whql) Any ideas?


Nvidia put a lock/limitation into the driver sets for certain programs, 3dmark and furmark come to mind.........


----------



## johnnyw

Quote:


> Originally Posted by *rush2049*
> 
> Nvidia put a lock/limitation into the driver sets for certain programs, 3dmark and furmark come to mind.........


Ye i know there suppose to be some sort of power limiter, but just cant understand why its triggered when i have default voltage. Been running games with 700/950 for while now and in those it dont downclock, but if i run mark11 anything above 692Mhz for gpu that starts to downclock once during each 3d test.

Unigine maxed out with extreme tesselation wont downclock either even with 720Mhz for gpu so bit confused.


----------



## heyskip

Probably a stupid question, but have you got "prefer maximum performance" enabled in NV control panel.


----------



## johnnyw

Quote:


> Originally Posted by *heyskip*
> 
> Probably a stupid question, but have you got "prefer maximum performance" enabled in NV control panel.


Well you need to have that enabled, otherwise it would only use single gpu. So yes its enabled


----------



## ProfeZZor X

My two replacement power harnesses arrived today, so I'll go home for lunch and plug them in to see what happens. I'm PRAYING that this solves the problem and no further steps are necessary, other than to install drivers, the OS and software... I'll keep you guys posted.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> My two replacement power harnesses arrived today, so I'll go home for lunch and plug them in to see what happens. I'm PRAYING that this solves the problem and no further steps are necessary, other than to install drivers, the OS and software... I'll keep you guys posted.


Good luck


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> Good luck


Two words... IT LIVES!!!

You guys have no idea how happy I am right now. I pulled off those two modified power cables and snapped in the new ones, and it posted immediately. It's amazing how shoddy workmanship can cause such a chain reaction and affect an entire build. I know I've learned my lesson from this experience. Now I feel more confident building a rig for my kids.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Two words... IT LIVES!!!
> You guys have no idea how happy I am right now. I pulled off those two modified power cables and snapped in the new ones, and it posted immediately. It's amazing how shoddy workmanship can cause such a chain reaction and affect an entire build. I know I've learned my lesson from this experience. Now I feel more confident building a rig for my kids.












Was this an extension or cable-harness for your psu or something?


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was this an extension or cable-harness for your psu or something?


The PC shop that I hired basically took the standard harness that came with the 590, cut the enture thing in half, and crimped in an additional 12 inches of wire to each of those original wire ends to extend it, rather than remove them from each of the connectors and replace them with longer wire. And to top it off, instead of individually sleeving each wire, he used this massive sleeve to cover everything up. And instead of using heat shrink, he used electrical tape and just wrapped it up. Aside from that, I assume somewhere within that mess he had to of screwed up an electrical connection...

I'll be sure to post pictures of the hack job they did.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> The PC shop that I hired basically took the standard harness that came with the 590, cut the enture thing in half, and crimped in an additional 12 inches of wire to each of those original wire ends to extend it, rather than remove them from each of the connectors and replace them with longer wire. And to top it off, instead of individually sleeving each wire, he used this massive sleeve to cover everything up. And instead of using heat shrink, he used electrical tape and just wrapped it up. Aside from that, I assume somewhere within that mess he had to of screwed up an electrical connection...
> I'll be sure to post pictures of the hack job they did.


This harness is for the lights on your block or something? I'm on EK blocks so the only wires I have are the psu power cables









Glad to hear you're up and running


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> This harness is for the lights on your block or something? I'm on EK blocks so the only wires I have are the psu power cables
> 
> 
> 
> 
> 
> 
> 
> 
> Glad to hear you're up and running


It's the harness that powers the block.

Upon closer inspection while snapping these shots, I discovered that one of the crimped wires simply fell out of the connector sleeve. That's porbably one of the reasons why it didn't work. Anyway, here's the hack job these guys did:


























And here's the end result after using the replacement harness EVGA was so kind to send me:


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> It's the harness that powers the block.
> Upon closer inspection while snapping these shots, I discovered that one of the crimped wires simply fell out of the connector sleeve. That's porbably one of the reasons why it didn't work. Anyway, here's the hack job these guys did:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And here's the end result after using the replacement harness EVGA was so kind to send me:


hmmmm....I'm still confused why your block needs a pcie 6-pin (or 4-pin) plug just for lights, ? Or this this do something else? My Classy came with the stock air cooler, though I noticed there's a second

Anyways, crimping on pins is super easy. You just needs some pins and a crimper which are cheap and easily available. Also, learning to solder wires is simple. There are many vids on youtube that teach you how to do simple soldering. I learned quickly (as it was very simple) and I can do my own custom wiring. Having spare pins and a crimper are a life saver in case I need to add new pins or re-crimp a new pin.


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> hmmmm....I'm still confused why your block needs a pcie 6-pin (or 4-pin) plug just for lights, ? Or this this do something else? My Classy came with the stock air cooler, though I noticed there's a second
> Anyways, crimping on pins is super easy. You just needs some pins and a crimper which are cheap and easily available. Also, learning to solder wires is simple. There are many vids on youtube that teach you how to do simple soldering. I learned quickly (as it was very simple) and I can do my own custom wiring. Having spare pins and a crimper are a life saver in case I need to add new pins or re-crimp a new pin.


That cable isn't for lights, it's to power my video card. It's an 8-pin that splits into two 6-pin PCI connectors. Isn't that standard for most 590 cards? What I had mentioned in previous posts is that a tell tale sign that some EVGA cards are underpowered is if the LED logo light is blinking. My video card logo was blinking for some time because these doofus' didn't know what they were doing, and I spent the greater part of two months on a long, treaturous, and expensive journey trying to figure out why my card didn't work properly. Now that it does, I can move on.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> That cable isn't for lights, it's to power my video card. It's an 8-pin that splits into two 6-pin PCI connectors. Isn't that standard for most 590 cards? What I had mentioned in previous posts is that a tell tale sign that some EVGA cards are underpowered is if the LED logo light is blinking. My video card logo was blinking for some time because these doofus' didn't know what they were doing, and I spent the greater part of two months on a long, treaturous, and expensive journey trying to figure out why my card didn't work properly. Now that it does, I can move on.


Why did you have a custom cable done in the first place is really where my confusion comes in. I'm sure your psu had enough cables or you could have easily purchased pre-sleeved (individual) cable adapter/extension. So all this time, it was your psu cable/adapter that was causing the issue? Very weird and perplexing.

edit: I have a bunch of these:

http://jab-tech.com/6-6-Pin-to-6-2-pin-extension-cable-Single-Braid-Sleeved-Blue-pr-4734.html

Lmk if you want a couple


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> Why do you have a custom cable done in the first place is really where my confusion comes in. I'm sure your psu had enough cables or you could have easily purchased pre-sleeved (individual) cable adapter/extension. So all this time, it was your psu cable/adapter that was causing the issue? Very weird and perplexing.


Yeah, go figure... When I initially got the card, it came with a harness that was about 5 inches long. So back in April this year I decided that I wanted something a little more professional looking and sleeved in black. So, I hired this local PC shop to do the job, thinking they'd do a professional extension and sleeving job like most of the builders here in these forums. But in reality, I ended up with something horrific. I was already at the end of my build, and low on funds, so my only concern was to get the PC running. Then back in August when I was done buying everything and buttoned everything up, that's when I stated that I needed help figuring out why the video card wasn't working properly. It didn't occur to me that it was the cable, because the responses I was getting on these boards was that it might be something else... I even mentioned the cables being modified a number of times, but it was overlooked in the advice I got.

This is one of the power harnesses that came with the card when I bought it, which isn't long enough to tuck behind the motherboard. That was my main goal in all of this.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Yeah, go figure... When I initially got the card, it came with a harness that was about 5 inches long. So back in April this year I decided that I wanted something a little more professional looking and sleeved in black. So, I hired this local PC shop to do the job, thinking they'd do a professional extension and sleeving job like most of the builders here in these forums. But in reality, I ended up with something horrific. I was already at the end of my build, and low on funds, so my only concern was to get the PC running. Then back in August when I was done buying everything and buttoned everything up, that's when I stated that I needed help figuring out why the video card wasn't working properly. It didn't occur to me that it was the cable, because the responses I was getting on these boards was that it might be something else... I even mentioned the cables being modified a number of times, but it was overlooked in the advice I got.
> This is one of the power harnesses that came with the card when I bought it, which isn't long enough to tuck behind the motherboard. That was my main goal in all of this.


Are you running off the psu or still using the harness(s)? I checked your psu (Rosewill 1300) and it has eight 6-2 pin pcie lines. If you need an extension, get a presleeved 8-pin pcie extension:

http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=34_804_1145_1144&products_id=35614


----------



## wermad

Let me know if you need a new harness. I have some spares that I might be able to whip one out or I might have an extension I can send yah. I have no need for them any more since I don't sleeve and care for my exposed BRY cables


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> Let me know if you need a new harness. I have some spares that I might be able to whip one out or I might have an extension I can send yah. I have no need for them any more since I don't sleeve and care for my exposed BRY cables


Each Rosewill power cord is long enough for my needs, and I have more than enough piperock connectors that came with my Rosewill. It's the 8-pin connector on the video card itself that isn't the same as a straight-8 on the PSU. And the fact that it splits from an 8-pin to two 6-pins make it that much more complicated. But I guess that's just the way EVGA made their cards... But honestly, I'll be fine the way that it is now, as long as it works. Where the PC sits at my desk, you really can't see the yellow and brown wires because they're so close to the hard drive cage and almost tucked away... And if push comes to shove, I can alway spray paint the wires black.

I wouldn't want to put you through too much trouble. You guys have been most helpful as it is.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Each Rosewill power cord is long enough for my needs, and I have more than enough piperock connectors that came with my Rosewill. It's the 8-pin connector on the video card itself that isn't the same as a straight-8 on the PSU. And the fact that it splits from an 8-pin to two 6-pins make it that much more complicated. But I guess that's just the way EVGA made their cards... But honestly, I'll be fine the way that it is now, as long as it works. Where the PC sits at my desk, you really can't see the yellow and brown wires because they're so close to the hard drive cage and almost tucked away... And if push comes to shove, I can alway spray paint the wires black. But thanks though, it is appreciated.


That's weird that your psu connector doesn't match your 590??? Both my EVGA and Asus took the stock 6+2 connectors off my Seasonic. Connectors can always be switched quite easily. You just need a pin removal tool, some patience, and a new pcie 8-pin connector.


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> That's weird that your psu connector doesn't match your 590??? Both my EVGA and Asus took the stock 6+2 connectors off my Seasonic. Connectors can always be switched quite easily. You just need a pin removal tool, some patience, and a new pcie 8-pin connector.


First things first... I need to get some kind enjoyment out of this PC. Then as I get more familiar with pin removals, custom sleeving and heat shrink, I'll revisit the old connectors and try it out.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> First things first... I need to get some kind enjoyment out of this PC. Then as I get more familiar with pin removals, custom sleeving and heat shrink, I'll revisit the old connectors and try it out.


Cool









Its not too challenging and ocn can help. Hit me up if you need any help


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> 
> Its not too challenging and ocn can help. Hit me up if you need any help


I most certainly will. Thanks for the offer.


----------



## derickwm

Little late to the party but...

EVGA Classified 590 LE








(Bottom right)

Running on air currently @700/1728.


----------



## OverClocker55

Quote:


> Originally Posted by *derickwm*
> 
> Little late to the party but...
> EVGA Classified 590 LE
> 
> 
> 
> 
> 
> 
> 
> 
> (Bottom right)
> Running on air currently @700/1728.


----------



## mironccr345

Quote:


> Originally Posted by *derickwm*
> 
> Little late to the party but...
> EVGA Classified 590 LE
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Bottom right)
> Running on air currently @700/1728.


Did you flash the BIOS to get it to 700? My 590 wont go passed 650, but I haven't tried to flash the BIOS yet. What driver's are you using?

I spy a logitech dinovo mini. I use two for my HTPC.


----------



## derickwm

I haven't flashed anything. 306.97.


----------



## mironccr345

Quote:


> Originally Posted by *derickwm*
> 
> I haven't flashed anything. 306.97.


Lucky!


----------



## OccamRazor

Hi derickwm!
great setups you got there!








Im currently working with a Vbios developer to mod asus mars 2 bios to use in any 590!
can you send me your mars 2 bios?
Thanks

Ed


----------



## OccamRazor

Hey mironccr345!
hows everything?
have you tryed to flash to mars 2?
im modding the bios with the help of a developer!
will have some news soon!

Cheers!

Ed


----------



## derickwm

Quote:


> Originally Posted by *OccamRazor*
> 
> Hi derickwm!
> great setups you got there!
> 
> 
> 
> 
> 
> 
> 
> 
> Im currently working with a Vbios developer to mod asus mars 2 bios to use in any 590!
> can you send me your mars 2 bios?
> Thanks
> 
> Ed


I would love to help ya out, unfortunately I sold my Mars II quite a while ago


----------



## OccamRazor

i see, is there a way you can ask the people you sold them to? Do you remember the bios revision ?
was it 70.10.42.00.01 or 70.10.37.00.01?
Thanks for your reply and thank you in advance if you can get those bios!









Ed


----------



## wermad

Quote:


> Originally Posted by *derickwm*
> 
> Little late to the party but...
> EVGA Classified 590 LE
> 
> 
> 
> 
> 
> 
> 
> 
> (Bottom right)
> Running on air currently @700/1728.


Welcome buddy!

You're not alone









Took me ~3 weeks to find my 2nd I did end up paying a premium (EVGA Classy). I haven't clocked mine any higher than the Classy's stock/default 630 core.

Btw, what makes the "LE" different from the Classified version?


----------



## derickwm

Quote:


> Originally Posted by *OccamRazor*
> 
> i see, is there a way you can ask the people you sold them to? Do you remember the bios revision ?
> was it 70.10.42.00.01 or 70.10.37.00.01?
> Thanks for your reply and thank you in advance if you can get those bios!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


Probably not possible but I will try.
Quote:


> Originally Posted by *wermad*
> 
> Welcome buddy!
> 
> You're not alone
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Took me ~3 weeks to find my 3rd I did end up paying a premium (EVGA Classy). I haven't clocked mine any higher than the Classy's stock/default 630 core.
> 
> Btw, what makes the "LE" different from the Classified version?












I like my classy, one of the sexier GPUs I've owned. I'm not sure on the LE, I just saw others putting it so I figured that all classified 590s were "LE"


----------



## wermad

Quote:


> Originally Posted by *derickwm*
> 
> Probably not possible but I will try.
> 
> 
> 
> 
> 
> 
> 
> 
> I like my classy, one of the sexier GPUs I've owned. I'm not sure on the LE, I just saw others putting it so I figured that all classified 590s were "LE"


Hehe, I'll call mine SC (supa' charged!)









You getting her wet? Mine are super cool and stay under 45°c as long as the fans are in high mode. Scaling is pretty good in Surround but recent drivers suck tbh.


----------



## derickwm

Probably not unless I can get a block super cheap. I have my Ares to play with, probably won't own the card for too long


----------



## wermad

Quote:


> Originally Posted by *derickwm*
> 
> Probably not unless I can get a block super cheap. I have my Ares to play with, probably won't own the card for too long


ah, I see. Yeah, I'm sorely tempted to go with something else but so far these guys have been working fine. I got two blocks with the Asus for a good deal (one was new but missing the pads). Definitely heavy on water. Keep on an eye out for ppcs.com; they have 6990 blocks for $50 each, so I'm suspecting 590 blocks might follow suit


----------



## derickwm

What blocks do you have?

I like the 590, but after taking into account the relatively limited OC'ing and small amount of VRAM, I can't see it as a permanent solution. I am curious to see how it does against my quad fire Ares at higher resolutions.


----------



## wermad

Quote:


> Originally Posted by *derickwm*
> 
> What blocks do you have?
> I like the 590, but after taking into account the relatively limited OC'ing and small amount of VRAM, I can't see it as a permanent solution. I am curious to see how it does against my quad fire Ares at higher resolutions.


EK, I got Harstyles Asus 590 w/ the EK block and he sold me his spare ek block. I bought the EVGA off ebay. The Asus is a rock solid card, especially for being old (initial launch). The EVGA is a bit weird but I got it sorted out.

I'm gaming on three screens (3240x1920) and some games vram does become a factor but the 4th gpu really helps w/ the frames. Drivers are flaky right now for Nvidia, so stay with 301.xx wqhl. I tried the newest ones and my frames dropped dramatically. Even the Nvidia boasted "improved" 310.xx beta are sucking paint really hard for some ppl.

On your single 1440, I would say it should perform pretty good. I just tone down or switch off some of the eye candy that is handicapped by vram.

I was tempted to go crossfire 6990s but I restrained myself. Looking forward to triple 7970s next year (or sooner





















).


----------



## ProfeZZor X

Sucks that I have to wait until the middle of this month to get internet service. I was hoping to update my VBIOS and other drivers soon.


----------



## wermad

Thanks for the help guys. I parted ways with my lovely 590s and I'm on the hunt for a new gpu setup w/ more vram


----------



## derickwm

Quote:


> Originally Posted by *wermad*
> 
> Thanks for the help guys. I parted ways with my lovely 590s and I'm on the hunt for a new gpu setup w/ more vram


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> Thanks for the help guys. I parted ways with my lovely 590s and I'm on the hunt for a new gpu setup w/ more vram


Just when I get my 590 up and running, you get rid of yours...


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Just when I get my 590 up and running, you get rid of yours...


lol, Surround is a pita. You need a lot of hp and vram to get a triple screen setup going. I'm running some budget screens and not the high end ones







. My gosh, these things sold in a matter of a hours and for more than what I expected. I had wanted to trade for two 6990s but the guy refused and said his cards are worth more. Actually and technically, the 6990 is better but the 590s are more desirable. These still sell $400-500 where as the 6990 are selling below $400. Oh well, that guys loss I guess.

I'm not sure what to get but its definitely gonna be with a lot more vram than 1.5gb


----------



## Canis-X

Where did you sell them at?


----------



## rush2049

Quote:


> Originally Posted by *Canis-X*
> 
> Where did you sell them at?


Also curious... if the 590 is still that desirable I may sell mine.


----------



## wermad

Ebay. One sold for $510, the older one for $460. At least I can cover the ebay and paypal fee. I had no interest here but i had them listed cheaper. Which, in the end, would be the same either way you sell it.
The one that sold for more was a few months old and the block was a month old so I guess that's what made it more appealing (and being the Classified). The other card (Asus) was from the initial launch and the block was about a year old. I did buy-it-now since I didn't really want to wait for auction style.

I know Juggalo sold them quickly a few weeks ago here, so yeah, these guys are still highly desirable. Auction style they are hitting $400 w/out a block and +$450 with block.

The 6990 is selling for <$400 and I've seen some go for $350 and $400 with block.

It also proves that Nvidia has always had better depreciation than Amd.

Ultimately, I'm sticking with Nvidia as I can't afford a triple 7970 setup and the 69xx series has crossfire lag issues in quad fire. Hunting for 3 (and possibly a fourth) gtx 580 3gb.


----------



## ProfeZZor X

I may end up with doing an SLI 590 setup, but I have yet to unleash the beast that I have now. I'll most likely buy a couple of games next week (Trine 1& 2) and see how well the 3D works on my HDTV. I'll eventually get around to benchmarking it, but if I'm not happy with the results, maybe I'll just commit to getting another 590 by Christmas or tax time.


----------



## wermad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> I may end up with doing an SLI 590 setup, but I have yet to unleash the beast that I have now. I'll most likely buy a couple of games next week (Trine 1& 2) and see how well the 3D works on my HDTV. I'll eventually get around to benchmarking it, but if I'm not happy with the results, maybe I'll just commit to getting another 590 by Christmas or tax time.


I have Trine 2 and I was able to max it out using three 560 448s on three screens. I'm sure one will be ok for a single screen. Sli on a single 1080/1200 screen will not see much benefit from the 4th gpu. I had high usage on all cores on some of the most demanding games but the vram was a hindering any further performance.


----------



## Samurai707

Hey guys, I'm about to join the club (hopefully after the mail lady get's here within the hour), and I'm just curious as to what are the best performing Drivers for you all? I've been seeing a lot of talk about crashing with the most recent drivers, but don't know if that is just for the 6xx... I have been out of the driver loop since my GTX 470 so any help is much appreciated!

(I'm seeing 306.97 as the most recent)

EVGA 590 Classi, gonna be paired with my mATX Sig rig.


----------



## iARDAs

There is a beta driver 310.53

it has improvements over most games. Give it a try too.

But since this is your first time with the 590, test it with a WHQL driver first, and than move on to 310.53 beta


----------



## Samurai707

For sure! Thanks, iARDAs!


----------



## mironccr345

Well guys, I sold my 590.


----------



## emett

Bigger and better things are on the horizon my friend.
I loved my 590, but in all honesty havent looked back.


----------



## wermad

Quote:


> Originally Posted by *Samurai707*
> 
> Hey guys, I'm about to join the club (hopefully after the mail lady get's here within the hour), and I'm just curious as to what are the best performing Drivers for you all? I've been seeing a lot of talk about crashing with the most recent drivers, but don't know if that is just for the 6xx... I have been out of the driver loop since my GTX 470 so any help is much appreciated!
> (I'm seeing 306.97 as the most recent)
> EVGA 590 Classi, gonna be paired with my mATX Sig rig.


I used 301.xx wqhl with my sli 590s. 306.xx wqhl was horrendous as I lost ~40% of my gpu usage. 301.xx wasn't perfect as I had random shutdowns on some games.
Quote:


> Originally Posted by *mironccr345*
> 
> Well guys, I sold my 590.


Looks like many ppl are moving on, including me. I really needed the extra vram and the 580 3gb was the perfect fit for my budget.

Its also a good time to buy a 590 since the many available will drive some prices to go down. I was lucky enough to sell mine and almost break even on my initial investment.
Quote:


> Originally Posted by *emett*
> 
> Bigger and better things are on the horizon my friend.
> I loved my 590, but in all honesty havent looked back.


I'm disappointed with the 6xx series so the 7970 was my current gen choice. Ultimately, the cost of 7970s outfitted with water was gonna break my budget. Four 580s w/ 3gb each should suffice in surround and came in at budget


----------



## ProfeZZor X

Quote:


> Originally Posted by *wermad*
> 
> I have Trine 2 and I was able to max it out using three 560 448s on three screens. I'm sure one will be ok for a single screen. Sli on a single 1080/1200 screen will not see much benefit from the 4th gpu. I had high usage on all cores on some of the most demanding games but the vram was a hindering any further performance.


Trine just showed up in the mail yesterday from Amazon (along with Titan Quest Gold and Lost Planet 2), so I took it home, did the install, changed the appearance on the desktop in the NVidia control panel to 3D, and expected Trine to be in 3D when I started the game.... But sadly, it only played in regular 2D. Am I missing something, or did I do something wrong? I changed the screen back to it's original setting, and tried using the 2D to 3D converter on my monitor, but all that did was give the game a little depth inside the television, and not the fore and background within the game. There's also no video adjustments in the options menu in Trine to convert it to 3D, so I don't know how it's done.

Any and all help to solve this mystery would be greatly appreciated.


----------



## iARDAs

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Trine just showed up in the mail yesterday from Amazon (along with Titan Quest Gold and Lost Planet 2), so I took it home, did the install, changed the appearance on the desktop in the NVidia control panel to 3D, and expected Trine to be in 3D when I started the game.... But sadly, it only played in regular 2D. Am I missing something, or did I do something wrong? I changed the screen back to it's original setting, and tried using the 2D to 3D converter on my monitor, but all that did was give the game a little depth inside the television, and not the fore and background within the game. There's also no video adjustments in the options menu in Trine to convert it to 3D, so I don't know how it's done.
> Any and all help to solve this mystery would be greatly appreciated.


I remember Trine had a bug.

Disable that option in Nvidia Control Panel where every game automatically starts in 3D however make sure 3D in general is enabled

Launch the game and turn on the 3D effect for the game via the shortcut

I remember it was CTRL + T if I am not mistaken.

I had the same issue back in the day


----------



## mironccr345

Quote:


> Originally Posted by *wermad*
> 
> I used 301.xx wqhl with my sli 590s. 306.xx wqhl was horrendous as I lost ~40% of my gpu usage. 301.xx wasn't perfect as I had random shutdowns on some games.
> Looks like many ppl are moving on, including me. I really needed the extra vram and the 580 3gb was the perfect fit for my budget.
> Its also a good time to buy a 590 since the many available will drive some prices to go down. I was lucky enough to sell mine and almost break even on my initial investment.
> I'm disappointed with the 6xx series so the 7970 was my current gen choice. Ultimately, the cost of 7970s outfitted with water was gonna break my budget. Four 580s w/ 3gb each should suffice in surround and came in at budget


I would have kept my 590 if I was running a single 24/27" 1080p monitor, but I def need more vram since I'm running 5760x1080. So far, the 680 4Gb is holding it's own. I'm pretty happy with the results.


----------



## Donkey1514

Quote:


> Originally Posted by *mironccr345*
> 
> Well guys, I sold my 590.


and I bought it


----------



## mironccr345

Quote:


> Originally Posted by *Donkey1514*
> 
> and I bought it


How about returning the favor with some Trader Rating?


----------



## Canis-X

herp derp


----------



## wermad

590s rock but they can easily hit their vram in Surround. I kept reading 6990s vs 590s reviews and the comment about how 590s suits a single screen came up a lot. So, I would have gladly kept my two 590s if I was running, say a 1440 or a 27" 1080 screen. I had experience with 3gb 580s so I know four will handle 3240x1920 with ease.

I never got a chance to up the clocks on my two 590s. I guess it makes you weary to do so even though some have done it successfully. The potential to kill one card w/ two gpu(s) on there was just too scary for me. I'm definitely gonna push my 580s though


----------



## iARDAs

I also used to love my 590. Still the best GPU I had to date. I liked it better than my current 670

Anyway, Nvidia needs to think about their Dual GPUs better. Even 690 lacks Vram in surround.

690 QUAD SLI would be great if the darn GPUs handled 4GB per core.

Same goes for 590 too. It would be great to have 3GB per core 590s.

This VRAm decision is making Dual core GPUs less attractive for demanding users.


----------



## wermad

Quote:


> Originally Posted by *iARDAs*
> 
> I also used to love my 590. Still the best GPU I had to date. I liked it better than my current 670
> 
> Anyway, Nvidia needs to think about their Dual GPUs better. Even 690 lacks Vram in surround.
> 
> 690 QUAD SLI would be great if the darn GPUs handled 4GB per core.
> 
> Same goes for 590 too. It would be great to have 3GB per core 590s.
> 
> This VRAm decision is making Dual core GPUs less attractive for demanding users.


HIS and PowerColor make a dual 7970 gpu w/ 3gb of vram each. Pricey though (~$1000-1200). You can easily pickup two used 7970s for ~$600-700 or a used 690 for <$800.


----------



## iARDAs

Quote:


> Originally Posted by *wermad*
> 
> HIS and PowerColor make a dual 7970 gpu w/ 3gb of vram each. Pricey though (~$1000-1200). You can easily pickup two used 7970s for ~$600-700 or a used 690 for <$800.


Actually i was referring to Nvidia's Dual GPUs. Atis Dual GPUs have enough Vram but they are somehow not so popular.


----------



## wermad

Quote:


> Originally Posted by *iARDAs*
> 
> Actually i was referring to Nvidia's Dual GPUs. Atis Dual GPUs have enough Vram but they are somehow not so popular.


The Mars II was to have 3gb each core but Asus went with 1.5gb. I'm sure they could have easily gone 3gb but the cost would have been higher. I'm know Asus won't do a Mars III (its been showcased but no production plans unfortunately) which is sad honestly. A 690 w/ 8gb of vram (4gb per gpu) would have been awesome. Could be Nvidia's strangle-hold on the AICs and no one wants to further piss Nvidia off and make the situation worsE







.

edit": Amd: cards have improved drastically this generation. I will credit the 6990 as it edged out the 590 where it mattered (large resolutions) but as typical, drivers held it back, especially crossfire. Also, they were much more attainable and affordable (slight) then Nvidia, so I wouldn't call them "not popular"







.


----------



## ProfeZZor X

Quote:


> Originally Posted by *iARDAs*
> 
> I remember Trine had a bug.
> 
> Disable that option in Nvidia Control Panel where every game automatically starts in 3D however make sure 3D in general is enabled
> 
> Launch the game and turn on the 3D effect for the game via the shortcut
> 
> I remember it was CTRL + T if I am not mistaken.
> 
> I had the same issue back in the day


I'll give it a try tonight and let you know if it worked. I bought this game for the sole reason that it's one of the few 3D games out there. It would suck that it's not native to the game. Don't get me wrong though, it's still a nice looking game.


----------



## Donkey1514

Quote:


> Originally Posted by *mironccr345*
> 
> How about returning the favor with some Trader Rating?


----------



## mironccr345

Quote:


> Originally Posted by *Donkey1514*


----------



## ProfeZZor X

I tested out Lost Planet 2 the other day, and I have to say that I was impressed by how smoothly it ran. The only thing I was worried about is the shuttering glitches I encountered once I tried playing the game itself. I thought it was a part of the game intro, but it shouldn't have performed the way that it did with the blackouts and floating objects. I'm still learning about benchmarking, so I hope these numbers are any indication of what I should expect during normal game play.


----------



## kx11

i still have my zotac 590

solid card but noisy


----------



## Canis-X

Quote:


> Originally Posted by *OccamRazor*
> 
> Hey mironccr345!
> hows everything?
> have you tryed to flash to mars 2?
> im modding the bios with the help of a developer!
> will have some news soon!
> Cheers!
> Ed


Hey Occam, have you made any progress on this?


----------



## OccamRazor

Hey Whats up?
Actually got a breakthrough, got a modified string of bios going on the bench, changed the asus mars 2 bios id string so it will work on versions .42 as the original bios is .37 and changed the volts, core and memory speed acordingly, i still have test it properly, i will give you a heads up as soon as i can!

Cheers

Ed

P.S. @ mironccr345: Hows it going on bro?


----------



## mironccr345

Quote:


> Originally Posted by *OccamRazor*
> 
> Hey Whats up?
> Actually got a breakthrough, got a modified string of bios going on the bench, changed the asus mars 2 bios id string so it will work on versions .42 as the original bios is .37 and changed the volts, core and memory speed acordingly, i still have test it properly, i will give you a heads up as soon as i can!
> Cheers
> Ed
> P.S. @ mironccr345: Hows it going on bro?


What's up ED?! Everything's alright over here. How's it going on you're end?
I've recently sold my 590 and picked up a 680.









But I like creeping this thread because I still love the 590, it's such a great card. I would have held on to it if it had at least 2GB ram.


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> What's up ED?! Everything's alright over here. How's it going on you're end?
> I've recently sold my 590 and picked up a 680.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I like creeping this thread because I still love the 590, it's such a great card. I would have held on to it if it had at least 2GB ram.


That's one sexy setup you got there.


----------



## MasterVampire

Hey guys what kinda performance are you getting with your 590's on Assasins creed 3 and Far Cry 3 and also Hitman?


----------



## hotrod717

Just crossed to the dark side. Lol! Waiting for delivery of my EVGA GTX 590 Classified. After looking at members list, I'm hoping my psu will be up to the challenge. Seasonic X750. ??? Will I be ok? Checked several calculators and seems like I may be on the edge. Had a PCS+ 6970, but understand that nvidia does better with gaming AND CAD sofware. Any advise or info would be helpful. A little late to the game, I know, but seemed right when comparing the cost of adding second 6970. Not to mention getting the one I have for less than $300.00 used.


----------



## OccamRazor

I had 570 sli and a overclocked i7 920 to 4.0ghz, 4 hard drives, a dvdrw and 4 120mm fans with a corsair tx 750.
Had a power meter on and the total system wattage rarelly went over 550/600w!
But honestly (lets take into the equation the psu natural degradation and consequent loss of eficiency over time and the power meter margin of error) its cutting pretty thin! i had my system for over a year and nothing bad happened but that doesnt mean it wont happen to you! in all dynamic systems as our computers are,
the rule of the thumb is: "no system is the same" there is no such thing!
But your psu is one of the best quality around, it should sustain a good wattage flow for a long time, my corsair was already 2 years old and it held just fine but if you start to experience random shutdowns and bad game framerates, it could be that you need more "juice"!








just my 2 cents!

cheers

Ed


----------



## rush2049

Quote:


> Originally Posted by *hotrod717*
> 
> Just crossed to the dark side. Lol! Waiting for delivery of my EVGA GTX 590 Classified. After looking at members list, I'm hoping my psu will be up to the challenge. Seasonic X750. ??? Will I be ok? Checked several calculators and seems like I may be on the edge. Had a PCS+ 6970, but understand that nvidia does better with gaming AND CAD sofware. Any advise or info would be helpful. A little late to the game, I know, but seemed right when comparing the cost of adding second 6970. Not to mention getting the one I have for less than $300.00 used.


Well, may I welcome you to the club..... its getting rather exclusive now a days....


----------



## MKHunt

Anybody have trouble with their displayport? I got my PB278Q today and it refuses completely to send anything to the screen via displayport. Right now it's on DVID with my old ML series screen on the second DVI for card A.

I will say, this ASUS screen is one of the most beautiful I've laid my eyes on. It makes my IPS panel in the ML look trashy and puts my friend's iPhone 5 to shame.


----------



## hotrod717

Quote:


> Originally Posted by *OccamRazor*
> 
> I had 570 sli and a overclocked i7 920 to 4.0ghz, 4 hard drives, a dvdrw and 4 120mm fans with a corsair tx 750.
> Had a power meter on and the total system wattage rarelly went over 550/600w!
> But honestly (lets take into the equation the psu natural degradation and consequent loss of eficiency over time and the power meter margin of error) its cutting pretty thin! i had my system for over a year and nothing bad happened but that doesnt mean it wont happen to you! in all dynamic systems as our computers are,
> the rule of the thumb is: "no system is the same" there is no such thing!
> But your psu is one of the best quality around, it should sustain a good wattage flow for a long time, my corsair was already 2 years old and it held just fine but if you start to experience random shutdowns and bad game framerates, it could be that you need more "juice"!
> 
> 
> 
> 
> 
> 
> 
> 
> just my 2 cents!
> cheers
> Ed


Quote:


> Originally Posted by *rush2049*
> 
> Well, may I welcome you to the club..... its getting rather exclusive now a days....


Thanks guys, appreciate the feedback. When it rains it pours, and I'm definately going to need a new psu. Bought a second GTX 590 Classified.







Couldn't help it, got an amazing deal! My girlfriend is going to freak!!! Less than a new 690 for both, though.







Any suggestions on which driver is best at this time?


----------



## ciceu4

Quote:


> Originally Posted by *MasterVampire*
> 
> Hey guys what kinda performance are you getting with your 590's on Assasins creed 3 and Far Cry 3 and also Hitman?


You can play on ultrasettings with an average of 35-40 fps for HItman Absolutin on 1920x1080 .I played also ASC 3 maxout but i didn't measure the fps .


----------



## hotrod717

Received 1st gtx590 today and while the graphics are amazing the pins have blotch in the middle. Almost looks like a fingerprint that ate into metal pins]about 1/2' wide, right in the center.








Any thoughts?


----------



## hotrod717

sorry for the lousy pics


----------



## MasterVampire

Guys how much can I get if I sell my GTX 590 with a waterblock already attached?


----------



## FateZero

Hi guys. I've been using the Nvidia GTX 590 that came along with my Aurora R4 package for almost a year now. Occasionally though, my monitor would go into Power Saving Mode while I'm gaming which I have to result to hard-resetting my com. I've Googled for solutions and even followed all instructions to turn off all power saving in the Power Options control panel. I pretty much made sure nothing goes to Sleep or Hibernate.

DVI-I cable
OptX AW2310 monitor
i7-3930K (no overclock)
875W PSU.


----------



## Samurai707

do you have power options on your monitor's menu? Give those a look-see too


----------



## Canis-X

Quote:


> Originally Posted by *OccamRazor*
> 
> Hey Whats up?
> Actually got a breakthrough, got a modified string of bios going on the bench, changed the asus mars 2 bios id string so it will work on versions .42 as the original bios is .37 and changed the volts, core and memory speed acordingly, i still have test it properly, i will give you a heads up as soon as i can!
> Cheers
> Ed
> P.S. @ mironccr345: Hows it going on bro?


How's it looking? Stable? How far have you taken the OC and how does 3DM11 look? When do you think that you can post something up?


----------



## OccamRazor

Hi guys, sorry for not giving you any news but last month and this one are terrible at work, so no time to fiddle with the files, ill have vacations in January so ill keep yall posted!
I wish all of you a Merry Christmas and a Happy new Year!

Cheers

Ed


----------



## OverClocker55

I want one of these


----------



## jpongin

Can anyone with GTX 590 Quad SLI post their benchmark experience with Far Cry 3?

I have everything maxed, VSYNC at 1 Frame, No MSAA, 2560x1600 res.

When just running around, I mostly get a smooth 60FPS.

When in any combat situation, my FPS drops as low as 8 FPS and as high as 30 FPS.

VRAM is consistently at around 1000 - 1200MB

Anyone else having the same issues?


----------



## alancsalt

Alot of your answers might be found in the FarCry3 thread. Try the advice in this post: http://www.overclock.net/t/1329630/official-far-cry-3-performance-thread-3rd-patch-out-no-real-improv/1330#post_18910104


----------



## jpongin

Thanks, I'm going to try that hack. I also wanted see if it wasn't anything specific to GTX 590 Quad SLIs.

But I guess no one with GTX 590s are playing Far Cry 3?


----------



## alancsalt

I'd ask that same question there...anyone playing with GTX 590s...


----------



## ProfeZZor X

Can someone... Anyone, direct me to the most up-to-date driver for the 590. Or can it simply be found on EVGA's website?


----------



## Rei86

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Can someone... Anyone, direct me to the most up-to-date driver for the 590. Or can it simply be found on EVGA's website?


Why are you going to EVGA's website for the driver updates when you should be going straight to nVidia's?


----------



## Kaapstad

Quote:


> Originally Posted by *ProfeZZor X*
> 
> Can someone... Anyone, direct me to the most up-to-date driver for the 590. Or can it simply be found on EVGA's website?


I have not done much testing but 306.23 seems to work well with my GTX 590 quad sli setup - gives the best 3dmark11 score.


----------



## Kaapstad

Can I join the club

Kaapstad

2 * Asus GTX 590s quad sli running @stock



http://www.3dmark.com/3dm11/4883655

I have included the above 3dmark11 link as proof of ownership if thats ok.


----------



## Tuthsok

Quote:


> Originally Posted by *alancsalt*
> 
> I'd ask that same question there...anyone playing with GTX 590s...


Yes!!









Farcry3 is running quite nice for me. Cards are both at stock. I'm not benchmarking, tweaking or counting frames anymore... Just playing.


----------



## Kaapstad

Is this thread still being updated or is it too late to join the club.


----------



## Canis-X

PM Alatar and see what he's doing with it.


----------



## jbyron

Quote:


> Originally Posted by *jpongin*
> 
> Thanks, I'm going to try that hack. I also wanted see if it wasn't anything specific to GTX 590 Quad SLIs.
> But I guess no one with GTX 590s are playing Far Cry 3?


Hi, I'm running quad sli gtx 590s! I never had any slow down or stuttering at 1920x1200, dx11 4x AA until today where it was really bad during an outpost fight. But I restarted the game and everything went went back to silky smooth. I think it might've been some other external process like a program background update that slowed my system down.

I'm totally stock, not over clocked and no game tweaks. I'm using the latest nvidia beta drivers


----------



## jbyron

Picture of the rig!


----------



## OverClocker55

Where are the gpu's ??


----------



## jbyron

Lol they're there behind the fans they needed some cooling help

Here's a before pic


----------



## OverClocker55

Pictures without


----------



## jbyron

Quote:


> Originally Posted by *OverClocker55*
> 
> Pictures without


Oh is that for the club? Sorry I don't have any, I didn't even know the club existed when I put all this together and that proof was needed. I just wanted to answer that guys question about far cry3 performance. They are stock asus cards and I'll take better pics tomorrow !


----------



## OverClocker55

Quote:


> Originally Posted by *jbyron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *OverClocker55*
> 
> Pictures without
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh is that for the club? Sorry I don't have any, I didn't even know the club existed when I put all this together and that proof was needed. I just wanted to answer that guys question about far cry3 performance. They are stock asus cards and I'll take better pics tomorrow !
Click to expand...

Thanks


----------



## Macaiah

Any word on being able to flash or mod BIOS for ASUS GTX 590 w/ 70.10.42.00.02? These BIOS suck!!!


----------



## Krazeswift

Quote:


> Originally Posted by *Macaiah*
> 
> Any word on being able to flash or mod BIOS for ASUS GTX 590 w/ 70.10.42.00.02? These BIOS suck!!!


Yup here you go.. Asus GTX590 @ 963mv oced to 630/1800 with 100% fan.

If you want more or less voltage you can adjust it in Nibitor

.42 Voltmod.zip 92k .zip file


----------



## Macaiah

Thanks Krazeswift! I'll try this when I get home. How were you able to edit and flash the BIOS for .42?


----------



## Krazeswift

Quote:


> Originally Posted by *Macaiah*
> 
> Thanks Krazeswift! I'll try this when I get home. How were you able to edit and flash the BIOS for .42?


The bios came from someone else on the forum, although it had a maximum of 988mv which is to high for air cooled 590's, I just lowered it slightly using nibtor. Then flashed with nflash


----------



## Macaiah

Quote:


> Originally Posted by *Krazeswift*
> 
> The bios came from someone else on the forum, although it had a maximum of 988mv which is to high for air cooled 590's, I just lowered it slightly using nibtor. Then flashed with nflash


nvflash isn't letting me apply the new bios. I can't even enter commands...


----------



## Krazeswift

Quote:


> Originally Posted by *Macaiah*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krazeswift*
> 
> The bios came from someone else on the forum, although it had a maximum of 988mv which is to high for air cooled 590's, I just lowered it slightly using nibtor. Then flashed with nflash
> 
> 
> 
> nvflash isn't letting me apply the new bios. I can't even enter commands...
Click to expand...

Drag and drop 590A onto nvflash.exe then repeat for B once flashed.

You'll only need to use cmd for disabling flash protection, which you might need to do 1st.


----------



## Macaiah

Had to do it via DOS and I ended up using my own modded BIOS. 700 / 1399 / 1800...not the improvement I was expecting, but I don't want to go any higher on air!!

Thanks man!


----------



## Krazeswift

Quote:


> Originally Posted by *Macaiah*
> 
> Had to do it via DOS and I ended up using my own modded BIOS. 700 / 1399 / 1800...not the improvement I was expecting, but I don't want to go any higher on air!!
> 
> Thanks man!


No worries, welcome to the club


----------



## Canis-X

Alright, so break this down for me as everyone seems to be able to OC this card but me....so I obviously must have missed something in this thread. How are you guys OCing it? I've tried flashing different BIOS' on the cards but MSI Afterburner still shows the GPU core voltage to be .963, which obviously limits my OC.


----------



## Krazeswift

Quote:


> Originally Posted by *Canis-X*
> 
> Alright, so break this down for me as everyone seems to be able to OC this card but me....so I obviously must have missed something in this thread. How are you guys OCing it? I've tried flashing different BIOS' on the cards but MSI Afterburner still shows the GPU core voltage to be .963, which obviously limits my OC.


I take it you're using the .42 bios also? The one I uploaded had a maximum of 963mv. Thought I'd play it safe and stick with a safe voltage so nobody unwittingly fried their card.

The .42 bios is slightly different to the launch .37, as it doesn't let you scale the voltage down from the maximum set, it only stays at your max value. But using nibtor you can can change power state 3 to whatever and it will go, hell you can go as high as 1.2V if you want but that would be suicide!!


----------



## heyskip

Quote:


> Originally Posted by *Canis-X*
> 
> Alright, so break this down for me as everyone seems to be able to OC this card but me....so I obviously must have missed something in this thread. How are you guys OCing it? I've tried flashing different BIOS' on the cards but MSI Afterburner still shows the GPU core voltage to be .963, which obviously limits my OC.


Post your bios and I can check if the settings are right to force higher voltage.


----------



## Canis-X

I'm on .37, see below. My problem is that no matter what I do MSI AB only shows .963v (also, shown in ss below). So are you saying to only use Nibitor to flash the GPU voltage and clocks that I want to run at and leave MSI AB alone? Do I need to be on a specific NVidia driver version......what is the trick?


----------



## heyskip

Quote:


> Originally Posted by *Canis-X*
> 
> I'm on .37, see below. My problem is that no matter what I do MSI AB only shows .963v (also, shown in ss below). So are you saying to only use Nibitor to flash the GPU voltage and clocks that I want to run at and leave MSI AB alone? Do I need to be on a specific NVidia driver version......what is the trick?


You need to flash the .42 bios if youve already tried a modified .37 bios and it doesn't work. Had the same problem with one of my cards.

I would use AB to control the clocks. Just change the setting 2 voltage with nibitor.

There are also 2 different .42 bios. Theres a .42.00.02 for both master/slave and a .42.00.90/91 master slave. I would use the 90/91 version.

I'll see if I've got a copy of that bios on my pc here at work and modify it.


----------



## Canis-X

Cool! Thanks so much! I really appreciate it! +REP


----------



## heyskip

590 bios.zip 93k .zip file


I couldn't find the 90/91 bios here at work or online but I do have a copy at home. I modified the .42.00.02 bios and attached that. It should work fine I just prefer the other one for my setup.

The settings in the attached bios are 0.850v idle and 0.988v at full load. This will be fine since your watercooled. I was running up to 1.025v on mine for benchmarking.

As far as flashing goes you will probably need to overide the protections in nvflash. Just open a command prompt at a folder with both the nvflash files and the rom files in it. You will need to use the following switches

nvflash -4 -5 -6 -i1 A.rom for the first card first chip
nvflash -4 -5 -6 -i2 B.rom for the first card second chip
nvflash -4 -5 -6 -i4 A.rom for the second card first chip
nvflash -4 -5 -6 -i5 B.rom for the second card second chip

Please pay special attention to the number after the -i as this is the id of the chip you want to flash. It might be worth running nvflash-c first to see how it picks up both of your cards. On my system it goes like this

0 is the first NF200 chip
1 is the GF110A
2 is the GF110B
3 is the second NF200 chip
4 is the GF110A
5 is the GF110B

You just need to make sure you dont flash the NF200 chips or your card will be toast. And make sure the cores labeled 102015P0 get the A.rom and the ones labeled 102015P2 get the B.rom. Nvflash is pretty good at letting you know if your attempting to flash a slave bios to the master chip or vice versa.

Let me know how you go.


----------



## heyskip

Oh and I like to reinstall the graphics driver using the clean install switch after a reboot. Sometimes windows will try and do it anyway on my system.


----------



## Canis-X

Thanks Heyskip! I'm not in a rush, if you can find the specific BIOS for the card that you like, I am more than willing to wait so I can work with that one.


----------



## Bezna

Been around for a while, finally got around to joinin the club officially ;P

Alatar do I need to PM you ?


----------



## Bezna

BTW a few thoughts I've had about this card this past year...

Great card IMO, noise isn't even close to the AMD cards I've heard in the past, cooling is pretty good considering I stuck it in the 800D and price wasnt bad either at the time of purchase ( Dec 2011, $750 ) compared to gtx 580's sli ( around $1000 ) at the time.

But now that I've owned it for over a year I noticed that heat buildup ( especially running Prime95 / Folding / Heavy gaming ) is a con since it dumps heat back into the case. And the 800D doesnt really help the situation ( should have WC'ed the card from the get-go ).

Overall I'm really happy with the performance of the card, although come GTX 8XX series cards come out. I'ma pick up 2 GTX 880's and W/C them.
Till then this should suffice. I will definitely need the upgrade if I add 2 more screens ( Currently have a 27" ). Either that or I'll go single 30" 2560 x 1600.

_*Quick question:*_ I noticed that I still get good performance in BF3 ( currently the most graphically demanding game I play ), Can I, and is it even worth trying to mess with overclocking it? Safe to overclock it or not on these cards? I heard different stories. Thanks in advanced!


----------



## heyskip

Quote:


> Originally Posted by *Canis-X*
> 
> Thanks Heyskip! I'm not in a rush, if you can find the specific BIOS for the card that you like, I am more than willing to wait so I can work with that one.


 590AB.zip 93k .zip file


There you go. That's the 70.10.42.00.90/91 bios. I use that on my older EVGA card and it works well. The older card didn't seem to work as well with the 70.10.42.00.02 bios. It wouldn't always increase to the maximum set voltage.


----------



## Canis-X

Thanks Heyskip!! Finally got the voltage to go up, Now I'm doing some benching to test things out....Really appreciate your help with this again!!!









Testing so far:

Dual GTX590 [quad-SLi] / 3930k / RIVE (all water cooled)

http://www.3dmark.com/3dmv/4519062


----------



## Bezna

bumpy bump bump... cough cough *
<---- want to join ze club


----------



## Bezna

Quote:


> Originally Posted by *vio2700k*
> 
> _*Quick question:*_ I noticed that I still get good performance in BF3 ( currently the most graphically demanding game I play ), Can I, and is it even worth trying to mess with overclocking it? Safe to overclock it or not on these cards? I heard different stories. Thanks in advanced!


----------



## Canis-X

^^ You will most likely need to PM Alatar about being added (thread starter).

http://www.3dmark.com/3dm06/17087766


----------



## Bezna

I did already but I guess I'll have to wait until he replies back


----------



## Canis-X

He's around a lot but as this is a last gen card, and a PITA one at that, he may not do anything with it any more. Might as well just through on the tags and be done with it.


----------



## heyskip

Quote:


> Originally Posted by *Canis-X*
> 
> Thanks Heyskip!! Finally got the voltage to go up, Now I'm doing some benching to test things out....Really appreciate your help with this again!!!


Thats awesome, glad it worked for you.


----------



## Pedropc

Hi, I have been 6 months playing all my games with this speed and voltage settings and no problem with vga, going great, I have no PDL and low frequencies with any test or game. a greeting.

http://imageshack.us/photo/my-images/824/3dmark11perfomance.png/


----------



## Canis-X

Download MSI Afterburner and run that bench again. Play close attention to your GPU core and GPU mem frequencies during the first benchmark test. At those clocks (and on current drivers) I am willing to bet that your frequencies are fluctuating.....resulting in a lower score. Happens to me still.


----------



## Pedropc

Good Canis-X, I tried to pass the test with the original bios GTX590 and the result is lower and looks like the PDL acts, but with the bios of the Mars II the result is greater and the frames are constant, 1.038 V even putting no PDL, greetings.

http://imageshack.us/photo/my-images/838/sinttulodyd.png/


----------



## Canis-X

Ahhhh....now I see!!! Which BIOS did you flash on there? Do you have a link to it and instructions?

Really sweet score man!


----------



## Pedropc

1,--the two bios download TechPowerUp, one is the bios 1 and another bios 2.

2,--you put the two bios on a bootable usb with the program nVFlash , you also have the latest version in techopowerup.

3,--boot your PC so that you first read the usb and also execute two.

4,--the first thing is to remove the protection of the bios, you have to put; nVFlash-r and will ask that bios want to remove the limitation, there are three bios, 0, 1 and 2, the 0 is the bios of the NF200, that no to touch, so you give the 1. You wait a bit and takes protection to the bios one, now do the same, choose the nVFlash-r 2, so remove the BIOS protection 2.

5,--now is the real flash, put nVFlash -i1 -4 -5 -6 xxx.rom, where xxx is the name of the downloaded BIOS 1. Repeatedly puts you care, that does not match anything, do not listen, you give YES in all caps and flashing.

6,--the same with the bios 2 nVFlash -i2 -4 -5 -6 xxx.rom, where xxx is the BIOS 2 discharged, he will say the same in the bios Messages 1 but do not listen, you give YES and do.

7,-- restart, and voila, you have a Mars II.

If you have any questions ask me, it's a pleasure to help. A greeting.


----------



## Canis-X

Thank you for the information!! I'll most definitely have to give this a try!!







+REP


----------



## jeromeface

I've been toying around with flashing my EVGA gtx 590 classified cards to mars2 quad sli. Pedro, do you think there would be any incompatibilities doing this with quad sli vs just 1 card?


----------



## SloaneRanger

hello Pedropc

has you know, old .37 bios does not match with .42 bios card ; so after a RMA on my 590 gygabyte, i receved a .42 . i tried .37

>a lot of artefacts .....i hought card was dead but i managed to reflash it with original .42 & all is ok

so your ASUS II bios have a moded string to be working with .42 version card ?

Thanks for your great job


----------



## SloaneRanger

Quote:


> Originally Posted by *SloaneRanger*
> 
> hello Pedropc
> 
> has you know, old .37 bios do not match with .42 bios card ; so after a RMA on my 590 gygabyte, i receved a .42 . i tried .37
> 
> >a lot of artefacts .....i yhought card was dead but i managed to reflash it with original .42 & all is ok
> 
> so your ASUS II bios have a moded string to be working with .42 version card ?
> 
> Thanks for your great job


----------



## Pedropc

Canis-X;

Your VGAs with both mutated have to win a lot.

jeromeface;

I think there is no problem, first try with one and then the other.

SloaneRanger;

It seems that one can mutate the VGAs with bios 37, those with 42 bios can not, mine is a Gigabyte bios 37.


----------



## jeromeface

So with both my EVGA cards having either bios 71.10.37.00.91 or 71.10.37.00.90 does that mean I'm in a prime position to do the flash?


----------



## Pedropc

I think so, mine is 37 and has flashed the bios of the Mars II to the first, a greeting.


----------



## Canis-X

@jeromeface: Both of my GTX590's original BIOS were .37 versions. I have them both running on a .42 version of the BIOS so that I can manipulate the voltage with Nibitor and flash the cards with NVFlash. They run great like this but are limited in some applications/benchmarks like 3DMark 11. If you set the voltage too high then the driver will force both the GPU core and GPU memory frequencies to throttle down, thus reducing your score.

@Pedropc: Do you have to make the USB drive bootable before you copy NVFlash and the GPU roms to it?

I just downloaded the MARS II BIOS and adjusted the FEMI clocks and FERMI voltages in Nibitor, I'm going to try the .988v setting first to see what happens in 3DMark 11 and then go from there if my results are good. I'm at work though so I can't do anything right now, I'll have to wait until this evening.....fingers crossed that all goes extremely well!!!


----------



## jeromeface

I know I'm excited to see your results Canis-X. If all goes well for you I think I might just have to make the plunge. Also I forgot to ask, will I be ok flashing to mars if I'm just on air cooling or is this a water cooled venture only?


----------



## Pedropc

Canis-X, Yes, you have to make a bootable USB, greetings.


----------



## Canis-X

Ok, so this is my thought process on how I'm going to do this....


Already have my thumbdrive handyly setup to be bootable, with NVFlash and my VGA roms (x2).
Download the latest NVidia drivers for MARSII and have them on the desktop
Use NVFlash for windows and remove the protection from 1,2,4,5 from the list (*nvflash -r*).
0 is the first NF200 chip
1 is the GF110A
2 is the GF110B
3 is the second NF200 chip
4 is the GF110A
5 is the GF110B

Uninstall the NVidia drivers completely
Reboot into safe mode and run Driver Fusion with elevated privileges
reboot into the thumbdrive
flash the GPU's (run A.rom and B.rom for both cards)
*nvflash -4 -5 -6 -i1 A.rom* for the first card first chip
*nvflash -4 -5 -6 -i2 B.rom* for the first card second chip
*nvflash -4 -5 -6 -i4 A.rom* for the second card first chip
*nvflash -4 -5 -6 -i5 B.rom* for the second card second chip

reboot into the OS (hopefully no brick occurred)
install the new NVidia driver
Reboot it a couple of time to let the driver complete configuring itself
test, test test

Anything sound out of whack there?


----------



## SloaneRanger

jeromeface

No problem with your bios version, you can flash safety. Mine new GIGABYTE is a .42 bios, that means that i have to wait for RAZOR ASUS MARS II version, since he will mod the string to recognize a .37 bios as as .42 bios on a .42 card. Actually to much busy at work he said .

i wonder if a .37 & a .42 card are component modded, but everybody who flashed a old bios into a new card got full screen artefact

no PDL is a dream for me too !!


----------



## MKHunt

Quote:


> Originally Posted by *SloaneRanger*
> 
> No problem with your bios version, you can flash safety. Mine new GIGABYTE is a .42 bios, that means that i have to wait for RAZOR ASUS MARS II version, since he will mod the string to recognize a .37 bios as as .42 bios on a .42 card. Actually to much busy at work he said .
> 
> i wonder if a .37 & a .42 card are component modded, but everybody who flashed a old bios into a new card got full screen artefact
> 
> is a dream for me too !!


The .42 BIOS version does have new hardware. I made a thread on it when I had an unfortunate series of RMA's early in the 590's life cycle.

http://www.overclock.net/t/1168598/the-590-has-a-new-vrm-scheme-with-pictures


----------



## SloaneRanger

MKHunt

Thanks a lot for your explanation ! WRM on first 590 were the < maillon faible > if you can understand these words.
And nvidia thought : welcome to PDL in driver as prime time solution


----------



## Canis-X

Quote:


> Originally Posted by *MKHunt*
> 
> The .42 BIOS version does have new hardware. I made a thread on it when I had an unfortunate series of RMA's early in the 590's life cycle.
> 
> http://www.overclock.net/t/1168598/the-590-has-a-new-vrm-scheme-with-pictures


You never got around to posting up your results on that thread, how did the card perform?


----------



## MKHunt

Quote:


> Originally Posted by *Canis-X*
> 
> You never got around to posting up your results on that thread, how did the card perform?


Hahaha I never finished testing it! That was the time when each 28X+ driver completely destroyed what was stable before, but each driver also gave me performance improvements in the games I was playing, so I kept changing it up. I also had a lot of random shutdowns and thought it was my overclock. Turns out that melting a 2600k and frying a 590 on one mobo was too much for it! I guess now that I have a new mobo I could go to messing with it again. Though I'm not very comfortable with messing with card BIOSes just to up the voltage. Or does MSI beta adjust volts with 310 drivers?


----------



## Canis-X

Ah-ha! LOL

Melting a 2600K? Is that why you had to RMA your other 590?

No, I am not aware of any application that will allow for OTF voltage adjustment on the GTX590's still. From my research the only way it can be accomplished is to flash the BIOS with a .42 version BIOS that you have modified the voltage with Nibitor. Sucks, but I suppose it could be worse! =-/


----------



## MKHunt

Quote:


> Originally Posted by *Canis-X*
> 
> Ah-ha! LOL
> 
> Melting a 2600K? Is that why you had to RMA your other 590?
> 
> No, I am not aware of any application that will allow for OTF voltage adjustment on the GTX590's still. From my research the only way it can be accomplished is to flash the BIOS with a .42 version BIOS that you have modified the voltage with Nibitor. Sucks, but I suppose it could be worse! =-/


Nah, the other 590 was just horrible luck. The 2600k got replaced and literally the day after I put the new 2600k in the system, my 590 shorted and melted my PSU leads. I can't help but wonder which event lead to the mobo taking a dump though.


----------



## Canis-X

Oh wow...you must have ticked Murphy off or something for all of that to happen to you! Glad to hear that everything is working better for ya now though.


----------



## jeromeface

I'm excited to see Canis-X's results!


----------



## Canis-X

Ok, so I did it. I haven't fully tested it yet but I have successfully flashed the BIOS as shown below. So far everything feels fine and I am not experiencing any issues. I followed my instructions above exactly (I restate them here for emphasis though).




From the screen shots above the BIOS (x2) that I flashed is attached below:

*ATTACHMENT REMOVED DUE TO VOLTAGE ISSUES EXPERIENCED BY OTHERS*

The steps that i took are as follows:
Quote:


> Already have my thumbdrive handyly setup to be bootable, with NVFlash and my VGA roms (x2).
> Download the latest NVidia drivers for MARSII and have them on the desktop
> Use NVFlash for windows and remove the protection from 1,2,4,5 from the list (*nvflash -r*).
> 0 is the first NF200 chip
> 1 is the GF110A
> 2 is the GF110B
> 3 is the second NF200 chip
> 4 is the GF110A
> 5 is the GF110B
> 
> Uninstall the NVidia drivers completely
> Reboot into safe mode and run Driver Fusion with elevated privileges
> reboot into the thumbdrive
> flash the GPU's (run A.rom and B.rom for both cards)
> *nvflash -4 -5 -6 -i1 A.rom* for the first card first chip
> *nvflash -4 -5 -6 -i2 B.rom* for the first card second chip
> *nvflash -4 -5 -6 -i4 A.rom* for the second card first chip
> *nvflash -4 -5 -6 -i5 B.rom* for the second card second chip
> 
> reboot into the OS (hopefully no brick occurred)
> install the new NVidia driver
> Reboot it a couple of time to let the driver complete configuring itself
> test, test test


To flash the cards you will need NVFlash. If you want to change the voltages you will need Nibitor. To create a bootable USB drive follow RaginCain's instructions here --> http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread/0_30#post_14179341, and you will need his USB-BOOTIMAGE folder and the HP DISK STORAGE FORMAT UTILITY below:

HP DISK STORAGE FORMAT UTILITY.zip 1812k .zip file


USB-BOOTIMAGE.zip 556k .zip file


----------



## Wogga

i'll try this evening. mine are also .37
now i have to remember what freqs were on 0.988v


----------



## Canis-X

Wow! I'm actually seeing my GPU's go over 40C for the first time!!! 42C is the highest I've witnessed so far monitoring MSI AfterBurner on my phone.









Edit:
Quote:


> P18732 with NVIDIA Dual GeForce GTX 580 - Asus Mars II(4x) and Intel Core i7-3930K Processor
> Graphics Score 21681
> Physics Score 14729
> Combined Score 11618


http://www.3dmark.com/3dm11/5654920



My previous best:
Quote:


> http://www.3dmark.com/3dm11/3749371
> P18724 with NVIDIA GeForce GTX 590(4x) and Intel Core i7-3930K Processor
> Graphics Score 21195
> Physics Score 16017
> Combined Score 11553


----------



## jeromeface

@Canis-X Grats on the successful flash. With your temps I'm assuming you are on water cooling. With me using the stock air coolers still would this cause me problems with this process if you were to guess?

@pedropc I'm assuming you are also using water blocks, would you not flash 590's to mars 2 if you were using air coolers?


----------



## Canis-X

Quote:


> Originally Posted by *jeromeface*
> 
> @Canis-X Grats on the successful flash. With your temps I'm assuming you are on water cooling. With me using the stock air coolers still would this cause me problems with this process if you were to guess?
> 
> @pedropc I'm assuming you are also using water blocks, would you not flash 590's to mars 2 if you were using air coolers?


Yes, I am all water cooled so you don't want to try this I wouldn't think. I have a dual loop setup.

CPU loop = CPU, Motherboard cooled by a 3x120mm rad in push only
GPU loop = both GTX590's full board blocks cooled by a 4x120mm rad in push/pull

You should definitely read RaginCain's thread to get an idea of what you can do though. I is kinda old and references old drivers but you can get a really good idea of what you can try. Just remember to take small steps and test a lot. Watch your temps constantly....don't take your eyes off of them.

http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread/0_30

Edit:

Last run of the night folks....crashed and burned when I tried to run my CPU at 5.125GHz....LOL.....SOOOOOooooo 5.0Ghz is as good as I'm gonna attempt tonight. I killed my old score too!!








Quote:


> http://www.3dmark.com/3dm11/5655049
> *P19242* with NVIDIA Dual GeForce GTX 580 - Asus Mars II(4x) and Intel Core i7-3930K Processor
> Graphics Score *21921*
> Physics Score *15845*
> Combined Score *12064*


----------



## Wogga

try lower memory clocks. for me mem clocks higher than stock lead to lower perfomance


----------



## Canis-X

I know that it may sound like a silly question but, GPU or CPU memory clocks?


----------



## brettjv

Assuming he's talking to you Canis, I think he must mean system memory. However, this is really not a 'CPU overclocking' thread, so ... let's not get all off on a tangent


----------



## Canis-X




----------



## MKHunt

Quote:


> Originally Posted by *brettjv*
> 
> Assuming he's talking to you Canis, I think he must mean system memory. However, this is really not a 'CPU overclocking' thread, so ... let's not get all off on a tangent


Actually he means GPU. Cain Wogga and I were among the few in the 6990 v 590 thread from way back when. I think technically I still have stewardship over that thread lol. Both Wogga and I noticed that memory clocks at the stock setting, or slightly over stock, was a real sweet spot. Also, try dropping the core a few MHz if you want to. The 590 seems to respond best at certain voltage and MHz numbers. You might be 100% stable at a certain core clock but dropping the hertz would, in some cases, make scores climb a bit.

Quote:


> Originally Posted by *Canis-X*


I am a bit jelly. Makes me wish someone would do this for the .42 BIOS now. If I could make my Sabertooth board not run my 2600k at full volts 24/7 I'd be tempted to O/C the beast again lol. Or maybe I'll just buy a 3770k and maybe that will fix it.


----------



## Pedropc

Canis-X, I'm glad that everything has gone well, you do not see the PDL in 3DMark11?, Greetings.


----------



## Canis-X

Quote:


> Originally Posted by *MKHunt*
> 
> Actually he means GPU. Cain Wogga and I were among the few in the 6990 v 590 thread from way back when. I think technically I still have stewardship over that thread lol. Both Wogga and I noticed that memory clocks at the stock setting, or slightly over stock, was a real sweet spot. Also, try dropping the core a few MHz if you want to. The 590 seems to respond best at certain voltage and MHz numbers. You might be 100% stable at a certain core clock but dropping the hertz would, in some cases, make scores climb a bit.
> I am a bit jelly. Makes me wish someone would do this for the .42 BIOS now. If I could make my Sabertooth board not run my 2600k at full volts 24/7 I'd be tempted to O/C the beast again lol. Or maybe I'll just buy a 3770k and maybe that will fix it.


I thought that was what he was referring to, but just wanted to clarify.

Absolutely, I plan on testing this out some more this evening!







I usually run the memory at stock, but I will try bumping it slightly above stock and retesting.

Making this for the .42 BIOS would definitely be nice. I thought OccamRazor was working to do just that, but I haven't heard back from him for some time now.
Quote:


> Originally Posted by *Pedropc*
> 
> Canis-X, I'm glad that everything has gone well, you do not see the PDL in 3DMark11?, Greetings.


Pedro!!! You are awesome man!! Really appreciate you coming back in here and posting. It really spurred me to just do it!! No PDL, everything ran smooth and I beat my old score, so I can't complain!









Thanks again!!


----------



## SloaneRanger

Quote:


> Originally Posted by *MKHunt*
> 
> Actually he means GPU. Cain Wogga and I were among the few in the 6990 v 590 thread from way back when. I think technically I still have stewardship over that thread lol. Both Wogga and I noticed that memory clocks at the stock setting, or slightly over stock, was a real sweet spot. Also, try dropping the core a few MHz if you want to. The 590 seems to respond best at certain voltage and MHz numbers. You might be 100% stable at a certain core clock but dropping the hertz would, in some cases, make scores climb a bit.
> I am a bit jelly. Makes me wish someone would do this for the .42 BIOS now. If I could make my Sabertooth board not run my 2600k at full volts 24/7 I'd be tempted to O/C the beast again lol. Or maybe I'll just buy a 3770k and maybe that will fix it.


ah ! .42bios MARS II awaited; it would be awesome ! who can ? OccamRazor for sure !
wait & see


----------



## Pedropc

Canis-X;

If you install the Asus GPU Tweak let you update the bios of the VGAs, and pass the AS07S and AS07M to AS08S and AS08M, there is no problem and work better, a greeting.

http://imageshack.us/photo/my-images/856/55753959.png/


----------



## Canis-X

I did do the update last night. I then modded the BIOS to reduce the clocks and voltages with Nibitor and flashed the cards again using NVFlash. I am too afraid that I will forget to reduce the voltage and clocks before I jump into a game and fry the card, I prefer to have them set lower.


----------



## Wogga

Flashed and haven't burned the cards yet =)


----------



## Canis-X

Nice Wogga!







I modded the voltage on mine further to 1.013v as depicted in in Pedro's screen shots. Doing so allowed me to OC the GPU's up to 750Mhz so far. At .988v I was successful running 3DMark11 at 720Mhz but crashed at 730Mhz.

@ Pedro: You mentioned that you ran your card successfully at 1.038v with no issues? I only ask because your screenshots only show you running at 1.013v.


----------



## SloaneRanger

Quote:


> Originally Posted by *MKHunt*
> 
> The .42 BIOS version does have new hardware. I made a thread on it when I had an unfortunate series of RMA's early in the 590's life cycle.
> 
> http://www.overclock.net/t/1168598/the-590-has-a-new-vrm-scheme-with-pictures


Attention : as MKHunt said , .42 card has been component modded . So , for people who are flashing their .37 bulk card to ASUS MARS II : don t forget WRM are the bad point. With .42 card, no more WRM issus. Look at MKHunt thread for more information before it smells burn









i had a .37 GIGABYTE that burned, got a .42 as RMA








, what i see is that even with stock air cooling, new card lets overclocking really easylly. i remenber it was not the same way with the old one, with same process.


----------



## Wogga

lol, if earlier i could run [email protected], now 3DMark11 benches only at [email protected]
but still got higher result compared to [email protected]
*P16883* vs *P16644*


----------



## Canis-X

Quote:


> Originally Posted by *Wogga*
> 
> lol, if earlier i could run [email protected], now 3DMark11 benches only at [email protected]
> but still got higher result compared to [email protected]
> *P16883* vs *P16644*


Yeah, I observed the same thing on my rig, I had to run at much higher freqs to obtain equal score results. The only difference that I can think of, aside from the different BIOS, is that I am currently running on a much new driver. I really need to try installing the same driver that I obtained my best score with on the GTX 590 BIOS and see if my scores are better/same/worse as my current driver.


----------



## Canis-X

Tried with less mem freq and got a slightly better total score...

http://www.3dmark.com/3dm11/5664484



^^ Trying a new thing with my name tag there....what do you think?


----------



## Pedropc

Yes, I've spent 3DMark11 to 1.038 V and 782 mhzs hassle-free, for 24/7 I have a 750 with 1.013 V and is perfect, greetings.


----------



## Canis-X

Nice Pedro!! Thanks for the update. Where are you from friend?


----------



## Pedropc

I'm from Spain

This did nothing but flash the bios and air, now I'm with RL.

http://imageshack.us/photo/my-images/401/3dmarkmarsii.png/


----------



## Canis-X

Sweet score Pedro!! Very nice! I'll try that voltage/clock out later this week!


----------



## Wogga

Quote:


> Originally Posted by *Canis-X*
> 
> ^^ Trying a new thing with my name tag there....what do you think?


better than previous


----------



## jeromeface

Quote:


> Originally Posted by *Pedropc*
> 
> I'm from Spain
> 
> This did nothing but flash the bios and air, now I'm with RL.
> 
> http://imageshack.us/photo/my-images/401/3dmarkmarsii.png/


So it runs on air without getting too hot gaming? Thanks in advance! +Rep


----------



## Pedropc

In games walking on air 76 º, in 3DMark11 88-89º, a greeting.


----------



## mironccr345

Quote:


> Originally Posted by *Pedropc*
> 
> I'm from Spain
> 
> This did nothing but flash the bios and air, now I'm with RL.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://imageshack.us/photo/my-images/401/3dmarkmarsii.png/


Miss my 590.







wish it had more vram, then I'd still have it!


----------



## Wogga

tried to fold and system shut off after 5-6 minutes. GPU voltage in HWinfo was 1.075v on each. now PC smell like burned PCB but it works fine now w/o load =D

no its not fine =(
got artifacts with shadows in WoT


----------



## Canis-X

You used the BIOS that I attached? Did you change the voltage in Nibitor? What did you set it to?









RaginCain stated in his OC thread that the "hypothetical maximum" for a waterblocked card was 1.063v. Pedro stated that the max that he tested with the stock air coolers was 1.038v. The BIOS that I provided was set to .988v and I have personally tested with 1.013v with success.


----------



## Wogga

It was your bios without any modifications. Now it shows 0.988v. Maybe its a bug of hwinfo, but power supply definitely turned the system off because of power draw it couldn't handle


----------



## D8V1D2012

Quote:


> Originally Posted by *Wogga*
> 
> It was your bios without any modifications. Now it shows 0.988v. Maybe its a bug of hwinfo, but power supply definitely turned the system off because of power draw it couldn't handle


Same here... Used the same BIOS that was posted by Canis-X on page 473. When I first ran the benchmark without any adjustment, my GTX 590 my voltages where at 1.088v, I immediately close out. But when I went back into the benchmark again it was at 0.988v, not sure why. After running the benchmark for less than a minute, my whole computer shutdown. My motherboard now wont even start unless the GTX 590 is removed.

I think the card is fried, the 1.088v must have killed it in the beinging because I have the EVGA Hydro Copper GTX 590 Watercooled and the temperatures never got over 47 degrees.

Well, there goes $1000 down the drain....


----------



## Bezna

Im sorry to hear about your loss...

I was that close to bypassin the Bios as well but was always hesitant to do for fear of bricking or frying it.


----------



## Bezna

btw whats the deal with the club, still no update for other owners? spread sheet is still not updated, and sig code isn't available.
BLAHHHh


----------



## Canis-X

WOW guys I'm sorry that you experienced issues with that!!! I followed the steps that I listed exactly and used the exact same BIOS that I uploaded so I am at a loss for what the issue is. I have removed the attachment and won't repost it for fear of others experiencing the same issue. I was pretty confident that it was 100% safe as I tested several runs of 3DM11 watching the voltage and the freqs closely and everything stayed exactly as I had set it.. I'm really sorry, I don't think that I'll post up anything like that again for fear of issues like this.


----------



## D8V1D2012

So sad... Tried putting the card back in 3 time today to see if it would work. Motherboard wont even boot with the GTX 590 in there.

I wonder what would happen if I RMA the card to EVGA? Do ya'll think they would know that it was overclock and has and ASUS Mars II BIOS on there? I think the card is totally fried though, so maybe there's no way they would find out.


----------



## Canis-X

Man, I can't tell you to do that. That is 100% your decision. We all take risks when we do mods like this and I should have removed the BIOS as soon as I saw Wogga's post. I really thought that it was just a fluke due to my perfectly fine tests and results. In any event, at the risk of sounding cliche', you should have completely known that there was the potential to brick or blow your card up doing this. Even with how much it sucks.









Anyone that downloaded my BIOS please destroy it and do not re-post it.


----------



## D8V1D2012

Yah, I knew what all the risks were. I don't think I'm going to RMA the card though. I would just feel bad about it, because it's my fault I killed the card, Not EVGA.


----------



## Canis-X

Cool. I don't know what to say to you guys....I'm really sorry to have caused harm to anyone's equipment!! Man, I just fell like "four asterisk word".


----------



## MKHunt

Quote:


> Originally Posted by *D8V1D2012*
> 
> Yah, I knew what all the risks were. I don't think I'm going to RMA the card though. I would just feel bad about it, because it's my fault I killed the card, Not EVGA.


Sometimes they can repair them for a nominal fee. Might be worth a try if you explain what happened.


----------



## D8V1D2012

So I tell EVGA...

"...so I decided to put a total different BIOS on my EVGA GTX 590 Hydro Copper. To be specific the ASUS Mars II BIOS. After the BIOS was install, I ran EVGA OC Scanner X to run a stress test. But then I suddenly saw the voltages were way too high at 1.088v. I stopped the test immediately. Then I decided to run it again because I didn't couldn't believe that it was running at 1.088v. Once a ran the stress test again it was only at 0.988v (not sure why the voltage changed) and temperatures never got over 47 C. But after less than 1 minute of running the stress test my computer totally shut down. Now my motherboard will not start if the GTX 590 is installed. I tried numerous times, installing the GTX 590 back on my motherboard, but it causes my system to not boot."

I wonder what their response will be.

Probably somewhere in the lines of why the F*** did you flash an ASUS Mars II BIOS on your GTX 590!

LOL this will be interesting.


----------



## mironccr345

Quote:


> Originally Posted by *D8V1D2012*
> 
> So I tell EVGA...
> 
> "...so I decided to put a total different BIOS on my EVGA GTX 590 Hydro Copper. To be specific the ASUS Mars II BIOS. After the BIOS was install, I ran EVGA OC Scanner X to run a stress test. But then I suddenly saw the voltages were way too high at 1.088v. I stopped the test immediately. Then I decided to run it again because I didn't couldn't believe that it was running at 1.088v. Once a ran the stress test again it was only at 0.988v (not sure why the voltage changed) and temperatures never got over 47 C. But after less than 1 minute of running the stress test my computer totally shut down. Now my motherboard will not start if the GTX 590 is installed. I tried numerous times, installing the GTX 590 back on my motherboard, but it causes my system to not boot."
> 
> I wonder what their response will be.
> 
> Probably somewhere in the lines of why the F*** did you flash an ASUS Mars II BIOS on your GTX 590!
> 
> LOL this will be interesting.


Oh man, good luck with that.


----------



## MKHunt

Quote:


> Originally Posted by *D8V1D2012*
> 
> So I tell EVGA...
> 
> "...so I decided to put a total different BIOS on my EVGA GTX 590 Hydro Copper. To be specific the ASUS Mars II BIOS. After the BIOS was install, I ran EVGA OC Scanner X to run a stress test. But then I suddenly saw the voltages were way too high at 1.088v. I stopped the test immediately. Then I decided to run it again because I didn't couldn't believe that it was running at 1.088v. Once a ran the stress test again it was only at 0.988v (not sure why the voltage changed) and temperatures never got over 47 C. But after less than 1 minute of running the stress test my computer totally shut down. Now my motherboard will not start if the GTX 590 is installed. I tried numerous times, installing the GTX 590 back on my motherboard, but it causes my system to not boot."
> 
> I wonder what their response will be.
> 
> Probably somewhere in the lines of why the F*** did you flash an ASUS Mars II BIOS on your GTX 590!
> 
> LOL this will be interesting.


Worst case scenario: you end up exactly where you are now









I know the pain of a 590 dying.


----------



## Bloodbath

Quote:


> Originally Posted by *D8V1D2012*
> 
> So I tell EVGA...
> 
> "...so I decided to put a total different BIOS on my EVGA GTX 590 Hydro Copper. To be specific the ASUS Mars II BIOS. After the BIOS was install, I ran EVGA OC Scanner X to run a stress test. But then I suddenly saw the voltages were way too high at 1.088v. I stopped the test immediately. Then I decided to run it again because I didn't couldn't believe that it was running at 1.088v. Once a ran the stress test again it was only at 0.988v (not sure why the voltage changed) and temperatures never got over 47 C. But after less than 1 minute of running the stress test my computer totally shut down. Now my motherboard will not start if the GTX 590 is installed. I tried numerous times, installing the GTX 590 back on my motherboard, but it causes my system to not boot."
> 
> I wonder what their response will be.
> 
> Probably somewhere in the lines of why the F*** did you flash an ASUS Mars II BIOS on your GTX 590!
> 
> LOL this will be interesting.


The only thing you should tell EVGA is that the card is faulty and let them test it. If the card is not recoverable they may not be able to detect the modified bios and you get a new card but you've just blown that chance by telling them everything. You should only tell them what they need to know, nothing more.


----------



## Canis-X

^ There is always that option.


----------



## Wogga

lol
i flashed previous bios (wich i used for 1.5 years), run Sims 3 and then one of my cards fried. old one, with ripped out capacitor without wich it worked great for 2 years =)
but one card is still alive and can even run benchmarks and other hard stuff.
the positive thing is that survivor is 34C under load and my electricity bills will be smaller


----------



## emett

Sound advice Bloodbath.


----------



## MKHunt

Quote:


> Originally Posted by *Wogga*
> 
> lol
> i flashed previous bios (wich i used for 1.5 years), run Sims 3 and then one of my cards fried. old one, with ripped out capacitor without wich it worked great for 2 years =)
> but one card is still alive and can even run benchmarks and other hard stuff.
> the positive thing is that survivor is 34C under load and my electricity bills will be smaller


Maybe the 590 uses tubes instead of transistors and now it um... has completed its burn-in stage!


----------



## Wogga

at least i tried =)
hope 1 590 will be enough for next few years


----------



## Bezna

It should be fine for at least 2 years for decent gaming. For everything else it should last you much longer!


----------



## Canis-X

For sure!!


----------



## D8V1D2012

Hey Wogga, are your GTX 590s okay still, after flashing the Mars II BIOS?


----------



## Bloodbath

Quote:


> Originally Posted by *emett*
> 
> Sound advice Bloodbath.


Seriously why tell them there's a 50/50 chance of getting a new card. If they ask, just say Bios whats a Bios?


----------



## Pedropc

I took six months to Mars II BIOS without problems.

I put the bios of the Mars II unmodified, then install the Asus GPU Tweak program and set the voltage to 1.013 V (750 core/ 1002 memory). So when you start windows fixed voltage.

So far I have not had any problems with the vga, but the maximum voltage I'd rate is 1.038 V. Never put more.

http://imageshack.us/photo/my-images/826/3dmark11perfomance.png/


----------



## Wogga

Quote:


> Originally Posted by *D8V1D2012*
> 
> Hey Wogga, are your GTX 590s okay still, after flashing the Mars II BIOS?


one of them died after flashing modified 590 bios and other is still alive.
as i understand there was a short circuit in the place where ripped capacitor was. after that i immediately powered off the system and felt a smell of burnt PCB. after powering on both cards still worked but first hard task killed that card.
so...the problem isnt only in bios


----------



## D8V1D2012

Quote:


> Originally Posted by *Wogga*
> 
> one of them died after flashing modified 590 bios and other is still alive.
> as i understand there was a short circuit in the place where ripped capacitor was. after that i immediately powered off the system and felt a smell of burnt PCB. after powering on both cards still worked but first hard task killed that card.
> so...the problem isnt only in bios


Yah, I don't think there any thing wrong with the Mars II BIOS in a GTX 590.
I thinks it's the initial spike in voltages right after flashing the BIOS is probably what affected the card the most, 1.088v is a lot for it to handle. And I'm not sure why it reverted back to 0.988v after trying to run the second test, I just want to run a stress test on the card to how compatible the BIOS was.


----------



## D8V1D2012

Quote:


> Originally Posted by *Pedropc*
> 
> I took six months to Mars II BIOS without problems.
> 
> I put the bios of the Mars II unmodified, then install the Asus GPU Tweak program and set the voltage to 1.013 V (750 core/ 1002 memory). So when you start windows fixed voltage.
> 
> So far I have not had any problems with the vga, but the maximum voltage I'd rate is 1.038 V. Never put more.


Those are some good tips for people who want to do this this mod.

Just download an unmodified Mars II BIOS.


----------



## SloaneRanger

Quote:


> Originally Posted by *D8V1D2012*
> 
> Those are some good tips for people who want to do this this mod.
> 
> Just download an unmodified Mars II BIOS.


Hello,

damn......another 590 burning .....but : did you had a overclocking software running before flashing WITH auto starting at windows boot ? or did you thought to reset setting before flashing ?

Another wellknown problem : as MKHunt said, .37 bios version card has inductors that are working at their limit...........only .42 card got stronger inductors. i say inductor & not vrm.

Maybe so keep warning with modding .37 card to ASUS MARS II


----------



## Komits

Hi i need bios A anb B for my gigabyte gtx590 if anyone could spend 5 min to help me !
Please Skyrim is awaiting me badly.


----------



## D8V1D2012

Quote:


> Originally Posted by *SloaneRanger*
> 
> Hello,
> 
> damn......another 590 burning .....but : did you had a overclocking software running before flashing WITH auto starting at windows boot ? or did you thought to reset setting before flashing ?
> 
> Another wellknown problem : as MKHunt said, .37 bios version card has inductors that are working at their limit...........only .42 card got stronger inductors. i say inductor & not vrm.
> 
> Maybe so keep warning with modding .37 card to ASUS MARS II


No, I never overclocked my GTX 590 since I heard about how bad it was, and so many people killing there cards.
I had it running at stock speeds before the flash and yes my card was a .37 BIOS version too


----------



## SloaneRanger

Quote:


> Originally Posted by *D8V1D2012*
> 
> No, I never overclocked my GTX 590 since I heard about how bad it was, and so many people killing there cards.
> I had it running at stock speeds before the flash and yes my card was a .37 BIOS version too


ok
so : ASUS MARS II bios has no PLD......nice for ASUS MARS II card or .42 card. PDL mission is to reduce frequencies when power draw is too high. With .37 gygabyte card, i remember overklocking it just a little & it frooze.....till burned.
After RMA i got a .42 gygabyte & what i can say is : overcloking is easy, even with stock air cooling

RMA your card, u ill get a .42 & u ill enjoy with


----------



## Canis-X

I tried the unmodified MARSII BIOS last night, both the one from Tech Powerup as well as the BIOS update that Pedro mentioned, and my GPU voltages were all over the place, no matter which program I used I could not keep them down while I sat idle on the desktop. The voltages would range from .963v to 1.050v to 1.063v and 1.075v







but would rarely hit the voltage that I had set in GPU tweakIt, 1.013v.







This freaked me out too much so I put my GTX590 .42 BIOS back on, so I am now back at .963v. I don't really intend on benching 3DM11 anymore, I doubt that I'll break my goal of 20000 even if I ran it at Pedro's recommended 1.038v to gain the extra 30 or so MHz on the GPU's.

I'm definitely going to try to get the #1 spot on 3DM06 for quad GTX590's though, I was only 213 points off with my last run on the .42 BIOS at 730MHz/1.025v, so I am going to try 1.038v to see if I can squeeze a few more MHz out of the cards and maybe that will get me over the hump.









Edit: Bear in mind that I flashed the BIOS using NVFlash within Windows this time so that could have caused me some inconsistencies with the voltages but the BIOS update via GPU TweakIt should have corrected that I would think. Not sure though.


----------



## D8V1D2012

At this point is probably not worth the risk to put the Mars II Bios on a GTX 590.

Ruining your card is not worth the marginal boost.


----------



## Komits

Hi guys , short story long

I almost from the begining have problem with my Gigabyte GTX590, overheating , fan do not accelerate that kind of things, so i read a little and have an idea about load different BIOS i use GPU-Z to save bios and download bios from techpowerup and try to upload using nvflash in win7, after restart card do not boot up and i was left with integrated card.
So...

After i read about bios flashing a little more i learn that this card have 2 BIOS for 2 integrated card .

I try to recover old bios in card using nvflash in DOS. I have gf110.rom which i backup from card and got this error.

I try to flash bios nr 1 and got this

current version 70.10.37.00.01 ID10DE 1088:10DE:0868
GF110 BOARD 102015PO (SWITCH PORT 0)
verison from file 70.10.42.02
ERROR: MISMATCH AT OFFSET 0X00000036

bios nr 2
numbers are exactly the same 70.10.42.00.02
file firmware image matches adapter firmware

Bios which is instaled in card 1 is the same which i download from techpowerup so i succesfully flashed card with that bios and now my card is dead.

I have to flash my bios 1 on card with oryginal bios which i do not have.

Can you please upload Gigabyte bios for GTX 590.

Or thers are diffrent methods to repair my card

Please help

Michael


----------



## SloaneRanger

Quote:


> Originally Posted by *Komits*
> 
> Hi guys , short story long
> 
> I almost from the begining have problem with my Gigabyte GTX590, overheating , fan do not accelerate that kind of things, so i read a little and have an idea about load different BIOS i use GPU-Z to save bios and download bios from techpowerup and try to upload using nvflash in win7, after restart card do not boot up and i was left with integrated card.
> So...
> 
> After i read about bios flashing a little more i learn that this card have 2 BIOS for 2 integrated card .
> 
> I try to recover old bios in card using nvflash in DOS. I have gf110.rom which i backup from card and got this error.
> 
> I try to flash bios nr 1 and got this
> 
> current version 70.10.37.00.01 ID10DE 1088:10DE:0868
> GF110 BOARD 102015PO (SWITCH PORT 0)
> verison from file 70.10.42.02
> ERROR: MISMATCH AT OFFSET 0X00000036
> 
> bios nr 2
> numbers are exactly the same 70.10.42.00.02
> file firmware image matches adapter firmware
> 
> Bios which is instaled in card 1 is the same which i download from techpowerup so i succesfully flashed card with that bios and now my card is dead.
> 
> I have to flash my bios 1 on card with oryginal bios which i do not have.
> 
> Can you please upload Gigabyte bios for GTX 590.
> 
> Or thers are diffrent methods to repair my card
> 
> Please help
> 
> Michael


Hello
myGFX is a .42 GYGABYTE card so if you need original or modded, just ask for


----------



## Komits

Thanks for the reply, I need the original BIOS file for GPU 1&2 , can you can upload it somewhere?


----------



## rush2049

Quote:


> Originally Posted by *Komits*
> 
> Thanks for the reply, I need the original BIOS file for GPU 1&2 , can you can upload it somewhere?


http://www.overclock.net/t/1063263/gtx-590-flashing-overclocking-thread/0_20

That should have all the original bioses you need.....


----------



## Komits

files have been deleted from Mediafire, unfortunately


----------



## rush2049

Let me PM RagingCain and see about getting those files to you.

edit

Ah, sorry Komits, but it seems RagingCain doesn't have them anymore. I would get you the ones I have saved on my desktop but I am traveling for the next few weeks and don't have them with me.


----------



## SloaneRanger

GYGABYTE 70.10.42.00.02 original.zip 93k .zip file


Komits : dont panic i"m around


----------



## alancsalt

Quote:


> Originally Posted by *Komits*
> 
> Hi guys , short story long
> 
> I almost from the begining have problem with my Gigabyte GTX590, overheating , fan do not accelerate that kind of things, so i read a little and have an idea about load different BIOS i use GPU-Z to save bios and download bios from techpowerup and try to upload using nvflash in win7, after restart card do not boot up and i was left with integrated card.
> So...
> 
> After i read about bios flashing a little more i learn that this card have 2 BIOS for 2 integrated card .
> 
> I try to recover old bios in card using nvflash in DOS. I have gf110.rom which i backup from card and got this error.
> 
> I try to flash bios nr 1 and got this
> 
> current version 70.10.37.00.01 ID10DE 1088:10DE:0868
> GF110 BOARD 102015PO (SWITCH PORT 0)
> verison from file 70.10.42.02
> ERROR: MISMATCH AT OFFSET 0X00000036
> 
> bios nr 2
> numbers are exactly the same 70.10.42.00.02
> file firmware image matches adapter firmware
> 
> Bios which is installed in card 1 is the same which i download from techpowerup so i successfully flashed card with that bios and now my card is dead.
> 
> I have to flash my bios 1 on card with original bios which i do not have.
> 
> Can you please upload Gigabyte bios for GTX 590.
> 
> Or there are different methods to repair my card
> 
> Please help
> 
> Michael


It's not just a matter of copying bios 2, and reflashing bios 1 with that, setting the flash to over-ride the mismatch?
Quote:


> ..... if there is protection that stops you then try again with nvflash -4 -5 -6 (biosname).rom


----------



## Komits

Unfortunately, the card has 2 different BIOSes

http://www.evga.com/forums/tm.aspx?m=1138813&mpage=1&print=true

i try to upload second bios with settings You suggest to first and nvflash shows error mismatch

i am afraid to use modded, because card before bios crash had some weird issues, like 100% core load in game menus WoT BF3 and TotalWar
which came to reset after get 90C and fan stuck on 55% of speed at that. (MSI Afterburner display)
I start whole bios operation because of that.


----------



## SloaneRanger

Quote:


> Originally Posted by *Komits*
> 
> Unfortunately, the card has 2 different BIOSes
> 
> http://www.evga.com/forums/tm.aspx?m=1138813&mpage=1&print=true
> 
> i try to upload second bios with settings You suggest to first and nvflash shows error mismatch
> 
> i am afraid to use modded, because card before bios crash had some weird issues, like 100% core load in game menus WoT BF3 and TotalWar
> which came to reset after get 90C and fan stuck on 55% of speed at that. (MSI Afterburner display)
> I start whole bios operation because of that.


Komits : why are dealing with EVga since i posted for you GIGABYTE original .42 bioses......
i always keep .37 version too, coming from my first GYGABYTE card that burned & got as RMA a .42 card


----------



## Komits

Thanks for upload, card is alive!!!!

this link was only info that gtx590 have 2 diffrent bioses, alancsalt wondering about using bios from 2 gpu to gpu1

but fan isnt working good , card gets overheated and restart the pc , after restart i clearly hear that fan is speed up to cool the card but in game it is working at 50% is it possible to mod bios to use fan in different manner?


----------



## SloaneRanger

Quote:


> Originally Posted by *Komits*
> 
> Thanks for upload, card is alive!!!!
> 
> this link was only info that gtx590 have 2 diffrent bioses, alancsalt wondering about using bios from 2 gpu to gpu1
> 
> but fan isnt working good , card gets overheated and restart the pc , after restart i clearly hear that fan is speed up to cool the card but in game it is working at 50% is it possible to mod bios to use fan in different manner?


if locked bios : range 40% -->95%
If unlocked : 40% ---> 100%

How to do : just use AB !!! install it check fan speed,apply......& go on. u ill stay at 95% for any game.
Wanna more [100% ] ?? ill be next lesson, next level

Salut


----------



## alancsalt

My son's P8z68Deluxe/Gen3/2600K/EVGA GTX 590 has been doing 9F bluescreens lately.
Quote:


> This was probably caused by the following module: ntkrnlmp.exe (nt!KeBugCheckEx+0x0)
> Bugcheck code: 0x9F (0x4, 0x258, 0xFFFFFA8006D94040, 0xFFFFF80000B9C3D0)
> Error: DRIVER_POWER_STATE_FAILURE
> Bug check description: This bug check indicates that the driver is in an inconsistent or invalid power state.
> This appears to be a typical software driver bug and is not likely to be caused by a hardware problem.
> The crash took place in the Windows kernel. Possibly this problem is caused by another driver that cannot be identified at this time.


Quote:


> 2 crash dumps have been found and analyzed. A third party driver has been identified to be causing system crashes on your computer. It is strongly suggested that you check for updates for these drivers on their company websites. Click on the links below to search with Google for updates for these drivers:
> 
> nvlddmkm.sys (NVIDIA Windows Kernel Mode Driver, Version 310.90 , NVIDIA Corporation)


He has uninstalled and re-installed, but it repeats. Any clues?


----------



## rush2049

Quote:


> Originally Posted by *alancsalt*
> 
> My son's P8z68Deluxe/Gen3/2600K/EVGA GTX 590 has been doing 9F bluescreens lately.
> 
> He has uninstalled and re-installed, but it repeats. Any clues?


Uninstall + do a driver sweep with CCleaner or similar. Remove all leftover bits from the drivers. Then re-install.

If thats what you already did, then bluescreen 9F refers to a failure after returning from hibernate, sleep, or standby mode. Which means when the card is put in a low power state it causes a BSOD when trying to wakeup. In the nvidia control panel under "adjust image settings with preview" set it to use the advanced 3D image settings. Then go to that page. Try setting the power management mode to something other than you have it on. Prefer maximum performance would keep the card always on/in 3D clock mode..... see if that fixes it.


----------



## alancsalt

Quote:


> Originally Posted by *rush2049*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> My son's P8z68Deluxe/Gen3/2600K/EVGA GTX 590 has been doing 9F bluescreens lately.
> 
> He has uninstalled and re-installed, but it repeats. Any clues?
> 
> 
> 
> Uninstall + do a driver sweep with CCleaner or similar. Remove all leftover bits from the drivers. Then re-install.
> 
> If thats what you already did, then bluescreen 9F refers to a failure after returning from hibernate, sleep, or standby mode. Which means when the card is put in a low power state it causes a BSOD when trying to wakeup. In the nvidia control panel under "adjust image settings with preview" set it to use the advanced 3D image settings. Then go to that page. Try setting the power management mode to something other than you have it on. Prefer maximum performance would keep the card always on/in 3D clock mode..... see if that fixes it.
Click to expand...

Good advice. I got my son to do custom install so he could select clean install. Not quite as thorough, but we've been happily playing BF3 for the last couple of hours. All fixed by phone, because he's at uni doing engineering 120 miles away....


----------



## CryptiK

Quote:


> Originally Posted by *alancsalt*
> 
> Good advice. I got my son to do custom install so he could select clean install. Not quite as thorough, but we've been happily playing BF3 for the last couple of hours. All fixed by phone, because he's at uni doing engineering 120 miles away....


That's an awesome dad right there,







to you sir.


----------



## Repco

Can someone tell me how do I get the other core to work?



My benchmark results are pretty bad...Shouldn't be for a 590GTX I reckon


----------



## grunion

Run full screen..


----------



## Repco

Quote:


> Originally Posted by *grunion*
> 
> Run full screen..


I did, but it doesn't work too! My benchmark results on full screen still shows little movement on the other core


----------



## grunion

N/M Kombustor..

Run Heaven or a 3DM to test sli.


----------



## SloaneRanger

If you want to use both core with MSI Kombustor, you can !









so how to do : inject a new profile inside NVIDIA driver ; i did it & it works fine.









Im actually at work so can t give you the process *but* i will when back to home

you need a special tool to extract profils existing, the out going file will be a text file (easy to mod) then use same tool to reinject profils in data base
do not seach this text file in driver, it won t appear !!!!!









so wait my come back home ........


----------



## Repco

That will be much appreciated! Cause my GTX590 can't even play Battlefield on High Settings!!! About to throw the card out and buy myself ARESII


----------



## SloaneRanger

Quote:


> Originally Posted by *Repco*
> 
> That will be much appreciated! Cause my GTX590 can't even play Battlefield on High Settings!!! About to throw the card out and buy myself ARESII


 Geforce_SLI_Profile_Tool.zip 217k .zip file


Ave ,

here is the link !
execute it
extrac your SLI profiles from your own drivers then save the incoming txt file somewhere
open txt file & copy past exactly that :

Profile "MSI Kombustor"
ShowOn GeForce
ProfileType Application
Executable "msikombustordx9.exe"
Executable "msikombustordx10.exe"
Executable "kloaderwin32.exe"
Executable "msikombustordx11.exe"
Executable "msi kombustor/heaven.exe"
Setting ID_0x00a06946 = 0x084000f5
Setting ID_0x00ae785c = 0x00000004
Setting ID_0x1033cec1 = 0x00000003
Setting ID_0x1033cec2 = 0x00000002
Setting ID_0x1033dcd2 = 0x00000004
Setting ID_0x1033dcd3 = 0x00000004
Setting ID_0x1095def8 = 0x02c00005
Setting ID_0x20441369 = 0x00000001
Setting ID_0x209746c1 = 0x04280001
Setting ID_0x20ebd7b8 = 0x00000020
EndProfile

then close & save your embedded txt file
return to tool to reinject this *modded* profile and
launch your favorite burning tool
what ill u see then ???? merci qui ??


----------



## vagenrider

first of all,sorry if im posting in wrong section..

i just want ask if anybody knows who is the best driver for windows 8 x64....many people believe nvidia-amd decrease the performance in the new drivers to buy the new models..

and second,how exactly watts needs the 590?


----------



## MKHunt

Ugh you guys, what have you done to me? Suddenly my CPU is overclocked again and I'm thinking of bios editing my 590 to .963V since my stuff tops out at 41c with the cpu overclocked. Y'all are bad influences.


----------



## c900712

Am selling mine if anyone wants an msi 590 ( 250 GBP ) , pick up only UK-Birmingham.


----------



## MKHunt

EDIT; Well my desire turned into .988V and 720MHz.


----------



## Revolution996

Heres mine....
















Point of View GTX590 ---- Stock settings.





Add me please..

Revo.


----------



## alancsalt

Quote:


> Originally Posted by *Revolution996*
> 
> Heres mine....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Point of View GTX590 ---- Stock settings.
> 
> Add me please..
> 
> Revo.


Wow! I'd have thought a POV card would be a bit higher clocked, or do they do more than one version?
Would love to see a shot of a voltage enabled Afterburner with GPUZ, both open?

Don't get to see many POV cards...


----------



## Revolution996

They do an Ultra Charged edition... didn't get that..

Revo.


----------



## MKHunt

Quote:


> Originally Posted by *Revolution996*
> 
> They do an Ultra Charged edition... didn't get that..
> 
> Revo.


Have you tried clocking your 955 higher? When I OC my 590, even my 2600k at 3.4 becomes a 3-4fps bottleneck in GPU-heavy benches such as the Unigine ones.


----------



## Revolution996

My 955BE is at a solid 4GHz now and its happy for a 24/7.
It has been up to 4.2GHz but its not too stable with the heat, am going to get the new Swiftech H220 anyway so i might have to have a play....








Going to try Haswell and new mobo when they come out later on this year, I think AMD has had its day to be truthful, this 955 will be my last AMD I think, Steamroller `may` change my mind but we will see.

Revo.


----------



## Revolution996

Quote:


> Originally Posted by *alancsalt*
> 
> Wow! I'd have thought a POV card would be a bit higher clocked, or do they do more than one version?
> Would love to see a shot of a voltage enabled Afterburner with GPUZ, both open?
> 
> Don't get to see many POV cards...


Here you go...













Revo.


----------



## alancsalt

Quote:


> Originally Posted by *Revolution996*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> Wow! I'd have thought a POV card would be a bit higher clocked, or do they do more than one version?
> Would love to see a shot of a voltage enabled Afterburner with GPUZ, both open?
> 
> Don't get to see many POV cards...
> 
> 
> 
> Here you go...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Revo.
Click to expand...

Thanks Revolution996. They were conservative with the clocks on that model... I assume the price wasn't as crazy as their full on models.


----------



## Revolution996

Quote:


> Originally Posted by *alancsalt*
> 
> Thanks Revolution996. They were conservative with the clocks on that model... I assume the price wasn't as crazy as their full on models.


I had Artifacting issues and had to RMA my GTX560ti three times over the Christmas period just gone.....they felt my pain and sent this back...not a bad return eh..?

So.... effectively costing the price of a GTX560ti and the RMA postage 3 times..

All my games are now running on the top tier and gaming has never been better...(but may go Haswell when it hits, just for the better CPU and Mobo with this card).









Revo.


----------



## MKHunt

Quote:


> Originally Posted by *Revolution996*
> 
> I had Artifacting issues and had to RMA my GTX560ti three times over the Christmas period just gone.....they felt my pain and sent this back...not a bad return eh..?
> 
> So.... effectively costing the price of a GTX560ti and the RMA postage 3 times..
> 
> All my games are now running on the top tier and gaming has never been better...(but may go Haswell when it hits, just for the better CPU and Mobo with this card).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Revo.


Wow. Maybe I should RMA my 590 for a third time and hope they send a surprise 690? JK. EVGA is not PoV.


----------



## Canis-X

I am in the beginning stages of considering to sell off my 590's. Maybe around June since it is my B-Day month....might scrounge up some $$ and upgrade. Has anyone sold one of these recently? What do you think that I would be able to get for them with the WB's?


----------



## iARDAs

Quote:


> Originally Posted by *Canis-X*
> 
> I am in the beginning stages of considering to sell off my 590's. Maybe around June since it is my B-Day month....might scrounge up some $$ and upgrade. Has anyone sold one of these recently? What do you think that I would be able to get for them with the WB's?


Not a genius at pricing. Especially in USA, but I would assume that since 590 is only a bit more powerful than a 680, and a used 680 might go for $425 in the market place.

Since the 590 consumes more power, is a SLI GPU and has 1.5 Gb of vram, I would say somewhere around the $350 region.

And even for that price you are most likely have to compete with a 670 GTX since it can be OCed a lot and can perform close to 590. With again more vram, less power consumption etc.

I have no idea about the waterblocks though.

When I sold my 590, I lost some money, it was very hard to sell it but I sold it right when the first Kepler reviews came in.

So yeah you are going to compete heavily with the 670s and 680s in the 2nd hand market.


----------



## Revolution996

Hi...

Can anyone recommend a good healthy CPU upgrade that wont bottleneck with my GTX590...(what do you guys have...???)
Running an AMD Phenom II 955BE on an Asus 990FX Sabertooth board as we speak, if I get an FX8350, then so be it and overclock it to the moon....or like I have stated previously, wait for Haswell and do a mobo upgrade too.
My Phenom 955BE is happy sat at 4GHz, (old girl) but feel that my GTX590 deserves better and I know it bottlenecks in some games.

I am not biased to who I use (AMD/INTEL), just want a CPU to match my GPU.

Metro2033 / Crysis / Sniper Elite V2 / Borderlands 2...etc etc.

Thanks all..

Revo.


----------



## iARDAs

Quote:


> Originally Posted by *Revolution996*
> 
> Hi...
> 
> Can anyone recommend a good healthy CPU upgrade that wont bottleneck with my GTX590...(what do you guys have...???)
> Running an AMD Phenom II 955BE on an Asus 990FX Sabertooth board as we speak, if I get an FX8350, then so be it and overclock it to the moon....or like I have stated previously, wait for Haswell and do a mobo upgrade too.
> My Phenom 955BE is happy sat at 4GHz, (old girl) but feel that my GTX590 deserves better and I know it bottlenecks in some games.
> 
> I am not biased to who I use (AMD/INTEL), just want a CPU to match my GPU.
> 
> Metro2033 / Crysis / Sniper Elite V2 / Borderlands 2...etc etc.
> 
> Thanks all..
> 
> Revo.


If you are in no hurry, I would wait for the Hashwell CPU that is going to be realsed in a few months. Hopefully early summer.

If you are in hurry from the Intel side, 2500k, 2700k, 3570k, 3770k are always good choices.

No idea about AMD side.


----------



## vagenrider

Quote:


> Originally Posted by *Revolution996*
> 
> Hi...
> 
> Can anyone recommend a good healthy CPU upgrade that wont bottleneck with my GTX590...(what do you guys have...???)
> Running an AMD Phenom II 955BE on an Asus 990FX Sabertooth board as we speak, if I get an FX8350, then so be it and overclock it to the moon....or like I have stated previously, wait for Haswell and do a mobo upgrade too.
> My Phenom 955BE is happy sat at 4GHz, (old girl) but feel that my GTX590 deserves better and I know it bottlenecks in some games.
> 
> I am not biased to who I use (AMD/INTEL), just want a CPU to match my GPU.
> 
> Metro2033 / Crysis / Sniper Elite V2 / Borderlands 2...etc etc.
> 
> Thanks all..
> 
> Revo.


i have FX 8120 and work's perfect with my 590..


----------



## ProfeZZor X

I can barely get past the first two minutes of a game after it's install, and my system will completely crash. Even while using cable internet on a picture heavy site, my entire PC will crash (hard reset) because it's not processing the images fast enough. J...F...C...

I'm about two seconds from ripping this card out and getting a 690. This is un-freaking believeable. And to think, EVGA replaced my previous 590 back in August. I'm at the stock settings right now, but I wonder if overclocking it will solve the problem. I have ZERO experience in overclocking, so if anyone has any ideas, then I'm all ears.


----------



## Canis-X

Does your event log contain any warnings or errors at the time of a crash? That might help narrow things down a bit.


----------



## ProfeZZor X

Quote:


> Originally Posted by *Canis-X*
> 
> Does your event log contain any warnings or errors at the time of a crash? That might help narrow things down a bit.


No, there are no warnings or anything indicating that it's overloaded. In fact, the entire PC freezes up altogether, so I end up doing a hard shutdown by holding the power button until it shuts off. Once I restart it's fine, but it'll gradually do the same thing again and again if I stay on the same pic heavy web page, or while I'm playing the game. It always seems to happen after I initially install a game, or as I progressively play that game. And it'll play just a little while longer each time, then crash. It's no particular game it does this with, it's just random. So far, the only games I have are Trine (which I've completed), Lost Planet 2, and X3 Terran Conflict. It's crashed on those last two games a few times to the point where I've given up. But like I said, each time I play either game, it'll let me play a little bit longer than the last time, but eventually crash.

All my settings on both the CPU and on the GPU are bone stock. So I can't imagine what could be causing this. I'm also running 16gig sticks of RAM.


----------



## Canis-X

Do you have ECC enabled in your BIOS for your RAM (I think I saw that setting in there somewhere). For you web browser, have you disabled hardware support for your graphic?


----------



## ProfeZZor X

Quote:


> Originally Posted by *Canis-X*
> 
> Do you have ECC enabled in your BIOS for your RAM (I think I saw that setting in there somewhere). For you web browser, have you disabled hardware support for your graphic?


I literally haven't done anything at all to my rig since it's been up and running (as of October 2012). No tampering with the BIOS or internet settings... yet.


----------



## vagenrider

Quote:


> Originally Posted by *ProfeZZor X*
> 
> I literally haven't done anything at all to my rig since it's been up and running (as of October 2012). No tampering with the BIOS or internet settings... yet.


very strange issue....I have too much pressure my gigabyte 590 bios mods-overvoltage etc..never had a problem..


----------



## nMixxo

Hey All.

Long time forum stalker and first poster. Thought I would finally throw my hat in the ring and join the 590 club, even knowing I've had them 12 months.





Cheers All


----------



## Canis-X

Welcome! Nice clean looking build and wallpaper as well....are you in the service?


----------



## nMixxo

thanks Canis-X.

I Put a bit of effort and coin into it.

No Sir I'm not in the service









Cheers


----------



## vagenrider

Quote:


> Originally Posted by *nMixxo*
> 
> thanks Canis-X.
> 
> I Put a bit of effort and coin into it.
> 
> No Sir I'm not in the service
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers


very nice and detailed work..really like it..welcome bro..


----------



## nMixxo

thanks vagenrider


----------



## nMixxo

Not a bad first run with 3D mark 11. Shame the drivers are not approved.

You suck balls Nvidia







Oh wel I'l have to give it another crack...Watch this space lol.


----------



## MKHunt

Sigh. QSLI scaling is so no bueno. Here's my single card.



At .988V I top out at 47C continuous run. I wonder if I could run more volts.


----------



## Canis-X

I'd forget about 3DM11 benches since NVidia throttles the GPU core and mem freques. Our cards do much better on Vantage though.


----------



## toX0rz

Well I moved along, no longer in the 590 club









Missing the beast already


----------



## MKHunt

I'll move on when the 690 replacement hits. Until then, 590 for this guy. It was always a personal goal for me to hit 11k with the 590 and I've finally done it. Also in Unigine Valley the 590 OC'ed still pulls on OC'ed 680s by a few fps.


----------



## nMixxo

Yeah the biggest disappointment of the 590 besides the price is the performance.
Oh well live and learn.
This is my first high end rig build and a major lesson learned was that the most expensive does not always mean the best.
Sucks when you buy high end hardware and six months down the track its old school lol.


----------



## toX0rz

Quote:


> Originally Posted by *MKHunt*
> 
> I'll move on when the 690 replacement hits. Until then, 590 for this guy. It was always a personal goal for me to hit 11k with the 590 and I've finally done it. Also in Unigine Valley the 590 OC'ed still pulls on OC'ed 680s by a few fps.


Well I never overclocked my 590 since I ran it on air and the card was already getting 80°C+

I compared unigine valley stock vs stock.
My 590 scored 2056 points (49,1 fps)
660 Ti SLI scored 2571 points (61,4 fps)

Not too bad considering I didnt pay anything extra and since the 660 Ti's dont run higher than around 65°C with the stock cooler there was plenty of headroom left for overclocks.


----------



## MKHunt

Quote:


> Originally Posted by *nMixxo*
> 
> Yeah the biggest disappointment of the 590 besides the price is the performance.
> Oh well live and learn.
> This is my first high end rig build and a major lesson learned was that the most expensive does not always mean the best.
> Sucks when you buy high end hardware and six months down the track its old school lol.


Hahaha I know that feel (got a TiHD right before the ZxR launched) but that's why I got my card back in June 2011. Sometimes I wonder if I'm one of the last 'OG" in this club.


----------



## vagenrider

Quote:


> Originally Posted by *toX0rz*
> 
> My 590 scored 2056 points (49,1 fps).


me too at unigine







2087 with air









I am very pleased with this card.

Which driver uses?


----------



## YP5 Toronto

call me crazy.... but I feel like selling my 2 590s and go with some titans... someone save me.


----------



## MKHunt

Quote:


> Originally Posted by *YP5 Toronto*
> 
> call me crazy.... but I feel like selling my 2 590s and go with some titans... someone save me.


Wait. Get two 790's. Put those titans to shame.


----------



## vagenrider

Quote:


> Originally Posted by *YP5 Toronto*
> 
> call me crazy.... but I feel like selling my 2 590s and go with some titans... someone save me.


(of course you're crazy)

btw if you sell it will buy the one..


----------



## alancsalt

Quote:


> Originally Posted by *YP5 Toronto*
> 
> call me crazy.... but I feel like selling my 2 590s and go with some titans... someone save me.


Need 35 rep to sell it on OCN and even then has to be in the OCN Marketplace.


----------



## vagenrider

question for guys who overvoltage 590....

i have mod my bios to 988mv.. today install the latest drivers (314.07) and the max voltage still to 988mv....how can happen this? these drivers not have locked voltage?


----------



## Canis-X

The only way to change the voltage on these cards is to flash a modified BIOS with the voltage you want to run the card at. Doing so will only allow you to run the card at the voltage you set it to, so you changed your voltage to .988v, that is what you will see in programs like GPU-Z, MSI Afterburner...etc. It will not allow you to then adjust the voltage on-the-fly, if you want it to run at a different voltage you will need to flash a new BIOS with the voltage on there.

Even with what I stated above, if you run a program that NVidia has added to their "list" like 3DMark11 the driver will throttle your freq and sometimes your voltage as well.


----------



## Shiftstealth

Quote:


> Originally Posted by *Canis-X*
> 
> The only way to change the voltage on these cards is to flash a modified BIOS with the voltage you want to run the card at. Doing so will only allow you to run the card at the voltage you set it to, so you changed your voltage to .988v, that is what you will see in programs like GPU-Z, MSI Afterburner...etc. It will not allow you to then adjust the voltage on-the-fly, if you want it to run at a different voltage you will need to flash a new BIOS with the voltage on there.
> 
> Even with what I stated above, if you run a program that NVidia has added to their "list" like 3DMark11 the driver will throttle your freq and sometimes your voltage as well.


Then they go boom.
http://www.google.com/url?sa=t&rct=j&q=gtx%20590%20exploding&source=web&cd=1&cad=rja&ved=0CC4QtwIwAA&url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DsRo-1VFMcbc&ei=C40vUZzmNJCq0AHd9oD4AQ&usg=AFQjCNE8Ohw3PQIlpzDHjmVpge6tpLidfg&bvm=bv.43148975,d.dmg


----------



## Canis-X

Only if you set the voltage too high. Stock voltage though....they left a lot of "safe zone" in there for sure!!


----------



## Revolution996

Yeah...mines staying stock.

Plays perfect as is...leave it alone..!!!

















Revo


----------



## vagenrider

Quote:


> Originally Posted by *Canis-X*
> 
> The only way to change the voltage on these cards is to flash a modified BIOS with the voltage you want to run the card at. Doing so will only allow you to run the card at the voltage you set it to, so you changed your voltage to .988v, that is what you will see in programs like GPU-Z, MSI Afterburner...etc. It will not allow you to then adjust the voltage on-the-fly, if you want it to run at a different voltage you will need to flash a new BIOS with the voltage on there.
> 
> Even with what I stated above, if you run a program that NVidia has added to their "list" like 3DMark11 the driver will throttle your freq and sometimes your voltage as well.


thank you very much for your answer..help's me very much to understand some things..

it is safe the 988mv? (im not pushing it..medium use)


----------



## vagenrider

Quote:


> Originally Posted by *Shiftstealth*
> 
> Then they go boom.
> http://www.google.com/url?sa=t&rct=j&q=gtx%20590%20exploding&source=web&cd=1&cad=rja&ved=0CC4QtwIwAA&url=http%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DsRo-1VFMcbc&ei=C40vUZzmNJCq0AHd9oD4AQ&usg=AFQjCNE8Ohw3PQIlpzDHjmVpge6tpLidfg&bvm=bv.43148975,d.dmg


this guy work's the card to the limits in 1025mv with air..and of course he pay the price.............thats why happen this..

me,i am not pushing the card to these limits and also not using this voltage..


----------



## Canis-X

I'm currently using 1.013v for 24/7 use and I'll max the OC out. I've benched mine at 1.038v without issue based on Pedropc's recommendation. I am also running on water though.


----------



## vagenrider

Quote:


> Originally Posted by *Canis-X*
> 
> I'm currently using 1.013v for 24/7 use and I'll max the OC out. I've benched mine at 1.038v without issue based on Pedropc's recommendation. I am also running on water though.
























:







:

so im very safe at 988..


----------



## MKHunt

Quote:


> Originally Posted by *vagenrider*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> :
> 
> 
> 
> 
> 
> 
> 
> :
> 
> so im very safe at 988..


I'm rocking the ole .988V as well. It lets me kiss 730MHz with the latest drivers, but 720 is the frequency it's stable at in all apps.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> I'm rocking the ole .988V as well. It lets me kiss 730MHz with the latest drivers, but 720 is the frequency it's stable at in all apps.


i see it too....im on 691-1784 for now and im completely stable..no crashes,no artifacts..









what clock you have in memory?


----------



## atrapados

It is stable in .988v at 700 MHz core and memory at 1900MHz in water 25º C idle and full run 39º C with the latest drivers.(ASUS model with MOD bios)


----------



## vagenrider

Quote:


> Originally Posted by *atrapados*
> 
> It is stable in .988v at 700 MHz core and memory at 1900MHz in water 25º C idle and full run 39º C with the latest drivers.(ASUS model with MOD bios)


can you give any link please to buy waterblock?can't find it here in greece everywhere..


----------



## MKHunt

Quote:


> Originally Posted by *vagenrider*
> 
> i see it too....im on 691-1784 for now and im completely stable..no crashes,no artifacts..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what clock you have in memory?


Memory is at stock (1728). The 590 has HUGE memory bandwidth, so stock speed is not a bottleneck at all.


----------



## mundivalur

Does any body have a good bios for Evga 590 Class. ?
Thanks


----------



## MKHunt

Quote:


> Originally Posted by *mundivalur*
> 
> Does any body have a good bios for Evga 590 Class. ?
> Thanks


Which BIOS version? There is the .37 and .42


----------



## mundivalur

Quote:


> Originally Posted by *MKHunt*
> 
> Which BIOS version? There is the .37 and .42


I am using the .42 now, was the .42 one better than .37 ?


----------



## alancsalt

Quote:


> Originally Posted by *mundivalur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MKHunt*
> 
> Which BIOS version? There is the .37 and .42
> 
> 
> 
> I am using the .42 now, was the .42 one better than .37 ?
Click to expand...

They use different vram chips AFAIK, so flashing a 37 with a 42 bios fails, and vice versa. These things can change in graphics cards during the manufacturing life.
Are you looking for a stock bios or a modified one?


----------



## vagenrider

Quote:


> Originally Posted by *alancsalt*
> 
> They use different vram chips AFAIK, so flashing a 37 with a 42 bios fails, and vice versa. These things can change in graphics cards during the manufacturing life.
> Are you looking for a stock bios or a modified one?


any bios for my gigabyte 590? the link in the flash bios thread not working..


----------



## alancsalt

Quote:


> Originally Posted by *vagenrider*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> They use different vram chips AFAIK, so flashing a 37 with a 42 bios fails, and vice versa. These things can change in graphics cards during the manufacturing life.
> Are you looking for a stock bios or a modified one?
> 
> 
> 
> any bios for my gigabyte 590? the link in the flash bios thread not working..
Click to expand...

http://www.techpowerup.com/vgabios/index.php?page=1&architecture=NVIDIA&manufacturer=Gigabyte&model=GTX+590&interface=&memSize=0


----------



## vagenrider

Quote:


> Originally Posted by *alancsalt*
> 
> http://www.techpowerup.com/vgabios/index.php?page=1&architecture=NVIDIA&manufacturer=Gigabyte&model=GTX+590&interface=&memSize=0


i know this....but have only one bios..590 have 2 as you know..


----------



## MKHunt

Quote:


> Originally Posted by *alancsalt*
> 
> They use different vram chips AFAIK, so flashing a 37 with a 42 bios fails, and vice versa. These things can change in graphics cards during the manufacturing life.
> Are you looking for a stock bios or a modified one?


vram chips remained the same Samsung LPGDDR5 but the power scheme was beefed up. I have a thread somewhere with the changes.


----------



## mundivalur

Quote:


> Originally Posted by *alancsalt*
> 
> They use different vram chips AFAIK, so flashing a 37 with a 42 bios fails, and vice versa. These things can change in graphics cards during the manufacturing life.
> Are you looking for a stock bios or a modified one?


I see that my bios are .42 and i am looking for a modified one but i have finally got some result in the overclocking the volts are now max 1.038v and now i can raise the core over 700mhz and the mem. up to that was not possible be4








I would like to see if there is any more difference between my mod bios and some other one








Everything is watercooled and the max on the gpu is 45°c at folding and 35°c in benchmarks !


----------



## MKHunt

Well 590 club I'm finally getting around to OCing my GPU to its actual limits at .988V.

So far:
Core 725MHz
Memory 1900MHz
CPU: i7 2600k @ 4.5GHz



Unigine Valley: 60.3fps


----------



## alancsalt

Quote:


> Originally Posted by *MKHunt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> They use different vram chips AFAIK, so flashing a 37 with a 42 bios fails, and vice versa. These things can change in graphics cards during the manufacturing life.
> Are you looking for a stock bios or a modified one?
> 
> 
> 
> vram chips remained the same Samsung LPGDDR5 but the power scheme was beefed up. I have a thread somewhere with the changes.
Click to expand...

Thanks for clarifying. So the ones that use a 42 (I assume) bios have better vrm?
Two bios, yep, in which case I'm as stuck as you. Maybe someone who posts here has copies they could attach?


----------



## MKHunt

Quote:


> Originally Posted by *alancsalt*
> 
> Thanks for clarifying. So the ones that use a 42 (I assume) bios have better vrm?
> Two bios, yep, in which case I'm as stuck as you. Maybe someone who posts here has copies they could attach?


Nobody really knows if its better because by the time they hit the market, nvidia had completely locked down the 590. I can say with surety that the VRM components (particularly the inductors) are rated significantly higher though. They were rated for ~5A continuous current more and had better shielding. Whether this and the permanent bump from the original .913V to .925V was because they had run out of the sweetest 580 cores or as a response to the dying cards is beyond my knowledge.


----------



## Bezna

I'm pulling my hair out...

Over the last couple days I've been having issues with my gtx 590 ( only in BF3 ).
It used to cause me trouble in the when I played last year but I found stable drivers and after taking a break from bf3 for a few months the problem is back.

I've tried different drivers, browsers, settings, whatever I do it seems to have the same issue.
It freezes my screen (in game), then black out, then hard freeze. I have to hard reboot the rig to get it going again.

Here is the latest error I've gotten.


I just dont know what to do. Is it BF3's fault or my card / drivers?

Rig specs are in sig , win 7 64bit, running latest drivers ( 314.07 ), windows update to date.
Any help would be welcomed. At this point I feel like selling the beast. Cheers - Vio

Edit * Temps are good btw, and I've never OC'ed the card.


----------



## jeromeface

Quote:


> Originally Posted by *vio2700k*
> 
> I'm pulling my hair out...
> 
> Over the last couple days I've been having issues with my gtx 590 ( only in BF3 ).
> It used to cause me trouble in the when I played last year but I found stable drivers and after taking a break from bf3 for a few months the problem is back.
> 
> I've tried different drivers, browsers, settings, whatever I do it seems to have the same issue.
> It freezes my screen (in game), then black out, then hard freeze. I have to hard reboot the rig to get it going again.
> 
> Here is the latest error I've gotten.
> 
> 
> I just dont know what to do. Is it BF3's fault or my card / drivers?
> 
> Rig specs are in sig , win 7 64bit, running latest drivers ( 314.07 ), windows update to date.
> Any help would be welcomed. At this point I feel like selling the beast. Cheers - Vio
> 
> Edit * Temps are good btw, and I've never OC'ed the card.


I heard xfire can cause this... worth a look


----------



## OccamRazor

Well guys, after a long absence i come back with bad news,
the bios i was trying to adapt from asus mars .37 to .42 didnt work!








the hardware differences between the old card and the new one, specially on the power delivery system were to great, we tried a lot of solutions but to no avail,
there were artifacts always present in every attempt!
I want to say thank you to all that showed interest in this project and im sorry that we got nothing out of it,
thanks also goes to Marco Rudisuli a.k.a. svL7 in another forum that was the brain behind it all!
so... .42 cards, whats the highest clocks and volts all of you guys are getting?

Ed


----------



## OccamRazor

@ vio2700k:

Dont flip over the 590, its not her fault!








Its a problem within directx and battlefield 3, it origin related, EA is aware of it!
Go here: http://forum.ea.com/eaforum/posts/list/9307984.page
In a long shot, have you tried to reinstall directx? its worth a shot...

Ed


----------



## Bezna

Quote:


> Originally Posted by *OccamRazor*
> 
> @ vio2700k:
> 
> Dont flip over the 590, its not her fault!
> 
> 
> 
> 
> 
> 
> 
> 
> Its a problem within directx and battlefield 3, it origin related, EA is aware of it!
> Go here: http://forum.ea.com/eaforum/posts/list/9307984.page
> In a long shot, have you tried to reinstall directx? its worth a shot...
> 
> Ed


I appreciate your response!!!
I will take your advice and try the dx reinstall.
Will keep you posted


----------



## mundivalur

Quote:


> Originally Posted by *OccamRazor*
> 
> Well guys, after a long absence i come back with bad news,
> the bios i was trying to adapt from asus mars .37 to .42 didnt work!
> 
> 
> 
> 
> 
> 
> 
> 
> the hardware differences between the old card and the new one, specially on the power delivery system were to great, we tried a lot of solutions but to no avail,
> there were artifacts always present in every attempt!
> I want to say thank you to all that showed interest in this project and im sorry that we got nothing out of it,
> thanks also goes to Marco Rudisuli a.k.a. svL7 in another forum that was the brain behind it all!
> so... .42 cards, whats the highest clocks and volts all of you guys are getting?
> 
> Ed


I can go 780/mem 2000 but the best bench score is when the card is in core 740-50 / mem 2000 volts 1.038


----------



## OccamRazor

Thanks for the reply mundivalor!








the best scores are around 730/750, as the coding in the drivers that trigger the PDL are only for old programs like 3d mark, furmark etc, as things move on (time and interests on nvidia side) less and less coding will be implemented in the drivers and no more PDL for our 590´s!
Ok guys ITS OVERCLOCK TIME!








I changed my 3d clocks to low 1.0005 and max 1.038 and tomorrow will see where it will go!
Keep you posted!

Ed


----------



## MKHunt

Quote:


> Originally Posted by *mundivalur*
> 
> I can go 780/mem 2000 but the best bench score is when the card is in core 740-50 / mem 2000 volts 1.038


Whoa, what drivers are you running 1.038 at?

EDIT: Ugh. I accidentally installed 314.07, and when I did a clean reinstall of 313.94 I have the .963V lock again. Any suggestions?


----------



## Canis-X

Odd, I'm currently running 314.07 and my voltage is 1.013. Do you think going back to 313.94 reset your voltage?


----------



## MKHunt

Quote:


> Originally Posted by *Canis-X*
> 
> Odd, I'm currently running 314.07 and my voltage is 1.013. Do you think going back to 313.94 reset your voltage?


I pulled my source BIOS copies, adjusted them to 1.005V and reflashed and it's still stuck at .963. The second core will touch 1V for about 5 seconds after boot but then it drops to 2d clocks and when stressed, it runs at .963V as well. I'm baffled.


----------



## Canis-X

Hmmm...maybe try Driver fusion to clean out all of the drivers and then try 313.94 again?


----------



## MKHunt

Did driver fusion then 313.96: max .963V
driver fusion 314.07: max .963V

What the heck?....


----------



## Canis-X

What are you reading the voltage with? Perhaps reinstalling that program?


----------



## mojobear

hey all, Anyone looking the tomb raider game in surround with 590s? Ive read tomb raider eats vram up and wanted some input before I go purchase the game. Unfortunately I cant game in anything other than surround anymore....surround has totally spoiled me...


----------



## MKHunt

Quote:


> Originally Posted by *mojobear*
> 
> hey all, Anyone looking the tomb raider game in surround with 590s? Ive read tomb raider eats vram up and wanted some input before I go purchase the game. Unfortunately I cant game in anything other than surround anymore....surround has totally spoiled me...


Anything in surround on the 590 is going to be a bit hard to take (3x 1080 panels is a lot of pixels for 1.5g of vram), though people have done surround BF3 on the 590 with few complaints. Mainly its a vram hog at this time because Nvidia only got the final code a week before the game was released. Without TressFX on it runs pretty smoothly.

Quote:


> Originally Posted by *Canis-X*
> 
> What are you reading the voltage with? Perhaps reinstalling that program?


That was my last thought last night. Installed the latest msi afterburner beta (after a full afterburner uninstall, settings and all) and reinstalled GPU-Z and both still read .963. Maybe I need to try reinstalling a driver as far back as 285?

I'm also getting a ton of Crysis 3 crashes near the endgame, but I have a sneaking suspicion that's just Crytek's glorious, ultra-efficient coding.

ETA: OK, 295.73 puts me at the target 1V. So what the heck is going on then....

ETA2: OK color me confused as all-get-out, but any 310 series driver sets the voltage to .963V..... until I try to set the voltage to anything LOWER than .963V in Afterburner at which point it pops up the the 1V mark where it should be. When I save a profile and set it to be the default for 3D, it saves the voltage as .875... which makes it go to the 1V mark where it's supposed to be.


----------



## mundivalur

I am using the 314 driver just needs a little volt help in the bios


----------



## vagenrider

Quote:


> Originally Posted by *mundivalur*
> 
> I am using the 314 driver just needs a little volt help in the bios


you are ok with this voltage..(if you are on water)


----------



## OccamRazor

@ MKhunt:

Is your card a .37 or is it a .42? if its a .37 the drivers might still be targeting your voltage...

Ed


----------



## MKHunt

It's a .42 card. AFAIK I was the first on this forum to find the different power scheme of the .42 cards when I had a triple-RMA with EVGA that resulted in me getting a .42 card in November of 2011. Not even Masked was aware of the revisions. The thread can be seen here.

The inductors on the ,42 cards are just so _nice_. I don't know the extent of the modified circuitry, I expect you know that much better than I.


----------



## mundivalur

Quote:


> Originally Posted by *vagenrider*
> 
> you are ok with this voltage..(if you are on water)


It is a .42 card water cooled 45°c max folding 35-40 in benching ,the max volt is 1.038v tho is shows up to 1.063


----------



## vagenrider

Quote:


> Originally Posted by *mundivalur*
> 
> It is a .42 card water cooled 45°c max folding 35-40 in benching ,the max volt is 1.038v tho is shows up to 1.063


I would not propose to work such volt..the risk is great....

It seems that the 590 is not enough for you


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> It's a .42 card. AFAIK I was the first on this forum to find the different power scheme of the .42 cards when I had a triple-RMA with EVGA that resulted in me getting a .42 card in November of 2011. Not even Masked was aware of the revisions. The thread can be seen here.
> 
> The inductors on the ,42 cards are just so _nice_. I don't know the extent of the modified circuitry, I expect you know that much better than I.


Actually the credit goes to you, it was your thread that alerted me to the extent of the 590 rev.2 PCB alterations, VRM circuitry, inductors, FETs and capacitors!
After realizing that it would take the guys that made the bios at ASUS to change it as it would take extensive knowledge of the circuitry to do it ourselves and of course (unfortunately) we dont have it!
Hows your voltage problem? you fixed it yet?

Ed


----------



## MKHunt

I did fix it. By trying to set the voltage to .875 it sets it properly at 1V. I also added more rad, de-lidded my 3770k, and re-plumbed my loop so I might be able to push things a bit harder. it's like I'm trying to see just how much rad I can cram in a C70



But then a swiftech rotary decided to ruin my spring break plans.



Sigh.


----------



## OccamRazor

Quote:


> Originally Posted by *vagenrider*
> 
> you are ok with this voltage..(if you are on water)


You're stretching a little...
its known that 590´s with +- 1v can reach 800mhz, so if i were you i wouldt go beyond 1.038!
water is good to keep temperatures at bay but does nothing against the voltage.
Thats my 2 cents...

Ed


----------



## OccamRazor

@MKhunt:
Sounds like software glitch to me something that got stuck in the registry, afterburner related or something similar!
Nice setup you got there man!









Ed


----------



## vagenrider

MKHunt....nightmare..


----------



## MKHunt

Quote:


> Originally Posted by *vagenrider*
> 
> MKHunt....nightmare..


The leak? I post these pictures liberally to remind people to leak test. The ONLY thing that had power was the pump, so nothing got damaged in the slightest.


----------



## OccamRazor

@MKhunt:

i was having the same problem last night, the volts i set in nibitor simply wouldnt stick
until i raised the minimum value in setting 2 in fermi voltage to 0,988v

Ed


----------



## Zweets

Hello i know this might be old but whats the best to have the gtx 590 in x16 or x8 slot??


----------



## OccamRazor

Quote:


> Originally Posted by *Zweets*
> 
> Hello i know this might be old but whats the best to have the gtx 590 in x16 or x8 slot??


Had it in both and never sensed a difference in anything.
My 2 cents...

Ed


----------



## Zweets

Quote:


> Originally Posted by *OccamRazor*
> 
> Had it in both and never sensed a difference in anything.
> My 2 cents...
> 
> Ed


Alright man thanks for the answer!!


----------



## OccamRazor

Ok guys ive found out that the 300´s drivers series (all of them) do not read the changes made by nibitor very well, its probably due to nibitor´s way of saving the rom files. untill series 290 all is well!
i found a inflection in the second core volts from 1v down to 0,875 crashing every game on the fly from 700mhz core up!
then i increased all the voltage tables in nibitor to 1v.

now its up to 770mhz and still testing!









Ed


----------



## OccamRazor

Quote:


> Originally Posted by *Zweets*
> 
> Alright man thanks for the answer!!


Happy to help! thats what we´re here for!

Cheers

Ed


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> Ok guys ive found out that the 300´s drivers series (all of them) do not read the alterations made by nibitor very well, its probably due to nibitor´s way of saving the rom files.
> i found a inflection in the second core volts from 1v down to 0,875 crashing every game on the fly from 700mhz core up!
> then i increased all the voltage tables in nibitor to 1v.
> 
> now its up to 770mhz and still testing!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


I had the same experience. I had to make setting 2 1.005V both drop-downs, setting 1 .988V both drop downs, and setting 0 .875 both drop downs.

Also, guess what's running again?


----------



## OccamRazor

800mhz! sweet!








played tomb raider for half hour and crysis 3 for half hour as well, no crashes, full load both gpus, temps 65º watercooled of course
Now trully is 580 sli! now its the card it was always suppose to be!
gonna do more tests tomorrow to check stabillity!









Ed


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> 800mhz! sweet!
> 
> 
> 
> 
> 
> 
> 
> 
> played tomb raider for half hour and crysis 3 for half hour as well, no crashes, full load both gpus, temps 65º watercooled of course
> Now trully is 580 sli! now its the card it was always suppose to be!
> gonna do more tests tomorrow to check stabillity!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


I am so jelly. My 590 hits a wall at 740 in Unigine Valley with 1V.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> I am so jelly. My 590 hits a wall at 740 in Unigine Valley with 1V.


I was hitting a wall too because the 300 drivers series was not reading the bios right!
in nibitor fermi voltage screen fill all the p0, p1 and p2 with 1.0005v, right and left
it did the job for me, as the drivers were pulling volts reading from p0 and p1 state, so i measured the output and my second gpu when entering 3d state was being fed with 0.843v of course it crashed although in afterburner you couldnt see the drop!
Hope it helps

Ed


----------



## OccamRazor

or perhaps its something in the unigine valley code that triggers a low state power and crashes the card at 740 with low volts!
does the same thing happen in games? or you can hit higher core in games?

Ed


----------



## vagenrider

finaly....watercooled! (so happy)





these days comes and the xspc block for my 590 to complete my project..


----------



## emett

800mhz! Well done dude. Good to see the overclocking on these beast cards has continued long after they were superceded.
People claimed the gtx 590 was a **** overclocker, but in reality you just had to really know your stuff to get the most out of these cards. Fond memories, I only got mine up to 700mhz rock stable. Didn't go any hire for fear of blowing it up. But as time goes on and new stuff comes out that fear fade.


----------



## OccamRazor

Quote:


> Originally Posted by *vagenrider*
> 
> finaly....watercooled! (so happy)
> 
> these days comes and the xspc block for my 590 to complete my project..


Well done, nice setup!








Got to post some pics of mine too! homemade watercooling!









Ed


----------



## OccamRazor

Quote:


> Originally Posted by *emett*
> 
> 800mhz! Well done dude. Good to see the overclocking on these beast cards has continued long after they were superceded.
> People claimed the gtx 590 was a **** overclocker, but in reality you just had to really know your stuff to get the most out of these cards. Fond memories, I only got mine up to 700mhz rock stable. Didn't go any hire for fear of blowing it up. But as time goes on and new stuff comes out that fear fade.


Thanks Emmet!
Its still a very nice card, its a pitty that due to marketing reasons nvidia never unlocked the card after the revision they made on the power delivery system,
and by todays standards it still puts up a fight against newer cards, and with most of the games being console ports, this old dog still kicks!

Cheers

Ed


----------



## MKHunt

Aight, time to fill every power state with 1V. The card maxes at 47C anyway in my loop so whatevs.

I KNEW I put 6x120mm of rad in my case for a reason!









ETA: Valley did not like 750. Occam would you mind trying Valley at your clocks? I have a feeling that you won the silicon lottery.

ETA2: Blargh it also keeps listing my man display as display 2, but when I switch the DVI ports, the main display becomes my lesser monitor.


----------



## Pedropc

Good OccamRazor; Your vga is 037 or 042?, I could spend the bios you have set for 800 mhzs?. It's for me to try on my GTX590, not PDL?, Which program you use to raise voltage and clock?, Thanks and greetings.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Aight, time to fill every power state with 1V. The card maxes at 47C anyway in my loop so whatevs.
> 
> I KNEW I put 6x120mm of rad in my case for a reason!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ETA: Valley did not like 750. Occam would you mind trying Valley at your clocks? I have a feeling that you won the silicon lottery.
> 
> ETA2: Blargh it also keeps listing my man display as display 2, but when I switch the DVI ports, the main display becomes my lesser monitor.


Dont worry, its driver related, it keep bouncing from, 800/771/750 to 554, so its PDL kicking in,
it only works at 740mhz cause its what nvidia hardwired in the drivers, its no news to me that all
benchmarks top at 740mhz for the 590... coincidence? i dont think so...

my 2 cents...

Ed


----------



## OccamRazor

Quote:


> Originally Posted by *Pedropc*
> 
> Good OccamRazor; Your vga is 037 or 042?, I could spend the bios you have set for 800 mhzs?. It's for me to try on my GTX590, not PDL?, Which program you use to raise voltage and clock?, Thanks and greetings.


Pedrito! chico! entonces que tal?








Its .42, doesnt work in your card, the program is nibitor; http://www.techspot.com/downloads/2731-nvidia-bios-editor.html
Si lo quieres hablamos en castellano por mensagem!

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> Well done, nice setup!
> 
> 
> 
> 
> 
> 
> 
> 
> Got to post some pics of mine too! homemade watercooling!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


thank you!yes post some pics of yours..


----------



## OccamRazor

Here you go! setup day was dust everywhere...















Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> Here you go! setup day was dust everywhere...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


very nice..i like it! i have think it too the outside mod..maybe i doit before the summer..


----------



## MKHunt

Tomb raider bench at 800MHz core, 1728 memory. Whoa 590, you are full of surprises.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> 
> 
> Tomb raider bench at 800MHz core, 1728 memory. Whoa 590, you are full of surprises.


this im playing now!...........but will not run bench now.....waiting my block and run it after the overvoltage hehhehe!!!!

exellent score..590 its a bad dog..


----------



## MKHunt

Well, it completes the bench just fine at 850MHz but the scores are about the same (though avg fps is up to 52)

Maybe the memory needs some OC love now.

K, memory OC to 1800 resulted in a crash. Looks like 850 w/ stock memory is my cap.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> Well, it completes the bench just fine at 850MHz but the scores are about the same (though avg fps is up to 52)
> 
> Maybe the memory needs some OC love now.


yes..maybe you done it with more mem..is any fry danger with volt more than 1000? (with water and good temp)


----------



## MKHunt

Well that was embarrassing. That was at 1440 and vsync was on.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Well, it completes the bench just fine at 850MHz but the scores are about the same (though avg fps is up to 52)
> 
> Maybe the memory needs some OC love now.
> 
> K, memory OC to 1800 resulted in a crash. Looks like 850 w/ stock memory is my cap.


Well done!









What volts are you applying?
you´ll have to lower the core to increase the memory, its a balance thing because the voltage is both for the core and memory, to increase one you have to lower the other!

Ed


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> Well done!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What volts are you applying?
> you´ll have to lower the core to increase the memory, its a balance thing because the voltage is both for the core and memory, to increase one you have to lower the other!
> 
> Ed


1V across the board. Yeah, 5MHz core in fps seems to be equivalent to ~50-100MHz memory, so core comes first always.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> 1V across the board. Yeah, 5MHz core in fps seems to be equivalent to ~50-100MHz memory, so core comes first always.


850mhz with 1v?
Man thats excelent, to get 800 stable i have to increase to 1,025v, and 1,038 to get that speed!

Ed


----------



## Pedropc

I with the Mars II Bios get this stable;

http://imageshack.us/photo/my-images/407/unigineb.png/


----------



## Canis-X

WHAO!!! Hey Pedropc, watch your voltage there....You have it set to 1.025v but your max voltage is showing that it is jumping up to 1.075v!! Others in this thread said that their cards were damaged when they hit 1.088v, you are dangerously close. Unless you have a backup card and don't care what happens to your 590, I would put a GTX590 bios back on there as soon as possible and leave the MARS II BIOS alone!


----------



## Pedropc

Good Canis-X, The voltage is always to 1.025 V, in this capture was testing up to fail where the rose and puts drivers 1.075 V. But just happens to find the limit. Once the limit is always found at the voltage you select a greeting.


----------



## Canis-X

I would confirm the GPU voltage with GPU-Z still. If that program also shows that you are hitting 1.075v, I would not risk it, even if it is only hitting that voltage briefly, it could burn your card up.


----------



## Gobiel

Hi,
While you guys have no issues overclocking this card, I have critical overheating problems with it... I have an ATCS 840 case and GPU2 when launching any game with any settings fastly gets to 100°C, where the voltages are lowering.

I have the Gigabyte version of this card, and have flashed it to the last BIOS version of the manufacturer to no effect. I don't overclock it and I use two screens on it (I have tried all the possible configurations for the DVI ports of the card without effect, except GPU1 beeing hotter with a screen on it...). I have used compressed air in all possible places, without effect. I use EVGA Precision X with the fan curve functionality, without effect.

I have: i5 750 processor, P7P55D motherboard, Antec TruePower New 750W power supply, Asus Xonar D2 audio card (that should be handling the game's audio in stead of the graphic card...).

My cooling configuration is:

The two Cooler Master provided fans at the top are left "as-is" in exhaust.
The one at the front before the hard drives is also left "as-is" pulling air in the case.
The back fan is replace with a Corsair H80 push-pull watercooling that pulls the air in the case (as opposed to Cooler Master default configuration that exhausts the air (thought the front fan wasn't sufficient for the computer)).
The power supply is at the bottom of the case where Cooler Master recommands it to be.
I'm a little exasperated by this... pls halp me


----------



## Canis-X

Can you do a couple of things for us....

Put your rig specs in your sig block
Post up a pic of the inside of your case so we can get a better idea of air flow concerns.

All that aside, when did this first start happening? Did anything change around that time?


----------



## alancsalt

Quote:


> Originally Posted by *Canis-X*
> 
> WHAO!!! Hey Pedropc, watch your voltage there....You have it set to 1.025v but your max voltage is showing that it is jumping up to 1.075v!! Others in this thread said that their cards were damaged when they hit 1.088v, you are dangerously close. Unless you have a backup card and don't care what happens to your 590, I would put a GTX590 bios back on there as soon as possible and leave the MARS II BIOS alone!


Pedro has been doing this for some time. Every now and then he posts his results of cross flashing with the Mars II bios, possibly because ppls forget, and there are new owners too, that might not know it can be done. Close to the bone maybe, but good work Pedro.

As always with overclocking, accept the risk or don't do it.


----------



## OccamRazor

hummm looks like there's bad contact between the core and the heatsink,
you should disassemble the card check for bad contact and reapply some thermal paste!
possible some screws are loose? that would make the heatsink shift and make only one core being in contact,
while the other isnt, it would explain the fast elevated temperature.

Ed


----------



## Gobiel

Thank you both very much for the fast replies !

There is a picture for my actual configuration if it helps.


I'll look into that, OccamRazor, thank you.


----------



## OccamRazor

Quote:


> Originally Posted by *Gobiel*
> 
> Thank you both very much for the fast replies !
> 
> There is a picture for my actual configuration if it helps.
> 
> 
> I'll look into that, OccamRazor, thank you.


Pas de tout mon Ami!


----------



## Canis-X

Quote:


> Originally Posted by *alancsalt*
> 
> Pedro has been doing this for some time. Every now and then he posts his results of cross flashing with the Mars II bios, possibly because ppls forget, and there are new owners too, that might not know it can be done. Close to the bone maybe, but good work Pedro.
> 
> As always with overclocking, accept the risk or don't do it.


I was just worried due to the other folks cards burning up at 1.088v is all. Do you think it is a false reading possibly?


----------



## Pedropc

Good Canis-X, the GPU-Z the voltage measured wrong with bios of Mars II on GTX590, I've been with this bios since August 15, 2012 and have had no problem, for 1 or 2 seconds if you put the voltage 1.075 V, before loading GPU Tweak, but I have not had any type of problema, anyway, I think the bios of the Mars II manages the voltage better than the GTX590 bios, a greeting.


----------



## mundivalur

MKHunt can you show me/us how your settings are in the nvidia control panel ! Manage 3D settings/global and program also the adjust image settings with preview !
I see allot of power difference when i play with that but i cant remember what was best


----------



## MKHunt

I haven't touched those whatsoever. I do a clean install when I change drivers too so its 100% stock there.


----------



## vagenrider

today i receive my block,so.....................................PROJECT COMPLETED!!!!!!!!!!!!!!!!!!!


----------



## alancsalt

Wonder if Pedro has the later inductors MKHunt spoke of? http://www.overclock.net/t/1168598/the-590-has-a-new-vrm-scheme-with-pictures

And the difference in voltage tolerance....


----------



## MKHunt

Quote:


> Originally Posted by *alancsalt*
> 
> Wonder if Pedro has the later inductors MKHunt spoke of? http://www.overclock.net/t/1168598/the-590-has-a-new-vrm-scheme-with-pictures
> 
> And the difference in voltage tolerance....


Most definitely not. Sadly, the hacked MARSII BIOS will not flash to a card with the .42 BIOS and better power scheme.


----------



## Pedropc

Hi, I have been testing a bios .42 on my GTX590, with .37 bios does not regulate the voltage changed with NiBiTor, with bios .42 if the voltage regulating nvitor modified. Curious.

This morning I spent some curious, my mistake I've been playing about two hours to Crysis 3 with the voltage at 1.075 V, and there was no problem, maximum temperature of 49 º, maybe I'm wrong, but I still think that the BIOS Mars II manages energy better than the GTX590 bios, greetings.


----------



## vagenrider

BIOS Mars II??? what is that?


----------



## Pedropc

The GTX590 with 0.37 bios can mutate to ASUS Mars II. With this bios can change the voltage, you can overclock and most importantly, no PDL, greetings.


----------



## OccamRazor

Quote:


> Originally Posted by *vagenrider*
> 
> BIOS Mars II??? what is that?


There´s 2 different 590´s concerning to power system, .37 and .42 bios
.37 are the first ones and the asus mars 2 bios is a .37, so, almost all of the early 590´s
accept this bios, however not all can handle the voltage and can die of it.
.37 bios doesnt work in the .42, but i´ve known guys that flashed .42 bios in their .37 590´s and worked,
got no evidence of so im not gonna comment on that, PedroPc is working on something similar but dont know if its working!

Cheers

Ed

P.S. Theres some early 590 that were not nvidia reference design and some PCB´s had significant changes,
perhaps PedroPC´s 590 is one of them to endure such voltage without burning.


----------



## vagenrider

Quote:


> Originally Posted by *Pedropc*
> 
> The GTX590 with 0.37 bios can mutate to ASUS Mars II. With this bios can change the voltage, you can overclock and most importantly, no PDL, greetings.


thank you for the info..i have 42 so this is not for me..


----------



## Canis-X

Both of my cards are the original .37 bios versions. I have been running the .42 version bios on them for a while now with no issues. I also flashed them successfully to the ASUS MARS II bios which I manually turned down the voltage in the bios on them to prevent them from burning up, using nibitor. This worked successfully for me but for others who used the same bios that I uploaded their cards would ramp the voltage up to 1.088v which resulted in damage to their cards. Pedropc had mentioned that he was running the ASUS MARS II bios on his card without modifying the voltage in the bios with nibitor and was not experiencing any issues. I flashed both of my cards to the unmodified stock ASUS MARS II bios but immediately noticed that the voltages on my cards were all over the place and would peak the voltage at 1.075v which worried me, so I immediately flashed my cards back to the GTX590 .42v bios and have been running them with that ever since. Not sure if I will flash them back to the ASUS MARS II bios again, but I might now that my CPU is connected to my phase change to see if I can get my scores any better.


----------



## SloaneRanger

Hello MKHunt,

do i need to set in nibitor Setting 0 & 1 to 1v0005 too, like setting 2 or can i leave them at 0v8755 for setting 0 & 0v963 for setting 1 ?

Thanks


----------



## OccamRazor

Quote:


> Originally Posted by *SloaneRanger*
> 
> Hello MKHunt,
> 
> do i need to set in nibitor Setting 0 & 1 to 1v0005 too, like setting 2 or can i leave them at 0v8755 for setting 0 & 0v963 for setting 1 ?
> 
> Thanks


To prevent wild voltage fluctuations and crashes its best to change settings 0, 1 and 2 with the same voltage, i got 800mhz stable with 1.025v and 850 stable with 1.038v
but all the fields must have the same volts, no jumps in volts during 3d clocks in more than 1 hour of gaming.
I found out that the shift between settings 0, 1 and 2 to and from 3d state leads to crashes if the volts are set differently in the settings 0, 1 and 2 because the 2 cores require different voltage, sometimes 0.025v between cores.

Ed


----------



## Gobiel

For the update: when looking for the screws to strengthen, I saw that there was dust *inside* the radiator (for the GPU near the DVI ports). It clearly was something that couldn't be removed just with compressed air... so I proceeded to remove the sheeps of dust by hand behind the pales of the fan.

It worked !







The tempertures are now "normal" with a 36°C idle on that "problematic" GPU2 and max 69°C ingame. Ridiculous how this card didn't even crash with no air income and me trying to use it...









Have a good day.


----------



## SloaneRanger

Quote:


> Originally Posted by *OccamRazor*
> 
> To prevent wild voltage fluctuations and crashes its best to change settings 0, 1 and 2 with the same voltage, i got 800mhz stable with 1.025v and 850 stable with 1.038v
> but all the fields must have the same volts, no jumps in volts during 3d clocks in more than 1 hour of gaming.
> I found out that the shift between settings 0, 1 and 2 to and from 3d state leads to crashes if the volts are set differently in the settings 0, 1 and 2 because the 2 cores require different voltage, sometimes 0.025v between cores.
> 
> Ed


Thanks for your nice help OccamRazor


----------



## vagenrider

arrived today my pump (swiftech mcp655)....this is the good news....the bud news is the accident i have last night during the replace of the clamps (finaly wrong clamps) a very big nightmare leak from the one of the connectors of the gpu drops liquid in to the back to pcb and to the motherboard during the test run..so i take a hairdryer to dry the liquid (about 1 hour)....and after i put tire up's for clamps that are very simple and good solution with zero leak everywhere.

finally everything went well..test run again and power on..


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> I haven't touched those whatsoever. I do a clean install when I change drivers too so its 100% stock there.


run unigine valey today for first time at 988mv 700 core 1770 mem..with this score..



is good or bad?


----------



## MKHunt

Your average fps is quite good as is your max, but the min fps is the lowest I've seen on a 590. Good job though. Try pushing it to 740.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> Your average fps is quite good as is your max, but the min fps is the lowest I've seen on a 590. Good job though. Try pushing it to 740.


in the next hours will upload some proof photos about the serious fail 590 xspc waterblock....


----------



## Canis-X

??? Really!? What's wrong with it? I have this style block on both of my 590's.


----------



## vagenrider

from the beginning I had problems with this block (xspc)....I'll say only this: almost 75c at full load! i think....***! this is fan temp!!!!........i check the screws and i find something that i dont like it..the base moving..so i take the hard decide to reopen it....

here what i found..............................

10 bases like this were dislodged!!!!!!!!!!!!!!!


Some of them had been left on the card.....


i take my time......to think what can i do.....Eventually I decided to hit it with a hammer and logo glue in the base..Believe me it was incredibly risky....even one wrong hit would be able to do great damage..

finally everything went just fine.......

the start..





and the finish....this is the one of the 10 dislodged bases..



my temps now are great....55c at 1 hour in unigine valey in extreme hd..


----------



## Canis-X

Unreal! Neither of mine have given me any problems.


----------



## MKHunt

Wow. XSPC has good CS in my experience, they even gave me measurements of clearances when I was looking for a .42-compatible block.

That is very very surprising.


----------



## vagenrider

so..the fails exists even to the good company's as you can see..the non screwed or unified bases is extremely wrong design in this block.


----------



## vagenrider

want to give 1013mv....it is very dangerous for 1-2 hours gaming? give me some answers please.


----------



## MKHunt

OccanRazor runs his cards at 1025mV for gaming, so 1013mV might be fine. But each card is different, so there are no guarantees of safety.


----------



## vagenrider

i know that each card is different..i ask from your experience..what volt you use?


----------



## Canis-X

I am currently using 1.013v for both of my cards for 24/7 use and have not experienced any issues/problems.


----------



## vagenrider

very good that....im thinking to get it too..can you give me your fermi? i mean the 0-1-2 steps..


----------



## OccamRazor

I've tried lot of different voltages, best to keep it safe at the lowest volts possible and stable core speed, keep in mind that different values in fermi voltage fields will allow for voltage imbalance between cores since they require different voltages to run stable, i had crashes everytime due to improper voltage setting by the drivers, set to 1,038v and core #2 had 0,975v, of course at 850mhz it would crash!

so, unlike this image, fill all fields with the same volts!

Cheers

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> I've tried lot of different voltages, best to keep it safe at the lowest volts possible and stable core speed, keep in mind that different values in fermi voltage fields will allow for voltage imbalance between cores since they require different voltages to run stable, i had crashes everytime due to improper voltage setting by the drivers, set to 1,038v and core #2 had 0,975v, of course at 850mhz it would crash!
> 
> so, unlike this image, fill all fields with the same volts!
> 
> Cheers
> 
> Ed


thank you for the answer..i set to 1013mv and stable to 750 but not see any difference (as i exept that) so going back to 988mv with 710..


----------



## OccamRazor

Ok guys, tell me your experiences with volts and core speed,
i get:

0,988v tops at 710mhz
1,0005v tops at 730mhz
1,013v tops at 760mhz
1,0255v tops at 820mhz
1,038v tops at 850mhz
1,0505v tops at 880mhz
1,063v tops at 910mhz

temps never go over 65º with active watercooling (800ltr/212usgal/hour pump/2m2 radiator/ 400mm fan)
and voltages never spike

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> Ok guys, tell me your experiences with volts and core speed,
> i get:
> 
> 0,988v tops at 710mhz
> 1,0005v tops at 730mhz
> 1,013v tops at 760mhz
> 1,0255v tops at 820mhz
> 1,038v tops at 850mhz
> 1,0505v tops at 880mhz
> 1,063v tops at 910mhz
> 
> temps never go over 65º with active watercooling (800ltr/212usgal/hour pump/2m2 radiator/ 400mm fan)
> and voltages never spike
> 
> Ed


very interesting....What interests me more is.... it is safe the 1025+820 for some time?

There are guys here that have done damage to their cards with clock like this?


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> Ok guys, tell me your experiences with volts and core speed,
> i get:
> 
> 0,988v tops at 710mhz
> 1,0005v tops at 730mhz
> 1,013v tops at 760mhz
> 1,0255v tops at 820mhz
> 1,038v tops at 850mhz
> 1,0505v tops at 880mhz
> 1,063v tops at 910mhz
> 
> temps never go over 65º with active watercooling (800ltr/212usgal/hour pump/*2m2 radiator*/ 400mm fan)
> and voltages never spike
> 
> Ed


My card touches 50C at 1.0005V

2 square meters?!?!? thats a ton of rad. At 1.0005V my card kisses 50-51C at 820MHz. I wonder what the VRMs are running at, temperature-wise? I am using the EK update block on a .42 card; so the inductors have contact with the block, the MOSFETs have contact with the block, and the cores/nf200 chip all have contact. Do you think this would add safety or merely transfer extra thermal energy from the cores to the power circuitry? I am unfamiliar with standard operating temps for the VRMs. I am also a little wary that with my MCP-35X, 6 fans, peripherals, and overclocked everything that I might be pushing the limits of my 850W power supply.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> My card touches 50C at 1.0005V
> 
> 2 square meters?!?!? thats a ton of rad. At 1.0005V my card kisses 50-51C at 820MHz. I wonder what the VRMs are running at, temperature-wise? I am using the EK update block on a .42 card; so the inductors have contact with the block, the MOSFETs have contact with the block, and the cores/nf200 chip all have contact. Do you think this would add safety or merely transfer extra thermal energy from the cores to the power circuitry? I am unfamiliar with standard operating temps for the VRMs. I am also a little wary that with my MCP-35X, 6 fans, peripherals, and overclocked everything that I might be pushing the limits of my 850W power supply.


i think the 1025 is too much..and it is no very diffrence in the performance from 710 to 820..


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> My card touches 50C at 1.0005V
> 
> 2 square meters?!?!? thats a ton of rad. At 1.0005V my card kisses 50-51C at 820MHz. I wonder what the VRMs are running at, temperature-wise? I am using the EK update block on a .42 card; so the inductors have contact with the block, the MOSFETs have contact with the block, and the cores/nf200 chip all have contact. Do you think this would add safety or merely transfer extra thermal energy from the cores to the power circuitry? I am unfamiliar with standard operating temps for the VRMs. I am also a little wary that with my MCP-35X, 6 fans, peripherals, and overclocked everything that I might be pushing the limits of my 850W power supply.


Well, its 2 square meters in a cylindrical shape, not as efficient as a normal rectangular shape but this fan 400 mm at 4500 rpms its short of amazing at removing heat from the radiator;
im using the same EK block, it touches every component of the card, its that good! so the vrms, capacitors and mosfets are all touching the block, dont worry about temps!
try a powermeter, (the ones you plug into the wall and then plug your system to it), dont trust the online power calculators, i had a 920 overclocked to 4.2ghz, 570 sli overclocked to 900mhz both cards, 4 hard drives with a corsair tx 750, gaming it touched 650w, the online calculator said it wasnt enough, that it took 800w ! lol

Ed


----------



## OccamRazor

Quote:


> Originally Posted by *vagenrider*
> 
> i think the 1025 is too much..and it is no very diffrence in the performance from 710 to 820..


In benchmarks i agree with you, the drivers cut performance, theres no doubt about that, but in gaming the overclock is enough to see everything fluid
in contrast to stock, ive got a 120mhz monitor and believe me i can SEE the difference!
Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> In benchmarks i agree with you, the drivers cut performance, theres no doubt about that, but in gaming the overclock is enough to see everything fluid
> in contrast to stock, ive got a 120mhz monitor and believe me i can SEE the difference!
> Ed


do not understand why you say that....overclock to me affects(good or bad) both games and benche's..


----------



## emett

The newer drivers for the GTX 590 limit voltage in program's like 3d mark where as in games they don't. Therefore you can run a higher overclock in games.
You can use early drivers to bypass this limit. However the voltage you can gain is surpassed by the performance using much newer drivers.


----------



## vagenrider

Quote:


> Originally Posted by *emett*
> 
> The newer drivers for the GTX 590 limit voltage in program's like 3d mark where as in games they don't. Therefore you can run a higher overclock in games.
> You can use early drivers to bypass this limit. However the voltage you can gain is surpassed by the performance using much newer drivers.


what? ....thank you very much


----------



## MKHunt

I submitted a request to EVGA for 590-compatible UEFI VBIOS. I don't expect much to come of it since they seem to be very picky about handing them out and JacobF said on the forums to not expect anything for the 500 series...

But Gigabyte also said over and over again that the 6 series chipsets could not hold the amount of code a UEFI BIOS takes.

RIght up to the point when they released UEFI for the 6 series.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> I submitted a request to EVGA for 590-compatible UEFI VBIOS. I don't expect much to come of it since they seem to be very picky about handing them out and JacobF said on the forums to not expect anything for the 500 series...
> 
> But Gigabyte also said over and over again that the 6 series chipsets could not hold the amount of code a UEFI BIOS takes.
> 
> RIght up to the point when they released UEFI for the 6 series.


what you mean?


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> I submitted a request to EVGA for 590-compatible UEFI VBIOS. I don't expect much to come of it since they seem to be very picky about handing them out and JacobF said on the forums to not expect anything for the 500 series...
> 
> But Gigabyte also said over and over again that the 6 series chipsets could not hold the amount of code a UEFI BIOS takes.
> 
> RIght up to the point when they released UEFI for the 6 series.


That would be excelent! But i dont think they will, it would be great for us, so we could remove PDL forever and set voltages and clocks at will for all P states!
Let see what happpens...
Anyway it was a good move!









Ed


----------



## MKHunt

Nope. They suggested I could get a 690 or Titan and have UEFI.


----------



## jeromeface

Worth a shot, thanks for trying MK


----------



## Krazeswift

Hey guys quick question if I may...

Had to rma my asus 590 a few weeks. Long story short asus can't repair or replace it so the supplier is offering me a LN50000 GB EVGA GTX 680 +Backplate instead.

Worthy compromise?

Cheers


----------



## rush2049

Quote:


> Originally Posted by *Krazeswift*
> 
> Hey guys quick question if I may...
> 
> Had to rma my asus 590 a few weeks. Long story short asus can't repair or replace it so the supplier is offering me a LN50000 GB EVGA GTX 680 +Backplate instead.
> 
> Worthy compromise?
> 
> Cheers


-the 590 performs slightly better
-the 680 can't raise its voltage (worse issue than the 590 had)
-the 680 won't have sli issues

-you could buy a second 680 and get double the performance!


----------



## MKHunt

Friends, I've been having the upgrade itch. My problem though is that my 590, overclocked to where it is, is a difficult card to make a substantial upgrade to for $1000. I'd hop on a 690 except they crippled it with a low 2GB vram and limited OC headroom. The OC headroom can partially be fixed with BIOS mods but then you still have nvidia's stupid boost clocks; which from everything I've read are absolutely maddening. It performs very well at 1080p, but I run 1440p. The 1440p is also what's giving me a little bit pf heart pangs with the 590. I haven't seen any vram issues (things are still buttery smooth) but I know they're there.

SO, 1440p... what about Titan? Well, I want to keep a single PCB since I water cool, and a single Titan is incredibly underwhelming. Most of the Titan OCs in Karlitos' Heaven thread are around 72-76fps. WIth my 590 at 740MHz, I pulled 61.9fps. 10-14fps gain at 1080p for $1000? That's hilarious. The gain is even less when you consider my OC is now 850MHz.

So that leaves me in the 7990 camp. Power draw would be about the same if not a bit less than my 590 and dual 7970s consistently outperform GK104 offerings. It's got a lot of VRAM too and a wider memory bus than GK104. If AMD releases their Malta 7990 it should outperform existing 7990 releases due to a higher base clock and be more on par with CFX GHZ 7970s when both are OC'ed. But then I'd be firmly in the red camp, and I don't know how I feel about that.

Of course, this is all assuming that any cards released later this year would be GK114 based which would be a disappointment. Maxwell would be the arch I'd be shooting for, but in reality that won't see consumer hands until April-July 2014.

The GPU market really kind of sucks right now.


----------



## Krazeswift

Quote:


> Originally Posted by *rush2049*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krazeswift*
> 
> Hey guys quick question if I may...
> 
> Had to rma my asus 590 a few weeks. Long story short asus can't repair or replace it so the supplier is offering me a LN50000 GB EVGA GTX 680 +Backplate instead.
> 
> Worthy compromise?
> 
> Cheers
> 
> 
> 
> -the 590 performs slightly better
> -the 680 can't raise its voltage (worse issue than the 590 had)
> -the 680 won't have sli issues
> 
> -you could buy a second 680 and get double the performance!
Click to expand...

Didn't realise the 680 voltages were locked.. How well do they overclock?

Well I've decided to go for it. I think a 4gb backplated 680 is a pretty good deal I can always go sli as you mentioned.

Was tempted by a 690 but I'm guessing two 4GB 680s will be noticeably quicker..


----------



## rush2049

Quote:


> Originally Posted by *Krazeswift*
> 
> Didn't realise the 680 voltages were locked.. How well do they overclock?
> 
> Well I've decided to go for it. I think a 4gb backplated 680 is a pretty good deal I can always go sli as you mentioned.
> 
> Was tempted by a 690 but I'm guessing two 4GB 680s will be noticeably quicker..


I don't have one, but from what I have read.... not too great compared to how good overclocking was on the 580's.


----------



## Krazeswift

Quote:


> Originally Posted by *rush2049*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krazeswift*
> 
> Didn't realise the 680 voltages were locked.. How well do they overclock?
> 
> Well I've decided to go for it. I think a 4gb backplated 680 is a pretty good deal I can always go sli as you mentioned.
> 
> Was tempted by a 690 but I'm guessing two 4GB 680s will be noticeably quicker..
> 
> 
> 
> I don't have one, but from what I have read.... not too great compared to how good overclocking was on the 580's.
Click to expand...

That's a bit disapointing. Well thanks for the info


----------



## alancsalt

They still beat 580's in benches though. 580's get similar results to 660's.


----------



## MKHunt

Well, I just impulse bought a Titan.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Well, I just impulse bought a Titan.


Well, im having the same impulse, please keep me on the loop of all the feedback you have OC and all!








Im in the same boat as you but ill probably sell my VG278H and buy 3x VG248QE
Ill need more than the 590 for sure!

Ed


----------



## vagenrider

I made a video for my pc to show it to my subscribers and wanted to share it with you.


----------



## jeromeface

Can't wait to hear what kinds of fun you have with your titan MK!


----------



## MKHunt

Thanks chaps. The 590 might be staying in the family, as my younger brother has expressed interest in building a gaming rig and the card is still a beast at 1080p.

It won't be water cooled if he uses it though, so if you're looking for the full cover EK block and backplate, let me know.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Thanks chaps. The 590 might be staying in the family, as my younger brother has expressed interest in building a gaming rig and the card is still a beast at 1080p.
> 
> It won't be water cooled if he uses it though, so if you're looking for the full cover EK block and backplate, let me know.


So, i´m about to take the titan plunge!








Any juice about the titan? how´s the benchmarks? Does it clock well?
I´m in pain here man...









Ed


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> So, i´m about to take the titan plunge!
> 
> 
> 
> 
> 
> 
> 
> 
> Any juice about the titan? how´s the benchmarks? Does it clock well?
> I´m in pain here man...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


My order was surprise cancelled because they ran out of stock and the inventory didn't update. I re-ordered this morning so I wont have benches for a week to a week and a half.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> My order was surprise cancelled because they ran out of stock and the inventory didn't update. I re-ordered this morning so I wont have benches for a week to a week and a half.


Why not going for one of these? http://www.overclock.net/t/1370486/3x-gtx-titans-sapphire-oc-7970-with-waterblock-sapphire-oc-7970-stock-2-zotac-gtx-680-2gbs-fs, it´s legit i believe!

Ed


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> Why not going for one of these? http://www.overclock.net/t/1370486/3x-gtx-titans-sapphire-oc-7970-with-waterblock-sapphire-oc-7970-stock-2-zotac-gtx-680-2gbs-fs, it´s legit i believe!
> 
> Ed


Amazon would take an electronic check from my investing account









ETA: aaaand it shipped. Price was the same as in that thread but it was through Amazon. Now I wait and see if I want to drain my loop and get a dual barb to bypass my current GPU water block and run Titan on air for a while or if I just want to wait until EK finally releases their true full cover block.... (they showed pics on Twitter today)


----------



## opasna228

Hello guys,can u help me?I have zotac gtx 590 with .37 bios,when i try to go higher than 0.963v it resets to 0.963v again...whats the problem?How u get the voltage higher then 0.963?


----------



## MKHunt

Quote:


> Originally Posted by *opasna228*
> 
> Hello guys,can u help me?I have zotac gtx 590 with .37 bios,when i try to go higher than 0.963v it resets to 0.963v again...whats the problem?How u get the voltage higher then 0.963?


In MSi afterburner, drag the voltage bar all the way to the bottom and then hit apply with your selected overclock. Should work then.


----------



## opasna228

It doesn't work,maximum voltage I can get in msi afterburner is always 0.963(even if I flash to higher volts)


----------



## alancsalt

you've been into 'Settings" of AB and checked unlock voltage?


----------



## opasna228

Quote:


> Originally Posted by *alancsalt*
> 
> you've been into 'Settings" of AB and checked unlock voltage?


Yep :c


----------



## opasna228

I'm on air cooling,so i don't wanna crazy oc.I want to get 0.988v at least,but it always drops to 0.963.Can anyone give me bios with this volts?I wonder if it is possible to flash .42 bios over my .37 from other manufacture?


----------



## vagenrider

Quote:


> Originally Posted by *opasna228*
> 
> I'm on air cooling,so i don't wanna crazy oc.I want to get 0.988v at least,but it always drops to 0.963.Can anyone give me bios with this volts?I wonder if it is possible to flash .42 bios over my .37 from other manufacture?


dont doit....988mv is too much for aircooling....you will see temps like 100c+ at full load.

stay to stock voltage and give some 40-50 hz to gpu or go to water


----------



## opasna228

Quote:


> Originally Posted by *vagenrider*
> 
> dont doit....988mv is too much for aircooling....you will see temps like 100c+ at full load.
> 
> stay to stock voltage and give some 40-50 hz to gpu or go to water


----------



## vagenrider

Quote:


> Originally Posted by *opasna228*


ok....


----------



## opasna228

Quote:


> Originally Posted by *vagenrider*
> 
> ok....


Quote:


> Originally Posted by *vagenrider*
> 
> ok....


dont worry i cant even set it higher than 963mv...xD..so can u help me?


----------



## IronMaiden1973

Thanks for the info on the air cooled max OC speed. I'm about to get a second ASUS 590 and run it in SLI. I'd love to water cool both but the funds dont apply to that amount but Ive got a water system running the CPU so its a simple enough step to do... Guys, what sort of performance increases can I expect to see with the 590 in SLI in games like Battlefield 3, Crysis 3 and Far Cry 3? And with those games run in 2D and 3D on one monitor and 2D and 3D on 3 monitors? (and feel free to add what games you might know to if you would like)
Cheers


----------



## vagenrider

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Thanks for the info on the air cooled max OC speed. I'm about to get a second ASUS 590 and run it in SLI. I'd love to water cool both but the funds dont apply to that amount but Ive got a water system running the CPU so its a simple enough step to do... Guys, what sort of performance increases can I expect to see with the 590 in SLI in games like Battlefield 3, Crysis 3 and Far Cry 3? And with those games run in 2D and 3D on one monitor and 2D and 3D on 3 monitors? (and feel free to add what games you might know to if you would like)
> Cheers


check my system..


----------



## vagenrider

Quote:


> Originally Posted by *opasna228*
> 
> dont worry i cant even set it higher than 963mv...xD..so can u help me?


you have set corectly the voltage steps to nibitor?


----------



## IronMaiden1973

Oh yeah. I forgot to add "nice video".. And thats just with one 590! I have a i7 3820 at stock 3.6 at the moment. I think I might play with it some more and get it stable. I just smashed some buttons in the BIOS and the PC ran at 4.7 for about 70% of the time.. But I had no idea about the new P9X79 pro MB BIOS coming from a ASUS Striker II board.. Wow, feels funny just typing that again. My FC3 game is not as smooth as yours is with my stock 3820 and stock 590.


----------



## vagenrider

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Oh yeah. I forgot to add "nice video".. And thats just with one 590! I have a i7 3820 at stock 3.6 at the moment. I think I might play with it some more and get it stable. I just smashed some buttons in the BIOS and the PC ran at 4.7 for about 70% of the time.. But I had no idea about the new P9X79 pro MB BIOS coming from a ASUS Striker II board.. Wow, feels funny just typing that again. My FC3 game is not as smooth as yours is with my stock 3820 and stock 590.


first of all thank you..also not know anything about this mobo bios..you thinking about water to your 590?


----------



## opasna228

yes i think...i have read that .37 bios doesnt allow you to go higher than 0.963 with new drivers,so can i flash it to .42 with no problems and where can i get it?There's no bios for my card ot techpowerup :с


----------



## vagenrider

Quote:


> Originally Posted by *opasna228*
> 
> yes i think...i have read that .37 bios doesnt allow you to go higher than 0.963 with new drivers,so can i flash it to .42 with no problems and where can i get it?There's no bios for my card ot techpowerup :с


dont know something about newer drivers not allow more volt on 37..i have 42 at 1005mv with the latest drivers..must ask some guy with 37 to help you


----------



## opasna228

Quote:


> Originally Posted by *vagenrider*
> 
> dont know something about newer drivers not allow more volt on 37..i have 42 at 1005mv with the latest drivers..must ask some guy with 37 to help you


I just wanted to know,does anything bad happen if i flash .42 bios from another manufacter?From gigabyte for example


----------



## vagenrider

Quote:


> Originally Posted by *opasna228*
> 
> I just wanted to know,does anything bad happen if i flash .42 bios from another manufacter?From gigabyte for example


I've heard that has succeeded some. But the risk remains believe..


----------



## IronMaiden1973

To be honest I'm probably going to have to do water. Its pretty loud with just the one fan spinning at 95%. I can only imagine with two.. I'm just happy I have full "cans" for headphones. Do you know how to get MSI to get the fan to run at 100%? For some reason the upper limit is only 95% for fan control..


----------



## vagenrider

Quote:


> Originally Posted by *IronMaiden1973*
> 
> To be honest I'm probably going to have to do water. Its pretty loud with just the one fan spinning at 95%. I can only imagine with two.. I'm just happy I have full "cans" for headphones. Do you know how to get MSI to get the fan to run at 100%? For some reason the upper limit is only 95% for fan control..


yes i know..i have adjust it too at 100% but nothing..its some kind of limitation..


----------



## IronMaiden1973

Well, I guess if 95% is the MAX, then thats really 100%.


----------



## vagenrider

Quote:


> Originally Posted by *IronMaiden1973*
> 
> Well, I guess if 95% is the MAX, then thats really 100%.


exactly..


----------



## MKHunt

Friends, can anyone with STOCK EVGA *.42* BIOS hook a brother up with the files? I'm putting my card back on air and I don't trust 1.0005V on air lol.


----------



## MKHunt

Friends, can anyone with STOCK EVGA *.42* BIOS hook a brother up with the files? I'm putting my card back on air and I don't trust 1.0005V on air lol.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Friends, can anyone with STOCK EVGA *.42* BIOS hook a brother up with the files? I'm putting my card back on air and I don't trust 1.0005V on air lol.




No need, just fill the fields with these values!

So, hows the titans?

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> No need, just fill the fields with these values!
> 
> So, hows the titans?
> 
> Ed


also think the same..


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> 
> 
> No need, just fill the fields with these values!
> 
> So, hows the titans?
> 
> Ed


I am bleeding my loop as I type this. I peeled the plastic off my 590 _for the first time_ just for this photo.


----------



## Canis-X

If only the wife didn't decide that we needed hardwood floors installed with this year's tax returns......I make the money, she spends it......last year it was a privacy fence for the back yard......I'm sure next year will be redoing bathrooms or replacing windows....ugg.


----------



## MKHunt

Quote:


> Originally Posted by *Canis-X*
> 
> If only the wife didn't decide that we needed hardwood floors installed with this year's tax returns......I make the money, she spends it......last year it was a privacy fence for the back yard......I'm sure next year will be redoing bathrooms or replacing windows....ugg.


Hardwood floors are worth it. I just entered into contract on a condo last week and I've yet to decide if I want golden acacia or siberian tigerwood. If I had to chose between titans and floors, it would 100% be floors









Bone stock single Titan score. Literally, as stock as it gets, air cooling and all. No k boost.

http://www.3dmark.com/3dm11/6286589

My best single 590 score.

http://www.3dmark.com/3dm11/6161787


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Hardwood floors are worth it. I just entered into contract on a condo last week and I've yet to decide if I want golden acacia or siberian tigerwood. If I had to chose between titans and floors, it would 100% be floors
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bone stock single Titan score. Literally, as stock as it gets, air cooling and all. No k boost.
> 
> http://www.3dmark.com/3dm11/6286589
> 
> My best single 590 score.
> 
> http://www.3dmark.com/3dm11/6161787


Do you have more "sauce" on the titans?








Im going for one next week, do you have any input on the single titan gaming on the new games?
Thanks!

Ed


----------



## MKHunt

Single titan was buttery at 1440 in Tomb Raider, Crysis 3, and Bioshock Infinite. Was it 60fps? I don't actually know since I do not use FRAPS. However, there were absolutely zero slowdowns nor hiccups. It was pretty freakin' nice.


----------



## vagenrider

Quote:


> Originally Posted by *MKHunt*
> 
> Single titan was buttery at 1440 in Tomb Raider, Crysis 3, and Bioshock Infinite. Was it 60fps? I don't actually know since I do not use FRAPS. However, there were absolutely zero slowdowns nor hiccups. It was pretty freakin' nice.


you mean you get 60 frames to these games? my 590 to far cry 3 bioshock infinite get 80-100 on ultra settings at 1920x1080


----------



## MKHunt

Quote:


> Originally Posted by *vagenrider*
> 
> you mean you get 60 frames to these games? my 590 to far cry 3 bioshock infinite get 80-100 on ultra settings at 1920x1080


I do not know the framerates, I guess I could download fraps and see. Adaptive Vsync is enabled on Bioshock Infinite because the screen tearing was atrocious. Also keep in mind that 2560x1440 has exactly 1,612,800 more pixels than 1920x1080.

If I had stayed at 1080p I never would have bought Titans because at that resolution a 590 is 110% capable

ETA: Aight, I disabled SLI and moved monitor 2 to card 2 so that it wouldnt affect the load (hint: extra monitors definitely lower frames)





Then I made sure to use the NVIDIA blank reviewer bios with zero OC.





I benched 1440p of 60 seconds of gameplay in Crysis 3 when you're underground in a dark cavern and Prophet is having some internal dialogue and hearing seph voices and there's a lot of fighting and stuff. I benched TR when you're finding shelter from the rain after crossing the old bomber. And I benched Crysis2 when you're in an APC and exiting the freeway tunnel where two APCs and an insta-death helicopter await you. All settings maxed (Crysis 2 in DX11 with high-res textures and motion blur at medium)


----------



## vagenrider

2560x1440??????????? thats why....







590 with this resolution will be of course much lower..so,well done with your new beast!


----------



## MKHunt

Hahaha yeah, 1.5GB was not enough at this res. 590 has the power, but not the memory. 690 will face the same trouble in 6ish months when games use even more vram.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Hahaha yeah, 1.5GB was not enough at this res. 590 has the power, but not the memory. 690 will face the same trouble in 6ish months when games use even more vram.


Even at 1080p im getting over 1.5gb of memory usage in both tombraider and bioshock, thats why im going for a titan!
and on top of that my monitor its flawless at 120hz and with the Zero motion blur trick: http://120hz.net/showthread.php?1186-ASUS-VG278H-owners!!!-Zero-motion-blur-setting!-(looks-like-CRT-or-like-480Hz-LCD)
i need a lot of gpu power!









Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> Even at 1080p im getting over 1.5gb of memory usage in both tombraider and bioshock, thats why im going for a titan!
> and on top of that my monitor its flawless at 120hz and with the Zero motion blur trick: http://120hz.net/showthread.php?1186-ASUS-VG278H-owners!!!-Zero-motion-blur-setting!-(looks-like-CRT-or-like-480Hz-LCD)
> i need a lot of gpu power!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed


for me not have any prob with gpu memory on 1080p and maxed all settings on bioshock infinite-tombraider 2013-far cry 3-crysis 3......


----------



## OccamRazor

Quote:


> Originally Posted by *vagenrider*
> 
> for me not have any prob with gpu memory on 1080p and maxed all settings on bioshock infinite-tombraider 2013-far cry 3-crysis 3......


Really? What framerates are you having in tombraider, bioshock and crysis3?


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> Really? What framerates are you having in tombraider, bioshock and crysis3?


in these games im on 80-100 frames..i have already make video with far cry 3 on that frames 



 ..but if you not believe it will make and for these games


----------



## OccamRazor

Quote:


> Originally Posted by *vagenrider*
> 
> in these games im on 80-100 frames..i have already make video with far cry 3 on that frames
> 
> 
> 
> ..but if you not believe it will make and for these games


Dont get me wrong, i believe you!








the problem is, AFAIK, all those games at maximum settings consume between 1,5 to 2,9gb of graphics memory (reviews at respectable sites like guru3d and hardocp), when this happens swapping between graphics memory and system memory occurs, then theres stuttering and drop of framerate, now ive seen people with gtx 560 with 1gb of memory having all maxed out and still going over 60fps and people with titans going 65fps!
How can that be!?!?!?!
Well, from my point of view different computers have different driver implementations and sometimes even when you set maximum setting in game the drivers dont allow it for some unknown reason, so you have lower settings in game despite what the game menu says! its hard to tell the difference from ultra settings to high settings but memory wise its a huge difference, sometimes 1gb! And then theres memory leaks in the games coding that doesnt affect everyone...
in my end when i get graphics memory consumption over 1,5gb i get stuttering and have to lower settings, this card has a lot of horse power but it lacks memory, so im moving on!

Cheers

Ed


----------



## MKHunt

Vagenrider what cpu/mobo are you running? I noticed that when the gfx memory got high, upping the speed of my RAM helped loads. If you're on a 2011 socket (or even PileDriver) and have some hot hot ram, the vram limit doesn't stutter as much. Granted, system memory is nowhere near the bandwidth of graphics memory but on a quad setup it can get stupidly fast.


----------



## vagenrider

i have very simple specs but very thought out settings(personal opinion) i have an 8120 at 4600mhz and a simple GA-990XA-UD3....for ram i have Only 4gb 2x ocz platinum pc3 16000 and 2x kingston hyper x all at 1.7v and 1T 8-8-8-22. Northbridge+hypertransport at 2400mhz

i know very well all what you say OccamRazor and MKHunt you have both right but also its important and helpfull to talk as we do now







because we learn more!


----------



## OccamRazor

I was about to press the button on the Titan and stumbled on a 690 for $200 less, all in all added water-blocks and transport fees i get a difference in excess of $150 (the 690 water-block is $40 more expensive that the titan block), now im in a tight spot, i want to max out my monitor at 120hz, (because gaming on my VG278H when FPS range from 100hz to 120hz its superb, like a CRT, you cannot explain it empirically, you have to try it to understand the fluidity) and the 690 IT IS faster than the Titan averaging 20% BUT the Titan IS a single chip and later i could add a second one, with the 690 a second one is diminishing returns.
So, MKhunt, can you do me a BIG favor? As we have similar systems, can you check your average framerates @1080p in crysis 3 ,tombraider and bioshock infinite with one titan? if they are in 100+ area, then i will go for the Titan for sure!









Thanks in advance

Ed


----------



## OccamRazor

As a point of interest i fiddled with the latest drivers a bit and now im able to play bioshock infinite and tombraider all maxed out and crysis 3 with everything in very high settings, memory never went above 1.5 gb in afterburner monitor, although maths says otherwise (at this resolution with AA so high i should be near 2gb, not 1.5gb) BUT i did see something, even if AA was set ingame menu, IT WAS NOT IN THE GAME as big ripples could be seen on screen, so thats that, about our dear nvidia drivers that identify the graphics power and suppress what ever AA levels and ambient occlusion levels or shading or whatever its required to make the game play!
But off-course this is speculation, i got no way of proving it...
but it smells bad and it aint burning...

My 2 cents

Ed


----------



## Canis-X

...what did you change in the drivers?


----------



## MKHunt

Quote:


> Originally Posted by *OccamRazor*
> 
> I was about to press the button on the Titan and stumbled on a 690 for $200 less, all in all added water-blocks and transport fees i get a difference in excess of $150 (the 690 water-block is $40 more expensive that the titan block), now im in a tight spot, i want to max out my monitor at 120hz, (because gaming on my VG278H when FPS range from 100hz to 120hz its superb, like a CRT, you cannot explain it empirically, you have to try it to understand the fluidity) and the 690 IT IS faster than the Titan averaging 20% BUT the Titan IS a single chip and later i could add a second one, with the 690 a second one is diminishing returns.
> So, MKhunt, can you do me a BIG favor? As we have similar systems, can you check your average framerates @1080p in crysis 3 ,tombraider and bioshock infinite with one titan? if they are in 100+ area, then i will go for the Titan for sure!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks in advance
> 
> Ed


A single 690 will not put you over 100fps in Crysis 3. The game is far too poorly optimized. All results were at 1080p with my Titan at *PCIe 3.0 x8*.



Couldn't do x16 because things are kind of locked in.



ETA: Actually, disregard those scores for a bit. My CPU lost its OC and is at 3.7GHz instead of 4.7. Memory also lost its OC and is at 667. That could explain why my Valley scores were so low....


----------



## vagenrider

..crysis its a very big fail in optimize section..however not play it because is too colourless for me..


----------



## OccamRazor

Quote:


> Originally Posted by *Canis-X*
> 
> ...what did you change in the drivers?
> 
> Sort of a crossbreeding!
> 
> 
> 
> 
> 
> 
> 
> mix some files that work better from my point of view from earlier drivers but i believe that sometimes its better just to do a clean install although in my case every time i do it, i get a blank screen at startup! lol Anyway i ended up cleaning everything (registry, lot of strings get lost there and can cause problems) and installed the r313_00_263 AKA 314.22WHQL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *MKHunt*
> 
> A single 690 will not put you over 100fps in Crysis 3. The game is far too poorly optimized. All results were at 1080p with my Titan at *PCIe 3.0 x8*.
> 
> 
> 
> Couldn't do x16 because things are kind of locked in.
> 
> 
> 
> ETA: Actually, disregard those scores for a bit. My CPU lost its OC and is at 3.7GHz instead of 4.7. Memory also lost its OC and is at 667. That could explain why my Valley scores were so low....
> 
> 
> 
> I dont mind to forefit some settings in some games in order to gain fps as some have minimal impact in overall quality and then there's the game coding itself, some can indeed look and feel wonderfully fluid at 60fps or at even less (dead space 2 and 3 springs to mind) and some BAD, like crysis3 have to be heavily tweaked to be enjoyed. I would like to know your fps and the "feel" of it in those games with your OC on of-course to extract the full power of the titan!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed
Click to expand...


----------



## MKHunt

Aight, CPU OC'ed properly and RAM back to 2200MHz.

Single titan, 1150MHz Core, 3005MHz memory. Again, every single setting in each game is as high as it will go. No tweaking, nothing disabled, every box checked; just brute force. All games play smoothly (Bioshock requires mouse speed tweaks in the .ini to be smooth) and buttery. At these settings, it eclipses my 590 at lower settings. YOu can see that Bioshoick and Crysis 3 are heavily CPU dependent. TombRaider saw about zero gain from 1GHz extra on the CPU, which is impressive.


----------



## OccamRazor

Quote:


> Originally Posted by *MKHunt*
> 
> Aight, CPU OC'ed properly and RAM back to 2200MHz.
> 
> Single titan, 1150MHz Core, 3005MHz memory. Again, every single setting in each game is as high as it will go. No tweaking, nothing disabled, every box checked; just brute force. All games play smoothly (*Bioshock requires mouse speed tweaks in the .ini to be smooth*) and buttery. At these settings, it eclipses my 590 at lower settings. YOu can see that Bioshoick and Crysis 3 are heavily CPU dependent. TombRaider saw about zero gain from 1GHz extra on the CPU, which is impressive.


What values did you change in the .ini, mouse wise?


----------



## MKHunt

mouse min speed from 1 to .5


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> What values did you change in the .ini, mouse wise?


??? not have any problem with my naga epic..


----------



## vagenrider

thinking to give some 1013mv....but afraid very much..anyone else have damage his card with this voltage?


----------



## vagenrider

thinking to give some 1013mv....but afraid very much..anyone else have damage his card with this voltage?


----------



## OccamRazor

i found no problem and tested all volts to 1.063v but all cards are different, only mess with volts if you are sure you accept the chance of it get burned! The real problem is that the card in some games throttle if you go over 750mhz, (crysis 3 for one, tombraider got it to 850mhz) so much horse power and so much restraint in the drivers... so i have already put it for sale and going to get my self a titan or 680sli.

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> i found no problem and tested all volts to 1.063v but all cards are different, only mess with volts if you are sure you accept the chance of it get burned! The real problem is that the card in some games throttle if you go over 750mhz, (crysis 3 for one, tombraider got it to 850mhz) so much horse power and so much restraint in the drivers... so i have already put it for sale and going to get my self a titan or 680sli.
> 
> Ed


the new games is not problem for me (exept little play on far cry 3) because my basic game is gtaiv that im developer in this game..now im on 1005mv and 742mhz....will try to 1013mv and some 780mhz..

im on thinking for quad. because im extremely satisfied with this card


----------



## vagenrider

done....780mhz stable..good temps..going for 800







god bless..


----------



## alancsalt

A reminder:

You may not sell outside OCN Marketplace
You may not sell with less than 35 reputation.

Please take the time to read our:
TOS
Marketplace Rules


----------



## vagenrider

Quote:


> Originally Posted by *alancsalt*
> 
> A reminder:
> 
> You may not sell outside OCN Marketplace
> You may not sell with less than 35 reputation.
> 
> Please take the time to read our:
> TOS
> Marketplace Rules


ok alancsalt will not buy anything.


----------



## IronMaiden1973

I have another 590 on the way as we speak so I too am going to try quad GPU or 590 in SLI. Two Asus GTX 590's to be exact. I'd love to see all my games maxed out and playing at 100-120 FPS.


----------



## vagenrider

im on 1013mv and 800mhz with normal temps 49-51c and during playing gtaiv random for unknown reason screen freezes for 3-4 seconds and the card throtle to factory voltage-clock. why? it is from the drivers?


----------



## OccamRazor

yes, its the dreaded PDL (Power Draw Limitation), its triggered by the drivers in some games and applications but it activates in some games despite the volts you have on the card, up to 750mhz its ok.
I thought Nvidia unlocked the drivers but its always the same, its about money, this card is very powerfull, just look at the new games benchmarks, the .42 versions can take the volts properly but Nvidia still maintains the lock! Why? because imho if you could overclock it to 850 and above, a good chunck of people wouldnt buy the 600 generation, while theres a lot of people that does not complaint about the stuttering its there for the 500 generation and for the 600 as well, but wait, the mechanism about stutter control has been implemented by nvidia in SLI by delaying the second frame generated by the 2nd gpu, and its been around for ages, so why are the stutters much worse in the new drivers? the 600 generation SLI is a breeze but why did the 500´s all of a sudden got stutters that were eliminated back at far cry 2 in the 200 generation? i wonder why...
Here´s a good reading about framerate and frametime: http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Dissected-Full-Details-Capture-based-Graphics-Performance-Testin

Just my 2 cents...

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> yes, its the dreaded PDL (Power Draw Limitation), its triggered by the drivers in some games and applications but it activates in some games despite the volts you have on the card, up to 750mhz its ok.
> 
> Cheers
> 
> Ed


is there any way to bypass it or something?


----------



## OccamRazor

No, its locked in the drivers, only nvidia can unlock it in the drivers, but 750mhz its still a good frequency and some games dont trigger the PDL, and some at some degree: i can play Crysis [email protected], [email protected], and [email protected]

Cheers

Ed


----------



## vagenrider

Quote:


> Originally Posted by *OccamRazor*
> 
> No, its locked in the drivers, only nvidia can unlock it in the drivers, but 750mhz its still a good frequency and some games dont trigger the PDL, and some at some degree: i can play Crysis [email protected], [email protected], and [email protected]
> 
> Cheers
> 
> Ed


yes..i read many things the last hours about that..im sure the reason of the locks (beyond the protection) its because nvidia wants force you to buy the new models. of course its great clock the 750. with that clock im on 80-100 fps on far cry 3 and bioshock infinite.tomorrow will get the second triple radiator fore more exotic temps!


----------



## vagenrider

at 770mhz the throttling stops....maybe will leave it here


----------



## vagenrider

after lot of unlucks and many other bad things..my motherboard died..







but its ok ,he die as a fighter







decide to go on asus sabertooth r2.0 because it have many things more than the same price gigabyte and also it is much better and stable on overclock.

this is the changes for my system at least for now such the motherboard the crucial balistix the acool dual water bay and the second triple acool radiator.

http://s229.photobucket.com/user/vagenrider/media/P4110164_zps70f25476.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110165_zps24dd14e6.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110168_zps66c5d407.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110169_zps894dd377.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110171_zpse75b6a4d.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110172_zps90e850f4.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110173_zpse41812e5.jpg.html
http://s229.photobucket.com/user/vagenrider/media/P4110174_zpsaa835254.jpg.html


----------



## IronMaiden1973

I know why your last MB died, and this one will too.. You dont have any air flow over the MB..... There is now chips on there that are heating up and popping. You NEED a fan over the MB to cool it...


----------



## vagenrider

Quote:


> Originally Posted by *IronMaiden1973*
> 
> I know why your last MB died, and this one will too.. You dont have any air flow over the MB..... There is now chips on there that are heating up and popping. You NEED a fan over the MB to cool it...


must ask before you playing the smart guy.my motherboard die because of a big leak.im not stupid or novice.


----------



## alancsalt

Quote:


> Originally Posted by *IronMaiden1973*
> 
> I know why your last MB died, and this one will too.. You dont have any air flow over the MB..... There is now chips on there that are heating up and popping. You NEED a fan over the MB to cool it...


I would not expect so. Maybe if overclocking to the max on a hot day without air-con.. but even then..

I do run a 120 x 38 fan over my mobo, but that is just to try to get max overclock in benching...


----------



## vagenrider

Quote:


> Originally Posted by *alancsalt*
> 
> I would not expect so. Maybe if overclocking to the max on a hot day without air-con.. but even then..
> 
> I do run a 120 x 38 fan over my mobo, but that is just to try to get max overclock in benching...


agree to all.


----------



## IronMaiden1973

PLaying the smart guy! Mate I'm just trying to save you paying more money for a 3rd MB.. You should do some research on your new ASUS MB. I have the P9X79 PRO. These boards have the ram controller chips right next to the CPU. They are designed to be cooled by the CPU stock fan.. If you take away that fan like you have ( and you have taken away all fans (PC case fans) you have NO cooling across the MB at all. SB for another example... Those chips will pop just on normal operation and WILL pop on overclocking without air flow.. By airflow I mean moving air, not air thats just sitting in your room. And arrrrr the way you described your last MB dieing.. You sure your not a novice?..


----------



## Canis-X

You need to read some more articles man. You are starting to cross a line from helpful to being controlling and rude. I would love to see you post some links to reputable sources backing up your logic. If not, please feel free to not post in here again and let it go.

Cheers


----------



## IronMaiden1973

Stuff ya then.. Im sick of trying to help people.. Always just come back with an insult or a "you're a know all" attitude. I wouldnt have posted at all if I didnt see concern in the pics..


----------



## Canis-X

Hey bud, don't get mad. If you have link to back up your statements then post them up. If not thenjust let it go. You're not coming across as being helpful though....you're coming across as being exactly as you are describing me.

Cheers!


----------



## mironccr345

I miss my 590.


----------



## PCModderMike

That block, with that pastel, oooooooooh my.


----------



## mironccr345

I now, I miss it.


----------



## PCModderMike

All good, just had to move on to something that met your needs.


----------



## mironccr345

Yeah, 685 sli


----------



## MKHunt

Quote:


> Originally Posted by *mironccr345*
> 
> Yeah, 685 sli


We took the same steps! Fangirl squee!

ETA:
My beloved 590 has changed hands at long last. Luckily for my over-attachment problem, it isn't far. My younger brother now has a decently beasty rig. So far he doesn't have a case (all my old components except the PSU) and as such, the computer's name in the home network is "BOXTRON." I'll have to part with it again fairly soon though as I bought a condo and am moving late may after the remodel.

Sigh, fond memories with tweaking the 590.



He got all you see (2600k, 590, z68xp-ud4, 1866 vengeance 8gb) minus the PSU plus an ASUS 23" IPS monitor for a whopping $750. I should be less nice in the future.


----------



## PCModderMike

Quote:


> Originally Posted by *mironccr345*
> 
> Yeah, 685 sli


Oh really now








Quote:


> Originally Posted by *MKHunt*
> 
> We took the same steps! Fangirl squee!
> 
> ETA:
> My beloved 590 has changed hands at long last. Luckily for my over-attachment problem, it isn't far. My younger brother now has a decently beasty rig. So far he doesn't have a case (all my old components except the PSU) and as such, the computer's name in the home network is "BOXTRON." I'll have to part with it again fairly soon though as I bought a condo and am moving late may after the remodel.
> 
> Sigh, fond memories with tweaking the 590.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> He got all you see (2600k, 590, z68xp-ud4, 1866 vengeance 8gb) minus the PSU plus an ASUS 23" IPS monitor for a whopping $750. I should be less nice in the future.


Wow, how generous of you.


----------



## mironccr345

Quote:


> Originally Posted by *MKHunt*
> 
> We took the same steps! Fangirl squee!
> 
> ETA:
> My beloved 590 has changed hands at long last. Luckily for my over-attachment problem, it isn't far. My younger brother now has a decently beasty rig. So far he doesn't have a case (all my old components except the PSU) and as such, the computer's name in the home network is "BOXTRON." I'll have to part with it again fairly soon though as I bought a condo and am moving late may after the remodel.
> 
> Sigh, fond memories with tweaking the 590.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> He got all you see (2600k, 590, z68xp-ud4, 1866 vengeance 8gb) minus the PSU plus an ASUS 23" IPS monitor for a whopping $750. I should be less nice in the future.


I found my new Step-Brother!!!! LoL Really nice of you though. I have 2 younger brothers and one older, none of them into computers.


----------



## rush2049

Quote:


> Originally Posted by *mironccr345*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MKHunt*
> 
> We took the same steps! Fangirl squee!
> 
> ETA:
> My beloved 590 has changed hands at long last. Luckily for my over-attachment problem, it isn't far. My younger brother now has a decently beasty rig. So far he doesn't have a case (all my old components except the PSU) and as such, the computer's name in the home network is "BOXTRON." I'll have to part with it again fairly soon though as I bought a condo and am moving late may after the remodel.
> 
> Sigh, fond memories with tweaking the 590.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> He got all you see (2600k, 590, z68xp-ud4, 1866 vengeance 8gb) minus the PSU plus an ASUS 23" IPS monitor for a whopping $750. I should be less nice in the future.
> 
> 
> 
> I found my new Step-Brother!!!! LoL Really nice of you though. I have 2 younger brothers and one older, none of them into computers.
Click to expand...

I give all my parts to my little brother as well. Eventually he will get my 590, right now he has to be happy with my 275...... lol


----------



## MKHunt

Learned the hard way that the EVGA nex-750b is set up poorly.

For vga power it has two 20A rails (4 12v rails total) which seems complex and unnecessary, oh and of course loading the 590 makes a near-instant power kill. b:thumbsdow


----------



## do0ki3 pwns

just wondering whats the going rate on one of these now?


----------



## nMixxo

Hey vagenrider,

I reached my PM limit so here it is

Gigabyte 690 core 1850 memory.

690-1850.zip 488k .zip file


----------



## YP5 Toronto

If all goes well..in the next coming days, I will need to remove myself from the club


----------



## vagenrider

Quote:


> Originally Posted by *nMixxo*
> 
> Hey vagenrider,
> 
> I reached my PM limit so here it is
> 
> Gigabyte 690 core 1850 memory.
> 
> 690-1850.zip 488k .zip file


nice bios my friend also copy it and to my asus,really thank you!


----------



## alancsalt

Quote:


> Originally Posted by *nMixxo*
> 
> Hey vagenrider,
> 
> I reached my PM limit so here it is
> 
> Gigabyte 690 core 1850 memory.
> 
> 690-1850.zip 488k .zip file


Isn't that just a matter of deleting old PMs to make room?


----------



## nMixxo

Quote:


> Originally Posted by *alancsalt*
> 
> Isn't that just a matter of deleting old PMs to make room?


Didn't know this, thanks.


----------



## vagenrider

Quote:


> Originally Posted by *nMixxo*
> 
> Didn't know this, thanks.


the old dogs hit again! at 4.4ghz and only +60hz gpu +100hz mem..

http://s229.photobucket.com/user/vagenrider/media/4086_zps3534a635.jpg.html


----------



## skitz9417

hi guys im thinking about getting the gtx 590 but i dont how big the bottleneck will be and i will be using a 1080p montior and my cpu is x4 955be at 4ghz


----------



## Canis-X

The cpu will be a huge bottleneck for that card! My 1090T running at 4.6Ghz on my phase change cooler bottle necked mine really bad.


----------



## skitz9417

Oh really


----------



## Canis-X

Yep...My best scores between my 1090T and my 3930k....same cards, both water cooled.


----------



## skitz9417

Quote:


> Originally Posted by *Canis-X*
> 
> Yep...My best scores between my 1090T and my 3930k....same cards, both water cooled.


i was thinking of getting one becasue my has one for sale


----------



## ProfeZZor X

I am officially submitting my resignation to keep the GTX 590 in my rig... After less than a year of service, I just can't handle all the freezing and force shutdowns. Whether I'm on the web, playing a game, or looking at Youtube content, there's not a day that this thing hasn't seized up. I thought it might be the drivers, but even after updating those regularly, it still crashes. Sometimes I end up rebooting several times in an hour, just to look at one simple video.

...The good news is that I ordered a Titan to take its place.


----------



## AlExAlExAlEx

So, I've been googling for some time now. Can't seem to find an official answer apart from random guesses.

Was the GTX 590 discontinued? No online store has it on stock, most of them don't even list them as products anymore.


----------



## rush2049

Quote:


> Originally Posted by *AlExAlExAlEx*
> 
> So, I've been googling for some time now. Can't seem to find an official answer apart from random guesses.
> 
> Was the GTX 590 discontinued? No online store has it on stock, most of them don't even list them as products anymore.


The gtx 590 was a limited manufacturing run of cards. Somewhere between 1000-3000 cards


----------



## alancsalt

Sometimes you can get new old stock on Ebay.....


----------



## ProfeZZor X

Quote:


> Originally Posted by *alancsalt*
> 
> Sometimes you can get new old stock on Ebay.....


Sadly, that's where mine will be headed once I get a new full cover EK water block for my new Titan.


----------



## mironccr345

^ sorry to hear all the trouble it's caused, but at least you're getting a beast of a card as an upgrade.


----------



## RagingCain

Quote:


> Originally Posted by *Canis-X*
> 
> Yep...My best scores between my 1090T and my 3930k....same cards, both water cooled.


Your scores show the GPU is not bottlenecked, your CPU physics scores were held back.

A 1090/1100T OCed to 4.2 GHz with 3GHz North Bridge should easily handle a GTX 590.


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> ^ sorry to hear all the trouble it's caused, but at least you're getting a beast of a card as an upgrade.


You and a couple of other members were the reason that I got the 590. Not that I'm blaming any of you at all for my misfortunes, but I guess I should have done more research before I jumped the gun on the 590. I mean, it's not like I had all of the other components ready to install when I bought the card... But, live and learn.

Now I'm playing the waiting game for the card to arrive, and another waiting game for P-PCs to get more acrylic/nickel blocks in. I certainly hope this card will do the trick and solve my problems, because I'm not upgrading my hardware anymore (aside from a secondary Titan, bigger case & rads).


----------



## mironccr345

Quote:


> Originally Posted by *ProfeZZor X*
> 
> You and a couple of other members were the reason that I got the 590. Not that I'm blaming any of you at all for my misfortunes, but I guess I should have done more research before I jumped the gun on the 590. I mean, it's not like I had all of the other components ready to install when I bought the card... But, live and learn.
> 
> Now I'm playing the waiting game for the card to arrive, and another waiting game for P-PCs to get more acrylic/nickel blocks in. I certainly hope this card will do the trick and solve my problems, because I'm not upgrading my hardware anymore (aside from a secondary Titan, bigger case & rads).


I hope the Titan works out for you. I'll be looking into the next gen GPU's. My 680 is still going strong.


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> I hope the Titan works out for you. I'll be looking into the next gen GPU's. My 680 is still going strong.


I was lucky enough to snag the last two acrylic/nickel EK blocks on P-PCs today, so that puts me at ease knowing that my Titan won't sit on my shelf for long.

With that said, it probably won't be long before I list my 590. I'm guessing next month. I bought it for a grip of money back in January 2012, so unfortunately I'll be taking a big hit.


----------



## vagenrider

http://s229.photobucket.com/user/vagenrider/media/DSC_0030_zpsda210f71.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0027_zps5c1dadc3.jpg.html

http://s229.photobucket.com/user/vagenrider/media/4850_zps8d2c6e93.jpg.html


----------



## mxthunder

you should post up those results in the official valley thread!


----------



## mironccr345

Quote:


> Originally Posted by *ProfeZZor X*
> 
> I was lucky enough to snag the last two acrylic/nickel EK blocks on P-PCs today, so that puts me at ease knowing that my Titan won't sit on my shelf for long.
> With that said, it probably won't be long before I list my 590. I'm guessing next month. I bought it for a grip of money back in January 2012, so unfortunately I'll be taking a big hit.


Two blocks. Thinking ahead and going to grab another titan?








Quote:


> Originally Posted by *vagenrider*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s229.photobucket.com/user/vagenrider/media/DSC_0030_zpsda210f71.jpg.html
> 
> http://s229.photobucket.com/user/vagenrider/media/DSC_0027_zps5c1dadc3.jpg.html
> http://s229.photobucket.com/user/vagenrider/media/4850_zps8d2c6e93.jpg.html


Nice score! How long have you had those 590's?


----------



## vagenrider

Quote:


> Originally Posted by *mxthunder*
> 
> you should post up those results in the official valley thread!


already posted before 2 months.


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> Two blocks. Thinking ahead and going to grab another titan?


That's the plan. This way, it wont be a headache (or pain in the wallet) to get another block later on. My next round of purchases will be two or three 480 or 360 Monsta rads, depending on which Caselabs model I get. It's been less than a year that I've had my build operational, but after all the things I've learned on Overclock in the time that I've been here, it's time I get my head in the game and build it right.


----------



## vagenrider

Quote:


> Originally Posted by *mironccr345*
> 
> Nice score! How long have you had those 590's?


thank you! about 9 months.. also the last months solve all the gaming nightmare on quad scaling with inspector and some other mods-adjustments and break the idiot myth which says that the quads never work properly....

but i sacrifice some months with tons of testing reading learning..

for the score....this score achieved on high summer temps(40+)..only the people who have 2 590's knows how hard is to keep cool these cards..i have made an extreme watercooling mod with huge radiator from car airconidtion outside to my house with a big 40w fan so,i have very very low temps..

at the autumn and winter will hit easly 5000+ also maybe post some photos of my cooling mod and system


----------



## mironccr345

Quote:


> Originally Posted by *ProfeZZor X*
> 
> That's the plan. This way, it wont be a headache (or pain in the wallet) to get another block later on. My next round of purchases will be two or three 480 or 360 Monsta rads, depending on which Caselabs model I get. It's been less than a year that I've had my build operational, but after all the things I've learned on Overclock in the time that I've been here, it's time I get my head in the game and build it right.


Well done man. I think the SM8 would be a perfect size for two Titans and your 3960x. One 480 on the top and a 360 in the front.









Quote:


> Originally Posted by *vagenrider*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> thank you! about 9 months.. also the last months solve all the gaming nightmare on quad scaling with inspector and some other mods-adjustments and break the idiot myth which says that the quads never work properly....
> but i sacrifice some months with tons of testing reading learning..
> for the score....this score achieved on high summer temps(40+)..only the people who have 2 590's knows how hard is to keep cool these cards..i have made an extreme watercooling mod with huge radiator from car airconidtion outside to my house with a big 40w fan so,i have very very low temps..
> at the autumn and winter will hit easly 5000+ also maybe post some photos of my cooling mod and system


Would def. like to see that custom loop.


----------



## vagenrider

Quote:


> Originally Posted by *mironccr345*
> 
> Well done man. I think the SM8 would be a perfect size for two Titans and your 3960x. One 480 on the top and a 360 in the front.
> 
> 
> 
> 
> 
> 
> 
> 
> Would def. like to see that custom loop.


http://s229.photobucket.com/user/vagenrider/media/DSC_0066_zpsdb8b5ee7.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0025_zpsd0f9fa1f.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0058_zpsd9ad369f.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0063_zpsd6154b5e.jpg.html

soon will post the radiator place..


----------



## mironccr345

Quote:


> Originally Posted by *vagenrider*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s229.photobucket.com/user/vagenrider/media/DSC_0066_zpsdb8b5ee7.jpg.html
> http://s229.photobucket.com/user/vagenrider/media/DSC_0025_zpsd0f9fa1f.jpg.html
> http://s229.photobucket.com/user/vagenrider/media/DSC_0058_zpsd9ad369f.jpg.html
> http://s229.photobucket.com/user/vagenrider/media/DSC_0063_zpsd6154b5e.jpg.html
> 
> 
> soon will post the radiator place..


UV green, nice. What is your mobo sitting on? Looks like some kind of padding.


----------



## ProfeZZor X

Quote:


> Originally Posted by *mironccr345*
> 
> Well done man. I think the SM8 would be a perfect size for two Titans and your 3960x. One 480 on the top and a 360 in the front.


UPS just left moments ago, but knowing that my Titan is physically in the building, waiting to be brought over to my desk is pretty exciting. I expect the blocks to be delivered by early next week sometime. As for the case, I was thinking that I might end up going with the T10 or SMH10 model because I'd have that extra space on the other side to tuck the drives and PSU away, and plenty of space for a few Monsta rads.

...Sorry for the off topic discussion everyone.


----------



## vagenrider

Quote:


> Originally Posted by *mironccr345*
> 
> UV green, nice. What is your mobo sitting on? Looks like some kind of padding.


its a full solid metal bench base designed-cuted-soldered by me..









http://s229.photobucket.com/user/vagenrider/media/DSC_0006_zps1c872219.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0009_zps3863b52f.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0011_zpsa82576f2.jpg.html


----------



## nismoskyline

hi, i'm interested in grabbing a pair of gtx 590's. i was wondering where would be the best place to pick them up relatively cheap. i have searched ebay with no avail. i'm looking to spend 250-300 per card. i have not seen any on this sites for sale section. i was hoping maybe you guys could help me out?


----------



## mojobear

Quote:


> Originally Posted by *vagenrider*
> 
> thank you! about 9 months.. also the last months solve all the gaming nightmare on quad scaling with inspector and some other mods-adjustments and break the idiot myth which says that the quads never work properly....
> 
> but i sacrifice some months with tons of testing reading learning..
> 
> for the score....this score achieved on high summer temps(40+)..only the people who have 2 590's knows how hard is to keep cool these cards..i have made an extreme watercooling mod with huge radiator from car airconidtion outside to my house with a big 40w fan so,i have very very low temps..
> 
> at the autumn and winter will hit easly 5000+ also maybe post some photos of my cooling mod and system


hey Vagenrider,

what problems did you have with quad sli and how did you solve them with inspector? Just curious if you could give me some tips!


----------



## vagenrider

Quote:


> Originally Posted by *mojobear*
> 
> hey Vagenrider,
> 
> what problems did you have with quad sli and how did you solve them with inspector? Just curious if you could give me some tips!


i need tons of hours to explain all that....all i can tell you is the most easy thing is to buy a quad..the hard thing is to make it work..


----------



## vagenrider

some little changes

http://s229.photobucket.com/user/vagenrider/media/DSC_0012_zps6d0d4c6d.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0022_zps874fa7f5.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0063_zpse96b160a.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0053_zps6fdded45.jpg.html

http://s229.photobucket.com/user/vagenrider/media/DSC_0010_zps0d2d624d.jpg.html


----------



## soulwrath

Looks great, plan to WC my quad 590s also - and same questions as bear


----------



## Canis-X

Quote:


> Originally Posted by *RagingCain*
> 
> Your scores show the GPU is not bottlenecked, your CPU physics scores were held back.
> 
> A 1090/1100T OCed to 4.2 GHz with 3GHz North Bridge should easily handle a GTX 590.


I understand your point in 3DM11, which may not have been the best example, since it is better optimized to stress only the GPU's on the GPU tests but most games and pretty much all other benchmarks clearly show that 1090T/1000T limit the FPS of these cards in some capacity or another. My Vantage scores also show this in that same post. Also, my 1090T was OC'd to 4.6GHz on the CPU and 3GHz NB.....still doesn't help.


----------



## Canis-X

Looks like flashing the BIOS of these cards to change the voltages matter anymore when using the current NVidia drivers. I was wondering why I kept crashing all of the time while playing BF4....you must update your drivers for this. Now looking in MSI afterburner and GPU-Z all but one of my GPUs run at the flashed voltage of 1.013V all of the others are running at the stock voltage of .963v. Major bummer!! I really don't want to flash them to MARSII again and retest that all out, I also don't want to flash them back to stock either. Ugg.

Pedro....do you still have your 590's? Are they still flashed to MARSII and running fine on the current NVidia drivers?

Edit....

Hmmm, so after a little more tweaking I found that if I use MSI Afterburner and manually retype the voltage in again with with a change to the core clock I can get the voltages to bump up to the BIOS setting, 1.013v. odd. I'll do some more testing and see how it behaves.

Edit x2...

So, they are not going to stock volts after all, however the voltage is throttling down to .963v....sometimes even to .950v. Oddly enough, it seems that when GPU 1 is running at 1.013v and the other three GPU's are running at .963v that is when the system is most stable. When all 4 are running at 1.013v that is when I get the most crashes. Wish there was a way to bypass NVidia's throttling logic completely.


----------



## soulwrath

what to do... should i buy new water blocks for my 590s to get 20-30% more out of it or wait a bit? and at the same time convert my entire rig to a WC system o.o


----------



## Canis-X

Well, one of my 590's died on Tuesday....R.I.P.!!! She OC'd well!


----------



## jeromeface

Mine lasted until about a month ago but I got lucky. My cards had a lifetime warranty due to them being evga cards. I recently got the replacement rma cards back 680 4gb ftw's. I was pretty happy about the upgrade. Thank you evga!


----------



## Canis-X

Nice! Congrats!


----------



## Revolution996

Help guys,
I am really tempted to sell my 590 for a 780 or similar when I upgrade to an I7 4770K, do I do it or is not going to be justifiable for the cost involved, or will the Intel set up boost my current GPU over my (o/c) FX8350.
Aggggggghhhhhh...!

Going crazeee..









Revo.


----------



## Canis-X

Hello! Hope that everyone is having a good holiday season this year!! Just in case anyone is interested, or maybe thinking about it, I have 2 waterblocks for the ASUS GTX 590 for sale --> *See my for sale link in my signature!!!*









Thanks, that is all.....


----------



## chaosneo

Greetings,
i wish to inquiry about GTX590 running and operating temp. i have been using my Gigabyte GTX590 for three years already. warranty twice due to 2nd DVI port malfunction but otherwise, this gpu perform great.

currently, the temp when i am playing games is around 80-90 Celsius. the cooling is air and stock design i think. is this too high or just slightly above normal load temp and acceptable?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *chaosneo*
> 
> Greetings,
> i wish to inquiry about GTX590 running and operating temp. i have been using my Gigabyte GTX590 for three years already. warranty twice due to 2nd DVI port malfunction but otherwise, this gpu perform great.
> 
> currently, the temp when i am playing games is around 80-90 Celsius. the cooling is air and stock design i think. is this too high or just slightly above normal load temp and acceptable?


Try your best to keep it under 80'C if you can. Under 85'C is a must.

Improve your ambient temps and/or case air flow.


----------



## chaosneo

Thank You Sir,
by using MSI Afterburner, a great software by the way, and showing ingame OSD (On Screen Display), i get to monitor the temps and load:

GPU1 : 85°C , 97%
CPU2 : 88°C , 98%
CPU1 : 60°C , 9%
CPU2 : 60°C , 66%
RAM : 4146 MB

above is the average of the high point during gaming. maybe i will try to add a PCI fan just below the GPU intake fan. will update once i did the modifications.


----------



## Canis-X

Another thing that you could do that would help lower your temps a lot would be to replace the TIM (a.k.a. thermal grease) on the GPU chips. Three year old TIM is kind of old. Also, they tend to put too much of the stuff on it from the factory and typically it is not that great of a quality TIM either. Replace it with some GELID Solutions GC-Extreme Thermal Compound. It is some great stuff, IMHO.

Changing out the TIM is also a great excuse to thoroughly clean out the stock heatsink and fan blades of any dust that has accumulated which has a big impact on your temps.







Doing these two things should really help drop your temps down.


----------



## bernieyee

Hi,

My friend gave me his GTX 590 and I am curious if anyone can decipher the information from the stickers on the GPU. The main one in white has rubbed off.

699-11020-0005-500K
032 Made in China

GeForce GTX 590
S/N: 0320911098402

The sticker on the back also says BIOS Ver: 70.10.42.00.02. From looking at the Video BIOS Collection at TechPowerUp! The only cards that had this BIOS version were ASUS, Gainward, Inno3D, MSI, NVIDIA, and Palit.

There are absolutely no stickers on this card, so I don't know which manufacturer made this, aside from the PCB ones which seem very generic (the ones I posted above).


----------



## SloaneRanger

Quote:


> Originally Posted by *bernieyee*
> 
> Hi,
> 
> My friend gave me his GTX 590 and I am curious if anyone can decipher the information from the stickers on the GPU. The main one in white has rubbed off.
> 
> 699-11020-0005-500K
> 032 Made in China
> 
> GeForce GTX 590
> S/N: 0320911098402
> 
> The sticker on the back also says BIOS Ver: 70.10.42.00.02. From looking at the Video BIOS Collection at TechPowerUp! The only cards that had this BIOS version were ASUS, Gainward, Inno3D, MSI, NVIDIA, and Palit.
> 
> There are absolutely no stickers on this card, so I don't know which manufacturer made this, aside from the PCB ones which seem very generic (the ones I posted above).


Hello
My 590GTX is a GIGABYTE with a bios Ver 70.10.42.00.02 so your card could be a GIGABYTE too. ALL PCB are generic so no difference between one model or another one (except ASUS MARS II).
But since i got previously a 70.10.37.00.01 GIGABYTE card, i can confirm you that .42 cards got stronger VRM than .37 ones. i m sure from that because my waterblock is a koolance rev 1.1 for .42 card & doesn t fit on .37 card


----------



## bernieyee

Quote:


> Originally Posted by *SloaneRanger*
> 
> Hello
> My 590GTX is a GIGABYTE with a bios Ver 70.10.42.00.02 so your card could be a GIGABYTE too. ALL PCB are generic so no difference between one model or another one (except ASUS MARS II).
> But since i got previously a 70.10.37.00.01 GIGABYTE card, i can confirm you that .42 cards got stronger VRM than .37 ones. i m sure from that because my waterblock is a koolance rev 1.1 for .42 card & doesn t fit on .37 card


Ok, because the reason I ask was becaused I wanted to see if there was a way to RMA it..

The card boots up fine in Safe Mode, but causes my system to black screen right after the Windows loading flag. I can't tell if it's the GPU, a motherboard compatibility issue or PSU.

I tried with both a Seasonic 550W and a Xigmatek 600W, but both should be able to power a GTX590 since it's not really under load. I even tried logging in normally with drivers all wiped using Display Driver Uninstaller.

I will test it in a 750W PSU when I have time, but kind of confused. I wanted to flash a different BIOS on it, but not sure if it would make a difference.


----------



## SloaneRanger

i RMAed my .37 GIGABYTE & got a new one .42  ; .42 are really greater for overcloking but don t try that now before solving your black screen issue. if you got your card from a friend, ask him if the card was ok inside his PC.

My PSU is a 1k200 watt CORSAIR LOL , but 600w shoud be enough since it s not a noname (enough if your PC isn 't overclocked as mine


----------



## rush2049

I am selling my GTX 590 on ebay,

so remove me from the club.

It was a good time, provided my second loaner rig with some good horsepower for a long time.


----------



## Robitussin

I'm starting to feel the burn on some of the newer games, I think its almost time to upgrade to the next new shiny, not sure if i should get a TitanX or a 390x tho, I wish AMD would hurry up n give us some solid numbers though, Star Citizen is running pretty rough, and If i do too much multitasking some of the other games I have been playing recently start to slow down









Anyone else feeling the itch yet?


----------



## Canis-X

Oh yeah, while the price dropped, I bought three MSI 290X Lightnings. I'm doing really good now, LOVE IT!! I put my 590 into my secondary rig and it is living the easy life now....ahhhh retirement!!


----------



## Canis-X

If anyone is still running one of these and was interested in watercooling it, I have one full coverage XSPC waterblock that I'm selling, see the link in my signature block. It is in excellent shape and worked great in my past build.


----------



## spyropt

i know this topic is almost dead but here goes my 590 air cooled

33º idle and 55ºc load
overclocked 700mhz clock 1900 mhz ram 0.938v


i was doing tests with the fans to see if i could ditch the fans









no quad sli for me









it's a mess the computer but i was doing tests








no cable management


----------



## Techyrod

Need help with a gtx 590 I had laying around my garage, installed in my computer and boots up into to windows and I get an error from precision X saying no hardware detected? card seems to reject the drivers im trying to install and reeboots when trying to install drivers. screen resolution is not correct when in windows. card looks fine led lights up and card is nice and quiet fans spins. not sure what to do about it. gpuz recognizes the card. BIOS is 70.10.37.00.02. is the card trash?


----------



## shaolin95

Hello guys!
Unlikely to get a reply from an older card but just in case...

So a local is selling the 590 for $60. That looks like a darn good price for what it is. Last time I had a dual car was the room cooking 295 lol
Anyways, I currently have a GTX260 and a GTS 450 in order to power 3 monitors but this is mostly for photo editing as I am not doing any gaming right now.
Main monitor is 1440 and the other two just 1080.
Then again, maybe getting the 590 will tempt me to try gaming again with so many awesome games I am missing...always wanted to do Fallout 3, Mass Effect, etc.
I have been kind of just waiting for Oculus Rift or similar (I was always using Nvidia 3D vision before).
So I figured, for $60 I could get a more powerful card (might help with some photoshop, dxo processing too) that could replace my other 2 as a single card for all 3 monitors and allow me to play a game from time to time at decent quality.
Worst case I guess I can sell it and make a few bucks.
What you guys think?


----------



## mironccr345

http://wwwimgur.com/q/Tls

Quote:


> Originally Posted by *shaolin95*
> 
> Hello guys!
> Unlikely to get a reply from an older card but just in case...
> 
> So a local is selling the 590 for $60. That looks like a darn good price for what it is. Last time I had a dual car was the room cooking 295 lol
> Anyways, I currently have a GTX260 and a GTS 450 in order to power 3 monitors but this is mostly for photo editing as I am not doing any gaming right now.
> Main monitor is 1440 and the other two just 1080.
> Then again, maybe getting the 590 will tempt me to try gaming again with so many awesome games I am missing...always wanted to do Fallout 3, Mass Effect, etc.
> I have been kind of just waiting for Oculus Rift or similar (I was always using Nvidia 3D vision before).
> So I figured, for $60 I could get a more powerful card (might help with some photoshop, dxo processing too) that could replace my other 2 as a single card for all 3 monitors and allow me to play a game from time to time at decent quality.
> Worst case I guess I can sell it and make a few bucks.
> What you guys think?


For $60, do it!!!!


----------

