# [Official] NVIDIA GTX 690 Owners Club



## jcde7ago

*NVIDIA GTX 690 Owners Club*

http://cdn.overclock.net/9/91/9181670c_logo_geforce.png *GTX 690 - Official Product Page*



Quote:


> *The GeForce GTX 690 may very well be the first dual-GPU graphics card that emerged in full form, free from compromise. It's the world's fastest graphics card by a wide margin. It's quieter and consumes less power than its predecessor. It's crammed with features that eliminate jaggies and sustains smooth framerates. And to top it off, it's made of the finest materials, expertly constructed, and with a look that speaks directly of the power that is housed within.
> 
> At a MSRP of $999, it's not a graphics card for everyone. But for enthusiasts who demand the very best and gamers who want a graphics card that will last for generations, the GeForce GTX 690 represents the ultimate in gaming from NVIDIA.*


*United States*:
- EVGA
- Amazon
- NewEgg
- Tiger Direct

*Canada*:
- Tiger Direct
- NCIX







*Features*:










Quote:


> *Kepler GPU Architecture:
> NVIDIA's Kepler GPU architecture has been designed from the ground up not just for maximum performance in the latest DirectX 11 games, but optimal performance per watt. The new SMX streaming multiprocessor is twice as efficient as the prior generation and the new geometry engine draws triangles twice as fast. The result is world class performance and the highest image quality in an elegant and power efficient graphics card*.



Quote:


> *Up until now, GPUs have operated at a fixed clock speed when playing 3D games, even if they have the potential to run faster. GPU Boost intelligently monitors graphics work load and increases the clock speed whenever possible. The result is that the GPU always performs at its peak and you get the highest framerate possible.*



Quote:


> *Nothing is more distracting than framerate stuttering and screen tearing. The first tends to occur when framerates are low, the second when framerates are high. Adaptive V-Sync is a smarter way to render frames. At high framerates, V-sync is enabled to eliminate tearing, at low frame rates, it's disabled to minimize stuttering. It gets rid of distractions so you can get on with gaming.*



Quote:


> *The GTX 690 uses hardware based frame metering for smooth, consistent frame rates across both GPUs. (In other words - say goodbye to micro-stuttering, which is *sometimes* perceived by *some* gamers in ALL multi-GPU setups, whether it's dual-GPU cards, or two/three/four single-GPUs in SLI).
> *


*To be added to the members list, please post the following*:

A) Proof of card ownership (picture, etc.)
B) Brand of card

*Members List*:

*Member Name / Brand / Proof*

*- jcde7ago* / EVGA / Proof
*- Michalius* / EVGA / Proof
*- Arizonian* / EVGA / Proof
*- CapnCrunch10* / EVGA / Proof
*- Cheesemaster* / EVGA x2 / Proof
*- blumpking* / EVGA / Proof
*- rwchui* / EVGA / Proof
*- V3teran* / GIGABYTE x2 / Proof
*- shiloh* / EVGA / Proof
*- Basti* / EVGA / Proof
*- Sh1n1ng Forc3* / EVGA / Proof
*- Lazz767* / EVGA / Proof
*- Callandor* / EVGA / Proof
*- kemsoff* / EVGA / Proof
*- dred* / EVGA / Proof
*- TheRainMan* / EVGA / Proof
*- hitman2169* / EVGA / Proof
*- eviltinky* / EVGA / Proof
*- sundrou* / EVGA x2 / Proof
*- ceteris* / EVGA / Proof
*- Suit Up* / GIGABYTE / Proof
*- tahoward* / EVGA / Proof
*- Cobolt005* / EVGA / Proof
*- pat031* / EVGA / Proof
*- Kyouki* / EVGA / Proof
*- Stateless* / EVGA x2 / Proof
*- Malachi101* / EVGA / Proof
*- gatehous3* / ASUS / Proof
*- wRRM* / GIGABYTE / Proof
*- ReignsOfPower* / EVGA / Proof
*- WaXmAn* / EVGA / Proof
*- Vinnce* / EVGA / Proof
*- Marcsrx* / EVGA / Proof
*- Pingis* / EVGA / Proof
*- xoleras* / EVGA / Proof
*- TassM* / EVGA / Proof
*- icecpu* / EVGA / Proof
*- JeepsRcool* / EVGA / Proof
*- dboythagr8* / EVGA / Proof
*- zkalra* / EVGA / Proof
*- Divineshadowx* / EVGA x2/ Proof
*- s74r1* / EVGA / Proof
*- thestache* / EVGA X2 / Proof
*- egotrippin* / EVGA / Proof
*- MrTOOSHORT* / EVGA / Proof
*- juanP* / EVGA x2 / Proof
*- wahdahale* / EVGA / Proof
*- marcmartyn* / EVGA / Proof
*- Buzzkill* / EVGA x2 / Proof
*- AngelKnight* / EVGA / Proof
*- Yerph* / GAINWARD / Proof
*- FiShBuRn* / ASUS / Proof
*- kingchris* / EVGA / Proof
*- nVIDIASLiRig* / EVGA / Proof
*- jagz* / EVGA / Proof
*- thunder1* / EVGA / Proof
*- kxdu* / ASUS / Proof
*- FalcX* / EVGA / Proof
*- AVEPICS* / EVGA / Proof
*- Fallendreams* / EVGA / Proof
*- Stray_Bullet* / EVGA / Proof
*- clone38*/ EVGA / Proof
*- andygoyap* / EVGA / Proof
*- RayTrace77* / EVGA / Proof
*- crazychink* / EVGA / Proof
*- bitMobber* / EVGA / Proof
*- sundrou* / MSI / Proof
*- bnj2* / ASUS / Proof
*- Qu1ckset* / EVGA / Proof
*- PhantomTaco* / ASUS / Proof
*- Jessekin32* / EVGA / Proof
*- max883* / ASUS / Proof
*- DinaAngel* / EVGA / Proof
*- Sterling84* / EVGA / Proof
*- Crip* / GIGABYTE / Proof
*- jassilamba* / EVGA / Proof
*- dmaffo* / EVGA / Proof
*- lukeman3000* / EVGA / Proof
*- JamesK* / EVGA / Post
*- grassy* / GIGABYTE x2 / Proof
*- Falcon3* / EVGA / Proof
*- Brocky-LJ* / GIGABYTE / Proof
*- Renairy* / EVGA / Proof
*- gizmo83* / GIGABYTE / Proof
*- D3skt0pG4m3r* EVGA / Proof
*- Dynn* / ASUS / Proof
*- alienware* / ASUS / Proof
*- Kaapstad* / EVGA / Proof
*- saifbukhari* / ASUS / Proof
*- C0re0per4tive* / EVGA / Proof
*- PinzaC55* / EVGA / Proof
*- D4rkThanatoS* / EVGA / Proof
*- Billymac10* / EVGA / Proof
*- maximus56* / EVGA / Proof
*- GTX 690 SLI* / Gainward / Proof
*- Alex132* / EVGA / Proof
*- AllGamer* / EVGA x2 / Proof
*- rcfc89* / ASUS / Proof
*- noob.deagle* / ASUS / Proof
*- Seanimus* / EVGA / Proof
*- Tech09* / Gigabyte x2 / Proof
*- pilla99* / ASUS / Proof
*- JoshMck* / ASUS x2 / Proof
*- Scorpion49* / EVGA / Proof
*- Rei86* / EVGA / Proof
*- over-x16* / EVGA / Proof
*- DamnVicious* / Inno3D / Proof
*- jhager8783* / EVGA x2 / Proof
*- Sweeper101* / EVGA / Proof
*- RandomHer0* / EVGA / Proof
*- ozrek* / EVGA / Proof
*- TheMadHerbalist* / EVGA / Proof
*- Sugi* / EVGA / Proof
*- PCModderMike* / EVGA / Proof
*- Paz1911* / ASUS / Proof
*- KrynnTech* / EVGA x2 / Proof
*- Lukas026* / MSI / Proof
*- worms14* / Gigabyte / Proof
*- wermad* / EVGA x2 / Proof
*- TeamBlue* / EVGA / Proof
*- wholeeo* / EVGA / Proof
*- zer0sum* / Gigabyte / Proof
*- JimmyD* / EVGA / Proof
*- tinmann* / EVGA / Proof
*- Maximilium* / EVGA x2 / Proof

Quote:


> Hi Everyone, *Arizonian* here. I'll no longer be assisting jcde7ago with updating the club members list. New members wanting to join please post your 'proof' as per club simple requirements for jcde7ago. Thank you.


*Resource Links*:

Nvidia GeForce Drivers Page
EVGA - Precision X - Download Link
EVGA - Precision X - GTX 690 Skin
EVGA - LED Controller (EVGA 690s only)

*Reviews*:

Anandtech
Guru3D
TechPowerUp
Hardware Canucks
[H]ard | OCP
PC Perspective
Tom's Hardware
Bjorn3D
Legit Reviews
Hexus
Hardware Heaven
X-bit Labs
Benchmark Reviews
Hitech Legion
Hot Hardware
Overclockers Club
Tech Report

*My GTX690 Benchmarks*:



Spoiler: Warning: Spoiler!



*jcde7ago / EVGA / (1045/1852/1150)*

(*This 3DMark11 benchmark was actually run on stock clocks*).


*(3DMark11 benchmark at the above OC settings):*


*(Heaven 3.0 benchmark at above OC settings):*




*Review Benchmarks*:


Spoiler: Warning: Spoiler!



*HardwareCanucks (i7 3930K @ 4.5 Ghz used):*



*Tom's Hardware (i7 3960X @ 4.2 Ghz used):*



*Guru3D (i7 965 Extreme @ 3.75 Ghz Used):*



*Hexus (i5 2500K @ 3.3 Ghz used):*



*TechPowerUp (i7 3770K @ 4.7 Ghz used):*


----------



## Arizonian

*Informational Bits*

*Source:* *Performance Perfected: Introducing the GeForce GTX 690*

Quote:


> *Improved Frame rate Metering*
> Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering. In SLI mode, two GPUs share the workload by operating on successive frames; one GPU works on the current frame while the other GPU works on the next frame. But because the workload of each frame is different, the two GPUs will complete their frames at different times. Sending the frames to the monitor at varying intervals can result in perceived stuttering.
> 
> The GeForce GTX 690 features a metering mechanism (similar to a traffic meter for a freeway entrance) to regulate the flow of frames. By monitoring and smoothing out any disparities in how frames are issued to the monitor, frame rates feel smoother and more consistent.


*PCIe 2.0 vs PCIe 3.0 Scaling GTX 690 Performance*

*EVGA GTX 690 Backplate*

*ARCTIC Accelero Twin Turbo 690 Cooler Review*

*GTX 690 BIOS A - B - C / by OCN member skyn3t







*
---- _Use at your own RISK_ ----

skyn3t-vBios-GTX690-A-B-C.zip 571k .zip file











*Benchmarks 690 vs 7990 @ 2560x1600
*



Spoiler: Warning: Benchmarks!



Source - *HardwareCanucks*










*My Personal GTX 690 Info*

*Arizonian --- i7 3770K 4.5 GHz - EVGA GTX 690 1044 Core 1594 Memory 1149 Boost* / *CPU-Z Validation*

*3DMark11* *P17011* *Validation Score*



Spoiler: Warning: Spoiler!







*Kepler Boost GPU #1 1175 MHz / GPU #2 1201 MHz - 301.42 Drivers*

*Arizonian --- i7 3770K 4.5 GHz - EVGA GTX 690 1044 Core 1594 Memory 1149 Boost /* *CPU-Z Validation*

*3DMark11 X6888 Validation Score*



Spoiler: Warning: Spoiler!







*Kepler Boost GPU#1 1175 MHz / GPU#2 1201 MHz - 301.42 Drivers*

*Arizonian --- i7 3770K 4.4 GHz - EVGA GTX 690 1044 Core 1594 Memory 1149 Boost / CPU-Z Validation

*

*Heaven 3.0 Benchmark FPS* *110.0* *Score* *2772*



Spoiler: Warning: Screen Shot!



Render: *Direct X 11*
Mode: *1680x1050 fullscreen*
Shaders: *high*
Textures: *high*
Filter: *trilinear*
Anisotropy: *16x*
Occlusion: *enabled*
Refraction: *enabled*
Volumetric: *enabled*
Anti-Aliasing: *8x*
Tessellation: *extreme*







*Kepler Boost GPU#1 1175 MHz / GPU#2 1189 MHz - 301.42 Drivers*

*My EVGA GTX 690 Pictures*



Spoiler: Warning: Spoiler!





[/



*1045 Core / 3121 Memory / 1150 Boost = Kepler Boost GPU#1 1176* *GPU #2* *1202*





Spoiler: EVGA Precision X Log


----------



## DADDYDC650

Looks beautiful! Do I dare?


----------



## Mariusmsj

at OCUK

ETA: 03/05/12

Stock Code: GX-181-OK

Manufacturers Code: NVIDIA_GTX690_KICKS_ASS


----------



## TheBenson

I will be very interested in how these perform. Might make me regret going sli if the scaling is good enough.


----------



## Arizonian

Quote:


> Originally Posted by *TheBenson*
> 
> I will be very interested in how these perform. Might make me regret going sli if the scaling is good enough.


I wouldn't regret it one bit. Your still going to be ahead in performance as the 690 is going to be expected to come within 91% - 95% performance of two GTX 680 SLI.

Don't look back bud - your already there!









Only difference is aesthetically the GPU metallic casing with LED lights.


----------



## Tslm

Am I the only person on this planet who doesnt like leds? The solid casing is cool though. I feel like when you pay > $500 for a GPU they should do a bit better than a plastic shroud


----------



## kcuestag

Quote:


> Originally Posted by *Arizonian*
> 
> I wouldn't regret it one bit. Your still going to be ahead in performance as the 690 is going to be expected to come within 91% - 95% performance of two GTX 680 SLI.
> Don't look back bud - your already there!
> 
> 
> 
> 
> 
> 
> 
> 
> Only difference is aesthetically the GPU metallic casing with LED lights.


^This.

Those who have a GTX680 SLI don't need to regret their purchase, a GTX680 SLI will still be more powerful than a GTX690, both at the same price.









Great looking card for those who prefer having a single PCB dual card, I prefer 2x physical cards though.


----------



## Supreme888

Count me in soon! Finally found my card!








Just wonding how the 7th release will look like and what other manufacture have to provide.


----------



## Milamber

I'm gonna need two for 3dvision. I wish it was 4gb per gpu, since games will only use 2gb per core.

Sent from my Galaxy Nexus using Tapatalk 2


----------



## jcde7ago

Quote:


> Originally Posted by *TheBenson*
> 
> I will be very interested in how these perform. Might make me regret going sli if the scaling is good enough.


The scaling will literally be the same. This is the first dual-GPU card Nvidia is releasing without having to compromise anything or cut corners, given how power efficient and cool Kepler runs. Same number of cores, i think one more power phase than a reference 680 (4 vs 5 on the 690), so the performance and OC headroom should be identical.

That said, no need to go through the trouble of switching from 680s in SLI to a 690, unless quad-SLI is in your future.

Really the only advantages of the GTX 690 over a pair of 680s is that it's going to be a tad bit more power efficient and quieter (since it's a single PCB instead of two), it will require less slots, and you'll only need to purchase a single waterblock if you watercool. Other than that, no need to fret if you've already got a pair of 680s.


----------



## tonyjones

Should I sell my 6990 now?


----------



## jcde7ago

Quote:


> Originally Posted by *tonyjones*
> 
> Should I sell my 6990 now?


That's kind of a hard question to answer given that we know absolutely NOTHING about your setup and/or needs... lol. Filling out your sig rig specs. is always a helpful thing for fellow members in helping us help you.


----------



## uncle00jesse

1000 bucks?? for one card?? looks like this is going to take quite a few trips to the sperm bank


----------



## jcde7ago

Quote:


> Originally Posted by *uncle00jesse*
> 
> 1000 bucks?? for one card?? looks like this is going to take quite a few trips to the sperm bank


LOL...I have now heard it all, OCN.


----------



## crazedsilence

Quote:


> Originally Posted by *Tslm*
> 
> Am I the only person on this planet who doesnt like leds? The solid casing is cool though. I feel like when you pay > $500 for a GPU they should do a bit better than a plastic shroud


I'm hoping they include a feature that allows you to turn off the LED


----------



## jcde7ago

Quote:


> Originally Posted by *crazedsilence*
> 
> I'm hoping they include a feature that allows you to turn off the LED


I'm assuming that even if they don't, you should be able to open up the card and unplug the LED's power pin. Although I agree, it would be nice if something like Afterburner could turn it off down the road.

For me though...as much as I like the design overall...i'll probably end up putting this bad boy under water anyways.


----------



## covert ash

I'm not one for aircooling much these days, but I have to say that is one *SHARP* looking cooler.







I would actually use that as a decoration piece if I bought one and slapped a waterblock on it.









I'm definitely interested in seeing how these perform, especially since it doesn't appear to have much of any corner cutting in order to realize the full power of two GK104 chips. My previous experience with an HD5970 left a sour taste in my mouth with dual-GPU, single PCB cards, but I welcome a new attempt to change my opinion.

The only question is two of these, or wait for GK110...


----------



## jcde7ago

Quote:


> Originally Posted by *covert ash*
> 
> I'm not one for aircooling much these days, but I have to say that is one *SHARP* looking cooler.
> 
> 
> 
> 
> 
> 
> 
> I would actually use that as a decoration piece if I bought one and slapped a waterblock on it.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm definitely interested in seeing how these perform, especially since it doesn't appear to have much of any corner cutting in order to realize the full power of two GK104 chips. My previous experience with an HD5970 left a sour taste in my mouth with dual-GPU, single PCB cards, but I welcome a new attempt to change my opinion.
> The only question is two of these, or wait for GK110...


Agreed...i'll slap a waterblock on mine for sure, and then end up displaying the shroud. It's funny, because I have a buyer for my GTX 590 already and the 590 block, and he didn't want the fan shroud so i'll put that up for display as well, lol.

As for the wait for GK110...I would say pull the trigger on the 690 if it's between that or waiting. Nvidia's entire GTX 600 lineup is all going to have GK104 chips in them, and they seem content to really flood the market this time around. People think that the GTX 680 supply is being limited because they can't keep stock, but honestly, I believe that's completely false and they're using that tactic to drive up demand, and you'll see availability of the GTX 680 and 690 ramp up a LOT, as will the 670 when that's released.

My personal opinion is that we don't see a high-end, flagship GK110 for AT LEAST 8-9 months. That's enough time for me to pull the trigger on the GTX 690, and sell it by the time GK110 rolls around.


----------



## covert ash

Quote:


> Originally Posted by *jcde7ago*
> 
> Agreed...i'll slap a waterblock on mine for sure, and then end up displaying the shroud. It's funny, because I have a buyer for my GTX 590 already and the 590 block, and he didn't want the fan shroud so i'll put that up for display as well, lol.
> As for the wait for GK110...I would say pull the trigger on the 690 if it's between that or waiting. Nvidia's entire GTX 600 lineup is all going to have GK104 chips in them, and they seem content to really flood the market this time around. People think that the GTX 680 supply is being limited because they can't keep stock, but honestly, I believe that's completely false and they're using that tactic to drive up demand, and you'll see availability of the GTX 680 and 690 ramp up a LOT, as will the 670 when that's released.
> My personal opinion is that we don't see a high-end, flagship GK110 for AT LEAST 8-9 months. That's enough time for me to pull the trigger on the GTX 690, and sell it by the time GK110 rolls around.


That is true. If the 690 does have good quantities available, and the GK110 is not going to be out till the end of the year, then I do see one or two along with accompanying waterblocks in my future.









Haha, and about the 590 cooler, you can start building your collection, or mosaic of dual GPU card coolers!







Of course, you'd have to go backwards and acquire one for the GTX 295 as well.


----------



## Arizonian

Another thing that is different is I haven't heard 'limited' supplies on this dual GPU release like the 590 did. I'm heavily leaning toward the 690 as a learning experience as a first dual for me and I've not felt as confident as I do now than with the 690 it's time.

No gimping it down. In fact they took the more robust 5 power phase per side 10 total road and I see this being a good over clocking card.


----------



## emett

I hope to join this club soon.


----------



## Recipe7

There they go again, claiming 4gb of RAM on the 690.

Just like the "3gb" on the 590


----------



## Arizonian

Quote:


> Originally Posted by *Recipe7*
> 
> There they go again, claiming 4gb of RAM on the 690.
> 
> Just like the "3gb" on the 590


Some people just don't get it's being mirrored. Just advice them and hope it eventually sticks.


----------



## Recipe7

Quote:


> Originally Posted by *Arizonian*
> 
> Some people just don't get it's being mirrored. Just advice them and hope it eventually sticks.


Not sticky enough,


----------



## jcde7ago

Quote:


> Originally Posted by *Recipe7*
> 
> Not sticky enough,


Lol.

Every dual-GPU card has always advertised the total amount of RAM as opposed to the useable/shared amount. This is fairly normal.


----------



## jcde7ago

Unboxing vid:


----------



## tonyjones

so these go on sale wed at midnight on newegg?


----------



## jcde7ago

Quote:


> Originally Posted by *tonyjones*
> 
> so these go on sale wed at midnight on newegg?


Guess we'll be finding out tonight..


----------



## fastpcman12

where to go on midnight on newegg site?


----------



## jcde7ago

Quote:


> Originally Posted by *fastpcman12*
> 
> where to go on midnight on newegg site?


No idea if it will even be available on NewEgg. We don't even know which AIBs (board partners) will have available models tonight/tomorrow.

There is "limited availability" on May 3, even though yes, tomorrow is indeed the official release date. They're supposed to increase availability next Monday...so you'll just have to keep an eye out in the next few hours and keep on F5'ing!


----------



## Cheesemaster

for the love of god i want two of these bad boys where do i order?


----------



## i7monkey

So when are reviews coming out 12AM EST?


----------



## CapnCrunch10

I don't even know what to search on the Egg.

Putting in gtx 690 in the search box results in this, which is not helpful at all. Eh, I wouldn't buy the card anyway unless it pops up on Amazon or the egg.


----------



## jcde7ago

I am assuming reviews will trickle in within the next 3-12 hours from various websites.

As for availability, we know they'll be selling them _somewhere_, in limited quantities scheduled for May 3 and more general availability next Monday, May 7, but no one was specific about it, so it's pretty much blind luck if people happen to run into one as they are listed on various e-tailers tonight/tomorrow...


----------



## fastpcman12

it's online at newgg!
HURRY!

J/K


----------



## CapnCrunch10

You're sooooooooo hilarious.


----------



## Cheesemaster

My hairy exploded


----------



## jcde7ago

Quote:


> Originally Posted by *fastpcman12*
> 
> it's online at newgg!
> HURRY!
> J/K


It's online at www.bathandbodyworks.com !!!

HURRY!!!!!


----------



## Michalius

Usually I don't fall for that kind of crap. This is the first time I'm trying to snipe a day 1 card though.

So screw you buddy.


----------



## Cheesemaster




----------



## Cheesemaster

I wants 2 for my rampage extreme with the 3960x... so now i can run my titanium hd with my killer nic card!


----------



## i7monkey

http://www.hardwarecanucks.com/news/video/reminder-gtx-690-review-live-tomorrow-9am-est-6am-pst/
Quote:


> Reminder: GTX 690 Review LIVE Tomorrow @ 9AM EST / 6AM PST
> 
> This is a friendly reminder to our readers that the launch time for the GTX 690 IS *NOT the typical midnight affair we're used to*.
> 
> As was discussed in our original preview, we will be publishing our review tomorrow morning @ 9AM EST / 6AM PST, to coincide with *NVIDIA's official GTX 690 launch timeframe*.
> 
> Join us here at that time for a FULL, in-depth review of the planet's fastest graphics card. We'll even be throwing in a few additional tests, outside of our traditional testing regime.


I'm guessing the NDA ends at 9AM??


----------



## jcde7ago

Quote:


> Originally Posted by *i7monkey*
> 
> http://www.hardwarecanucks.com/news/video/reminder-gtx-690-review-live-tomorrow-9am-est-6am-pst/
> Quote:
> 
> 
> 
> Reminder: GTX 690 Review LIVE Tomorrow @ 9AM EST / 6AM PST
> This is a friendly reminder to our readers that the launch time for the GTX 690 IS *NOT the typical midnight affair we're used to*.
> As was discussed in our original preview, we will be publishing our review tomorrow morning @ 9AM EST / 6AM PST, to coincide with *NVIDIA's official GTX 690 launch timeframe*.
> Join us here at that time for a FULL, in-depth review of the planet's fastest graphics card. We'll even be throwing in a few additional tests, outside of our traditional testing regime.
> 
> 
> 
> I'm guessing the NDA ends at 9AM??
Click to expand...

Makes sense...thanks for posting!

Dammit...just when I thought I would get to sleep in after 5 AM on a weekday because I have to commute to a client's location tomorrow, I now have to wake up at 6 AM just to try to snipe a 690...FFS NVIDIA, *** KIND OF LAUNCH TIMEFRAME IS THIS?!?! 6 AM PST?!?!


----------



## Michalius

*sets alarm for 5:54* Just enough time to start coffee and pop an adderall. Hah.


----------



## ceteris

Oh! Lord have mercy! When will my wallet take a break!









The only thing I'd be worried about are those poly*** windows showing all the dust gathered up in the heat sinks. Otherwise, this is a true trophy card that is bound to be OOS at online stores til they stop production. Much like the 590's were.


----------



## Cheesemaster

OOS?


----------



## ceteris

Quote:


> Originally Posted by *Cheesemaster*
> 
> OOS?


Out of Stock


----------



## darkphantom

This card is awesome, simply put. Looks stunning, and I'm sure will perform that way too. I just can't wrap my head around paying $1k for a 2gb gk104 =/

If this were 4gb, it would make more sense. I'll stick to the 680 for now.







Do let me know how much you guys enjoy it when you get it!

A rep at microcenter was trying to woo me to buying one...I lol'd and walked away.


----------



## jcde7ago

Quote:


> Originally Posted by *darkphantom*
> 
> This card is awesome, simply put. Looks stunning, and I'm sure will perform that way too. I just can't wrap my head around paying $1k for a 2gb gk104 =/
> If this were 4gb, it would make more sense. I'll stick to the 680 for now.
> 
> 
> 
> 
> 
> 
> 
> Do let me know how much you guys enjoy it when you get it!
> A rep at microcenter was trying to woo me to buying one...I lol'd and walked away.


The GTX 690 will excel at providing buttery smooth, constant-60-80FPS, vsync on, for pretty much every game out right now, at completely maxed out settings including at least 4xAA + 16xAF @ 1440/1600p resolutions.

Where the card might hit the VRAM wall are with 3D and/or surround/tri-monitor setups at extremely high resolutions, where AA or other settings may have to be toned down to achieve that 60 FPS. Or, 120 FPS constant on a couple of FPS games with everything cranked up (BF3, for example).

The main benchmark right now would be BF3, which is kind of an exception to the rule as far as VRAM usage is concerned, as it will pre-load textures and utilize as much VRAM as available, regardless of amount...so upping the VRAM here wouldn't necessarily help framerates. SLI GTX 680 2GBs can max out BF3 between 60-80FPS constant @ Ultra w/ 4xAA and 16xAF @ 1440/1600p with absolutely no problems, and the GTX 690 will be able to achieve the same. There really isn't another game as graphically intensive while at the same time being VRAM-intensive like BF3 is, so it's safe to say that the 690 is going to be able to handle 99.9% of what's out there at very high resolutions just fine.


----------



## DADDYDC650

Anyone know if Amazon will carry the GTX 690? I don't feel like paying tax on top of $1000....


----------



## ceteris

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone know if Amazon will carry the GTX 690? I don't feel like paying tax on top of $1000....


They probably might. They did it for the 680 on launch date but I didn't find it on search since for some reason pushed a bunch of 580's on top of the list.


----------



## jcde7ago

Quote:


> Originally Posted by *DADDYDC650*
> 
> Anyone know if Amazon will carry the GTX 690? I don't feel like paying tax on top of $1000....


They most likely will. Though again, launch day availability (May 3) is "limited quantities," with general availability on the 7th.

If Amazon doesn't release it in the next 12-16 hours and you don't wan't to pay tax, i'd hold out until Monday. Although, there is still that chance that biting the bullet with an e-tailer that is selling the card but charging tax turns out to be cheaper still than buying from Amazon if the supply is THAT limited and people inflate the prices on Amazon through re-selling....

I feel your pain, cause Cali. tax is going to be about $90.00...









Then again, I have a buyer for my GTX 590 + waterblock for $650, and sold a spare Q9650 on fleabay for $230, so upgrade costs for me when it's all said and done is only going to be about $350-400 even after having to get a 690 waterblock down the road...so i'm quite happy in that regard.


----------



## Azefore

Newegg's first look @ the 690


----------



## emett

Quote:


> Originally Posted by *jcde7ago*
> 
> Where the card might hit the VRAM wall are with 3D


3D doesn't require any more vram than 2d at the same settings


----------



## jcde7ago

Available on EVGA.com now!!!! GO GO GO!!!

I got my order in!









http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=


----------



## fastpcman12

pre-order?


----------



## CapnCrunch10

Apparently, you can order directly from EVGA.

http://www.evga.com/products/prodlist.asp


----------



## Lazz767

Only if your in the US. I'm in Canada and can't find it yet.


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Available on EVGA.com now!!!! GO GO GO!!!
> I got my order in!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=


Damn, lol... Didn't see your post.

What's the difference between the Signature and the regular version? Besides $50 and the fact you can't order it atm?


----------



## Michalius

I am in the club.


----------



## PatrickCrowely

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Damn, lol... Didn't see your post.
> What's the difference between the Signature and the regular version? Besides $50 and the fact you can't order it atm?


Looks like a T-Shirt & a few accessories...


----------



## CapnCrunch10

I think that's it? Notify me is up.


----------



## fastpcman12

i got a preorder in for the signuatre.


----------



## jcde7ago

Quote:


> Originally Posted by *fastpcman12*
> 
> pre-order?


The "Signature Edition" is Pre-Order. Knowing EVGA, this is just a stock clock speed boost, with maybe an extra goody in the box. Otherwise, same card in the box, as *Nvidia is the ONLY one making the 690 right now - no AIBs. They will all be the same reference card.* EVGA does have the Signature and standard 690 with the same clocks at the moment though, but i bet it will change later. Signature Edition is $1,049.99, standard id $999. I snagged a standard one...

Dat tax...but officially in the club:



EVGA is literally like 20 minutes from my office (delivering it there for safety lol), so i'm hoping that even with Ground shipping, i'll get it tomorrow, which has usually always been the case....









I'll be adding in links to the reviews in a moment.


----------



## CapnCrunch10

Surprised it was only on EVGA. I don't see anything on the egg or Amazon.


----------



## fastpcman12

only evbga and asus are making the cards according to nanadtech. so watch newegg for asus!


----------



## Wogga

damn, this thing alone is as powerful as my two 590. and OC...oh, how i want two of them


----------



## DADDYDC650

Still waiting on Amazon. The more time passes the more I feel like buying a 4GB 680 instead.


----------



## Michalius

Should have done 2 day shipping.


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Surprised it was only on EVGA. I don't see anything on the egg or Amazon.


Yeah, EVGA happened to be the first site i checked at exactly 6AM, as they released the 680 at exactly that time in March. Regardless, they were pretty much the only brand i was going to go with, so that worked out...

Speaking of EVGA...
Quote:


> EVGAJacob_F: Re:EVGA GTX 690! 13 mins. ago (permalink)
> BTW, those looking for backplates, brackets and waterblocks, these are coming soon!


Their article is also up:

http://www.evga.com/articles/00679/


----------



## CapnCrunch10

Quote:


> Originally Posted by *DADDYDC650*
> 
> Still waiting on Amazon. The more time passes the more I feel like buying a 4GB 680 instead.


That's what people said when the 680 released. The 690 managed to be released before 4GB hit the US market.


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, EVGA happened to be the first site i checked at exactly 6AM, as they released the 680 at exactly that time in March. Regardless, they were pretty much the only brand i was going to go with, so that worked out...
> Speaking of EVGA...
> Quote:
> 
> 
> 
> EVGAJacob_F: Re:EVGA GTX 690! 13 mins. ago (permalink)
> BTW, those looking for backplates, brackets and waterblocks, these are coming soon!
> 
> 
> 
> Their article is also up:
> http://www.evga.com/articles/00679/
Click to expand...

Thanks! Nice job on being able to snag one. That tax looks awful...


----------



## pat031

Anyone know the best (cheapest way to order from Canada ?
I feel if I order it directly from EVGA I will pay uge custom fee


----------



## Lazz767

EVGA won't ship to Canada, already tried.


----------



## CSHawkeye

Darn missed out. When I was putting my CC # in it sold out.


----------



## pat031

Quote:


> Originally Posted by *Lazz767*
> 
> EVGA won't ship to Canada, already tried.


Damn it's weird, When I preorder , I can select Canada and all different province


----------



## Lazz767

Quote:


> Originally Posted by *pat031*
> 
> Damn it's weird, When I preorder , I can select Canada and all different province


Yeah I know. Right at the end it will say "Ontario is not a valid province" or where ever you live. I'm pretty disappointed in that.


----------



## Arizonian

Did you order or preorder? Only preorder is up.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Did you order or preorder? Only preorder is up.


Ordered the standard GTX 690, not the EVGA Signature Edition.

Pre-order was for the Signature Edition. Will probably come with some sort of EVGA goody (maybe even a backplate). Again all cards produced right now are by Nvidia, no AIBs yet.


----------



## jcde7ago

Review links are now up. Edited a couple of things as well on my main post.


----------



## Arizonian

Well pulled the trigger for the Signature $1054. I think it comes with a t-shirt and a mouse pad that I could do without. Hopefully it's a back plate too but it didn't show it. Now question is how long before they actually send it?









I'm in the club though. It's my first dual GPU.


----------



## Dwood

Got it on notify...hopefully will get it soon.

Just a thought, the 690 wont work in a tri sli with my other 680 will it?


----------



## shiloh

I've put in a pre-order for the signature edition too!


----------



## KoSoVaR

Quote:


> Originally Posted by *shiloh*
> 
> I've put in a pre-order for the signature edition too!


Where's everyone ordering from? Rather, pre-ordering?


----------



## covert ash

Quote:


> Originally Posted by *Dwood*
> 
> Got it on notify...hopefully will get it soon.
> Just a thought, the 690 wont work in a tri sli with my other 680 will it?


It doesn't appear so at the moment.









Gah!!! The wait for more than 2GB VRAM per GPU is agonizing...









Congrats to all who have bought one thus far!


----------



## Arizonian

Quote:


> Originally Posted by *KoSoVaR*
> 
> Where's everyone ordering from? Rather, pre-ordering?


Directly from EVGA.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Well pulled the trigger for the Signature $1054. I think it comes with a t-shirt and a mouse pad that I could do without. Hopefully it's a back plate too but it didn't show it. Now question is how long before they actually send it?
> 
> 
> 
> 
> 
> 
> 
> 
> I'm in the club though. It's my first dual GPU.


List is updated, with current proof of ownership. I've included pre-orders as well.








Quote:


> Originally Posted by *KoSoVaR*
> 
> Where's everyone ordering from? Rather, pre-ordering?


Actual ordering from EVGA for the non-Signature GTX 690 only lasted about 9-10 minutes. That is now on notify. Pre-orders are for the Signature Edition.


----------



## crazedsilence

As much as I would love to try out a single 690 (or even dual) I think I'll stick with my 2 680's







it'll be interesting to see how a 690 performs against SLI 680's however.


----------



## jcde7ago

Quote:


> Originally Posted by *crazedsilence*
> 
> As much as I would love to try out a single 690 (or even dual) I think I'll stick with my 2 680's
> 
> 
> 
> 
> 
> 
> 
> it'll be interesting to see how a 690 performs against SLI 680's however.


Check the various review links. The performance between a single GTX 690 and GTX 680s in SLI is 2%, with the 680s using up to 50+w more power for that 2%.









EDIT: For those of us who got actual orders in today...
Quote:


> EVGAJacob_F: Re:EVGA GTX 690! 15 mins. ago (permalink)
> Yes they will ship today provided the charge goes through, approved, etc.


----------



## CapnCrunch10

You can add me to the list jcd.

Messed up entering my card info the first time, but still managed to get it.

Proof

Even if it's shipping today, it won't get here until next week. Ground shipping is not going to be quick and the cost to send it sooner was not worth it at all.


----------



## Cheesemaster

I ordered two of them (signature edition)! Quad Sli here I come!

Order Date: 05/03/2012

Purchase Information
Product Unit Price Qty Subtotal
04G-P4-2692-KR $1,049.99 1 $1,049.99
Subtotal $1,049.99
Sales Tax $76.12
Shipping $20.73
Grand Total $1,146.84

Purchase Information
Product Unit Price Qty Subtotal
04G-P4-2692-KR $1,049.99 1 $1,049.99
Subtotal $1,049.99
Sales Tax $76.12
Shipping $20.73
Grand Total $1,146.84


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> You can add me to the list jcd.
> Messed up entering my card info the first time, but still managed to get it.
> Proof
> Even if it's shipping today, it won't get here until next week. Ground shipping is not going to be quick and the cost to send it sooner was not worth it at all.


Done.









Ground shipping...yeah, it will be next Wed/Thurs. for you. EVGA is located literally like...15-20 miles from me, and since they use UPS, UPS ground shipping from EVGA to where i'm located has almost always been the same as next day. We do the same thing at my workplace - we ship using UPS ground, and 99% of the time, clients/customers within 30-40 miles will usually get their shipments the next day. I'm just hoping EVGA gets these shipped by the cutoff, lol...


----------



## CapnCrunch10

Damn cheese. You better post your system after you're done. What is the rest of your setup going to look like?


----------



## jcde7ago

Quote:


> Originally Posted by *Cheesemaster*
> 
> I ordered two of them (signature edition)! Quad Sli here I come!
> Order Date: 05/03/2012
> Purchase Information OrderID:
> Product Unit Price Qty Subtotal
> 04G-P4-2692-KR $1,049.99 1 $1,049.99
> Subtotal $1,049.99
> Sales Tax $76.12
> Shipping $20.73
> Grand Total $1,146.84
> Purchase Information OrderID:
> Product Unit Price Qty Subtotal
> 04G-P4-2692-KR $1,049.99 1 $1,049.99
> Subtotal $1,049.99
> Sales Tax $76.12
> Shipping $20.73
> Grand Total $1,146.84


I'll take your word on it and add you...









Also, for those who pre-ordered (not actual ordered) - *EVGAJacob_F stated that the Signature Edition will ship out "around the end of the month."* As far as we can tell, the only differences are a t-shirt, mousepad, etc. Cards are the same. They will also NOT CHARGE YOU UNTIL THEY SHIP. So feel free to pre-order and cancel anytime before they ship in a few weeks.

Hopefully i will be making you pre-order guys drool if my 690 gets in tomorrow...


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Done.
> 
> 
> 
> 
> 
> 
> 
> 
> Ground shipping...yeah, it will be next Wed/Thurs. for you. EVGA is located literally like...15-20 miles from me, and since they use UPS, UPS ground shipping from EVGA to where i'm located has almost always been the same as next day. We do the same thing at my workplace - we ship using UPS ground, and 99% of the time, clients/customers within 30-40 miles will usually get their shipments the next day. I'm just hoping EVGA gets these shipped by the cutoff, lol...


Lucky then. Maybe being forced to pay the tax was worth it if you get it by tomorrow!


----------



## Cheesemaster

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Damn cheese. You better post your system after you're done. What is the rest of your setup going to look like?


Asus Rampage extreme IV

[email protected] 4.8ghz

16gig corsair vengence [email protected] 10-11-10-28-1T

corsair force 3 GT for OP system

Titanium HD sound blaster

Corsair sp2500 gaming speaker system

corsair AX1200 pro gold

Acer 3d monitor

Killer 2100 NIC

( I am currently running 580gtx 3gb in 3-way sli)


----------



## CapnCrunch10

Quote:


> Originally Posted by *Cheesemaster*
> 
> Asus Rampage extreme IV
> [email protected] 4.8ghz
> 16gig corsair vengence [email protected] 10-11-10-28-1T
> corsair force gt3 for OP system
> Titanium HD sound blaster
> Corsair sp2500 gaming speaker system
> corsair AX1200 pro gold
> Acer 3d monitor
> Killer 2100 NIC
> ( I am currently running 580gtx 3gb in 3-way sli)












That is one monster system.


----------



## StormX2

can I just say - I hate you all ^.^

never once ha ve I tasted the aweosmeness of high end GPU - only ever 200-300 $ range and I make them last me a long time


----------



## Cheesemaster

By the way I am a complete noob... I am so noob in fact.... That I am the noobiest noob that ever noobed a noobie!


----------



## Lazz767

Anyone found one in Canada or know when they'll release it in Canada?


----------



## Michalius

High end GPU's are the result of creating your own problems with other expensive parts.

Example: Bought a 120hz monitor about 4 months ago. Anything below 120fps now is unacceptable. 7970 just wasn't cutting it.


----------



## CapnCrunch10

Quote:


> Originally Posted by *StormX2*
> 
> can I just say - I hate you all ^.^
> never once ha ve I tasted the aweosmeness of high end GPU - only ever 200-300 $ range and I make them last me a long time


You're what I call the "smart buyer".









If you ever did want to splurge, you'd probably be able to make most of your money back if you sold your high end GPU before something new came out. I've seen used 680 reference cards being sold at retail or more all the time due to high demand and limited availability. Consider it if you ever get the urge.


----------



## Michalius

Anyone have an idea on waterblocks for this bad boy?

I won't be able to not put it in before then, but it's a core component of a planned build.


----------



## CapnCrunch10

Quote:


> Originally Posted by *Michalius*
> 
> Anyone have an idea on waterblocks for this bad boy?
> I won't be able to not put it in before then, but it's a core component of a planned build.


Most likely at least two weeks before we see news about it (EK is usually on top of it). Such a shame to remove the really nice looking cooler though.


----------



## Arizonian

I'm going to cancel my pre-order. It's not shipping until the end of the month! Going to hang tight and wait for Monday availability. You can take me off the club list for right now.


----------



## Oystein

Congrats guys! What are your thoughts on 690 vs SLI 680 2GB? Price is virtually identical.


----------



## Lazz767

Quote:


> Originally Posted by *Oystein*
> 
> Congrats guys! What are your thoughts on 690 vs SLI 680 2GB? Price is virtually identical.


Reviews have shown there is only a 1-2% margin difference between the 690 and SLI 680's


----------



## dred

Was lurking the forum this morning and ended up getting a sig. edition.
Quote:


> Originally Posted by *Arizonian*
> 
> I'm going to cancel my pre-order. It's not shipping until the end of the month! Going to hang tight and wait for Monday availability. You can take me off the club list for right now.


I'm thinking about doing the same. I jumped the gun.


----------



## Michalius

Quote:


> Originally Posted by *Oystein*
> 
> Congrats guys! What are your thoughts on 690 vs SLI 680 2GB? Price is virtually identical.


If you're looking at water, it's actually less expensive. At least, that's my justification for it.









Also, looks like someone put one up on ebay for $2099. If that sells, I don't know if I'll be able to hold back and not do the same.


----------



## CapnCrunch10

Quote:


> Originally Posted by *Oystein*
> 
> Congrats guys! What are your thoughts on 690 vs SLI 680 2GB? Price is virtually identical.


Personally, I would prefer two non-reference 680s over a 690. Should run cooler and have more OC headroom. Plus, if you find that 680 SLI is overkill for your needs, you could just sell one off.


----------



## CapnCrunch10

Quote:


> Originally Posted by *Michalius*
> 
> If you're looking at water, it's actually less expensive. At least, that's my justification for it.
> 
> 
> 
> 
> 
> 
> 
> 
> Also, looks like someone put one up on ebay for $2099. If that sells, I don't know if I'll be able to hold back and not do the same.


Doubtful that it would sell. That guy always posts ridiculous prices. I think he was trying to sell the Hydrocopper 680s for $1200 or something before... Who would be that stupid/impatient to not wait until Monday? EVGA hasn't even called to verify orders yet and somehow this person managed to score two? The only person I could see buying this for full price is possibly an international buyer.


----------



## blackend

IF anyone want to see 4 way sli gtx 680
her it is

her is 3D MARK 11 P

http://3dmark.com/3dm11/3324318;jsessionid=1jfrh0venz76bmma2jrmjikuv


----------



## dred

Quote:


> Originally Posted by *Michalius*
> 
> Anyone have an idea on waterblocks for this bad boy?


^^this^^
Quote:


> Originally Posted by *CapnCrunch10*
> 
> Most likely at least two weeks before we see news about it (EK is usually on top of it).


I'm planning the same thing. As beautiful as the shroud is, WC'ing is where it's at!


----------



## jcde7ago

Quote:


> Originally Posted by *Oystein*
> 
> Congrats guys! What are your thoughts on 690 vs SLI 680 2GB? Price is virtually identical.


Initial availability might come down as the main issue for some. Performance wise, they are on par. If you want less power + less heat, the 690 is the way to go. If you watercool (like I will be doing with my GTX 690), only needing to buy one block is also worth it.









I think Anadntech summed it up pretty well:
Quote:


> For all practical purposes *the GTX 690 is a single card GTX 680 SLI - a single card GTX 680 SLI that consumes noticeably less power under load and is at least marginally quieter too.*
> 
> With that said, this would typically be the part of the review where we would inject a well-placed recap of the potential downsides of multi-GPU technology; but in this case there's really no need. Unlike the GTX 590 and unlike the GTX 295 NVIDIA is not making a performance tradeoff here compared to their single-GPU flagship card. When SLI works the GTX 690 is the fastest card out there, and when SLI doesn't work the GTX 690 is still the fastest card out there. For the first time in a long time using a dual-GPU card doesn't mean sacrificing single-GPU performance, and that's a game changer.
> 
> At this point in time NVIDIA offers two different but compelling solutions for ultra-enthusiast performance; the GTX 690 and GTX 680 SLI, and they complement each other well. For most situations the GTX 690 is going to be the way to go thanks to its lower power consumption and lower noise levels, but for cases that need fully exhausting video cards the GTX 680 SLI can offer the same gaming performance at the same price. Unfortunately we're going to have to put AMD out of the running here; as we've seen in games like Crysis and Metro the 7970 in Crossfire has a great deal of potential, but as it stands Crossfire is simply too broken overall to recommend.
> 
> *The only real question I suppose is simply this: is the GTX 690 worthy of its $999 price tag? I don't believe there's any argument to be had with respect to whether the GTX 690 is worth getting over the GTX 680 SLI, as we've clearly answered that above.* As a $999 card it doesn't double the performance of the $499 GTX 680, but SLI has never offered quite that much of a performance boost. However at the same time SLI has almost always been good enough to justify the cost of another GPU if you must have performance better than what the fastest single GPU can provide, and this is one of those times.
> 
> *Is $999 expensive? Absolutely. Is it worth it? If you're gaming at 2560x1600 or 5760x1200, the GTX 690 is at least worth the consideration. You can certainly get by on less, but if you want 60fps or better and you want it with the same kind of ultra high quality single GPU cards can already deliver at 1920x1080, then you can't do any better than the GTX 690.*


----------



## jincuteguy

So the only place that had 690s up on sale was EVGA website? Cause I didnt see anywhere else including Newegg and Amazon


----------



## jcde7ago

Quote:


> Originally Posted by *jincuteguy*
> 
> So the only place that had 690s up on sale was EVGA website? Cause I didnt see anywhere else including Newegg and Amazon


EVGA and ASUS are the only partners that will be distributing the GTX 690 in North America. EVGA will probably have more in stock next Monday on the 7th, and ASUS will go through NewEgg I presume...though when exactly, no one knows.


----------



## Oystein

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Personally, I would prefer two non-reference 680s over a 690. Should run cooler and have more OC headroom. Plus, if you find that 680 SLI is overkill for your needs, you could just sell one off.


I agree, the 690 hit around 85°C under load IIRC. A bit hot for me. Also didn't overclock as well as the 680s. Of course if you're going watercooled this all changes.

I think a pair of 680 4GB cards in SLI would be best, though. At surround-resolutions (where these cards belong) one can easily pass 2GB when some AA is added.

Big respect to the power consumption, though. I'm impressed


----------



## CapnCrunch10

Quote:


> Originally Posted by *Oystein*
> 
> I agree, the 690 hit around 85°C under load IIRC. A bit hot for me. Also didn't overclock as well as the 680s. Of course if you're going watercooled this all changes.
> I think a pair of 680 4GB cards in SLI would be best, though. At surround-resolutions (where these cards belong) one can easily pass 2GB when some AA is added.
> Big respect to the power consumption, though. I'm impressed


Well, remember that the reference 680s easily get to 80+ on heavy load (like Heaven) without setting the fan control manually at the max. Most people have gotten their cards to OC above 1200 for the boost (which matches the 690 reviews we've seen which have hit anywhere from 1200-1250 boost) and it's fairly rare to see 1300 boost even with WC. Just another silicon lottery. Biggest thing holding the 600 line in my opinion and something that will likely be the same for the 690 is the lack of voltage control.

I suggest non-reference since reviews I've seen shown temps of 60-65 on load (though no mention of SLIing them which would mean the top one is going to get fairly warm). If you're intention is WC, the 690 will save you a lot in the long run (like Michalius said) considering each full block for a GPU is $100 and less heat will be generated which is always a plus.

Also, slot constraints is something to consider too. If you plan on having quad SLI, then two 690s will fit with a lot more builds than four 680s.


----------



## Oystein

I even struggle to fit two non-ref 680s, because everyone seem to think it's fun to make coolers that take up more than 2 slots.









Then again I've just ordered a 4GB 680, so I'm set for now.

I think the 690 would be a no-brainer if it came with 8GB and a better cooler. Maybe in the future.


----------



## CapnCrunch10

Quote:


> Originally Posted by *Oystein*
> 
> I even struggle to fit two non-ref 680s, because everyone seem to think it's fun to make coolers that take up more than 2 slots.
> 
> 
> 
> 
> 
> 
> 
> 
> Then again I've just ordered a 4GB 680, so I'm set for now.
> I think the 690 would be a no-brainer if it came with 8GB and a better cooler. Maybe in the future.


Norway... Didn't even see that. Nevermind then. I don't think we officially have a 4GB in the US officially yet. Are you getting the Phantoms then?


----------



## ceteris

I keep hitting refresh on Newegg and it keeps pushing a refurbished GTX 460


----------



## fastpcman12

email from newegg on status of gtx 690:

Unfortunately, we do not have an ETA for this item until after we receive the first shipment. Once we receive the initial order, we can more accurately estimate future deliveries. Generally it takes us about 6-8 weeks to receive new merchandise once it has been released to the public, due to the nature of supply and demand. In addition, please feel free to use our Auto Notify link for this item. Once the new shipment arrives an email would be sent to you. Please continue to check all prices and availability at www.newegg.com for our most up-to-date status. Any item that is removed from our website usually will be out of stock for longer than 2 weeks. Thank you in advance for your understanding.


----------



## CapnCrunch10

Quote:


> Originally Posted by *fastpcman12*
> 
> email from newegg on status of gtx 690:
> Unfortunately, we do not have an ETA for this item until after we receive the first shipment. Once we receive the initial order, we can more accurately estimate future deliveries. Generally it takes us about 6-8 weeks to receive new merchandise once it has been released to the public, due to the nature of supply and demand. In addition, please feel free to use our Auto Notify link for this item. Once the new shipment arrives an email would be sent to you. Please continue to check all prices and availability at www.newegg.com for our most up-to-date status. Any item that is removed from our website usually will be out of stock for longer than 2 weeks. Thank you in advance for your understanding.


Auto-notify would be helpful if they actually had a page for the 690. Thanks for emailing.


----------



## ceteris

I'd do auto notify if it was even up on their site









I guess we are looking at a Monday launch then.









Grats to all who are going to enjoy their new 690 on the weekend.


----------



## Oystein

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Norway... Didn't even see that. Nevermind then. I don't think we officially have a 4GB in the US officially yet. Are you getting the Phantoms then?


Ordered one Phantom, yes. Can't fit two and still have x16/x16 with my mobo. The Phantom cooler takes up 2.5 slots. I can still SLI if I get a 2 slot card for the top card and put the Phantom below, but none are for sale right now. Thinking MSI Twin Frozr III when that comes out.


----------



## Cheesemaster

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Personally, I would prefer two non-reference 680s over a 690. Should run cooler and have more OC headroom. Plus, if you find that 680 SLI is overkill for your needs, you could just sell one off.


One thing that's cool about running this in quad sli is that i get to run my sound card and my killer network card as well! that is one awesome benefit of this setup!


----------



## jincuteguy

Quote:


> Originally Posted by *jcde7ago*
> 
> EVGA and ASUS are the only partners that will be distributing the GTX 690 in North America. EVGA will probably have more in stock next Monday on the 7th, and ASUS will go through NewEgg I presume...though when exactly, no one knows.


What do you mean only EVGA and Asus are the only partners in North America? you mean they will be the only ones that sell the 690s? What about retailers? Like Fry's , Newegg are they getting them? Im confused.


----------



## StormX2

shoot id be happy with a 560 ti 448 to be honest

I wonder what a 660 Ti would be like Hrrmmmmmm

Thank the Gods that i dont care for new games!


----------



## Lazz767

Quote:


> Originally Posted by *jincuteguy*
> 
> What do you mean only EVGA and Asus are the only partners in North America? you mean they will be the only ones that sell the 690s? What about retailers? Like Fry's , Newegg are they getting them? I'm confused.


I think what he ment was for the inital release day (today) they would be the only ones to sell it. On a side note as of right now the US is the only place in North America where this card can even been found.


----------



## pat031

I just order mine !!!
I will receive it in Québec next monday


----------



## V3teran

Ordered mine will be here on the 11th may(EVGA model).


----------



## tonyjones

seriously what do some of you guys do for a living







quad sli gtx 690 crazy!


----------



## jcde7ago

A rep from EVGA just called me about 30 minutes ago on my cell phone to verify my order info/shipping address/etc. He said it would ship out today for sure.

I <3 EVGA!!!


----------



## Lazz767

Quote:


> Originally Posted by *pat031*
> 
> I just order mine !!!
> I will receive it in Québec next monday


Where did you order from?


----------



## ceteris

Quote:


> Originally Posted by *jcde7ago*
> 
> A rep from EVGA just called me about 30 minutes ago on my cell phone to verify my order info/shipping address/etc. He said it would ship out today for sure.
> I <3 EVGA!!!


Grats, bro~!









Still F5'ing here lol


----------



## Descadent

Quote:


> Originally Posted by *jcde7ago*
> 
> A rep from EVGA just called me about 30 minutes ago on my cell phone to verify my order info/shipping address/etc. He said it would ship out today for sure.
> I <3 EVGA!!!


nice!

wish I was single in the case of owning a 690... my wife would chop it off if I spent 1k on a graphics card.... lol

have fun!


----------



## Michalius

Quote:


> Originally Posted by *jcde7ago*
> 
> A rep from EVGA just called me about 30 minutes ago on my cell phone to verify my order info/shipping address/etc. He said it would ship out today for sure.
> I <3 EVGA!!!


Same! When I got the call, I was all worried. "Sorry, due to insufficient stock...."

Should have it next week.


----------



## jincuteguy

Quote:


> Originally Posted by *V3teran*
> 
> Ordered mine will be here on the 11th may(EVGA model).


How did you order one? Aren't they sold out on EVGA?


----------



## blumpking

New member here. Just wanted to give a HUGE thanks to jcde7ago for the tip on when the 690 went on sale this morning. I had just enough time to create an account on EVGA.com and get my order placed. Received my phone call from EVGA a few hours ago confirming my order and notifying me about shipping. Can't wait to crank some 2560x1600 resolution on a Dell U3011 Ultrasharp monitor. Gonna be sick!!


----------



## jcde7ago

Quote:


> Originally Posted by *blumpking*
> 
> New member here. Just wanted to give a HUGE thanks to jcde7ago for the tip on when the 690 went on sale this morning. I had just enough time to create an account on EVGA.com and get my order placed. Received my phone call from EVGA a few hours ago confirming my order and notifying me about shipping. Can't wait to crank some 2560x1600 resolution on a Dell U3011 Ultrasharp monitor. Gonna be sick!!


Welcome to OCN!

And glad I was able to help you out this morning, and grats on the GTX 690! I'll add you to the list for now, just be sure to post pics when you receive the card!


----------



## Cheesemaster

Quote:


> Originally Posted by *tonyjones*
> 
> seriously what do some of you guys do for a living
> 
> 
> 
> 
> 
> 
> 
> quad sli gtx 690 crazy!


Sell girl scout cookies


----------



## Cheesemaster

Can i still run my killer nic, and my titanium hd soundcard when i quad my 690gtx's


----------



## Pentium4 531 overclocker

The Epicness of this club....


----------



## jcde7ago

Quote:


> Originally Posted by *Cheesemaster*
> 
> Can i still run my killer nic, and my titanium hd soundcard when i quad my 690gtx's


Hmm, good question. It's really going to depend on your board...you may have mentioned it earlier...but regardless, FILL OUT YOUR SYSTEM SPECS!!


----------



## bitMobber

The GTX 690 is on Newegg for $1,199.99 !!

Why so much!?


----------



## Inglewood78

Quote:


> Originally Posted by *bitMobber*
> 
> The GTX 690 is on Newegg for $1,199.99 !!
> Why so much!?


Not seeing it on their site...


----------



## jcde7ago

Quote:


> Originally Posted by *bitMobber*
> 
> The GTX 690 is on Newegg for $1,199.99 !!
> Why so much!?


Hmm, I don't see it, but I wouldn't be surprised...as much as I love the Egg...they tend to inflate the prices of some of the more popular products to make some huge margins. If it's true that they just now put up the GTX 690 for $1,200, people that have to pay taxes are going to be price-gouged even further...

Anyway, care to post a link?


----------



## Cheesemaster

Quote:


> Originally Posted by *jcde7ago*
> 
> Hmm, good question. It's really going to depend on your board...you may have mentioned it earlier...but regardless, FILL OUT YOUR SYSTEM SPECS!!


Asus Rampage extreme IV

[email protected] 4.8ghz

16gig corsair vengence [email protected] 10-11-10-28-1T

corsair force 3 GT for OP system

Titanium HD sound blaster

Corsair sp2500 gaming speaker system

corsair AX1200 pro gold

Acer 3d monitor

Killer 2100 NIC


----------



## drka0tic

Here you go....
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130781&cm_mmc=SNC-YouTube-_-na-_-na-_-na



INSANE PRICE!


----------



## Cheesemaster

Quote:


> Originally Posted by *jcde7ago*
> 
> Hmm, good question. It's really going to depend on your board...you may have mentioned it earlier...but regardless, FILL OUT YOUR SYSTEM SPECS!!


Asus Rampage extreme IV

[email protected] 4.8ghz

16gig corsair vengence [email protected] 10-11-10-28-1T

corsair force 3 GT for OP system

Titanium HD sound blaster

Corsair sp2500 gaming speaker system

corsair AX1200 pro gold

Acer 3d monitor

Killer 2100 NIC


----------



## bitMobber

You can only get to it through the link they put on their youtube video. If you try to search the site it can't be found.

It's almost $1,300 shipped! I have it in my cart right now but I can't pull the trigger, I don't understand why it costs so much more on Newegg. The $200 more isn't worth the 10% less power consumption and 4-5 dB less fan noise compared to 680s in SLI.


----------



## 86JR

This card is overkill for pretty much everything other than folding.

5 of them on an SR2 would be sweet for folding!


----------



## jcde7ago

Quote:


> Originally Posted by *Cheesemaster*
> 
> Asus Rampage extreme IV
> [email protected] 4.8ghz
> 16gig corsair vengence [email protected] 10-11-10-28-1T
> corsair force 3 GT for OP system
> Titanium HD sound blaster
> Corsair sp2500 gaming speaker system
> corsair AX1200 pro gold
> Acer 3d monitor
> Killer 2100 NIC


The RE IV has 40 PCI-E 3.0 lanes. X-79 is more than capable of supporting the bandwidth of a pair of 690s, and yes, you will be able to use your sound card and NIC just fine. I have the Fata1ity Recon3D and Killer 2100 myself (got the Killer 2100 on sale for like $30....can't really say if it's anything to write home about though).


----------



## bitMobber

Do you guys think the price will drop once it can be found on the site through a search?


----------



## Michalius

Quote:


> Originally Posted by *86JR*
> 
> This card is overkill for pretty much everything other than folding.
> 5 of them on an SR2 would be sweet for folding!


Not when you need 120fps in every game.


----------



## Cheesemaster

Quote:


> Originally Posted by *jcde7ago*
> 
> The RE IV has 40 PCI-E 3.0 lanes. X-79 is more than capable of supporting the bandwidth of a pair of 690s, and yes, you will be able to use your sound card and NIC just fine. I have the Fata1ity Recon3D and Killer 2100 myself (got the Killer 2100 on sale for like $30....can't really say if it's anything to write home about though).


Yeah it does decrease ping but not all that crazy though.. it was a gift (killer NiC)


----------



## jcde7ago

Quote:


> Originally Posted by *bitMobber*
> 
> You can only get to it through the link they put on their youtube video. If you try to search the site it can't be found.
> It's almost $1,300 shipped! I have it in my cart right now but I can't pull the trigger, I don't understand why it costs so much more on Newegg. The $200 more isn't worth the 10% less power consumption and 4-5 dB less fan noise compared to 680s in SLI.


Thanks! Kind of weird why NewEgg would be hiding the link...check out the end of the link formatting, lol:

EVGA GTX 690 for sale on NewEgg for $1,200: http://www.newegg.com/Product/Product.aspx?Item=N82E16814130781&cm_mmc=SNC-YouTube-_-na-_-na-_-na

Again though, typical NewEgg. While their customer service is top notch, they really have been ramping up on the price inflation for popular and limited-stock items. Hence why they're trying to make a killing with their EVGA GTX 690....i'm glad EVGA was honest about it and put it up for $999 this morning.









That said, I would NOT pull the trigger at $1,200, especially if you have taxes...that's $300+ higher than MSRP. That makes NewEgg only slightly less bad than fleabay and Amazon price-gougers...yuck.









EDIT: Just shared the link in the GTX 690 Reviews thread. While I hope no one actually buys it for $1,200, i shared it anyway for those people desperate enough to be price-gouged by NewEgg, if they want it that badly. Or for those where money is even LESS of an issue than for the people willing to fork out $1K.


----------



## bitMobber

I can't believe Newegg does this, they use to be my go to site for purchasing new hardware but now I'm going to consider other retailers for future builds.


----------



## bitMobber

It can be found on the site now. Still priced at $1,199.99

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709+600315498&QksAutoSuggestion=&ShowDeactivatedMark=False&Configurator=&IsNodeId=1&Subcategory=48&description=&hisInDesc=&Ntk=&CFG=&SpeTabStoreType=&AdvancedSearch=1&srchInDesc=


----------



## Shadowness

Quote:


> Originally Posted by *Michalius*
> 
> Not when you need 120fps in every game.


This.


----------



## bitMobber

Everyone should go write a review for the card, let them know their inflated price point is not acceptable.


----------



## bitMobber

By the way, it's out of stock now.


----------



## PatrickCrowely

Quote:


> Originally Posted by *bitMobber*
> 
> By the way, it's out of stock now.


It's back up......


----------



## bitMobber

Quote:


> Originally Posted by *PatrickCrowely*
> 
> It's back up......


Says Auto Notify here.


----------



## Droogie

Shows in stock for me. That price is ridiculous.


----------



## jcde7ago

Quote:


> Originally Posted by *Droogie*
> 
> Shows in stock for me. That price is ridiculous.


It's OOS. They're screwing with their stock. Pretty sure they ran out when it initially went OOS.


----------



## V3teran

Quote:


> Originally Posted by *jincuteguy*
> 
> How did you order one? Aren't they sold out on EVGA?


I got mine from scan.co.uk.
It will be in stock on the 10th may and i will have it on the 11th/12th.


----------



## jcde7ago

I just read this tidbit below from the Tom's Hardware review, which I just now got a chance to read, and I have to say I posted about AIBs hopefully being able to change the LED status/color based on temperature, for example, etc., and it looks like I was right!!!
Quote:


> An LED up top can actually be controlled through a new API Nvidia is making available to partners. So, it might respond to load, getting lighter and darker, as an example.


That sounds awesome...depending on how cool this card actually stays with the most demanding games with my setup, I might *have to* forego watercooling and keep this sexy case/shroud on it...hmm...


----------



## ceteris

LOL $1,199.99 isn't even touching the true value of the card at this very moment. Seeing how it sold out before most of us could even see it posted on Newegg shows that the price is still too low to keep these on the market.


----------



## bitMobber

Quote:


> Originally Posted by *ceteris*
> 
> LOL $1,199.99 isn't even touching the true value of the card at this very moment. Seeing how it sold out before most of us could even see it posted on Newegg shows that the price is still too low to keep these on the market.


I don't think so. There will always be buyers no matter how much the price is inflated.

I'm pretty sure Newegg knew they only had a handful in stock and decided to inflate the price 20% knowing they will still sell out.


----------



## ceteris

Quote:


> Originally Posted by *bitMobber*
> 
> I don't think so. There will always be buyers no matter how much the price is inflated.
> I'm pretty sure Newegg knew they only had a handful in stock and decided to inflate the price 20% knowing they will still sell out.


If that was the case, then the GTX 690's on eBay going for $2,000+ would've been bought out too lol. Newegg has always been kind with prices, but they are in business for profit. I'm not saying that I want to pay more money than I have to, but if you can find a place that sells the card @ MSRP or less, you will find that card OOS and/or resold at a higher price like what they are putting on eBay.


----------



## PatrickCrowely

Newegg waited to put it up on their site so they could price it like they did. They knew Gamers would be hungry for the card. I bet some even did Saturday Shipping....


----------



## tonyjones

yeah i was too slow, i'm ready to buy two!


----------



## jcde7ago

Just saw this:



My reaction:


----------



## tonyjones

damn you jcde7ago, post some unboxing pictures in high res please!!


----------



## jcde7ago

Quote:


> Originally Posted by *tonyjones*
> 
> damn you jcde7ago, post some unboxing pictures in high res please!!


Will do, but unfortunately, the highest res pics you're going to get is the max res of my iPhone 4...lol. But I will post some as soon as I get it for sure.


----------



## Cheesemaster

any reviews of 690gtx in quad mode?


----------



## CapnCrunch10

That's pretty messed up of the egg. I can understand a $50 premium, but $200? And they don't even throw in the free tshirt and the mousepad. Should've expected it considering they gouged for Ivy as well.

And btw, the guy sold both his cards for $2100 on eBay but he has been the only one to sell it (there's like 7 sellers now?) leading me to believe that buyer is probably someone who is international or they are not going to pay (some sellers forget to enable immediate payment for BIN). Anyone who pays two grand for this card is out of their mind


----------



## CallsignVega

GTX 690's in stock, get them while they are hot:

http://www.ebay.com/itm/Latest-Graphics-Card-EVGA-GeForce-GTX-690-Best-Card-Ever-/150808827113?pt=PCC_Video_TV_Cards&hash=item231ce814e9#ht_500wt_1158

rofl


----------



## jcde7ago

Quote:


> Originally Posted by *CallsignVega*
> 
> GTX 690's in stock, get them while they are hot:
> http://www.ebay.com/itm/Latest-Graphics-Card-EVGA-GeForce-GTX-690-Best-Card-Ever-/150808827113?pt=PCC_Video_TV_Cards&hash=item231ce814e9#ht_500wt_1158
> rofl


Oh man.

If anyone on OCN is willing to buy one of those for $2,500....PM me first please. I've got friends who are psychiatrists and also, i'll sell you the GTX 690 that i''ll be getting tomorrow for $500 less than what that dude on eBay is selling his for.


----------



## CallsignVega

Quote:


> Originally Posted by *jcde7ago*
> 
> Oh man.
> If anyone on OCN is willing to buy one of those for $2,500....PM me first please. I've got friends who are psychiatrists and also, i'll sell you the GTX 690 that i''ll be getting tomorrow for $500 less than what that dude on eBay is selling his for.


Haha Ya I saw the Newegg price gouging, I think I will try and stick going through EVGA directly or Amazon instead of Newegg. They have been pulling some crap like that lately. Although I am not shopping for a 690, I thought that was hilarious some noob trying to flip a 690 for $2500.


----------



## bitMobber

I like how the seller has (0) feedback, lol. Seems like a legit auction!


----------



## ceteris

LOL hey now! there are people out there with way more means than the average joe to pick these up. Of course they would probably be some rich Saudi brat prince or trust fund baby. Small pond to be fishing in though


----------



## rwchui

Quick question.

I got a EVGA GTX 690 coming tomorrow or Monday.

Will I have to enable sli under NCP?

or

Is it already automatically enabled?

From what I learned, some say it was automatically enabled with the GTX 590.

This is my first dual-GPU card and I fear the countless threads about "cannot enable sli" in the Nvidia official sli forum.

Some other say its to do with un-genuine/cracked versions of Win7 with P67,Z68,Z77,X79 boards

Any comments on this would be appreciated!

Thanks!


----------



## jcde7ago

Quote:


> Originally Posted by *rwchui*
> 
> Quick question.
> I got a EVGA GTX 690 coming tomorrow or Monday.
> Will I have to enable sli under NCP?
> or
> Is it already automatically enabled?
> 
> *With the newer release drivers, it should automatically enable it for you (was the case for me with the GTX 590). But regardless, it acts like a traditional SLI setup with 2 individual cards. There are prompts to enable or disable SLI in NVCP.*
> 
> From what I learned, some say it was automatically enabled with the GTX 590.
> This is my first dual-GPU card and I fear the countless threads about "cannot enable sli" in the Nvidia official sli forum.
> Some other say its to do with un-genuine/cracked versions of Win7 with P67,Z68,Z77,X79 boards
> 
> *I never had a problem with SLI on my P9X79 Pro, so I can't comment on that.*
> 
> Any comments on this would be appreciated!
> Thanks!


Answers in bold, and grats on the 690! Post a pic or send me a PM with info and i'll get you added to the list ASAP!


----------



## rwchui

Thanks for the quick reply!

And will post pics here as soon as I get it!


----------



## jcde7ago

Quote:


> Originally Posted by *rwchui*
> 
> Thanks for the quick reply!
> And will post pics here as soon as I get it!


Sure thing!

If you ordered yours overnight, you might get it tomorrow....or if you live close enough to Anaheim.

It looks like EVGA is shipping these from SoCal, and not their offices here in NorCal, as I just got updated tracking information via UPS and estimated delivery date is next Monday...oh well! It'll give me time this weekend to clean up my rig, and disassemble my loop, as my GTX 590 will be making its exit and the 690 won't be going under water for a while (if ever at all, depending on how cool it stays).


----------



## rwchui

Nope, ordered at NCIX this morning, they told me I am second in line.

http://www.ncix.com/products/?sku=71219

ETA May 4th, 2012

So... technically it should be there at the store for pick up tomorrow.

Well... hopefully.


----------



## jcde7ago

Quote:


> Originally Posted by *rwchui*
> 
> Nope, ordered at NCIX this morning, they told me I am second in line.
> http://www.ncix.com/products/?sku=71219
> ETA May 4th, 2012
> So... technically it should be there at the store for pick up tomorrow.
> Well... hopefully.


Nice, you will be one of the first in the club to receive theirs then.


----------



## rwchui

Note the keyword: *ESTIMATED* Time Arrival... lol


----------



## brettjv

Quote:


> Originally Posted by *jincuteguy*
> 
> What do you mean only EVGA and Asus are the only partners in North America? you mean they will be the only ones that sell the 690s? What about retailers? Like Fry's , Newegg are they getting them? Im confused.


nV is making the cards, but they're only selling them to EVGA and Asus, not their other partners. Thus, all 690's will either say EVGA or Asus on the box.

Those two companies then do the distribution (i.e. those two brands send them to the retailers they normally use) so the cards could be sold anywhere that normally carries Asus or Evga products.

Basically those two scored the exclusive rights to brand and distribute all the 690's in North America, probably because they are the top two Evga partners in terms of sales and extent of distribution networks in N. America would be my guess









BTW, I'm coveting one of these things ... can't afford it but bjv DOES WANT!!!


----------



## CapnCrunch10

Quote:


> Originally Posted by *rwchui*
> 
> Nope, ordered at NCIX this morning, they told me I am second in line.
> http://www.ncix.com/products/?sku=71219
> ETA May 4th, 2012
> So... technically it should be there at the store for pick up tomorrow.
> Well... hopefully.


Glad to see NCIX isn't behaving as dastardly as Newegg for pricing on this. Grats on being able to score one.


----------



## Cheesemaster

I had to pull ninja tactics to get two.. I am so excited.. only if I could get my 3960x past 4.8ghz


----------



## Alyan

Quote:


> Originally Posted by *brettjv*
> 
> nV is making the cards, but they're only selling them to EVGA and Asus, not their other partners. Thus, all 690's will either say EVGA or Asus on the box.
> Those two companies then do the distribution (i.e. those two brands send them to the retailers they normally use) so the cards could be sold anywhere that normally carries Asus or Evga products.
> Basically those two scored the exclusive rights to sell all the 690's, probably because they are the top two Evga partners in terms of sales and distribution networks in N. America would be my guess


I'm surprised MSI is not on that list.


----------



## ceteris

Quote:


> Originally Posted by *Alyan*
> 
> I'm surprised MSI is not on that list.


MSI has had an oddly low profile compared to the others. You see vids and reps from EVGA and especially ASUS all the time. Most of MSI news I get from here, TH, HOCP or HCanucks.


----------



## Arizonian

Brettjv you so deserve one of these. I know how much you love the Golden 465's unlocked, but it's time buddy. It's time.


----------



## SirWaWa

http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690/specifications

why only one dual link dvi-d?
normal for dual gpu cards to have only one dual link dvi-d?
wouldn't this matter at higher resolutions?

so are all the 690's for now gonna look like... well reference?
i like everything except for that nasty green nvidia decided to go with this gen


----------



## brettjv

Quote:


> Originally Posted by *Arizonian*
> 
> Brettjv you so deserve one of these. I know how much you love the Golden 465's unlocked, but it's time buddy. It's time.


Yeah ... I DO, don't I?

But ... I also *SOOO* want to go see The Wall on the 15th and the tix I want are $300/pair. No way I can do both, not this month anyways.

Maybe I'll just show up at your door to drool over your 690 when you get it


----------



## jcde7ago

Quote:


> Originally Posted by *brettjv*
> 
> nV is making the cards, but they're only selling them to EVGA and Asus, not their other partners. Thus, all 690's will either say EVGA or Asus on the box.
> Those two companies then do the distribution (i.e. those two brands send them to the retailers they normally use) so the cards could be sold anywhere that normally carries Asus or Evga products.
> Basically those two scored the exclusive rights to brand and distribute all the 690's in North America, probably because they are the top two Evga partners in terms of sales and extent of distribution networks in N. America would be my guess
> 
> 
> 
> 
> 
> 
> 
> 
> BTW, I'm coveting one of these things ... can't afford it but bjv DOES WANT!!!


You know you're going to get one..









Also, the other thing I read is that Nvidia is not allowing partners to change the reference design. That means that all GTX 690s for the foreseeable future (maybe ever) will have the distinctive housing and unique overall design. Makes sense, considering Nvidia is manufacturing all of these right now, and it's actually not a bad move on their part...personally, I think i'm going to rock mine on air for a while, and if it ends up staying cool, I may never spring for a waterblock.


----------



## mcg75

I happened to be on newegg last night and the 690 was in stock. I was all set to buy and then I noticed they wanted $1199 instead of $999. No thanks Newegg.


----------



## Supper

Will have my 690 this sunday (lets just say courtesy of a friend pulling some string), but the polycarbonate glass got me thinking,...
Its gonna be eventually a dusty glass though since the fan is blowing without a filter so it might not look good after few months, no matter how clean the case and room is. Dust is inevitable.
So I might keep using my reference 680 (SLI in the future) and seized the extra performance in SLI.
Am i right about the glass?


----------



## blumpking

Tracking numbers show the 690 is going to be delivered on Tuesday (8th). I don't think I can wait that long!!!!


----------



## jcde7ago

Quote:


> Originally Posted by *Supper*
> 
> Will have my 690 this sunday (lets just say courtesy of a friend pulling some string), but the polycarbonate glass got me thinking,...
> Its gonna be eventually a dusty glass though since the fan is blowing without a filter so it might not look good after few months, no matter how clean the case and room is. Dust is inevitable.
> So I might keep using my reference 680 (SLI in the future) and seized the extra performance in SLI.
> Am i right about the glass?


You're absolutely correct about the glass....but that's kind of irrelevant because in most instances (except for cases that are designed for top-exhaust with GPUS) you're going to

A) not see the dust

B) whether or not the windows are glass or was just a plastic shroud, that area WILL accumulate dust regardless, and you'd want to regularly use compressed air to clean it out anyways.

So no, dust really shouldn't be a big deal. Like you said, it's bound to get dusty regardless.


----------



## Supper

Quote:


> Originally Posted by *jcde7ago*
> 
> You're absolutely correct about the glass....but that's kind of irrelevant because in most instances (except for cases that are designed for top-exhaust with GPUS) you're going to
> A) not see the dust
> B) whether or not the windows are glass or was just a plastic shroud, that area WILL accumulate dust regardless, and you'd want to regularly use compressed air to clean it out anyways.
> So no, dust really shouldn't be a big deal. Like you said, it's bound to get dusty regardless.


yeah, but its kinda ruin the aesthetic and it matters for me, furthermore i dont see how you can blow away the dust using compressed air unless you are talking about taking off the card completely.


----------



## jcde7ago

Quote:


> Originally Posted by *Supper*
> 
> yeah, but its kinda ruin the aesthetic and it matters for me, furthermore i dont see how you can blow away the dust using compressed air unless you are talking about taking off the card completely.


But the same thing goes for EVERY CARD with a fan. Do you just ignore the dust in all the other GPUs you've ever owned just because you couldn't see the dust?

Bottom line is, any card with a fan will need to be cleaned every couple of months and will need to be taken out of the case to be cleaned, period. The 690 having see-through windows does not exacerbate this problem any more than normal, especially because the card will be facing DOWN in 90% of the cases it's going to be put in. So you won't even be able to look straight through the windows to see the dust in the glass.


----------



## ceteris

Quote:


> Originally Posted by *brettjv*
> 
> Yeah ... I DO, don't I?


Yeah! GIT WUN~!


----------



## 86JR

Quote:


> Originally Posted by *CapnCrunch10*
> 
> You're what I call the "smart buyer".
> 
> 
> 
> 
> 
> 
> 
> 
> If you ever did want to splurge, you'd probably be able to make most of your money back if you sold your high end GPU before something new came out. I've seen used 680 reference cards being sold at retail or more all the time due to high demand and limited availability. Consider it if you ever get the urge.


Whilst true, I spend £350 on an 8800GTX on release and did not upgrade it untill 5 years later, which is only £75 per year and played every game maxed out...


----------



## bolagnaise

Uhhh I am in the wrong job.


----------



## LukaTCE

Finally some power of PCI-E 3.0







and even not tested with IB


----------



## blumpking

Here is my order confirmation on the card for proof...

EVGA Order Confirmation.JPG 59k .JPG file


----------



## jcde7ago

Quote:


> Originally Posted by *blumpking*
> 
> Here is my order confirmation on the card for proof...
> 
> EVGA Order Confirmation.JPG 59k .JPG file


Yep, i've added you.


----------



## jcde7ago

Just saw this over at EVGA:
Quote:


> EVGAJacob_F:
> 
> Here is a sneak peek of a new utility for EVGA GTX 690...
> 
> Firstly this utility will only work on EVGA GTX 690 cards, basically you can manually set the brightness for the LED that is on the side of the EVGA GTX 690:
> 
> 
> 
> You can even customize it so that the brightness can increase when GPU utilization increases, FPS increases or even when GPU clock changes if EVGA Precision X is also running.
> 
> 
> 
> Utility should be available next week.


Another reason why I love EVGA.


----------



## ceteris

Quote:


> Originally Posted by *jcde7ago*
> 
> Just saw this over at EVGA:
> Quote:
> 
> 
> 
> EVGAJacob_F:
> Here is a sneak peek of a new utility for EVGA GTX 690...
> 
> Firstly this utility will only work on EVGA GTX 690 cards, basically you can manually set the brightness for the LED that is on the side of the EVGA GTX 690:
> 
> You can even customize it so that the brightness can increase when GPU utilization increases, FPS increases or even when GPU clock changes if EVGA Precision X is also running.
> 
> Utility should be available next week.
> 
> 
> 
> Another reason why I love EVGA.
Click to expand...

Hehe that won't matter once you slap a waterblock on it.


----------



## shiloh

Finally I was able to order one from NCIX and it's shipping today! I cancelled my signature pre order on evga.com.

Add me in the club! My first dual GPU (or even SLI) setup!


----------



## jcde7ago

Quote:


> Originally Posted by *ceteris*
> 
> Hehe that won't matter once you slap a waterblock on it.


I'm not going to slap a block on it for a while (especially since none will be available for a couple of weeks, and my 690 is being delivered Monday), and actually, if it ends up staying cool enough, I may not throw a waterblock on it at all.


----------



## Arizonian

Newegg is a rip off right now with that mark up. For $1200 you'll be able to get two GTX 680 FTW 4GB cards in SLI. Fact that people are buying them anyway is not going to help. Let them keep it.


----------



## phantomphenom

Newegg selling for $1,200.00....wth? I do not see a water block on the card. I do not see it says it has straight up 4 gigs of ram instead of 2gb x2..... so theres no reason it should be $200 more if it doesn't have something different other than reference design. Damnit newegg....


----------



## bitMobber

It's going to be interesting come this Monday. If Newegg gets anymore 690s in stock from Asus or EVGA I wonder if they will keep their price at $1,199 or drop it to something more reasonable.


----------



## LuminatX

So this is the forever jealous thread.


----------



## V3teran

Here is my proof kinda


----------



## DADDYDC650

Quote:


> Originally Posted by *LukaTCE*
> 
> Finally some power of PCI-E 3.0
> 
> 
> 
> 
> 
> 
> 
> and even not tested with IB


Why did they test without AA?


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Here is my proof kinda


I'm going to add you to the list now.

But...did I read that correctly?! Is that 848 GBP...total cost for ONE GTX 690 in the UK?! Dude, that's like $1,400...


----------



## 3930K

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm going to add you to the list now.
> But...did I read that correctly?! Is that 848 GBP...total cost for ONE GTX 690 in the UK?! Dude, that's like $1,400...


The normal price is 899 GDP + shipping... we get badly ripped here.


----------



## jcde7ago

Quote:


> Originally Posted by *3930K*
> 
> The normal price is 899 GDP + shipping... we get badly ripped here.


The $999 MSRP I paid for mine definitely pales in comparison to how badly shredded you guys are getting across the pond....ouch.


----------



## 3930K

Quote:


> Originally Posted by *jcde7ago*
> 
> The $999 MSRP I paid for mine definitely pales in comparison to how badly shredded you guys are getting across the pond....ouch.


Here, that's the price of a normal 680 hc


----------



## jcde7ago

Amazon now has this listed as well:

http://www.amazon.com/EVGA-GeForce-mDisplayPort-Graphics-04G-P4-2690-KR/dp/B007ZRO3U4/ref=amb_link_362951422_3?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=top-1&pf_rd_r=6C025FA1314B4824B93E&pf_rd_t=301&pf_rd_p=1367340022&pf_rd_i=gtx%20690

It's not available or in stock any more, but people can at least sign up for notifications...so it'll give you another possible location to snag one from. I also updated the OP with availability/e-tailer information.


----------



## rwchui

You can now add me to the club









I might be the first owner in my local area. *Toronto, Ontario*









One thing though, when I picked up at the vendor I got my card from, the associates there told me Nvidia is not making a lot of these cards because of the high quality material they used (no plastic at all), he told me to expect VERY low quantities of this card in the near future, even more rare than the GTX 590 when they were first released.

I expect this card to sell out much within 10 minutes every time or at least 2 times faster than the GTX 680 which is the flagship card Nvidia has been mass producing to keep up with the demand.



Now, I am wondering whether to break the seal and test out the card or just resell the card sealed locally for cash.


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> Amazon now has this listed as well:
> http://www.amazon.com/EVGA-GeForce-mDisplayPort-Graphics-04G-P4-2690-KR/dp/B007ZRO3U4/ref=amb_link_362951422_3?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=top-1&pf_rd_r=6C025FA1314B4824B93E&pf_rd_t=301&pf_rd_p=1367340022&pf_rd_i=gtx%20690
> It's not available or in stock any more, but people can at least sign up for notifications...so it'll give you another possible location to snag one from. I also updated the OP with availability/e-tailer information.


Awesome. If anyone notices one of those sites have them in stock please post a response here so the rest of us can have a chance lol...


----------



## 3930K

Quote:


> Originally Posted by *rwchui*
> 
> You can now add me to the club
> 
> 
> 
> 
> 
> 
> 
> 
> I might be the first owner in my local area. *Toronto, Ontario*
> 
> 
> 
> 
> 
> 
> 
> 
> One thing though, when I picked up at the vendor I got my card from, the associates there told me Nvidia is not making a lot of these cards because of the high quality material they used (no plastic at all), he told me to expect VERY low quantities of this card in the near future, even more rare than the GTX 590 when they were first released.
> I expect this card to sell out much within 10 minutes every time or at least 2 times faster than the GTX 680 which is the flagship card Nvidia has been mass producing to keep up with the demand.
> 
> Now, I am wondering whether to break the seal and test out the card or just resell the card sealed locally for cash.


Break the seal WITH A VID.


----------



## bitMobber

If anyone is interested, I just got off the phone with someone at Fry's, they have no idea when or even if they will be receiving the card.


----------



## jcde7ago

Quote:


> Originally Posted by *rwchui*
> 
> You can now add me to the club
> 
> 
> 
> 
> 
> 
> 
> 
> I might be the first owner in my local area. *Toronto, Ontario*
> 
> 
> 
> 
> 
> 
> 
> 
> One thing though, when I picked up at the vendor I got my card from, the associates there told me Nvidia is not making a lot of these cards because of the high quality material they used (no plastic at all), he told me to expect VERY low quantities of this card in the near future, even more rare than the GTX 590 when they were first released.
> I expect this card to sell out much within 10 minutes every time or at least 2 times faster than the GTX 680 which is the flagship card Nvidia has been mass producing to keep up with the demand.
> 
> Now, I am wondering whether to break the seal and test out the card or just resell the card sealed locally for cash.


If you can make AT LEAST $500+ in profit and not feel bad about gouging, sell it...

If not, VIDS AND PICS NAO OR FOREVER BE BANNED FROM THIS CLUB!!!









Ugh, Monday can't get here fast enough...never thought i'd say that about Monday, either...


----------



## 3930K

I'm faced with a dillema: 690 + block or 680 HC 4GB. I'm thinking 4GB will be better (as I'm going 3x U3011) but currently 690 gives me moar performance.


----------



## rwchui

Quote:


> Originally Posted by *jcde7ago*
> 
> If you can make AT LEAST $500+ in profit and not feel bad about gouging, sell it...
> If not, VIDS AND PICS NAO OR FOREVER BE BANNED FROM THIS CLUB!!!
> 
> 
> 
> 
> 
> 
> 
> 
> Ugh, Monday can't get here fast enough...never thought i'd say that about Monday, either...


Hehe! I think I will put it up for sale for the weekend at least until Monday, after that I might decide to crack the seal and use it


----------



## jcde7ago

Quote:


> Originally Posted by *3930K*
> 
> I'm faced with a dillema: 690 + block or 680 HC 4GB. I'm thinking 4GB will be better (as I'm going 3x U3011) but currently 690 gives me moar performance.


Or you can head over to the Monitors and Displays forum and pick up 3x Korean 2560x1440 IPS monitors, using the same LG displays as the Dells, for the price of ONE U3011. Believe me when i say that they are quality, and nothing to scoff at.









And if you're running Surround with that high of a resolution...i have doubts about performance, even with 4GB of VRAM...hmm.


----------



## 3930K

Quote:


> Originally Posted by *jcde7ago*
> 
> Or you can head over to the Monitors and Displays forum and pick up 3x Korean 2560x1440 IPS monitors, using the same LG displays as the Dells, for the price of ONE U3011. Believe me when i say that they are quality, and nothing to scoff at.
> 
> 
> 
> 
> 
> 
> 
> 
> And if you're running Surround with that high of a resolution...i have doubts about performance, even with 4GB of VRAM...hmm.


I would do it... but I dont want to risk dead pixels.


----------



## jcde7ago

Quote:


> Originally Posted by *3930K*
> 
> I would do it... but I dont want to risk dead pixels.


Some sellers offer guaranteed ZERO dead pixels for like $20-30 extra per monitor that you order.


----------



## 3930K

Quote:


> Originally Posted by *jcde7ago*
> 
> Some sellers offer guaranteed ZERO dead pixels for like $20-30 extra per monitor that you order.


Please link to some of those in UK ebay. It's prob a done deal


----------



## fuzzybass

jcde7ago: How did you find the link to the Amazon page? I tried searching for "gtx 690" since yesterday morning, and it would only come up with GTX 680's. Did you search for the manufacturer (EVGA, in this case)?

Also, does anyone know at what time exactly the 690's will be available on the May 7th date? Will it be like the May 3rd launch, where it's available at exactly 6 a.m., or will it just be like any other day at retail (ie. when the store opens)?


----------



## jcde7ago

Quote:


> Originally Posted by *3930K*
> 
> Please link to some of those in UK ebay. It's prob a done deal


Here you go...and OMG, they are like $316-330 shipped now...WOW. That is a ridiculous steal!!!! Again, these use the same LG-manufactured IPS panels found in the Dells and Apple Cinema Displays. They also have NO ANTI-GLARE COATING. This makes the picture quality THAT MUCH BETTER.

Only difference is these use A- grade panels that may have dead pixels...but if you pay the $20-30 extra to have them check...there goes that advantage. I have to say, my Achieva Shimian 2560x1440 IPS from eBay is arguably the best hardware investment i've made in the last 2-3 years. It is THAT impressive. These Koreans aren't messing around with their high-res IPS displays, man.









Here is what I have, the Achieva Shimian:

Achieva Shimian, 27" IPS, 2560x1440p @ $316

Here is the other variant, the Yamakasi Catleap...same thing, just higher stand:

Yamakasi Catleap, 27" IPS, 2560x1440p @ $329

It says they ship worldwide. Be sure to message them about zero dead pixel policy....they are both reputable sellers here on OCN, lots of people have bought from them. Thank me later.









EDIT:
Quote:


> Originally Posted by *fuzzybass*
> 
> jcde7ago: How did you find the link to the Amazon page? I tried searching for "gtx 690" since yesterday morning, and it would only come up with GTX 680's. Did you search for the manufacturer (EVGA, in this case)?
> Also, does anyone know at what time exactly the 690's will be available on the May 7th date? Will it be like the May 3rd launch, where it's available at exactly 6 a.m., or will it just be like any other day at retail (ie. when the store opens)?


I'm on a bunch of other forums, and someone else posted the link a few hours ago.


----------



## jcde7ago

Quote:


> Originally Posted by *rwchui*
> 
> Hehe! I think I will put it up for sale for the weekend at least until Monday, after that I might decide to crack the seal and use it


You are a stronger man than i am. I don't think I could resist waiting 3 days to not crack open that beast. I'd give it MAYBE one day to sell locally for cash, hoping for at least $500~ in profit, and if it didn't sell...that thing would be in my rig within minutes. Good luck with whatever you decide to do, but for now, you're definitely still in the club.


----------



## rwchui

Quote:


> Originally Posted by *jcde7ago*
> 
> You are a stronger man than i am. I don't think I could resist waiting 3 days to not crack open that beast. I'd give it MAYBE one day to sell locally for cash, hoping for at least $500~ in profit, and if it didn't sell...that thing would be in my rig within minutes. Good luck with whatever you decide to do, but for now, you're definitely still in the club.


Thanks!

I'll keep you informed of what happens!


----------



## V3teran

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm going to add you to the list now.
> But...did I read that correctly?! Is that 848 GBP...total cost for ONE GTX 690 in the UK?! Dude, that's like $1,400...


Yeah that is correct.


----------



## CapnCrunch10

Up on EVGA! How fast can you pull out your credit cards? Go go go.

http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=


----------



## jcde7ago

In stock now!!!!

GO GO GO IF YOU HAVEN'T ORDERED ONE ALREADY!!!!

http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=


----------



## CapnCrunch10

OOS. Only took 3 min this time.


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> OOS. Only took 3 min this time.


Yeah...that's insane. Hopefully we'll have a few new members who managed to snag one this time around...


----------



## Lazz767

Quote:


> Originally Posted by *rwchui*
> 
> You can now add me to the club
> 
> 
> 
> 
> 
> 
> 
> 
> I might be the first owner in my local area. *Toronto, Ontario*
> 
> 
> 
> 
> 
> 
> 
> 
> One thing though, when I picked up at the vendor I got my card from, the associates there told me Nvidia is not making a lot of these cards because of the high quality material they used (no plastic at all), he told me to expect VERY low quantities of this card in the near future, even more rare than the GTX 590 when they were first released.
> I expect this card to sell out much within 10 minutes every time or at least 2 times faster than the GTX 680 which is the flagship card Nvidia has been mass producing to keep up with the demand.
> 
> Now, I am wondering whether to break the seal and test out the card or just resell the card sealed locally for cash.


Where in T.O did you find it, I can't find it anywhere.


----------



## CapnCrunch10

*PSA*: Be sure to use the "Notify Me" button on EVGA's website since they actually email everyone unlike the Egg.


----------



## jcde7ago

Quote:


> Originally Posted by *Lazz767*
> 
> Where in T.O did you find it, I can't find it anywhere.


3 places to get it....i posted links on the first page. Check the "Availability" section. All three sites (EVGA, Amazon, NewEgg) offer email notifications to alert you when they have 690s in stock. I'd start there.


----------



## Lazz767

Quote:


> Originally Posted by *jcde7ago*
> 
> 3 places to get it....i posted links on the first page. Check the "Availability" section. All three sites (EVGA, Amazon, NewEgg) offer email notifications to alert you when they have 690s in stock. I'd start there.


I know those but he said he picked it up from a local vendor, I'd like to know where in Toronto since I live there too.


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> In stock now!!!!
> GO GO GO IF YOU HAVEN'T ORDERED ONE ALREADY!!!!
> http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=


I got the in stock notification email at 5:59pm PST. Added to cart > Registered > Entered Credit Card > Clicked Place Order and..........OUT OF STOCK!









It took me about 1 minute to register and click through purchase pages, and within that time I lost out, lol. Oh well, I'm sure they'll get more in stock Monday, and this time I'm ready! Haha.









I'm wondering, when did everyone else get their notification email at?


----------



## rwchui

Local would be very to find these cards now, the one I went to said only had 5 came in and were already all sold to customers.

They also told me, they don't know when the manufacturer will send more. Could be days or even weeks.

I would recommend you to find one online.


----------



## jcde7ago

Quote:


> Originally Posted by *shiloh*
> 
> Finally I was able to order one from NCIX and it's shipping today! I cancelled my signature pre order on evga.com.
> Add me in the club! My first dual GPU (or even SLI) setup!


Added.


----------



## fuzzybass

As much as I appreciate the info provided by this thread, I can't help but notice preferential treatment towards those in the "690 club", versus those who are not in the club.


----------



## jcde7ago

Quote:


> Originally Posted by *fuzzybass*
> 
> As much as I appreciate the info provided by this thread, I can't help but notice preferential treatment towards those in the "690 club", versus those who are not in the club.


Erm, how so? I've done my absolute best to respond to everyone in this thread who may have had questions or looked to be added. I'm not exactly sure what you mean by your statement? What would you like me to do better? I pretty much try to keep all the relevant information one would possibly want on the front page and then answer questions as they come in. I would really like to know why you think there is "preferential treatment" to club members only...









EDIT: Yeah, i'm really confused. I just went through all 250+ posts in this thread, and i'm really not sure where you're coming from. I pretty much tried to answer everyone and provide info as they came in, and update the first page with new members and relevant links and stuff. I'm sorry you feel that way, but either you're overreacting, or some people's posts here offended you. The majority of the posts here talk about stock and exchanging nice words about the card. So again, if you want to clarify what you mean by "preferential treatment," i'd love to be able to accommodate you like I do everyone else. I'll also apologize if it's weird that someone who joined today and has a total of two posts would be so randomly spurned by this thread...


----------



## Lazz767

Quote:


> Originally Posted by *fuzzybass*
> 
> As much as I appreciate the info provided by this thread, I can't help but notice preferential treatment towards those in the "690 club", versus those who are not in the club.


I'm not in the club and I think these guys are helping as much as they can. I know it sucks for those who are having issues getting the video card but I don't think anyone is getting preferred treatment, there is only so much that can be done to help and again I think everyone in this forum is doing their best to help out.


----------



## jcde7ago

Quote:


> Originally Posted by *Lazz767*
> 
> I'm not in the club and I think these guys are helping as much as they can. I know it sucks for those who are having issues getting the video card but I don't think anyone is getting preferred treatment, there is only so much that can be done to help and again I think everyone in this forum is doing their best to help out.


Yeah, thanks for the kind words.









I can't for the life of me understand where he's coming from...all i'm doing is trying to provide info as it comes in, and answering questions that I see that may not have been addressed at all. I want this thread to be a resource first and foremost, and a nice place to discuss all things 690-related secondly, including helping people find stock and whatnot. Not sure if there is much more I can do.


----------



## bnj2

I just preordered one and seems madness started to set in as I'm thinking of actually getting another one for my surround setup








So, will a AX850 PSU be enough for 2 of them + a 2600k or should I get another PSU as well?


----------



## CapnCrunch10

Quote:


> Originally Posted by *bitMobber*
> 
> I got the in stock notification email at 5:59pm PST. Added to cart > Registered > Entered Credit Card > Clicked Place Order and..........OUT OF STOCK!
> 
> 
> 
> 
> 
> 
> 
> 
> It took me about 1 minute to register and click through purchase pages, and within that time I lost out, lol. Oh well, I'm sure they'll get more in stock Monday, and this time I'm ready! Haha.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm wondering, when did everyone else get their notification email at?


?:58 for me. The literally try and send it in mass to everyone at the same time.


----------



## jcde7ago

Quote:


> Originally Posted by *bnj2*
> 
> I just preordered one and seems madness started to set in as I'm thinking of actually getting another one for my surround setup
> 
> 
> 
> 
> 
> 
> 
> 
> So, will a AX850 PSU be enough for 2 of them + a 2600k or should I get another PSU as well?


I would not risk 2x GTX 690s + an i7 2600K on an 850w PSU, especially if you're OC'ing that 2600K (of course you are this is OCN lol







)... 2x GTX 690s will have a TDP of 600w, + your OC'd i7 2600K, + everything else...even at 80+% efficiency, that doesn't leave enough headroom that i'd be comfortable with, personally. If you're going to plunk down the cash for a pair of these bad boys...plunk down a bit more extra for a quality 1K watt PSU.


----------



## CapnCrunch10

Quote:


> Originally Posted by *bnj2*
> 
> I just preordered one and seems madness started to set in as I'm thinking of actually getting another one for my surround setup
> 
> 
> 
> 
> 
> 
> 
> 
> So, will a AX850 PSU be enough for 2 of them + a 2600k or should I get another PSU as well?


Geez... Some of you guys are ridiculous/awesome/ridiculously awesome. You should try and get the matching SLI connector too!



Ummm... It might be able to handle 850W, but why take the risk? Plus, it would be better efficiency for a higher wattage PSU since you'd be pushing the limits.

I think you should upgrade if you're OCing too, I think you might as well upgrade. Especially considering that you're shelling out 2 grand for graphics cards.


----------



## ReignsOfPower

Ill be picking one of these bad boys up with a EK waterblock and backplate when they all become available. My Pre-order with my local vendor is placed.


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, thanks for the kind words.
> 
> 
> 
> 
> 
> 
> 
> 
> I can't for the life of me understand where he's coming from...all i'm doing is trying to provide info as it comes in, and answering questions that I see that may not have been addressed at all. I want this thread to be a resource first and foremost, and a nice place to discuss all things 690-related secondly, including helping people find stock and whatnot. Not sure if there is much more I can do.


Ignore him. I think it was some kind of misguided and lame joke.

If he really was offended, then we can wait to see what it is that irked him or any examples he can provide.


----------



## fuzzybass

Listen, it's not a big deal.







Like I said before, I appreciate all the info.


----------



## jcde7ago

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Ill be picking one of these bad boys up with a EK waterblock and backplate when they all become available. My Pre-order with my local vendor is placed.


Nice!









I really, really want to put a waterblock on mine...like, really. But DAT SHROUD!!!









Like i've been saying over and over...if the 690 can stay cool enough in games like BF3 completely maxed out, even in the summer (Summer is coming here in the states...) then i may end up not getting a block and just keeping the shroud on for a prolonged period of time.

Then again, once I see people start to push their boost OCs to the limit under water and the gains are awesome while keeping the temps amazingly cool...I will not even hesitate to throw this bad boy in my loop.








Quote:


> Originally Posted by *fuzzybass*
> 
> Listen, it's not a big deal.
> 
> 
> 
> 
> 
> 
> 
> Like I said before, I appreciate all the info.


I'll take your word for it...I don't even mind criticism as long as it's justified.


----------



## ReignsOfPower

^ I know what you mean. It's a shame I gotta discard that beauty of a heatsink to put it back in the box. I might consider picking up a EVGA watercooled version from the get go but I do usually prefer doing it myself


----------



## jcde7ago

Quote:


> Originally Posted by *ReignsOfPower*
> 
> ^ I know what you mean. It's a shame I gotta discard that beauty of a heatsink to put it back in the box. I might consider picking up a EVGA watercooled version from the get go but I do usually prefer doing it myself


Yeah, i hear ya...it is a shame indeed. We can use it as a centerpiece display instead, lol...









As for a watercooled version...it will take months for that to happen. Nvidia for the time being is not allowing any of its partners to change the reference design - which doesn't really make a difference at the moment, since Nvidia is the only one making the 690s. So yeah, I think Nvidia really wants to ingrain this unique design of the GTX 690 into everyone's heads, and I don't blame them...I think it's a good idea for now, actually.


----------



## CapnCrunch10

Quote:


> Originally Posted by *fuzzybass*
> 
> ^ I rest my case.


Lol what case? Were you waiting for someone to post that your original comment was frivolous?

If you need help, ask. You're being the jerk at the moment for not telling us what you need and then chastising us for not being helpful.

If I offended you in some way, let me be the first to apologize.


----------



## jcde7ago

No one's mentioned this yet, but...even though I may never have a need or desire to move to quad-SLI in the future, i'd still like to know where/if we can even get our hands on the sweet-looking SLI-bridge in the press release pic below,as it matches the 690's design scheme...


----------



## Lazz767

Quote:


> Originally Posted by *jcde7ago*
> 
> No one's mentioned this yet, but...even though I may never have a need or desire to move to quad-SLI in the future, i'd still like to know where/if we can even get our hands on the sweet-looking SLI-bridge in the press release pic below,as it matches the 690's design scheme...


You figure they'd include it in the box, being $1k and all.


----------



## Basti

Can I join you guys?







I can't wait to get a hold of this badboy!



Oh yea, Canada Tax = ***.


----------



## SimpleTech

Quote:


> Originally Posted by *jcde7ago*
> 
> No one's mentioned this yet, but...even though I may never have a need or desire to move to quad-SLI in the future, i'd still like to know where/if we can even get our hands on the sweet-looking SLI-bridge in the press release pic below,as it matches the 690's design scheme...


Mentioned about 3½ hours ago.

http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/260#post_17156514


----------



## jcde7ago

Quote:


> Originally Posted by *SimpleTech*
> 
> Mentioned about 3½ hours ago.
> http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/260#post_17156514


Oh...thanks for that....not sure how I missed it.









You guys are keeping better track of my thread than i am at times, haha...


----------



## jcde7ago

Quote:


> Originally Posted by *Basti*
> 
> Can I join you guys?
> 
> 
> 
> 
> 
> 
> 
> I can't wait to get a hold of this badboy!
> 
> Oh yea, Canada Tax = ***.


Grats on being able to snag one! And of course....added!








Quote:


> Originally Posted by *Lazz767*
> 
> You figure they'd include it in the box, being $1k and all.


Yeah, but I can also understand why they don't, as SLI bridges are usually included with Motherboards, and not GPUs. Still though...DAT SLI-BRIDGE! If they're not giving it out with our 690s, they at least need to let us buy that darn thing...


----------



## Cheesemaster

Quote:


> Originally Posted by *jcde7ago*
> 
> No one's mentioned this yet, but...even though I may never have a need or desire to move to quad-SLI in the future, i'd still like to know where/if we can even get our hands on the sweet-looking SLI-bridge in the press release pic below,as it matches the 690's design scheme...


Yeah tell me about it! I am quading these; and I wantz!


----------



## ceteris

OMG... I was watching The Avengers at the movie theatre when I got e-mailed by EVGA that they were in stock. ***!


----------



## OmegaRED.

Quote:


> Originally Posted by *ceteris*
> 
> OMG... I was watching The Avengers at the movie theatre when I got e-mailed by EVGA that they were in stock. ***!


That's literally the only acceptable reason for using a cellphone in the theater.


----------



## Lazz767

Quote:


> Originally Posted by *Basti*
> 
> Can I join you guys?
> 
> 
> 
> 
> 
> 
> 
> I can't wait to get a hold of this badboy!
> 
> Oh yea, Canada Tax = ***.


Did you actually snag one or did you just order one while its back ordered?


----------



## OmegaRED.

Quote:


> Originally Posted by *Basti*
> 
> Can I join you guys?
> 
> 
> 
> 
> 
> 
> 
> I can't wait to get a hold of this badboy!
> 
> Oh yea, Canada Tax = ***.


Not bad, I paid $100 more for my SLI 680s from NCIX.


----------



## Basti

@Lazz767
I ordered it on May 3rd around 11pm pst while it still said "available in 5-10 business days".
My order was processed this afternoon but haven't received an email that it has been shipped yet.
I hope they shipped it already or on Monday.


----------



## 3930K

Quote:


> Originally Posted by *jcde7ago*
> 
> Here you go...and OMG, they are like $316-330 shipped now...WOW. That is a ridiculous steal!!!! Again, these use the same LG-manufactured IPS panels found in the Dells and Apple Cinema Displays. They also have NO ANTI-GLARE COATING. This makes the picture quality THAT MUCH BETTER.
> Only difference is these use A- grade panels that may have dead pixels...but if you pay the $20-30 extra to have them check...there goes that advantage. I have to say, my Achieva Shimian 2560x1440 IPS from eBay is arguably the best hardware investment i've made in the last 2-3 years. It is THAT impressive. These Koreans aren't messing around with their high-res IPS displays, man.
> 
> 
> 
> 
> 
> 
> 
> 
> Here is what I have, the Achieva Shimian:
> Achieva Shimian, 27" IPS, 2560x1440p @ $316
> Here is the other variant, the Yamakasi Catleap...same thing, just higher stand:
> Yamakasi Catleap, 27" IPS, 2560x1440p @ $329
> It says they ship worldwide. Be sure to message them about zero dead pixel policy....they are both reputable sellers here on OCN, lots of people have bought from them. Thank me later.
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> I'm on a bunch of other forums, and someone else posted the link a few hours ago.


OK, I can't really buy all of them at once, as that'd be $3K (2K for 690s and 1K for the monitors) and it would leave me with a tiny bit of money for the 3930K + RIVE + Dom GT's + WC.
Anyone know of a Korean 30"?


----------



## barkinos98

people might think im ******ed but since i was planning on evga 680 sli'ed, should i go with one 690? the difference is max 2-3fps according to the tables that the op posted, but this does look cooler. and for the same price, should i go with this and wait a water block or go with 2 680 which already has the blocks out.


----------



## 3930K

Quote:


> Originally Posted by *barkinos98*
> 
> people might think im ******ed but since i was planning on evga 680 sli'ed, should i go with one 690? the difference is max 2-3fps according to the tables that the op posted, but this does look cooler. and for the same price, should i go with this and wait a water block or go with 2 680 which already has the blocks out.


Please tell me your mobo.


----------



## V3teran

Quote:


> Originally Posted by *Lazz767*
> 
> You figure they'd include it in the box, being $1k and all.


Totally agree,they should include them in the box.
I will be putting mine in a custom loop also once the novelty wears off about how it looks,it will have an rx360 rad all to itself.


----------



## bnj2

Quote:


> Originally Posted by *jcde7ago*
> 
> I would not risk 2x GTX 690s + an i7 2600K on an 850w PSU, especially if you're OC'ing that 2600K (of course you are this is OCN lol
> 
> 
> 
> 
> 
> 
> 
> )... 2x GTX 690s will have a TDP of 600w, + your OC'd i7 2600K, + everything else...even at 80+% efficiency, that doesn't leave enough headroom that i'd be comfortable with, personally. If you're going to plunk down the cash for a pair of these bad boys...plunk down a bit more extra for a quality 1K watt PSU.


Quote:


> Originally Posted by *CapnCrunch10*
> 
> Geez... Some of you guys are ridiculous/awesome/ridiculously awesome. You should try and get the matching SLI connector too!
> Ummm... It might be able to handle 850W, but why take the risk? Plus, it would be better efficiency for a higher wattage PSU since you'd be pushing the limits.
> I think you should upgrade if you're OCing too, I think you might as well upgrade. Especially considering that you're shelling out 2 grand for graphics cards.


Yeah, the problem is that I'm stretching too much getting a 2nd one already (in Europe prices for 690s are higher) and a good 1000W+ PSU is about another $3-400 if not more, but yeah... madness costs and I will regret it in a few months








From the test I've seen, a Ivy Bridge system, mildly overclocked, draws about 450W of power from the wall, that's why I had hopes.

Anyway, anyone knows where to get that awesome looking SLI bridge?


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Oh...thanks for that....not sure how I missed it.
> 
> 
> 
> 
> 
> 
> 
> 
> You guys are keeping better track of my thread than i am at times, haha...


That was my bad. I originally quoted the wrong person thinking they had bought two of the 690s so I shifted the quote to the proper reply.

Quote:


> Originally Posted by *barkinos98*
> 
> people might think im ******ed but since i was planning on evga 680 sli'ed, should i go with one 690? the difference is max 2-3fps according to the tables that the op posted, but this does look cooler. and for the same price, should i go with this and wait a water block or go with 2 680 which already has the blocks out.


I think we hashed this one out quite well. Go ahead and start here and work your way up.


----------



## Sh1n1ng Forc3

Can I please join the club now. I ordered Thursday night at 4:40pm PST from newegg (I know I took it up the pooper on this one). My GTX 690 will be here on Monday according to UPS tracking info.


----------



## jcde7ago

Quote:


> Originally Posted by *Sh1n1ng Forc3*
> 
> Can I please join the club now. I ordered Thursday night at 4:40pm PST from newegg (I know I took it up the pooper on this one). My GTX 690 will be here on Monday according to UPS tracking info.


Grats, and added.


----------



## Arizonian

Well I've kept the GTX 690 Signature on Pre-order and did not cancel it. Waiting until the end of the month is going to kill me but this rate I've not been able to land one anyway.

In the meantime the kids want the 680 from me in their X58 I handed down and I will be trying to move to the integrated 4000 GPU on my new Ivy until the 690 arrives.

Just can't seem to move to the onboard video settings with the new UEFI Bios. Will post back when it arrives.

$1054.28 is much better than the rip offs on other sites. I don't need a shirt or mouse pad but at least it's something for the extra $55 rather than just a price hike.


----------



## Majin SSJ Eric

Dang, I'm having a hard time imagining how awesome it would be to have two of these in my rig! Of course at the inflated prices of right now you'd have to be mental to buy two of them....


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Well I've kept the GTX 690 Signature on Pre-order and did not cancel it. Waiting until the end of the month is going to kill me but this rate I've not been able to land one anyway.
> In the meantime the kids want the 680 from me in their X58 I handed down and I will be trying to move to the integrated 4000 GPU on my new Ivy until the 690 arrives.
> Just can't seem to move to the onboard video settings with the new UEFI Bios. Will post back when it arrives.
> $1054.28 is much better than the rip offs on other sites. I don't need a shirt or mouse pad but at least it's something for the extra $55 rather than just a price hike.


You're back on the list.


----------



## 3930K

God I want a 690, but I'm not sure I'll manage with the 2GB/GPU. Opinions?


----------



## sundrou

Dear Mr

This email is to acknowledge placement of order number OC1416139.
Order date and time: 3 May, 12, 9:34 pm.

Your order consisted of the following items:

Item

Qty

Price

MSI GeForce GTX 690 4096MB PCI-Express Graphics Card

1

£733.29

Sub Total:

£733.29

DHL Worldwide Express

Shipping:

£46.45

Total Vat:

£0

Total inc Vat:

£779.74

can i join?


----------



## ceteris

Quote:


> Originally Posted by *3930K*
> 
> God I want a 690, but I'm not sure I'll manage with the 2GB/GPU. Opinions?


I'm sort of in the same boat as you, bro. One of the main things driving me is that I want to stay on a 1 PSU solution rather than break my new build into 2 of them. My new build's case is a Caselabs STH-10 and I want to put 3 GTX 560's and 1 GTX 280 on the side that the PSU is on. If I had to use 2 PSUs, I will have to ditch 1 560 rad and the 280 and go with 2 360's, which is what I don't want since I already have all 3 560's and the 280.

Although I'm not sure my current PSU is going to be able to power all that, I at least know that if I do upgrade my PSU, I probably won't need to go beyond the good ol' reliable Corsair AX1200. Plus I can get the pre-braided cable set along with it if I want.


----------



## sundrou

as this my first time i deal with SLI its possible to add my gtx 680 togther with gtx 690 ??


----------



## yfz350rider

Im jealous. I plan on buying a 690 as soon as I find a buyer for my 590.


----------



## yfz350rider

As far as I know It is not possible to do that. With AMD cards it is possible to take a 6990 and pair it with a 6970 but with the NVIDIA cards (500 series for sure) you can only pair dual gpu cards with dual gpu cards.


----------



## sundrou

Quote:


> Originally Posted by *yfz350rider*
> 
> As far as I know It is not possible to do that. With AMD cards it is possible to take a 6990 and pair it with a 6970 but with the NVIDIA cards (500 series for sure) you can only pair dual gpu cards with dual gpu cards.


thanks for the fast respond


----------



## TheRainMan

Can you do GTX690 +GTX680 SLI? Really curious


----------



## sundrou

Quote:


> Originally Posted by *TheRainMan*
> 
> Can you do GTX690 +GTX680 SLI? Really curious


not sure i could try lol but im afraid as i dont have good info on this


----------



## 3930K

Quote:


> Originally Posted by *ceteris*
> 
> I'm sort of in the same boat as you, bro. One of the main things driving me is that I want to stay on a 1 PSU solution rather than break my new build into 2 of them. My new build's case is a Caselabs STH-10 and I want to put 3 GTX 560's and 1 GTX 280 on the side that the PSU is on. If I had to use 2 PSUs, I will have to ditch 1 560 rad and the 280 and go with 2 360's, which is what I don't want since I already have all 3 560's and the 280.
> Although I'm not sure my current PSU is going to be able to power all that, I at least know that if I do upgrade my PSU, I probably won't need to go beyond the good ol' reliable Corsair AX1200. Plus I can get the pre-braided cable set along with it if I want.


Yeah so would I. But then again I've got 4 psu slots in my case









But then again I want as many mini-itx folders in that case as well.


----------



## Lazz767

Add me to the club!



On a side note, anyone in Canada looking for it go here to order it: http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690


----------



## TheRainMan

^unfortunate that the US tigerdirect website doesn't have it in stock


----------



## Lazz767

Quote:


> Originally Posted by *TheRainMan*
> 
> ^unfortunate that the US tigerdirect website doesn't have it in stock


I know. It was a pain for me to even find. They're not actually listing it on the website I had to do a google search for it and it found the link.


----------



## Cheesemaster

Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Dang, I'm having a hard time imagining how awesome it would be to have two of these in my rig! Of course at the inflated prices of right now you'd have to be mental to buy two of them....


I must be absolutely, ******ED!


----------



## jcde7ago

Quote:


> Originally Posted by *Lazz767*
> 
> Add me to the club!
> 
> On a side note, anyone in Canada looking for it go here to order it: http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690


Grats, and added.









I've also added the link to Tiger Direct Canada for our Canadian members.
Quote:


> Originally Posted by *TheRainMan*
> 
> Can you do GTX690 +GTX680 SLI? Really curious


No. You can only SLI two of the same model cards for Nvidia.

If people are going to SLI the GTX 600 series past two cards, I would recommend sticking with GTX 680s (especially if they release 4GB variants that are...actually possible to get a hold of). This is because scaling past three cards at MOST is not worth it MOST of the time. If you have the money to burn, sure, 2x GTX 690s WILL be better than 3x 680s, but not at the cost of an extra $500.00. You will run into diminishing returns past Tri-SLI in a lot of ways, unless Nvidia somehow is able to up quad-SLI scaling performance in later GTX 600 driver releases.


----------



## Cheesemaster

Quote:


> Originally Posted by *jcde7ago*
> 
> Grats, and added.
> 
> 
> 
> 
> 
> 
> 
> 
> I've also added the link to Tiger Direct Canada for our Canadian members.
> No. You can only SLI two of the same model cards for Nvidia.
> If people are going to SLI the GTX 600 series past two cards, I would recommend sticking with GTX 680s (especially if they release 4GB variants that are...actually possible to get a hold of). This is because scaling past three cards at MOST is not worth it MOST of the time. If you have the money to burn, sure, 2x GTX 690s WILL be better than 3x 680s, but not at the cost of an extra $500.00. You will run into diminishing returns past Tri-SLI in a lot of ways, unless Nvidia somehow is able to up quad-SLI scaling performance in later GTX 600 driver releases.


Thats why they made special drivers for 690gtx ( this is an e-pinion and an assumption!)


----------



## jcde7ago

Quote:


> Originally Posted by *Cheesemaster*
> 
> Thats why they made special drivers for 690gtx ( this is an e-pinion and an assumption!)


They always release card-specific drivers for dual-GPU cards on the initial release, because those are the drivers they had been testing for a while. After that, the GTX 690 will fall in line with the same driver releases that the rest of the GTX 600 Series cards will get.


----------



## drufause

So i'm seeing that this card does support PCIe 3.0. Has Nvidia released a supported driver for PCIe 3.0?


----------



## 3930K

Quote:


> Originally Posted by *drufause*
> 
> So i'm seeing that this card does support PCIe 3.0. Has Nvidia released a supported driver for PCIe 3.0?


The 680 supports PCIE3, only just on IVB. The 690 has PCIE 3 on SNB-E + IVB.


----------



## PTCB

In two months' time, I'll grab one (hopefully, the stock will have improved by then).

On a side note, how much does EVGA charge for shipping (e.g. to FL)? Thanks.


----------



## CapnCrunch10

Quote:


> Originally Posted by *PTCB*
> 
> In two months' time, I'll grab one (hopefully, the stock will have improved by then).
> On a side note, *how much does EVGA charge for shipping (e.g. to FL)*? Thanks.


Ground shipping has always been $4.99 for me. The quicker options go anywhere from $10-$35.


----------



## jcde7ago

Quote:


> Originally Posted by *drufause*
> 
> So i'm seeing that this card does support PCIe 3.0. Has Nvidia released a supported driver for PCIe 3.0?


The 690 has a PLX PCI-E 3.0 bridge chip inside that replaces the NF200, so yes.


----------



## Callandor

Long time lurker, first time poster









Tigerdirect US link:

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690

Just ordered! On iPad though so no capture of confirmation email


----------



## kemsoff

Quote:


> Originally Posted by *Callandor*
> 
> Long time lurker, first time poster
> 
> 
> 
> 
> 
> 
> 
> 
> Tigerdirect US link:
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690
> Just ordered! On iPad though so no capture of confirmation email


+rep just ordered 1 as well


----------



## hitman2169

I can't order from tigerdirect.com because im living in canada .. ? tigerdirect.ca didnt have them in stock ....


----------



## Callandor

Did you try the tigerdirect.ca link on page 31 of this thread:

http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690

It has the add to cart button available


----------



## cnh789

Quote:


> Originally Posted by *Callandor*
> 
> Long time lurker, first time poster
> 
> 
> 
> 
> 
> 
> 
> 
> Tigerdirect US link:
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690
> Just ordered! On iPad though so no capture of confirmation email


awesome just ordered one too!


----------



## jcde7ago

Quote:


> Originally Posted by *Callandor*
> 
> Long time lurker, first time poster
> 
> 
> 
> 
> 
> 
> 
> 
> Tigerdirect US link:
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690
> Just ordered! On iPad though so no capture of confirmation email


Thanks for the link! I'll go ahead and add you to the list as I wait for a confirmation pic.








Quote:


> Originally Posted by *kemsoff*
> 
> 
> +rep just ordered 1 as well


Grats, and added!








Quote:


> Originally Posted by *cnh789*
> 
> awesome just ordered one too!


Added as well, grats!









I've gone ahead and officially added the Tiger Direct US product link to the front page. *As a reminder, I have all the current e-tailers' product pages linked on the very first page of this thread, under "Availability."*


----------



## dred

Quote:


> Originally Posted by *Callandor*
> 
> Long time lurker, first time poster


I'm guilty of the exact same thing:
Quote:


> Originally Posted by *dred*
> 
> Was lurking the forum this morning and ended up getting a sig. edition.
> Quote:
> 
> 
> 
> Originally Posted by *Arizonian*
> 
> I'm going to cancel my pre-order. It's not shipping until the end of the month!
> 
> 
> 
> I'm thinking about doing the same. I jumped the gun.
Click to expand...

Except I didn't cancel my pre-order...



and, thanks to your link, will actually be able to hold one in my hands in a few days!



They stuck me with tax, but I'm not disappointed. Upgraded shipping as well!

Now I just need to find out where to get my hands on that sexy SLI bridge...


----------



## jcde7ago

Quote:


> Originally Posted by *dred*
> 
> I'm guilty of the exact same thing:
> Except I didn't cancel my pre-order...
> 
> and, thanks to your link, will actually be able to hold one in my hands in a few days!
> 
> They stuck me with tax, but I'm not disappointed. Upgraded shipping as well!
> Now I just need to find out where to get my hands on *that sexy SLI bridge*...


Grats, added.









And yeah, the SLI bridge has been asked about, but I don't think it is being sold. I know for sure that it is NOT being distributed with the cards by any of the board partners. Again, SLI bridges normally come with motherboards, so that isn't really too surprising. Though I do think Nvidia/partners could make a decent chunk of change by selling that awesome looking SLI bridge, or others like it that have a more modern design scheme...after all, SLI bridges have looked the same for pretty much the last ~7 years, and it might be time for a change...

EDIT: It's been about 45 min. or so since TigerDirect listed the GTX 690 up for sale, so either they have more than enough inventory to accommodate all the orders so far, or their internal stock tracking is terrible and they actually sold out within 2-3 minutes of listing it for sale, and there might be a lot of disappointed buyers receiving cancellation emails...because we know that, well, the last option/explanation of no one wanting to buy a 690 while it's listed is pretty much impossible.

Regardless, i've got my fingers crossed for you guys that ordered from TD.


----------



## Kyouki

Yea I am iffy about tiger direct so i am going to wait for the email from EVGA. I just dont see how they can have it instock all this time but all the other websits sold out in mins. and it 100$ more to wait and get a backorder email im ok!. GL to everyone that is making the order on TD.


----------



## jcde7ago

Quote:


> Originally Posted by *Kyouki*
> 
> Yea I am iffy about tiger direct so i am going to wait for the email from EVGA. I just dont see how they can have it instock all this time but all the other websits sold out in mins. and it 100$ more to wait and get a backorder email im ok!. GL to everyone that is making the order on TD.


I would actually give them a call as well (for those that ordered on TD) to be absolutely sure. It couldn't hurt at this point just to verify.

I'll still be keeping the members who did at least go through with TD purchases and posted pics on the members list, unless otherwise requested...


----------



## TheRainMan

I also ordered from TD but I'm not gonna get my hopes too high...


----------



## dred

Just chatted with TD:
Quote:


> dred: I've ordered the GTX 690 today, and need to confirm that this is indeed in stock, and NOT a pre-order or OOS on backorder.
> Please wait while we find an agent to assist you...
> All agents are currently busy. Please stand by.
> You have been connected to Nathaniel Quizon.
> Nathaniel Quizon: Hello, and thanks for contacting us today, please hold on while I get that information for you.
> dred: Thanks
> Nathaniel Quizon: May we have your order number?
> dred: ********
> Nathaniel Quizon: We have checked your order.
> dred: OK?
> Nathaniel Quizon: The product will ship out within the next 2-3 business days.
> dred: So it's not a pre-order, or OOS on backorder?
> Nathaniel Quizon: *It is not a pre-order.*
> dred: OK, thanks a lot! You've been very helpful!


Looks promising


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also ordered from TD but I'm not gonna get my hopes too high...


Grats, added.








Quote:


> Originally Posted by *dred*
> 
> Just chatted with TD:
> Quote:
> 
> 
> 
> dred: I've ordered the GTX 690 today, and need to confirm that this is indeed in stock, and NOT a pre-order or OOS on backorder.
> Please wait while we find an agent to assist you...
> All agents are currently busy. Please stand by.
> You have been connected to Nathaniel Quizon.
> Nathaniel Quizon: Hello, and thanks for contacting us today, please hold on while I get that information for you.
> dred: Thanks
> Nathaniel Quizon: May we have your order number?
> dred: ********
> Nathaniel Quizon: We have checked your order.
> dred: OK?
> Nathaniel Quizon: The product will ship out within the next 2-3 business days.
> dred: So it's not a pre-order, or OOS on backorder?
> Nathaniel Quizon: *It is not a pre-order.*
> dred: OK, thanks a lot! You've been very helpful!
> 
> 
> 
> Looks promising
Click to expand...

Okay, well since he's saying that it will be shipped within the next 2-3 days...it leads me to believe that TD might not necessarily have enough to fulfill the orders right now, but they are expecting a decent shipment in the next day or two, coinciding with the May 7 "general availability" date...so that is indeed good news.


----------



## TheRainMan

Isn't the "real release" (AKA general availability date) tomorrow and therefore there will be 690's in stock in a few places at stock price?

Is the $100 "luxury tax" really worth it?

/buyers remorse :>


----------



## kemsoff

Quote:


> Originally Posted by *jcde7ago*
> 
> I would actually give them a call as well (for those that ordered on TD) to be absolutely sure. It couldn't hurt at this point just to verify.
> I'll still be keeping the members who did at least go through with TD purchases and posted pics on the members list, unless otherwise requested...


I called them right after I ordered and they say mine will be here by friday.


----------



## Arizonian

Quote:


> Originally Posted by *TheRainMan*
> 
> Isn't the "real release" tomorrow and therefore there will be 690's in stock in a few places at stock price?
> Is the $100 "luxury tax" really worth it?
> /buyers remorse


Newegg selling them for $1200, others higher.

The Signature come with some extras for $50 more. T-Shirt. Mouse pad, hopefully backplate which I doubt but still hoping. Still no a premium from EVGA as opposed to paying more with nothing to show for it.

No indication prices are going down.


----------



## Callandor

I just had an online chat with TD after seeing the responses, this is what I was told:

TD: Thank you for contacting us today, how may I assist you?
Me: Hello, I just wanted to confirm the order I placed earlier today for the new EVGA GTX 690 video card is not a pre-order and will ship this week,. Order Number: ******
TD: Thanks. Please, give me a moment.
Me: thanks
TD: It is a pre-order. That is why if you look at the shipping information is shows: Usually ships within 2-3 days.
TD: I apologize for any inconvenience.
Me: ok, thanks. I presume I will get an email update regarding availability and shipping?
TD: When the item ships out, you will get an e-mail confirmation with the tracking number. Once the item is scanned by UPS you will be able to track it on their website and you will be able to see when it should be delivering.

Not sure if she did much more than look at the item page though and didn't really look beyond that.

My order status shows the credit card information was verified and the order is being processed by the warehouse, so I guess I will see what happens tomorrow.

Don't think I will be up at midnight or 6am to see if EVGA puts more online.


----------



## eviltinky

...so as a TD pre-order your credit card was just verified and will not be charged until it ships correct?


----------



## hitman2169

My visa have been charged.. not sure if its a pre order or they will send my card tomorrow?


----------



## Callandor

My card looks to have been charged as well, so will have to see how the TD status looks tomorrow


----------



## hitman2169

i have a pre order with evga ... im gonna take a look tomorrow a 6:00.... 170$tax at tigerdirect.... evga no tax so i will go with the first to ship


----------



## jcde7ago

Post you guys' order confirmations so I can add those of you who ordered from TD on the list (if I haven't added you akready).


----------



## hitman2169




----------



## jcde7ago

Quote:


> Originally Posted by *hitman2169*


Added.


----------



## eviltinky

here's mine!


----------



## jcde7ago

Quote:


> Originally Posted by *eviltinky*
> 
> here's mine!


Added, and welcome to OCN!


----------



## SimpleTech

For those looking to save a little, you can get 4% cashback on Fatwallet or Mr. Rebates with Tigerdirect. Brings the total down to $1055.99. TD also has a $10 off your order if you use a Visa card.


----------



## Callandor

Think you added me earlier, but here is mine:


----------



## jcde7ago

Quote:


> Originally Posted by *SimpleTech*
> 
> For those looking to save a little, you can get 4% cashback on Fatwallet or Mr. Rebates with Tigerdirect. Brings the total down to $1055.99. TD also has a $10 off your order if you use a Visa card.


Thanks for sharing.









Also, i've heard on other forums various reports about the Tiger Direct orders - some people who chatted with TD support claims that they were told that these 690 orders are PRE-ORDERS, with cards not shipping until May 31 (ouch), while some have been told that theirs won't be arriving for a week or two...and obviously some that were told that they would be getting theirs in 2-3 days' time.

I would keep an eye out for your order if you went with TD, especially since TD is charging people directly upon order placement and not waiting until the order is actually shipped - don't let them hold $1k+ for weeks without them shipping anything. If you don't see movement on your order within 2-3 days, I would call again or cancel and look elsewhere.

That said, until we get a better idea of how TD will resolve all of this (they still have the 690 listed as being available, which just can't be true at this point), I will not be removing any of the members who purchased from TD from the Members list.


----------



## hitman2169

may 31 ? do you think evga will send my gtx 690 faster than that if i have a pre order ?


----------



## jcde7ago

Quote:


> Originally Posted by *hitman2169*
> 
> may 31 ? do you think evga will send my gtx 690 faster than that if i have a pre order ?


Well, EVGA's Pre-Orders for the Signature Edition is supposed to go out "towards the end of the month," according to EVGA themselves. So I would say...yes, they will probably get it in before the 31st

Remember though, Signature Edition pre-order from EVGA is only different than the standard GTX 690 they're selling in that it costs $50 more for a mousepad, shirt, and a custom box, and that you have to wait 3 weeks longer for it. Other than that, the cards themselves are exactly the same. So I don't know if it's worth the wait or not...


----------



## Arizonian

Quote:


> Originally Posted by *jcde7ago*
> 
> Well, EVGA's Pre-Orders for the Signature Edition is supposed to go out "towards the end of the month," according to EVGA themselves. So I would say...yes, they will probably get it in before the 31st
> Remember though, Signature Edition pre-order from EVGA is only different than the standard GTX 690 they're selling in that it costs $50 more for a mousepad, shirt, and a custom box, and that you have to wait 3 weeks longer for it. Other than that, the cards themselves are exactly the same. So I don't know if it's worth the wait or not...


When the cards first launhed the preorder came out first. Had a bunch of people who jumped the gun thinking that was the official launch. Moments later the official limited release showed up briefly and it was sold out within 15 mins. Then for another 10 mins or do the Pre-order was still out so people bought what was showing available.

No one orderd the Signature thinking it was better or different. They are the same. Some bought it before EVGA showed the $999 card thinking it was preorder only. Others. Boise what was left shortly after.

I asked an EVGA rep off the record should I try on Monday or keep my preorder and was advised to keep my preorder as if Mondys lauch will be very very limited again. Sounds like I'd rather go for waiting over taking my chance canceling and not getting lucky enough when it launches again. Sounded as if I didnt score on Mondays launch my preorder would come quicker than stock availability if I wasn't lucky enough on Monday. For those that are waiting good luck. May not be until June if you don't score on Monday.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> When the cards first launhed the preorder came out first. Had a bunch of people who jumped the gun thinking that was the official launch. Moments later the official limited release showed up briefly and it was sold out within 15 mins. Then for another 10 mins or do the Pre-order was still out so people bought what was showing available.
> No one orderd the Signature thinking it was better or different. They are the same. Some bought it before EVGA showed the $999 card thinking it was preorder only. Others. Boise what was left shortly after.
> I asked an EVGA rep off the record should I try on Monday or keep my preorder and was advised to keep my preorder as if Mondys lauch will be very very limited again. Sounds like I'd rather go for waiting over taking my chance canceling and not getting lucky enough when it launches again. Sounded as if I didnt score on Mondays launch my preorder would come quicker than stock availability if I wasn't lucky enough on Monday. For those that are waiting good luck. May not be until June if you don't score on Monday.


I'd be willing to bet there are some listed at 6AM tomorrow morning on EVGA's site.









Also, when the stock first showed up on the May 3rd limited release on EVGA's website, the Signature Edition didn't come out first, it just was listed ABOVE the standard GTX 690. I was F5'ing from 5:57 onwards, so I saw the links pop up right at 6AM exactly







. So I think people jumped on it as it was listed at the top without even scrolling or looking down out of excitement. So it's easy to see how some people made the mistake of going for the pre-order one first, and by the time they went back and looked at the product listing and saw the standard one for sale, it sold out probably right as some people were adding it to their cart.

I got lucky since the $999.99 price tag of the standard edition caught my eye first since it was what I was expecting to see, and I naturally went for it instead. Otherwise, I probably would have gone for the pre-order one first as well...


----------



## Arizonian

Yeah I'm one of those that jumped the gun launch day so naturally I thought it was added after. Thanks for clearing up it was my own excitement. I have a feeling they have my 690 but are waiting on the shirts and or mouse pads. Darn it. I'd take the 690 and they can skip the shirt & pad.


----------



## hitman2169

So TD take my money .. and i have no more $$ on my credit card for tomorrow 6:00 am on evga.com ...and the customer service open 7:00.... :$ Seriously ???


----------



## Kyouki

What time zone 6am central ?


----------



## rwchui

This is why I always avoid dealing with TD, and if I do, I would give them a call just to make sure they have the item in stock.


----------



## SimpleTech

Are you sure it's not a pending charge? Even Newegg extracts money from my PayPal Debit Mastercard but it gets put aside on hold until they ship it.


----------



## jcde7ago

Quote:


> Originally Posted by *SimpleTech*
> 
> Are you sure it's not a pending charge? Even Newegg extracts money from my PayPal Debit Mastercard but it gets put aside on hold until they ship it.


Yeah, sometimes this can be the case. Retailers may put a 24-hour hold on the amount of the item's total cost, just to be sure your payment won't bounce when it comes times for them to deduct it upon shipment.

Still, this might not help the people that ordered from TD if that's what happened and EVGA puts stock out tomorrow morning and that money hasn't been released yet for those people.


----------



## hitman2169

All TD customer that have pending order gonna have to wait until may 31 .. i just got the email from them ... ( Dear Valued Customer,

Thank you for your e-mail.
We do apologize for the inconvenience.
Unfortunately our records indicate that this item is on backorder and should be available on May 31 but nothing is confirmed yet.
You can chose to wait for the item or you may also contact one of our sales representatives at 1.800.800.8300 for further assistance finding a comparable item.)


----------



## jcde7ago

Quote:


> Originally Posted by *hitman2169*
> 
> All TD customer that have pending order gonna have to wait until may 31 .. i just got the email from them ... ( Dear Valued Customer,
> Thank you for your e-mail.
> We do apologize for the inconvenience.
> Unfortunately our records indicate that this item is on backorder and should be available on May 31 but nothing is confirmed yet.
> You can chose to wait for the item or you may also contact one of our sales representatives at 1.800.800.8300 for further assistance finding a comparable item.)


Yikes! Bad news indeed... :/

Hopefully the people that bought from TD can get their money back ASAP. Would be a shame for some of you to not be able to grab one tomorrow because you needed the money that TD held/withdrew already....

Let me know how it goes and what you TD buyers decide to do! If you want to be removed from the list, let me know, or if you want to stay because you know you'll get a 690 from somewhere else anyways, no worries, i will keep you guys on. Best of luck with the hunt!


----------



## hitman2169

I told them to give me a refund ASAP... no reply back at this time .. ill try tomorrow if i have my money back.. what time they are gonna be up on EVGa . East time ?


----------



## kemsoff

Quote:


> Originally Posted by *hitman2169*
> 
> All TD customer that have pending order gonna have to wait until may 31 .. i just got the email from them ... ( Dear Valued Customer,
> Thank you for your e-mail.
> We do apologize for the inconvenience.
> Unfortunately our records indicate that this item is on backorder and should be available on May 31 but nothing is confirmed yet.
> You can chose to wait for the item or you may also contact one of our sales representatives at 1.800.800.8300 for further assistance finding a comparable item.)


EEk, havent recieved that email yet


----------



## dred

Quote:


> Originally Posted by *kemsoff*
> 
> EEk, havent recieved that email yet


Neither have I. I ordered around 11:45AM CDT.


----------



## eviltinky

I hope I'm one of the lucky ones for my TD order as shown below and haven't yet received an email for a delayed order. Anyway, I'll call TD tomorrow to confirm.


----------



## Kyouki

Yea I seen that coming, Wish you guys luck, I am happy I decided to hold off. I posted early no response is it 6 am central time. I need to set my alarm!!!!


----------



## jcde7ago

Quote:


> Originally Posted by *hitman2169*
> 
> I told them to give me a refund ASAP... no reply back at this time .. ill try tomorrow if i have my money back.. what time they are gonna be up on EVGa . East time ?


EVGA had them up on their site at exactly 6AM Pacific Time (9 Eastern) on May 3rd during launch day. So I would start checking the site a few minutes before then, and just keep mashing F5. Be ready with your CC info, and make sure to sign in to your EVGA account (make one if you don't already have one) so that all you need to do is enter in that CC info.


----------



## hitman2169

Thx for the info .now i just need my $ back from TD :$... before 9am .. i will be ready F5 !!!


----------



## TheRainMan

Ugh, what crappy news. My order is still "pending review" so I'll be up at 6 am EST tomorrow morning, put an order in at EVGA and cancel my TD order


----------



## hitman2169

If i have a pre-order with Evga for the signature .. can i buy 1 tomorrow if they have some in stock .??. Or im stuck with the 1 Limits per Household .. then iwill have to wait till the end of the month ??


----------



## dred

Quote:


> Originally Posted by *hitman2169*
> 
> If i have a pre-order with Evga for the signature .. can i buy 1 tomorrow if they have some in stock .??. Or im stuck with the 1 Limits per Household .. then iwill have to wait till the end of the month ??


I believe you'll have to cancel the pre-order to purchase the regular version.


----------



## jcde7ago

Quote:


> Originally Posted by *hitman2169*
> 
> If i have a pre-order with Evga for the signature .. can i buy 1 tomorrow if they have some in stock .??. Or im stuck with the 1 Limits per Household .. then iwill have to wait till the end of the month ??


That's actually a good question, and the answer is...yes, they will count the pre-order as part of the limit. So you will not be able to order a standard edition one tomorrow unless you have canceled your Signature Edition order. This was confirmed by EVGA on release day.


----------



## hitman2169

Ahhh :$ i already sold my laptop to a friend ..so i need my card to complete my build before may 15 for diablo 3







Thx for all your answer


----------



## Cheesemaster

So, I grabbed two more Acer 3d monitors , they will be here Wednesday. I should Have two sig 690's coming soon.. I am running 3-way 580gtx 3gb versions till i get my 690's. 3d surround! 690gtx's! OMG! I am juiced!.. I would advise stick with Evga on this one and just get it straight from them.. quad 690's wow I sure have come along way.. congrats to you all, cant wait t to start posting up scores!


----------



## AppetiteNZ

Check out the prices for these babys in NZ

http://www.computerlounge.co.nz/components/componentview.asp?partid=16702










Almost $2000 US

$2499NZD


----------



## TheRainMan

EDIT: derp


----------



## V3teran

Quote:


> Originally Posted by *Cheesemaster*
> 
> So, I grabbed two more Acer 3d monitors , they will be here Wednesday. I should Have two sig 690's coming soon.. I am running 3-way 580gtx 3gb versions till i get my 690's. 3d surround! 690gtx's! OMG! I am juiced!.. I would advise stick with Evga on this one and just get it straight from them.. quad 690's wow I sure have come along way.. congrats to you all, cant wait t to start posting up scores!


Yeah im thinking of ordering another one,trouble is they have gone up in price from the place that i bought them from so ive asked can i buy a second 1 at the price i bought the first,should find out tomorrow as its a bank holiday over here in the UK.


----------



## Lazz767

Quote:


> Originally Posted by *eviltinky*
> 
> I hope I'm one of the lucky ones for my TD order as shown below and haven't yet received an email for a delayed order. Anyway, I'll call TD tomorrow to confirm.


Nah mine was in that status yesterday. Today its "All Backordered". Kinda surprising since I had to hunt for the link. I'm thinking maybe they're getting a lot of stock today, I can hope right? lol


----------



## sofakng

So... is this the best place to check for updates on GTX 690 availability?


----------



## dred

YUP, bit with the backorder.


----------



## dred

48 hours to get your cash back from TD


----------



## Callandor

When is EVGA supposed to list their stock today? Been hitting F5 for 15 minutes and nothing.


----------



## sofakng

I've been refreshing eVGA for over an hour, heh.

Oh well... what can you do!


----------



## Callandor

Bummer, I am at work and can't keep refreshing all day









guess I will just check every 15 min or so and see if I get lucky

always have my TD backorder to fall back on....


----------



## jcde7ago

Looks like EVGA may not have stocked any at 6 AM this morning like they did on launch day :/

Regardless, general availability is today, so i would at least expect that a few e-tailers will list them at some point...and you guys may just have to get lucky....

There's also a bunch of browser add ons/extensions that will check for changes on websites and notify you (forgot their names), so that may be worth a look for some of you...i'll be at work and will check in on this thread from time to time, but good luck for those hoping to snag one today!

I will have unboxing pics up of my own 690 later this evening.


----------



## Lazz767

Just took a look at the TD site (Canadian) and they're finally showing it as currently out of stock. Here's hoping that it means they're going to get a fairly large shipment.


----------



## dred

Quote:


> Originally Posted by *Lazz767*
> 
> Just took a look at the TD site (Canadian) and they're finally showing it as currently out of stock. Here's hoping that it means they're going to get a fairly large shipment.


Same with the US site.


----------



## Callandor

Just called TD to check my order since status is backorder but I have not received an email like others have.

I did receive an email from them on additional items I might be interested with this purchase









The agent tells me his system shows they are supposed to get a replacement order some time next week so I should check back on the status then.

Take with a grain of salt, but may not be as bad as waiting till 5/31

Also, still haven't seen any post up on EVGA


----------



## Cheesemaster

I looked on nvidias web sight and they say that you have to use two d-dvi cables and one display port to enable surround 3d. this is all done on card #1... I am going to run quad Sli with the two 690gtx's. I have acer 3d monitors on the way how am i going to be able to run surround 3d?


----------



## Inglewood78

Damn, what happened to "broad availability?"


----------



## jcde7ago

News from EVGA Jacob:
Quote:


> This week
> 
> 
> 
> 
> 
> 
> 
> RT @sofakng: @EVGA_JacobF any eta on more 690 stock?




__ https://twitter.com/i/web/status/199537388983300096
Nothing more specific than, "this week" unfortunately. Looks like sofaking from here Tweeted him, lol.


----------



## sofakng

Quote:


> Originally Posted by *Inglewood78*
> 
> Damn, what happened to "broad availability?"


I wouldn't get my hopes up for availability today (at least from eVGA).

Jacob just said on twitter that the 690's will be back in stock "this week".... sigh.


----------



## Lazz767

I wonder how many they actually sold on Thursday.


----------



## Arizonian

Quote:


> Originally Posted by *Inglewood78*
> 
> Damn, what happened to "broad availability?"


Makes me a bit glad I didn't cancel the preorder mistake for Signature I made launch day.

The hint by customer care regarding my preorder, suggested I should hang onto the preorder over canceling and trying for the $999 version paid off. I think he knew something but couldn't come out and say it.

*whew*

Hopefully they fulfill and hold onto the preorders that went through on lauch day first. Prior to accepting new orders.

This is one of the best darn looking cards EVER to be released. Best scaling perfomance compared to its SLI brethren. Perfect power consumption for my AX850 watt PSU. Will simply perfom and look great in my rig carefully chosen to perfom and look as good as it gets.









Mother board, CPU and GPU all launch day purchases. Feels like I'm living on the edge.


----------



## jcde7ago

Quote:


> Originally Posted by *Lazz767*
> 
> I wonder how many they actually sold on Thursday.


EVGA? Or 690s in general? In general, i am going to go out on a limb and say that less than 50 were sold on May 3rd. rwchui got one and has posted pics (though his was from a local shop i believe), and I will be posting pics in about 7-8 hours from now.

Other than that...I have not seen too many people actually posting pics on any other forum at all. I think someone confirmed that on Thursday NewEgg themselves only had around 12-15 to sell as well nationwide.

It doesn't make sense though, since May 3 was just a "limited release" day, as they just wanted to make the cards purchasable on its actual launch day, and general availability was supposed to be today, and a LOT of sources echoed this.

I would stay persistent, for at least today, as these have to pop up SOMEWHERE at somepoint...


----------



## rwchui

Yes, I have the card in my possession at the moment and I got it from my local retail store on Friday, that was when they got their shipment of only 5 cards. However, they said they do not expect another ship until May 31st.

An example of my assumption of Nvidia *wide availability* already arrived at TD yesterday which was only about 10 and OOS already.

Likewise, these cards are ultra rare.

Here are all the local Canadian online/retail stores:

http://www.ncix.com/products/?sku=71219 *$1049.99*
http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=2580893&srkey=E145-0690 *$1099.99*
http://www.newegg.ca/Product/Product.aspx?Item=N82E16814130781 *$1199.99*
http://www.directcanada.com/products/?sku=11610BD6413&vpn=04G-P4-2690-KR&manufacture=EVGA *$997.50*
http://www.infonec.com/site/main.php?module=detail&id=459331 *$1250.00*

All OOS.


----------



## sofakng

Quote:


> Originally Posted by *jcde7ago*
> 
> I would stay persistent, for at least today, as these have to pop up SOMEWHERE at somepoint...


I hate to be negative, but it looks like eVGA is the only company currently producing the GTX 690 (although, ASUS has showed up on one site but is listed as backordered), and if they're own store is out-of-stock and only has estimates of "this week", I'd skeptical that they would give other stores more inventory rather than sell it themselves.

Hopefully I'm wrong and we see cards today, but I too am a bit disappointed with the "May 7th wider availability" (aka. zero availability).

Oh well - it's just a video card that we can all wait a bit longer for. Disappointing, but not life altering


----------



## jcde7ago

Quote:


> Originally Posted by *sofakng*
> 
> I hate to be negative, but it looks like eVGA is the only company currently producing the GTX 690 (although, ASUS has showed up on one site but is listed as backordered), and if they're own store is out-of-stock and only has estimates of "this week", I'd skeptical that they would give other stores more inventory rather than sell it themselves.
> Hopefully I'm wrong and we see cards today, but I too am a bit disappointed with the "May 7th wider availability" (aka. zero availability).
> Oh well - it's just a video card that we can all wait a bit longer for. Disappointing, but not life altering


While EVGA direct is saying they will have more "this week," it doesn't necessarily mean that they didn't already deliver a good amount to various retailers in order to sell them today.

I don't see any reason why pretty much every review site and forum has May 7 as "general availability" and not a single store actually stock any on that day. Mark my words...there will be a window or two today where these will go on sale...


----------



## Callandor

Quote:


> Originally Posted by *sofakng*
> 
> Oh well - it's just a video card that we can all wait a bit longer for. Disappointing, but not life altering


True but I haven't upgraded in 4 years and am running a Q6700 with a single GTX 280

I got the building fever and the only thing that can cure it, is more GTX 690!

Got the 3770k ordered and leaning heavily toward the Sabertooth Z77.

Course, as you say I can put together the rest and live with my GTX 280 for awhile, it ran the Diablo 3 beta without issue and that game will hold me for awhile.


----------



## Arizonian

Quote:


> Originally Posted by *Callandor*
> 
> True but I haven't upgraded in 4 years and am running a Q6700 with a single GTX 280
> I got the building fever and the only thing that can cure it, is more GTX 690!
> Got the 3770k ordered and leaning heavily toward the Sabertooth Z77.
> Course, as you say I can put together the rest and live with my GTX 280 for awhile, it ran the Diablo 3 beta without issue and that game will hold me for awhile.


I some what feel your pain. Im running on the intergrated graphics off my i7 3770K chip right now. No gaming for me until the GPU arrives. Luckily I can stream and surf still. Heh.

Good combo choices for your new build. Working great for me. Can't wait to run the 690 dual GPU in PCIe x16 in the first single slot.


----------



## Callandor

Quote:


> Originally Posted by *Arizonian*
> 
> I some what feel your pain. Im running on the intergrated graphics off my i7 3770K chip right now. No gaming for me until the GPU arrives. Luckily I can stream and surf still. Heh.
> Good combo choices for your new build. Working great for me. Can't wait to run the 690 dual GPU in PCIe x16 in the first single slot.


Yeah, I am not too put out about the 690 wait since Diablo 3 performed so well with the 280, plus I am trying to take some time in my parts decision this go round instead of rushing in

(except with the 690 anyway)









Heard good things about the Sabertooth so about to pull the trigger on that

Still considering Case, power supply, and RAM

Thinking NZXT for the case, I like the Phantom and the Switch 810

Also thinking watercooling for the first time, leaning toward the idiot proof H80 or H100, but may jump into custom after I read up on it.

690 will stay with stock cooler though, can't get rid of that beauty!


----------



## tonyjones

Someone selling it for $1400 shipped, on [H], too much for my blood, I'm paying MSRP only! ha


----------



## Lazz767

Quote:


> Originally Posted by *tonyjones*
> 
> Someone selling it for $1400 shipped, on [H], too much for my blood, I'm paying MSRP only! ha


$1400 and they opened the box? no way.


----------



## sundrou

uk.PNG 21k .PNG file

GRRRRRR lol i have order the second one also and you dont add me in shame on you









us.PNG 67k .PNG file


----------



## jcde7ago

Quote:


> Originally Posted by *sundrou*
> 
> uk.PNG 21k .PNG file
> 
> GRRRRRR lol i have order the second one also and you dont add me in shame on you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> us.PNG 67k .PNG file


Grats and added! And you probably were not added before because you didn't post anything before, lol..unless I didn't see it.


----------



## Arizonian

@lazz - congrats first member to show pic.









On a side note here is a recent email I recieved a response just a little bit ago.
Quote:


> Hi,
> 
> The Nvidia graphics card you requested is the newest on the market by Nvidia at this time. We will not have these cards available until late June/early July and the price on them is $1190.00 plus Shipping. If you wish to purchase the card I would be more then happy to help you out with that and get the order in. If you have any further questions please feel free to email me back of call me directly at the number listed below.
> 
> Chris Fisher
> Sales Associate, IT, Web Support
> STAR TECH INC.
> 1490 N. Hermitage RD ~ Hermitage, PA 16148
> Phone: 724-962-0566 Ext. 107/108


At first I thought it was from FrozenCPU but there's is on sale for $999. Newegg was the only other vendor I put a Auto Notify and the only other vendor for $1199. I'm not sure who I recieved this from after all.







sorry had to clarify the info and don't want to post the wrong info.

Can't recall who this was after all.


----------



## TheRainMan

This day has been the biggest bonerkill ever


----------



## CapnCrunch10

Very odd. I was surprised that EVGA posted a second wave of 690s on Friday as well. Sold out in about 5 minutes compared to the 10 from the 3rd.

Even more interesting is that Zotac 690s started showing up on eBay as well. I thought it was just ASUS and EVGA selling them (though I have yet to see an ASUS one).

Very odd in deed. Keep a look out you guys! Just be close to your email and you'll be notified as soon as they're available.


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Very odd. I was surprised that EVGA posted a second wave of 690s on Friday as well. Sold out in about 5 minutes compared to the 10 from the 3rd.
> Even more interesting is that Zotac 690s started showing up on eBay as well. I thought it was just ASUS and EVGA selling them (though I have yet to see an ASUS one).
> Very odd in deed. Keep a look out you guys! Just be close to your email and you'll be notified as soon as they're available.


EVGA and Asus in North America. There are other board partners for Europe/Asia.


----------



## TheRainMan

I got C4C running every 5 seconds on this thread (for purchase updates) and on the EVGA site. I really want to snag one of these (I cancelled my TD order because it's OOS so take me off the list until I secure one)


----------



## 3930K

Yes, there is.
According to my favourite e-tailer, there's:

Asus, EVGA, MSI, Gainward and Gigabyte.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jcde7ago*
> 
> Not sure if serious. You do realize that 680s aren't any easier to find, right? Please leave this thread if you're going to just threadcrap, before I just ask everyone to report you.


Agreed.

Man do I want this card, what a beast. Make sure you guys do some 3dmark benchies when you get yours in!


----------



## Arizonian

OP Jcde7ago is doing a great job! Been supportive and very fast to update and accomedate our request. +1 rep bud.

When we start getting our cards, I've no doubt you'll do great keeping up with the benchmarks we will be introducing. Keep up the good work.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> OP Jcde7ago is doing a great job! Been supportive and very fast to update and accomedate our request. +1 rep bud.
> When we start getting our cards, I've no doubt you'll do great keeping up with the benchmarks we will be introducing. Keep up the good work.


Thanks Arizonian, i've been getting nothing but support and kind words from everyone else in this thread...I definitely hope this becomes a one-stop place for all things 690 on OCN...that front page will soon be filled with benchmarks and it will constantly be update with new info, that's for sure!!!


----------



## ceteris

Yeah, JC! You're a cool cat with me yo. Willing to let people into the club with just a confirmation order shows that you are fair and far from elitist despite how exclusive the 690 is. Just need a moderator to join to flush the crap out


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Thanks man, much appreciated!


Don't worry about it man. Like I said, just let it go. You're doing a fantastic job at updating and maintaining this thread.

Now this, I cannot believe. The original pair of 690s that sold on eBay for $2000 each was a mistake. The guy relisted the cards for the same price. But, how did this guy sell three 690s for $1900 each (at least two of them). People are either really dumb or there is a lot of people spoofing these auctions. I'd drop my card in a second for a $1900 offer. I'll even wrap a pretty little bow on top too.


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Don't worry about it man. Like I said, just let it go. You're doing a fantastic job at updating and maintaining this thread.
> Now this, I cannot believe. The original pair of 690s that sold on eBay for $2000 each was a mistake. The guy relisted the cards for the same price. But, how did this guy sell three 690s for $1900 each (at least two of them). People are either really dumb or there is a lot of people spoofing these auctions. I'd drop my card in a second for a $1900 offer. I'll even wrap a pretty little bow on top too.


Man, for $1,900...I would have no problems whatsoever selling mine for that amount, but ONLY if someone offered that much...I mean, yeah, i'd feel kinda bad of course...but I also don't think I could walk away from that much money if it was offered to me.


----------



## tonyjones

jcd, amen u get a rep +1 lol


----------



## V3teran

There is no way im selling mine when i get it,im going to enjoy it to the max


----------



## kemsoff

Evga has them right now! Just ordered one, hopefully unlike the TigerDirect one, this one will be good.


----------



## ceteris

YEAHH, BABY~! I'M IN THE CLUB~!!!!











One down, one more to go!


----------



## TheRainMan

Crap, I was playing a game and didn't notice my phone go off. God damn it


----------



## Arizonian

Quote:


> Originally Posted by *ceteris*
> 
> YEAHH, BABY~! I'M IN THE CLUB~!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One down, one more to go!


Was that order for today? They had some in stock?


----------



## Kyouki

OMG!!! I am so mad at my self! I was glued to my phone and email all day! I have it no more then 12 inches from me but decided to play some LOL, and i didn't hear the E-mail then i seen it light up on phone 3 min after i got it and i rushed to site and they are sold out again. GRRRR!! What was I thinking. Now I bet it will be forever till I can order please get more in-stock this week!


----------



## kemsoff

Quote:


> Originally Posted by *Arizonian*
> 
> Was that order for today? They had some in stock?


They did about 20 min ago, for about 2min


----------



## TheRainMan

Quote:


> Originally Posted by *Kyouki*
> 
> OMG!!! I am so mad at my self! I was glued to my phone and email all day! I have it no more then 12 inches from me but decided to play some LOL, and i didn't hear the E-mail then i seen it light up on phone 3 min after i got it and i rushed to site and they are sold out again. GRRRR!! What was I thinking. Now I bet it will be forever till I can order please get more in-stock this week!


ahaha, this is actually exactly what happened to me play by play

you know what would've been crazy, if we were in the same game


----------



## ceteris

Quote:


> Originally Posted by *Arizonian*
> 
> Was that order for today? They had some in stock?


Yep! I was cyberloafing on some sites looking at some watercooling parts when I heard the *ding* on my iPhone. Tabbed on browser to check my e-mail, read the subject matter, immediately F5'd the EVGA store page and voila~! I think I hit a new personal WPM record when I filled out my information


----------



## jcde7ago

Grats on those who snagged one from EVGA's restock today!

I told you guys, if there was one day you should stay persistent with that F5'ing, today was the day...









If there are new members that need to be added just let me know.


----------



## TheRainMan

*sigh* I just can't get over how unlucky I am D:

Wake up at 5 am, at the computer F5-ing the EVGA page all day and the instant I decide to play a game of League, boom, back in stock.


----------



## Kyouki

Quote:


> Originally Posted by *TheRainMan*
> 
> ahaha, this is actually exactly what happened to me play by play
> you know what would've been crazy, if we were in the same game


Haha yah that would be nuts but you would of known because I was yelling in game I missed out! Hahaha good luck next time. I won't give up!


----------



## kemsoff

Quote:


> Originally Posted by *TheRainMan*
> 
> *sigh* I just can't get over how unlucky I am D:
> Wake up at 5 am, at the computer F5-ing the EVGA page all day and the instant I decide to play a game of League, boom, back in stock.


Yea, that definitely stinks. I almost didnt even check my email when it came through, didnt think It would be for this


----------



## ceteris

Quote:


> Originally Posted by *TheRainMan*
> 
> *sigh* I just can't get over how unlucky I am D:
> Wake up at 5 am, at the computer F5-ing the EVGA page all day and the instant I decide to play a game of League, boom, back in stock.


If you are ordering directly from EVGA, their e-mail notification is instant. This is the 2nd time I've purchased from them this way and was sucessful both times. Newegg, although I doubt you are watching their site, won't notify you sometimes and will randomly update their stock. I have yet to receive a single notification from Tiger Direct or Amazon (when you can enter for notification).


----------



## jcde7ago

Alright so....

It's about 6:30 PDT here or so, UPS still has not shown up. My card is on vehicle for delivery though, so it should be here within the next hour.

I have spent the last 2+ hours, rebuilding *half* of my water loop and leak testing it, as my GTX 590 was a part of it and I will now be running just the 3930K under H20...so while that will cool down a bit more than before, it does come with it a bit more work when 690 water blocks are released, as it means i will have to rebuild my loop yet again. That's about the only downside to watercooling that I can think of though...a lot of work at times, but once set up...well worth it.









Here is my goodbye to the 590 after it was taken out my loop; and my rig, ready and waiting for the 690 to arrive (sorry for the semi-crappy pics; iPhone 4 FTL):





I will take pics of the package and update the thread as soon as it gets here.


----------



## tahoward

Will be receiving mine tomorrow afternoon. Time to dust out the hardware.


----------



## jcde7ago

Murder, she wrote:











Go time:



I will have benchmarks late tonight or sometime tomorrow....but folks, this card is pure quality. I've owned many a GPU...but i've never had the feeling I got when I held this beast in my hands. For now though, i'm going to disappear for a couple of hours...and do a bit of what I bought this card for...GAMING!


----------



## Kyouki

Looks awesome! Congrats and enjoy!

Oh yea if you could tells us how you feel about the 2nd gpu exhausting into the case and how hot it is making the case? Enjoy first but that is one of my questions I have for you guys with them! Because I have the silver stone tj10 ESA edition and it has the graphics card intake fan to cool the card but I wonder if I flip fan to exhaust the 2nd gpu out the front.


----------



## Arizonian

I know your not online atm - but congrats JCD - crank all your settings as high as they go and enjoy the eye candy.







Fluid sweetness.

Can't wait for Nvidia to implement TXAA that will be taking place of MSAA. TXAA graphic qaulity of MSAA x16 at only a MSAA x4 performance hit.









Real time destructible enviroments will be simply amazing when coded by developers as well. Exciting times to own an Nvidia card to say the least.


----------



## itzhoovEr

Quote:


> Originally Posted by *TheRainMan*
> 
> *sigh* I just can't get over how unlucky I am D:
> Wake up at 5 am, at the computer F5-ing the EVGA page all day and the instant I decide to play a game of League, boom, back in stock.


Former TSM TheRainMan?


----------



## Sh1n1ng Forc3

Got the card today from newegg and after a clean install of windows (migrating from AMD eyefinity to nvidia) i can confirm this card is a beast. It overclocks nicely with very little effort and runs cool and stable. Surround works great with SLI enabled (someone on newegg posted a review saying it doesn't). I am running off of the 3 DVI ports and everything was painless. Great job nvidia!!!


----------



## ceteris

Quote:


> Originally Posted by *jcde7ago*
> 
> I will have benchmarks late tonight or sometime tomorrow....but folks, this card is pure quality. I've owned many a GPU...but i've never had the feeling I got when I held this beast in my hands. For now though, i'm going to disappear for a couple of hours...and do a bit of what I bought this card for...GAMING!


Grats, dude! If you have old screenshots of your 590's benches, please post those too!









....and please add me to the club!


----------



## TheRainMan

Quote:


> Originally Posted by *itzhoovEr*
> 
> Former TSM TheRainMan?


No, to be honest I just jacked his name because I like the way it flows (that and the fact that I'm unoriginal as hell)

It's the same with all my other internet aliases, except for the original which I shall keep hidden away in my brain for I am ashamed of it


----------



## error-id10t

Sorry, haven't read all of the pages but are there people with full-cover water block on these already or planning to get it? Just did a quick search and couldn't find any but I'm guessing it's too early.


----------



## phantomphenom

I'm so jealous of you guys who got them.....lol. I'll wait a month to see what happens....


----------



## sundrou

Quote:


> Originally Posted by *jcde7ago*
> 
> Grats and added! And you probably were not added before because you didn't post anything before, lol..unless I didn't see it.


its ok i think you miss my previous post


----------



## jcde7ago

Quote:


> Originally Posted by *ceteris*
> 
> Grats, dude! If you have old screenshots of your 590's benches, please post those too!
> 
> 
> 
> 
> 
> 
> 
> 
> ....and please add me to the club!


Thanks man, and apologies for not adding you earlier, but I will do so now.








Quote:


> Originally Posted by *error-id10t*
> 
> Sorry, haven't read all of the pages but are there people with full-cover water block on these already or planning to get it? Just did a quick search and couldn't find any but I'm guessing it's too early.


There are no waterblocks currently available for the 690 yet. It will probably be a couple of weeks/end of May I would guess.









Also, I did not get around to benching much tonight since I was enjoying how badly this card just ate every game like you're asking it to run Minesweeper. I mean, Battlefield 3 @ 2560x1440p, 4xAA, 16xAF, everything on Ultra...smooth as butter. This card didn't break a sweat...like i texted to my friend...it literally just ATE Battlefield 3.









That said, I did get to run two (2) benchmarks with 3DMark11; both runs are at ABSOLUTE STOCK settings, meaning no clock speed boosts at all. I literally closed any background tasks/apps, ran 3DMark 11 (P) benchmark twice, one with PhysX On, and another with PhysX Off.

Keep in mind that i am using the Basic (Free) Edition of 3DMark11 and also, it doesn't yet seem to recognize the GTX 690 and the 301.34 GeForce drivers as being official when the scores were uploaded at the end of the runs.

PhysX On:



PhysX Off (CPU Physics):



Again, these are my first two runs ever with this card, no tweaks at all, so take them for what they are. I will run more 3DMark11 benches during the course of the week, including Max OC clocks (and i'll run them on my full version key lol, I just never reinstalled 3DMark11 after reformatting not too long ago). More benches to come later, including Heaven runs and whatnot.

That said, t'was a busy night rebuilding my loop/cleaning my rig, and getting to play around with the 690 for a couple of hours, but now i'm off to bed cause...I got a full-time day job like the rest of you.


----------



## Arizonian

Sweet stock bench. Your *graphics score 16477* alone.









Going to be interesting to see how she over clocks and what core clocks we can achieve along with GPU boosts possible. See how well she does vs two 680's or two 7970's will be intriguing.









Though I will add regardless, this one bad arse looking card. Even with slightly less performance of the two in SLI or crossfire it will suit my needs very nicely. Single 120Hz 1920x1200 res monitor in 2D gaming or 3D Vision gaming it's going to be great. 2GB for me will be more than plenty. Misconception on single monitors is 3D Vision gaming eats more VRAM when it doesn't.

Keep us posted bud if you can tear yourself away from gaming.


----------



## brettjv

Did a little thread cleanup of some off-topic stuff earlier. Just a heads up if you see some posts missing









Grats on the new card JC ... keep them benchies coming.

BTW, looking forward to see what the top clock is that baby hits at full OC/boost









Also, physX on/off shouldn't affect 3dMark11 at all ... it's pure CPU physics in this suite, unlike Vantage. Your score difference is likely just MoE-related is my guess.


----------



## fuzzybass

Oh... some of my posts got deleted. Heh. Oh well.


----------



## Basti

This wait is killing me. I am getting jealous seeing you guys already have yours.
anyways, i got an email from ncix that I'm 6th in the queue of 30 orders they have but they don't know when they're going to deliver it.


----------



## Suit Up

Ordered a Gigabyte GTX690 today.
I wrote a small bash script a couple of days ago to check the PCCG website every couple of minutes to see if they added a GTX690 to their inventory. Worked a treat


----------



## krsboss

....GTX 690 is on sale for $1600 in Australia....it would be more cost efficient to buy one in England and have it shipped....or buy one in the USA and have it shipped...

...also, you can buy a GTX 680 in Autralia for $600 (if you can find one in stock)....you could nearly buy 3 GTX 680s for the price of one GTX 690...I think the price translation has gone awry somewhere...I seriously dislike the disparity between exchange rates and prices for computer parts (& games) in Australia....


----------



## PTCB

Quote:


> Originally Posted by *krsboss*
> 
> ....GTX 690 is on sale for $1600 in Australia....it would be more cost efficient to buy one in England and have it shipped....or buy one in the USA and have it shipped...
> ...also, you can buy a GTX 680 in Autralia for $600 (if you can find one in stock)....you could nearly buy 3 GTX 680s for the price of one GTX 690...I think the price translation has gone awry somewhere...I seriously dislike the disparity between exchange rates and prices for computer parts (& games) in Australia....


How about 1,950 AUD? Across the ditch is much worse, my friend. That's one of the reasons why I will either join my family in the US or go to AUS.


----------



## pat031

Anyone tested with the Catleap Korean monitor overcloked to see if you can pass the 400 mhz pixel clock ?????


----------



## krsboss

Quote:


> Originally Posted by *PTCB*
> 
> How about 1,950 AUD? Across the ditch is much worse, my friend. That's one of the reasons why I will either join my family in the US or go to AUS.


wow! and I thought we were getting rooted







...at least with that price you could fly over here, buy one and take it back


----------



## dred

My pending charge from TD isn't displaying anymore (from my bank). My backorder is still valid. Guess they'll charge me when they ship.


----------



## Crack_Fox

Wait till the UK prices are up, then you'll sheet brix.


----------



## Lazz767

Quote:


> Originally Posted by *dred*
> 
> My pending charge from TD isn't displaying anymore (from my bank). My backorder is still valid. Guess they'll charge me when they ship.


I just emailed Tiger Direct to see what the status is on the next shipment. This is what I got back

"Dear Valued Customer,
We do apologize for the inconvenience.
Unfortunately our records indicate that this item is on backorder and should be available on 05/31/12.
You can chose to wait for the item or you may also contact one of our sales representatives at 1.800.800.8300 for further assistance finding a comparable item."


----------



## kemsoff

Quote:


> Originally Posted by *Lazz767*
> 
> I just emailed Tiger Direct to see what the status is on the next shipment. This is what I got back
> "Dear Valued Customer,
> We do apologize for the inconvenience.
> Unfortunately our records indicate that this item is on backorder and should be available on 05/31/12.
> You can chose to wait for the item or you may also contact one of our sales representatives at 1.800.800.8300 for further assistance finding a comparable item."


, this happened to me as well. I called td as when I ordered it said mine would be delivered this Friday.
After they pre authorized my card then told me the same. I canceled my order with them I find not telling the customer in advance before we order that they don't have them in stock completely unacceptable.


----------



## Lazz767

Quote:


> Originally Posted by *kemsoff*
> 
> , this happened to me as well. I called td as when I ordered it said mine would be delivered this Friday.
> After they pre authorized my card then told me the same. I canceled my order with them I find not telling the customer in advance before we order that they don't have them in stock completely unacceptable.


I agree, only reason I'm not cancelling is due to the fact that is been almost impossible to find the card.


----------



## dred

Quote:


> Originally Posted by *Lazz767*
> 
> I agree, only reason I'm not cancelling is due to the fact that is been almost impossible to find the card.


I'm trying to find a comfort zone with this. One one hand, I've got my foot in the door (twice). On the other hand, the wait is killer, and the horrible business practices by TD are unacceptable.

However, in the meantime, I can take my time with my rig building. Also, I'm hoping a water block for these are released soon, so I can get my hands on one (prolly before the card arrives).


----------



## Michalius

Today is a good day.


----------



## Lazz767

Quote:


> Originally Posted by *dred*
> 
> I'm trying to find a comfort zone with this. One one hand, I've got my foot in the door (twice). On the other hand, the wait is killer, and the horrible business practices by TD are unacceptable.
> However, in the meantime, I can take my time with my rig building. Also, I'm hoping a water block for these are released soon, so I can get my hands on one (prolly before the card arrives).


Its probably better to just keep your order with TD unless you can find somewhere else with it in stock. That's how I am justifying it anyways.


----------



## kemsoff

Quote:


> Originally Posted by *Lazz767*
> 
> I agree, only reason I'm not cancelling is due to the fact that is been almost impossible to find the card.


True, I got lucky yesterday and snagged one from Evga however.


----------



## OliverGw

Can anyone here comment on the 690's surround performance in BF3 in particular? I'm interested in getting one but not sure if 2x680's would be better for surround gaming.


----------



## Cheesemaster

Quote:


> Originally Posted by *Arizonian*
> 
> Sweet stock bench. Your *graphics score 16477* alone.
> 
> 
> 
> 
> 
> 
> 
> 
> Going to be interesting to see how she over clocks and what core clocks we can achieve along with GPU boosts possible. See how well she does vs two 680's or two 7970's will be intriguing.
> 
> 
> 
> 
> 
> 
> 
> 
> Though I will add regardless, this one bad arse looking card. Even with slightly less performance of the two in SLI or crossfire it will suit my needs very nicely. Single 120Hz 1920x1200 res monitor in 2D gaming or 3D Vision gaming it's going to be great. 2GB for me will be more than plenty. Misconception on single monitors is 3D Vision gaming eats more VRAM when it doesn't.
> Keep us posted bud if you can tear yourself away from gaming.


Thats impressive I am running 3-way 580gtx sli overclocked to 900mz and you just beat my graphics score by 400 points ... WOW!


----------



## jcde7ago

Quote:


> Originally Posted by *Suit Up*
> 
> Ordered a Gigabyte GTX690 today.
> I wrote a small bash script a couple of days ago to check the PCCG website every couple of minutes to see if they added a GTX690 to their inventory. Worked a treat


Grats man, added you to this list!









That is also the first non-EVGA card on there, lol.
Quote:


> Originally Posted by *OliverGw*
> 
> Can anyone here comment on the 690's surround performance in BF3 in particular? I'm interested in getting one but not sure if 2x680's would be better for surround gaming.


The performance would be near identical. Right now, the 690 is, at the absolute WORST, 4% worse than a pair of 680s....while using 20-23% less power and putting out less heat. You can't go wrong either way though...but again, people need to realize that comparing a pair of 680s to a 690 is the exact same thing. There are no compromises on this bad boy, and there are actually quite a few benefits as if you go with a 690 as opposed to going with a pair of individual 680s.


----------



## CapnCrunch10

Are we going to see some OC benchmarks jcde7ago?


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Are we going to see some OC benchmarks jcde7ago?


I'm at work right now, but yes, I expect to be doing some OC benchmarks tonight when i am home for sure...which won't be for another 6-ish hours unfortunately, but yes...more benchmarks will indeed be coming.


----------



## tahoward

Proof of purchase:



Can't wait to open it up today.


----------



## Lazz767

I'm so jealous right now LOL. What happened to this "wider availability"?


----------



## burningrave101

Quote:


> Originally Posted by *Lazz767*
> 
> I'm so jealous right now LOL. What happened to this "wider availability"?


Wider availability in more simpler terms means it will show up in stock again like it did at EVGA's site yesterday for a whole 30 seconds and there may be 100+ units and there may be 5. Honestly noone should expect there to be very many of these show up in stock anywhere for at least another month with the current availability of the GTX 680 where it is at.


----------



## ceteris

Quote:


> Originally Posted by *Lazz767*
> 
> I'm so jealous right now LOL. What happened to this "wider availability"?


Just keep watching. ASUS version has yet to be posted anywhere and is bound to pop up sooner or later.


----------



## Lazz767

I like how on NCIX's site they're already trying to sell more by saying "WEB DEAL" and taking 140 off their inflated price lol.


----------



## Lazz767

Quote:


> Originally Posted by *ceteris*
> 
> Just keep watching. ASUS version has yet to be posted anywhere and is bound to pop up sooner or later.


They've already been posted on NCIX for a while. Sold out as well.

http://ncix.com/products/?sku=71419


----------



## jcde7ago

Quote:


> Originally Posted by *tahoward*
> 
> Proof of purchase:
> 
> Can't wait to open it up today.


Grats man, added! Post pics when you get it.


----------



## TheRainMan

So, what are the odds of EVGA putting a few up tonight >.>

After all, they did say "this week" on Twitter so it could be vague-marketing-speak for "We're gonna sell ~ 5 a day"


----------



## Lazz767

Quote:


> Originally Posted by *TheRainMan*
> 
> So, what are the odds of EVGA putting a few up tonight >.>
> After all, they did say "this week" on Twitter so it could be vague-marketing-speak for "We're gonna sell ~ 5 a day"


It is quite possible. Again its all a mystery really. Could mean that was all for this week. I really wish nvidia would have prepared the stock prior to announcing. I guess they're just doing this to make people want it more.


----------



## TheRainMan

Welp, as much as I hate to admit it, it is working. This reminds me a lot of the Wii craze in 2006


----------



## jcde7ago

Quote:


> Originally Posted by *Lazz767*
> 
> It is quite possible. Again its all a mystery really. Could mean that was all for this week. I really wish nvidia would have prepared the stock prior to announcing. I guess they're just doing this to make people want it more.


They said "this week" yesterday, then promptly put out stock 4 hours later that only lasted 2 minutes...lol.

For what it's worth, I would start expecting them to put out stock towards the end of the day pacific time - i'd say, 3:30-6:30 PDT would be a good time to camp out.


----------



## TheRainMan

I shall not let video games distract me, I will secure a 690 by the end of this week!


----------



## ceteris

Quote:


> Originally Posted by *Lazz767*
> 
> They've already been posted on NCIX for a while. Sold out as well.
> http://ncix.com/products/?sku=71419


Bleh, NCIX Canada gets all the goodies. The US side doesn't even give half the effort and make you use Paypal for AMEX


----------



## sundrou

update the one i order from OCUK is already shiped it will arrive in few days but the other one from US is gona tack long time so it will be long time before i set my quad SLI


----------



## sundrou

USA they can pre-order from here is good price but no stock http://www.shopblt.com/cgi-bin/shop/shop.cgi?action=enter&thispage=011004001505_BPF8258P.shtml&order_id=!ORDERID!


----------



## V3teran

Quote:


> Originally Posted by *sundrou*
> 
> update the one i order from OCUK is already shiped it will arrive in few days but the other one from US is gona tack long time so it will be long time before i set my quad SLI


Ocuk sold a few MSI models today between 10:30 and 13:00 hrs,they were all sold out by this time.I should be getting mine from Scan,will have it in my hands on friday.


----------



## cnh789

please take me off the list, i canceled my order with TD for being deceptive and greedy and i just bought x2 gigabyte gtx 680 winforces off newegg instead of playing this twitch waiting game.


----------



## emett

gee your lucky, newegg is now out of stock.


----------



## ceteris

EVGA has them up! Get one now!

*edit* HAD them up. I F5'd and counted up to 9 seconds before it went back to notify. Wish I could've bought another one if they didn't limit 1 per household


----------



## TheRainMan

ON SALE

I GOT ONE


----------



## ceteris

Quote:


> Originally Posted by *TheRainMan*
> 
> ON SALE
> I GOT ONE


Grats, bro!


----------



## bitMobber

Quote:


> Originally Posted by *TheRainMan*
> 
> ON SALE
> I GOT ONE


Me too!

But I'm not keep it, it's going straight on Amazon and eBay! Muhaha!!

It sold out within minutes this time, guess they didn't have large amount in stock.


----------



## TheRainMan

Literally 2 minutes



So happy ^^


----------



## Cobolt005

Just got my order in with Evga just waiting on the email now.







Man talk about selling out fast by time I got my card info in and purchased it was back to Notify Me.


----------



## ceteris

Quote:


> Originally Posted by *bitMobber*
> 
> Me too!
> But I'm not keep it, it's going straight on Amazon and eBay! Muhaha!!
> It sold out within minutes this time, guess they didn't have large amount in stock.


LOL I think some people would be more annoyed if you did keep it. That is if your system specs in your sig are still up the date.. Bottleneck City!









Although I would be interested to see what kind of scores you would get on 3DMark 11 and Unigine


----------



## bitMobber

Quote:


> Originally Posted by *ceteris*
> 
> LOL I think some people would be more annoyed if you did keep it. That is if your system specs in your sig are still up the date.. Bottleneck City!
> 
> 
> 
> 
> 
> 
> 
> 
> Although I would be interested to see what kind of scores you would get on 3DMark 11 and Unigine


LOL I wouldn't put the card in the system I have now, I'm building a new Ivy Bridge system but I'm waiting til more motherboard and ram options become available. The GTX 690 is my card of choice but I can't resist the premium the card is selling for online.


----------



## jcde7ago

Quote:


> Originally Posted by *jcde7ago*
> 
> They said "this week" yesterday, then promptly put out stock 4 hours later that only lasted 2 minutes...lol.
> For what it's worth, I would start expecting them to put out stock towards the end of the day pacific time - i'd say, 3:30-6:30 PDT would be a good time to camp out.


Didn't I tell you guys!? Between 3:30 PDT and 6:30 PDT, haha.

On another note, I will be testing Max OCs and benchmarking some more tonight, expect results in hopefully not too late tonight.


----------



## Kyouki

Quote:


> Originally Posted by *TheRainMan*
> 
> ON SALE
> I GOT ONE


Congrats! I missed out again I suck at this waiting and sniping one game. Gosh I hope they come again tomorrow I should have less competition! Lol


----------



## hitman2169

Why we get the pre-order one at the end of the month if they put Cards to sell everyday ?... Limited quantities but stil.....


----------



## ceteris

Quote:


> Originally Posted by *hitman2169*
> 
> Why we get the pre-order one at the end of the month if they put Cards to sell everyday ?... Limited quantities but stil.....


They probably have to make the T-shirts and posters and then pack the boxes lol. TBH, I think the "signature edition" for the 690's are cheesy since they can't do much with the aesthetics.... Should pack those things with backplates at least.


----------



## dph314

Anyone that already got theirs -> don't forget to hit the Heaven contest thread: http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores

I wanna see some 690s up there!







(try not to bump me outta 16th place though







)


----------



## jcde7ago

Quote:


> Originally Posted by *dph314*
> 
> Anyone that already got theirs -> don't forget to hit the Heaven contest thread: http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores
> I wanna see some 690s up there!
> 
> 
> 
> 
> 
> 
> 
> (try not to bump me outta 16th place though
> 
> 
> 
> 
> 
> 
> 
> )


I will be running Heaven shortly, i'm still currently testing some OCs incrementally...once I find a nice and solid OC, i'll run Heaven a few more times and post my score here and in that thread.


----------



## TheRainMan

How long does it usually take for them to ship out?


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> How long does it usually take for them to ship out?


EVGA? When I ordered at 6 AM on launch day, they called me at noon (6 hours later) to verify my order, and shipped my card out the same day.

I would assume they'll call you to confirm your order around lunch time tomorrow (pacific time), and ship it out a few hours later.


----------



## ceteris

EVGA usually ships out by 5pm after calling to confirm your order. Shortly after 5pm, you will get e-mail notification of the tracking number to let you know that it is shipped.


----------



## TheRainMan

Quote:


> Originally Posted by *jcde7ago*
> 
> EVGA? When I ordered at 6 AM on launch day, they called me at noon (6 hours later) to verify my order, and shipped my card out the same day.
> I would assume they'll call you to confirm your order around lunch time tomorrow (pacific time), and ship it out a few hours later.


Ah, good, I should get the phone call 3 pm my time, after school. Would be awkward if they called me and I was in class. Do you reckon they'll let me upgrade to 1 day shipping over the phone?


----------



## Stateless

Can any owners comment about how much heat at full load is being dumped into the case? This will be my first GPU that disapates heat into the case. I think I will be fine, I just put together a Xigmatek Elysium Super Tower Rig with alot of fans, so should be good, but generally curious how much heat they will dump at full load and/or overclcoked.

I also need to post my invoice so I can join the club...was able to score 2x690's and should have them here by Friday!!!


----------



## PTCB

Quote:


> Originally Posted by *jcde7ago*
> 
> I will be running Heaven shortly, i'm still currently testing some OCs incrementally...once I find a nice and solid OC, i'll run Heaven a few more times and post my score here and in that thread.


Please also post your scores in the GTX 670 OC thread for comparison. Thank you.

Rep waiting to be given.


----------



## jcde7ago

Quote:


> Originally Posted by *PTCB*
> 
> Please also post your scores in the GTX 670 OC thread for comparison. Thank you.
> Rep waiting to be given.


I'm currently benching, posting this from my laptop.









Just as a short preview (I didn't take a screenshot):

Using 301.34 GTX 690 Drivers, EVGA Precision X with the following OC:

Power Target: 135%
Core Offset: +125 Mhz
Memory Offset: +200 Mhz

FPS: 107.3
Score: 2702

It would have been enough to place me 15th in the OCN Heaven top 30! Again, I didn't screencap, cause I was so excited i figured i'd gun straight for a higher OC, so i'm running +140 and +150 Mhz offset OCs and tweaking the memory some more....this card is a BEAST!!!!









I will post screencaps of the highest Heaven score I get in a little while, using the criteria set forth in this thread:

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores/0_100


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm currently benching, posting this from my laptop.
> 
> 
> 
> 
> 
> 
> 
> 
> Just as a short preview (I didn't take a screenshot):
> Using 301.34 GTX 690 Drivers, EVGA Precision X with the following OC:
> Power Target: 135%
> Core Offset: +125 Mhz
> Memory Offset: +200 Mhz
> FPS: 107.3
> Score: 2702
> It would have been enough to place me 15th in the OCN Heaven top 30! Again, I didn't screencap, cause I was so excited i figured i'd gun straight for a higher OC, so i'm running +140 and +150 Mhz offset OCs and tweaking the memory some more....this card is a BEAST!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> I will post screencaps of the highest Heaven score I get in a little while, using the criteria set forth in this thread:
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores/0_100


Interesting. On my Evga GTX-680'sin SLI with Power Target at 132%, +100 Offset and +400 on the memory the best FPS Unigine FPS was 98.5, can't remember the score, but I rememeber the FPS not being over 100!

How has heat been so far at these OC settings?


----------



## Brulf

Would love to buy one but seriously....... http://www.pccasegear.com/index.php?main_page=index&cPath=193_1385

$1600 how is that price justified?


----------



## Plagasx

Quote:


> Originally Posted by *Brulf*
> 
> Would love to buy one but seriously....... http://www.pccasegear.com/index.php?main_page=index&cPath=193_1385
> $1600 how is that price justified?


Lol, I would have to sell my PC to get that.


----------



## Projector

I don't think your pc is worth that


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Interesting. On my Evga GTX-680'sin SLI with Power Target at 132%, +100 Offset and +400 on the memory the best FPS Unigine FPS was 98.5, can't remember the score, but I rememeber the FPS not being over 100!
> How has heat been so far at these OC settings?


I'm currently running at 85% fan while i'm benching with high OC's, and thus far the highest it got was 76c.

Here are the results of my OC run with:

Power Target: 135%
Core Offset: +140Mhz
Memory Offset: +200Mhz

Drivers: 301.34



Not bad, eh?


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm currently running at 85% fan while i'm benching with high OC's, and thus far the highest it got was 76c.
> Here are the results of my OC run with:
> Power Target: 135%
> Core Offset: +140Mhz
> Memory Offset: +200Mhz
> Drivers: 301.34
> 
> Not bad, eh?


Noticed your resolution....I did my tests at 1080p, so your lower resolution is the main reason the scores are alot higher than my 680's and the offset's I am running. What are the boost clocks hitting? I am interested to know how high they are boosting. Also, do you notice the cards throttling back a bit? The 680's had a 70c threshold where the card "can" throttle back a bit because of the heat...I have not been able to get any confirmation from anyone if the 690's also do this as well.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Noticed your resolution....I did my tests at 1080p, so your lower resolution is the main reason the scores are alot higher than my 680's and the offset's I am running. What are the boost clocks hitting? I am interested to know how high they are boosting. Also, do you notice the cards throttling back a bit? The 680's had a 70c threshold where the card "can" throttle back a bit because of the heat...I have not been able to get any confirmation from anyone if the 690's also do this as well.


If you read my above post, I ran 1680x1050 because that is the more "standard" preset for Heaven 3.0, and that was the criteria set forth by the OCN Top 30 Heaven scores thread.

As for the heat issue, i've owned the card for less than 24 hours, and I haven't pushed it all that much yet, so I have yet to notice any throttling thus far...but I do know that there is some sort of "soft" threshold for when GK104 may tighten up a little to prevent instability in accordance with temps, so this is certainly possible. It'd be a lot easier to tell though once water blocks are out and we can gauge performance when these cards are being kept at sub-60/65c temps at all times.


----------



## tahoward

The card was waiting for me by the time I got home. Here are some pics with it installed. This thing screams to be liquid cooled:


----------



## TheRainMan

Hey jcde7ago, could you update my proof in t he OP?

Thanks


----------



## jcde7ago

Quote:


> Originally Posted by *tahoward*
> 
> The card was waiting for me by the time I got home. Here are some pics with it installed. This thing screams to be liquid cooled:


Grats man!

And while it does scream for liquid cooling, I think you will be pleasantly surprised by how amazingly cool this card runs. I was eyeing my Kill-A-Watt during heaven, and even with a decent OC above, temps were just slightly above 70c (and it's warm in my room), and I never saw power consumption past 650w - and this is with my monitor, a lamp, and speakers plugged into the same outlet.
Quote:


> Originally Posted by *TheRainMan*
> 
> Hey jcde7ago, could you update my proof in t he OP?
> Thanks


Sure thing!


----------



## Michalius

Hmm... what could be in this innocent brown box?



Oh, why hello there. (yeah, cellphone flash. waiting to install it before I give it the good photo treatment)



It's huge! Bigger than my 7970 even.



Rest of my parts don't arrive until Friday. It's friends will be:

Gigabyte G1 Assassin 2
3820 i7
4x4GB Corsair 1.35v Vengeance Low Profile (white)
Corsair AX850 w/ white braided cables
Crucial M4 128GB

Going in a 600T for now until the PC-V750 releases. Once that releases, modding it and powder coating the internals white to fit:
Alphacool Single Bay Repack w/ 2xDC-LT pumps (arrived, sweet piece of kit)
Alphacool XT45 240mm radiator (front)
Alphacool XT45 360mm radiator (top)
EK's eventual 690 block
EK Supreme Rev 3 Plexi or XSPC Raystorm (still haven't decided)
Mayhem's Mint Green Concentrate

If the V750 takes more than a month to come out, I'll probably get antsy and mod the 600T for a 360 up top and a 200mm in the front. Hoping I can stay strong.


----------



## tahoward

I think I may have a strong overclocker candidate. I haven't pushed the voltages far yet but here is the initial result when attempting 1.2GHZ. Ran the Heaven bench maxed out @ 1920x1080 3 times. No artifacting or anything. When the voltage is set to 100% it will crash randomly without warning. A 1% voltage increase did the trick though. Hoping for 1.3Ghz


----------



## Inglewood78

How are your temps? I have the same case.


----------



## tahoward

Stock with no changes it hits 80C while running the benchmark but the fan is spinning @ about 50%. Very quiet.


----------



## jcde7ago

Quote:


> Originally Posted by *tahoward*
> 
> I think I may have a strong overclocker candidate. I haven't pushed the voltages far yet but here is the initial result when attempting 1.2GHZ. Ran the Heaven bench maxed out @ 1920x1080 3 times. No artifacting or anything. When the voltage is set to 100% it will crash randomly without warning. A 1% voltage increase did the trick though. Hoping for 1.3Ghz


Not bad at all! I haven't tried OC'ing past +140Mhz core/+200Mhz offsets yet, as i've been gaming on them to test stability.

As for the Power Target increase, raising it does not mean you're actually raising the voltage - in fact, regardless of your Power Target value, the GPU will not use up any more power than needed to achieve your offsets. People just default it to the max (especially during benches) because it gives the most headroom when benchmarking, and it's necessary to have IN CASE your GPU does need that extra juice to keep it stable/keep it from crashing.

Majin SSJ Eric posted his CF 7970s with a 3960X clocked 100Mhz higher than my 3930K, and he only scored .1 better than me in Heaven (108.0 him, 107.9 me) at the same settings...and I haven't even hit my max OC yet, let alone having tried OC'ing this beast under H20.


----------



## Arizonian

Quote:


> Originally Posted by *tahoward*
> 
> Stock with no changes it hits 80C while running the benchmark but the fan is spinning @ about 50%. Very quiet.


Ramp up your fan settings manually. At 80C I'd be at 90% fan speed. At the very least I do 1C to 1% up the entire scale. At 80C if fan was unlocked I'd be at 100% personally. The fan Unlocker BIOS is still upcoming by EVGA. At the moment were locked at 85% fan speed max.

Keep that baby cooler and raise those clocks and take her for a spin around the block, open her up and see what she can do.


----------



## jcde7ago

Wanted to do a quick 3DMark11 run before I went off to bed, using my modest OC, here's the result as compared to my run last night:

Last night, stock clocks:



Tonight, modest OC:



I'll take a 360 point increase any day!

Now that I know that this card can do a +140Mhz core/+200MHz mem. offset completely stable...i'll push it even further tomorrow night.


----------



## error-id10t

There are 680 SLI guys cracking 21000 for graphics (with 670 SLI guys going around 20000), are there no owners with 690 that can show what this can do when pushed or does it need water?


----------



## jcde7ago

Quote:


> Originally Posted by *error-id10t*
> 
> There are 680 SLI guys cracking 21000 for graphics (with 670 SLI guys going around 20000), are there no owners with 690 that can show what this can do when pushed or does it need water?


Those 680 SLI guys are (with the exception of 670s, still an unknown but for a model or two that are known to clock high since they are non-reference), getting that high of a graphics score most likely are on water/phase/liquid nitrogen/whatever, and also some of them are able to overvolt/voltmod their 680s.We're not going to see those kinds of scores with a 690 until we at least get these under water. Temperatures will throttle Kepler as it starts to breach the low/mid 70s celsius, depending on the clocks speeds and voltage, and the fan isn't going to be able to keep temps cool enough when pushing a 690 to the absolute brink.

Not to mention, there haven't been any 690 owners I know that have been brave enough to go on suicide runs without being under liquid cooling yet, either.


----------



## tahoward

This is my best 100% stable overclock so far:

1185MHz core ( 1191/1200 have a small chance to just crash out of the blue in 3DMark11.)

Memory was way more interesting; it just kept taking every MHz I threw at it to the point where I just shot it to +1000Mhz in the offset bar to see if I could force a crash (It did crash btw). In the end I settled with a 725MHz offset which put out an effective 7.45Ghz memory clock ...

With those two offsets and target power at 135% Heaven and 3DMark run great without artifacts or random crashing.

So here are my results:

P15701 Graphics score: 20078
X6812 Graphics score: 6625

These results were obtained using the centered option. I'll update with Stretched as well.


----------



## error-id10t

Quote:


> Originally Posted by *tahoward*
> 
> So here are my results:
> P15701 Graphics score: 20078
> These results were obtained using the centered option. I'll update with Stretched as well.


Thanks for this, so they do match up fairly nicely and I think you are still the first I've seen pushing them .. congrats









I read on one of the reviews that they ran into a heat wall with air-cooling so under water this may still have something to give (I don't know your temps obviously) but good to see it's up there as-is.


----------



## tahoward

For these frequencies I locked my fan at 95%. Temps have not breached 77C. I do not think a waterblock will do much for the maximum obtainable frequency without the ability to get core voltage beyond 1.175v. The way the card crashes when I try to approach 1.2GHz is reminiscent of other cards that would chug along to higher frequencies with some more juice. From my personal experience symptoms of this are immediate crashes without experiencing any form of artifacting prior. The only way I can induce artifacts with the GTX 690 is if the memory frequency is set too high.

Water-cooling this card would only be beneficial if you wanted a more consistent boost speed without having to listen to a fan that is being ramped to 95%.

And here are the results using the "stretched" option:

P15978 Graphics score: 20701
X6806 Graphics score: 6613


----------



## ceteris

Looks like Newegg caved and lowered their price to MSRP! I have a feeling they are going to update their stock soon.


----------



## Cheesemaster

Thanx for the base guide on what ill be shooting my OC's for. I am gonna have a quad setup.. I will post ASAP!


----------



## Stateless

Quote:


> Originally Posted by *tahoward*
> 
> This is my best 100% stable overclock so far:
> 1185MHz core ( 1191/1200 have a small chance to just crash out of the blue in 3DMark11.)
> Memory was way more interesting; it just kept taking every MHz I threw at it to the point where I just shot it to +1000Mhz in the offset bar to see if I could force a crash (It did crash btw). In the end I settled with a 725MHz offset which put out an effective 7.45Ghz memory clock ...
> With those two offsets and target power at 135% Heaven and 3DMark run great without artifacts or random crashing.
> So here are my results:
> P15701 Graphics score: 20078
> X6812 Graphics score: 6625
> These results were obtained using the centered option. I'll update with Stretched as well.


So that 1185mhz on the core the actual core speed without the boost or is that what it boost too? If this is the actual core, then you got a good +270 offset...so your boost should be getting around 1250 or so if that is the case.

I am wondering what kind of offset I can get without breaking the 70c barrier, but also not going 85% on the fan speed. On my 680's in SLI, my highest temp ever was around 65c on the top card with a +100 offset to the core, +400 on the memory and 132% power. I had a custom fan profile that matched temp to fan speed 1:1 as well. If I can get something close with the 690, I would be really happy.


----------



## emett

I was gonna get a 690 to replace my 590 but they are to hard to come by. EDIT- Lol **** thought I was in the 680 thread... Anyway I'll leave this for reference... -EDIT
P16,888
http://3dmark.com/3dm11/3377525

Gpu score over 21,000. Bare in mind I had nfi how to over clock these cards, I got them today.


----------



## tahoward

The 1185MHz is the result after boost. Without liquid cooling the card will definiately go past 70c with a good offset. Stock it will hit 80c without modifying the fan profile. The fan is very impressive, at 85-95% you can definiately hear it but its a soft whirring air sound; not annoying and it is not very noticeable when your speakers are going.


----------



## Cheesemaster

Quote:


> Originally Posted by *tahoward*
> 
> The 1185MHz is the result after boost. Without liquid cooling the card will definiately go past 70c with a good offset. Stock it will hit 80c without modifying the fan profile. The fan is very impressive, at 85-95% you can definiately hear it but its a soft whirring air sound; not annoying and it is not very noticeable when your speakers are going.


I wont be scared to max my fans and hit some solid numbers.. i really want to max this out, I never had a quad setup before just 2-way and 3-way sli.. first I had 5850 in crossfire.. then i went to 560ti in sli, then i went 580gtx 3gb 3-way sli.. now im onto 690gtx quad sli.. I got a 3960x @4.8 and RAM @2133 10-11-10-28-1T That should push those 690's reall good!


----------



## kemsoff

Woot, Evga shipped mine yesterday. Got my UPS tracking #. I'll have my 690 tomorrow


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> Those 680 SLI guys are (with the exception of 670s, still an unknown but for a model or two that are known to clock high since they are non-reference), getting that high of a graphics score most likely are on water/phase/liquid nitrogen/whatever, and also some of them are able to overvolt/voltmod their 680s.We're not going to see those kinds of scores with a 690 until we at least get these under water. Temperatures will throttle Kepler as it starts to breach the low/mid 70s celsius, depending on the clocks speeds and voltage, and the fan isn't going to be able to keep temps cool enough when pushing a 690 to the absolute brink.
> Not to mention, there haven't been any 690 owners I know that have been brave enough to go on suicide runs without being under liquid cooling yet, either.


Not entirely true. I managed to get a graphics score of 22140 with my 680 SLI (EVGA - Reference model (no SC or SC+)) and this was all on air with max fan speed.

Though, your Heaven score is really close to mine f'rom the top 30.

We shouldn't be expecting a 690 to trounce 680 SLI anyway. It will be just shy of a 680 SLI's max score. It will be close, but you'd be lucky to have a card that passes it.


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> Not entirely true. I managed to get a graphics score of 22140 with my 680 SLI (EVGA - Reference model (no SC or SC+)) and this was all on air with max fan speed.
> Though, your Heaven score is really close to mine f'rom the top 30.
> We shouldn't be expecting a 690 to trounce 680 SLI anyway. It will be just shy of a 680 SLI's max score. It will be close, but you'd be lucky to have a card that passes it.


Was this with a Max OC on your 680s though? While I don't expect a 690 to surpass 680s in SLI (as close as they will, be should still be a 2-4% difference overall in favor of 680s in SLI), I have yet to reach my Max OC potential, nor have I had to use Max fan speed yet...so i'm sure there's still a bit of headroom for my 690.









Also, I only ran 3DMark on stretched mode, I have yet to run it on centered (Or maybe it was the other way around - I completely forgot about this setting). And i'm almost certain I can squeeze a better score on Heaven as well, once I find that max OC.









EDIT:

Looks like Martyr82 of that Gigabyte 670 Windforce 3 fame submitted his top 30 score for Heaven as well with 670s in SLI, and it was still decently below my 690, even though his score was with a MAX OC for the 670s:

My 690, mild OC:

jcde7ago --- i7 3930k / 4.7 Ghz ---- GTX 690 / 1055 / 1602 / 1160 Mhz ---- 107.9 ---- 2719



His 670 SLI, MAX OC:

Martyr82 --- 2500k / 4.6GHz ---- SLI GTX 670, 1129 / stock / 1775---- 102 ---- 2569



Keep in mind he does have a 2500K clocked 100Mhz lower than my 3930K, though I do not know how big of a difference that would make. 670s still showing some beastly power as well though.


----------



## ceteris

Quote:


> Originally Posted by *jcde7ago*
> 
> Was this with a Max OC on your 680s though? While I don't expect a 690 to surpass 680s in SLI (as close as they will, be should still be a 2-4% difference overall in favor of 680s in SLI), I have yet to reach my Max OC potential, nor have I had to use Max fan speed yet...so i'm sure there's still a bit of headroom for my 690.
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I only ran 3DMark on stretched mode, I have yet to run it on centered (Or maybe it was the other way around - I completely forgot about this setting). And i'm almost certain I can squeeze a better score on Heaven as well, once I find that max OC.
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> Looks like Martyr82 of that Gigabyte 670 Windforce 3 fame submitted his top 30 score for Heaven as well with 670s in SLI, and it was still decently below my 690, even though his score was with a MAX OC for the 670s:
> My 690, mild OC:
> jcde7ago --- i7 3930k / 4.7 Ghz ---- GTX 690 / 1055 / 1602 / 1160 Mhz ---- 107.9 ---- 2719
> 
> His 670 SLI, MAX OC:
> Martyr82 --- 2500k / 4.6GHz ---- SLI GTX 670, 1129 / stock / 1775---- 102 ---- 2569
> 
> Keep in mind he does have a 2500K clocked 100Mhz lower than my 3930K, though I do not know how big of a difference that would make. 670s still showing some beastly power as well though.


I read that you can't compare the clock speeds between an SB and an SB-E due to slight differences in architectures... # of cores maybe? I'd have to dig through some old reviews for proper references, but the response came from a couple gripes that the SB-E couldn't OC to 5ghz as easily as a regular SB. There were benches that showed an SB-E clocked around 4.8 ghz was beating a 5ghz 2700k with most of the same components (with the exception of the mobo and cpu of course).


----------



## jcde7ago

Quote:


> Originally Posted by *ceteris*
> 
> I read that you can't compare the clock speeds between an SB and an SB-E due to slight differences in architectures... # of cores maybe? I'd have to dig through some old reviews for proper references, but the response came from a couple gripes that the SB-E couldn't OC to 5ghz as easily as a regular SB. There were benches that showed an SB-E clocked around 4.8 ghz was beating a 5ghz 2700k with most of the same components (with the exception of the mobo and cpu of course).


Fairly certain that SB and SB-E use the exact same architecture, with the exception being that SB-E just has the additional 2 cores. And again, i am not sure exactly how the 2 additional cores are utilizes/scale with GPU benchmarks.


----------



## Arizonian

JCD - Base 1124 Core Clock with +132 Power Target should give you a 1228 MHz Core approximately when GPU boost kicks in. If your power target is set to low it has no way if reaching 1.175v and will crash at higher GPU boosts.

Don't worry at all about the OC. With voltage locked there is no way we can fry there cards unlike the early attempts if 590 last year where too much voltage was applied.

I do wish I had my card. Ordering on launch day with Pre-order and watching others get thier cards who order after me is a raw deal from EVGA. Before selling others cards that order after me, EVGA should be fulfilling thier Pre order sales and not 30 days later. First and last time ordering directly from EVGA and that comes from a huge EVGA fan boy. If a Zotac or ASUS comes into the market while my Pre-order is stuck in hiatus, I'm switching.

If the EVGA FTW version comes out I'm going for that instead. I'll be more than happy with a single FTW if it hits 1300 MHz which shouldn't be a stretch with 8 PWM's.


----------



## ceteris

Quote:


> Originally Posted by *jcde7ago*
> 
> Fairly certain that SB and SB-E use the exact same architecture, with the exception being that SB-E just has the additional 2 cores. And again, i am not sure exactly how the 2 additional cores are utilizes/scale with GPU benchmarks.


Here are OC3D's Benches *CLICK HERE for i7-3930K Benchmark Reviews*

I'm not a hardware engineer by trade so perhaps my use of the word "architecture" was wrong, but graphs definitely show differences in performance.

On another note, picked up my 690 from the UPS office after dropping the wife off at work











After I take care of some business I'm going to bench my 680's and pop this baby in for some benching too!









Here is a screenshot of my last 680 SLI benches, everything stock clocks, for you to compare with yours:


----------



## Lazz767

Anyone who ordered off TD, mine just shipped. Looks like they received some today. Check your statuses!


----------



## blumpking

The card showed up exactly when I got off of work yesterday. I was so pumped that it showed up (plus I had a BF3 session waiting) that I went ahead and threw it in the case without doing any cleanup work. LOL my two six pin power connectors from my old Radeon HD 5850 are still just hanging there. Desperately need to spend some time cleaning up my SATA/fan/power cabling rats nest this weekend, as well as cleaning case filters out and destroying colonies of dust bunnies.









Anyway, here are some pics of the card in my rig:


----------



## Lazz767

Do you guys think my CPU will bottleneck the 690?


----------



## TheRainMan

My confirmation call came in the middle of class >.>

But it should be shipping out today and getting here on Friday which I have off


----------



## 3930K

Quote:


> Originally Posted by *Lazz767*
> 
> Do you guys think my CPU will bottleneck the 690?


Don't think so.


----------



## bitMobber

People that have their tracking numbers, where did it ship from?


----------



## Lazz767

Quote:


> Originally Posted by *bitMobber*
> 
> People that have their tracking numbers, where did it ship from?


All I can tell right now is that its coming by UPS, hasn't stated where yet as UPS hasn't been updated on the package.


----------



## TheRainMan

I think they ship from Cali


----------



## Lazz767

Quote:


> Originally Posted by *3930K*
> 
> Don't think so.


I hope not, would really suck if I did. I mean the 875k is fairly close to the 2600k, plus I have a nice OC on it. Hopefully the 690 pairs well with it.


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> I think they ship from Cali


They ship from Placentia, CA , just south of Los Angeles.
Quote:


> Originally Posted by *Lazz767*
> 
> I hope not, would really suck if I did. I mean the 875k is fairly close to the 2600k, plus I have a nice OC on it. Hopefully the 690 pairs well with it.


I don't think it will bottleneck much if at all, really. You can also try higher CPU clocks and see if performance is scaling linearly with CPU clock speed. I could see it *maybe* affecting benchmark scores, but for gaming...I think you're fine.


----------



## Lazz767

Quote:


> Originally Posted by *jcde7ago*
> 
> They ship from Placentia, CA , just south of Los Angeles.
> I don't think it will bottleneck much if at all, really. You can also try higher CPU clocks and see if performance is scaling linearly with CPU clock speed. I could see it *maybe* affecting benchmark scores, but for gaming...I think you're fine.


You rock jcde7ago! Thanks for the info







. Glad to be getting my 690 soon







.


----------



## jcde7ago

Quote:


> Originally Posted by *Lazz767*
> 
> You rock jcde7ago! Thanks for the info
> 
> 
> 
> 
> 
> 
> 
> . Glad to be getting my 690 soon
> 
> 
> 
> 
> 
> 
> 
> .


Anytime man! Be sure to post pics and benchmarks when you get it!


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> They ship from Placentia, CA , just south of Los Angeles.


Awesome, I'm in Sacramento. I got my confirmation call this morning around 10am and they said it would ship this afternoon. I wonder if I'll receive it this week even though I paid for UPS ground...


----------



## Arizonian

Quote:


> Originally Posted by *bitMobber*
> 
> Awesome, I'm in Sacramento. I got my confirmation call this morning around 10am and they said it would ship this afternoon. I wonder if I'll receive it this week even though I paid for UPS ground...


Was your an EVGA Pre-order?


----------



## Stateless

Quote:


> Originally Posted by *Arizonian*
> 
> Was your an EVGA Pre-order?


Pre-orders were only for the Signature Edition which does not ship out until the end of the month. If he is getting confirmation now it is for a regular 690, not the signature.

UPS Ground is usually a 3 day affair, so I would say if it ships out today, you will get your card on Monday/Tuesday.

I just received my confirmation and I will have my cards on Friday!!!


----------



## bitMobber

Yes, the regular 690.

Well seeing how it ships from the LA area and I'm in Sac I might have a good chance of receiving it Friday. If not then definitely Monday.

Another question for the people that ordered two 690s. Since EVGA's site says 1 per household, how did you get around this? I'm assuming you made another EVGA account with a different delivery address but did you also change the billing address?

I have another address I can use for delivery but the billing address will have to be the address I used for my first 690 purchase, will this work?


----------



## Stateless

Quote:


> Originally Posted by *bitMobber*
> 
> Yes, the regular 690.
> Well seeing how it ships from the LA area and I'm in Sac I might have a good chance of receiving it Friday. If not then definitely Monday.
> Another question for the people that ordered two 690s. Since EVGA's site says 1 per household, how did you get around this? I'm assuming you made another EVGA account with a different delivery address but did you also change the billing address?
> I have another address I can use for delivery but the billing address will have to be the address I used for my first 690 purchase, will this work?


I did not order via Evga. I used a different e-tailer that did not put a limit on the amount of cards I could order. Evga just changed their 1 Limit to all 600 Series GPU's now...so it is just not the 690's that are a limit of 1 per household.

I am not sure what measures Evga uses to prevent someone from creating another account and allowing more than 1 order to go through. It can be at the Credit Card level where you would get screwed, but not sure how it works.


----------



## Arizonian

@bitmobber.

Thanks. Was curious if EVGA was starting to ship Pre orders. Got me excited. Ah well.

In the meantime - congrats on it's confirmed delivery. Good times for you. Post back with OC stats.


----------



## bitMobber

Quote:


> Originally Posted by *Stateless*
> 
> I did not order via Evga. I used a different e-tailer that did not put a limit on the amount of cards I could order. Evga just changed their 1 Limit to all 600 Series GPU's now...so it is just not the 690's that are a limit of 1 per household.
> I am not sure what measures Evga uses to prevent someone from creating another account and allowing more than 1 order to go through. It can be at the Credit Card level where you would get screwed, but not sure how it works.


A few people on this thread were able to buy two from EVGA. It says "One per household" so I would assume they track the delivery address.


----------



## tahoward

Dug up my best Extreme setting 3DMARK11 score when I still had 2 GTX 580 LE's @ 910MHz.

X4606 Graphics score: 4280

So that's a 55% graphics score and 48% combined score increase going to the GTX 690 (OCed.) Not a bad upgrade in the least bit. Cut my GPU power consumption in half as well.


----------



## TheRainMan

What would you guys say is better, the 690 or Tri-SLI 580's? Just out of curiosity


----------



## y2kcamaross

Quote:


> Originally Posted by *TheRainMan*
> 
> What would you guys say is better, the 690 or Tri-SLI 580's? Just out of curiosity


Definitely the 690, if there was perfect scaling they'd be about equal, but since there isn't...definitely the 690


----------



## tahoward

To put numbers to it here is a 3DMARK11 Performance preset result with 3 overclocked GTX 580's in SLI:

P17138 Graphics score: 20164

Here are my results from an overclocked GTX 690:

P15978 Graphics score: 20701

Pretty close.

In the end new GTX 580's average $400 a pop still. Three of them would cost more than a single GTX 690 which would perform just as well without all the added heat and noise. Pretty damn good deal.


----------



## jcde7ago

FYI, someone on Twitter asked about email notifications for when cards are in stock, and if they should be F5'ing all day, and this was EVGA Jacob's response:
Quote:


> Check today around 3-3:30PM PST


So, if you're still hunting for a GTX 690 (I know many of you are)... you better be signed into your EVGA account, ready to enter in your CC info without making a mistake, because they sold out of their 690 stock yesterday in under 60 seconds....(some say it lasted only 20-25 seconds before they were sold out or got an error finalizing their payment and shipping info).


----------



## jcde7ago

By the way, for anyone interested in how 4GB of VRAM would benefit even a surround/triple-monitor set up, or a set up at high resolutions in general compared to just grabbing a pair of 2GB 680s or a 690...the difference may shock you.

*Hint: in almost ALL cases at 5760x1080 with high amounts of AA/AF, the difference was 1, maybe 2 FPS between 2GB and 4GB 680s (this applies to the 690 as well):*

http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/22235-test-palit-geforce-gtx-680-4-gb-jetstream.html?start=15









Bottom line is...2GB vs 4GB is irrelevant right now. The GTX 690 or a pair of 680s will destroy everything, even when using a triple-monitor or ultra high res. display with all of the eye candy turned up. Hopefully this helps someone out who might be waiting on 4GB 680s and should just jump on a pair of 2GB 680s or a 690


----------



## Cobolt005

Please add me to the list



I also got my phone call today at lunch which turned out sweet as at my work not allowed to have cell phones on the shop floor. BS rule


----------



## jcde7ago

Quote:


> Originally Posted by *Cobolt005*
> 
> Please add me to the list
> 
> I also got my phone call today at lunch which turned out sweet as at my work not allowed to have cell phones on the shop floor. BS rule


Grats man, added!


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> FYI, someone on Twitter asked about email notifications for when cards are in stock, and if they should be F5'ing all day, and this was EVGA Jacob's response:
> Quote:
> 
> 
> 
> Check today around 3-3:30PM PST
> 
> 
> 
> So, if you're still hunting for a GTX 690 (I know many of you are)... you better be signed into your EVGA account, ready to enter in your CC info without making a mistake, because they sold out of their 690 stock yesterday in under 60 seconds....(some say it lasted only 20-25 seconds before they were sold out or got an error finalizing their payment and shipping info).
Click to expand...

3:48 and still no update on EVGA.


----------



## Xinoxide

Quote:


> Originally Posted by *jcde7ago*
> 
> By the way, for anyone interested in how 4GB of VRAM would benefit even a surround/triple-monitor set up, or a set up at high resolutions in general compared to just grabbing a pair of 2GB 680s or a 690...the difference may shock you.
> 
> *Hint: in almost ALL cases at 5760x1080 with high amounts of AA/AF, the difference was 1, maybe 2 FPS between 2GB and 4GB 680s (this applies to the 690 as well):*
> 
> http://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/22235-test-palit-geforce-gtx-680-4-gb-jetstream.html?start=15
> 
> ( Insert pictures here )
> 
> Bottom line is...2GB vs 4GB is irrelevant right now. The GTX 690 or a pair of 680s will destroy everything, even when using a triple-monitor or ultra high res. display with all of the eye candy turned up. Hopefully this helps someone out who might be waiting on 4GB 680s and should just jump on a pair of 2GB 680s or a 690


I wouldn't really put 5760x1080 as a really memory limiting resolution anymore, 1440P cost is dropping immensely. I would like to see 2GB vs 4GB on 7580 x 1440/1600. As soon as my probational is over and I get my first bonus, I am splooging 27" all over my desk.


----------



## kemsoff

Quote:


> Originally Posted by *bitMobber*
> 
> 3:48 and still no update on EVGA.


They are up right now for those of you looking


----------



## CapnCrunch10

It's up!


----------



## CapnCrunch10

And it's gone. Wow.


----------



## kemsoff

Quote:


> Originally Posted by *CapnCrunch10*
> 
> And it's gone. Wow.


Less than 2 minutes, lol


----------



## bitMobber

Lol yeah...

FYI: I tried to buy another under the same account as my first purchase and it said I've already reached my limit per household.


----------



## dred

In case your trying to add a second card via EVGA:
Quote:


> You already reached the product limit of 1 for the family of product 04G-P4-2690-KR.


This was confirmed on launch day, but just goes to further prove it.


----------



## ceteris

Quote:


> Originally Posted by *CapnCrunch10*
> 
> It's up!


Quote:


> Originally Posted by *CapnCrunch10*
> 
> And it's gone. Wow.


Makes me imagine EVGA has a whole bunch of crates sitting in the warehouse with only 1 employee handling the opening of those crates and logging them into inventory, while going on frequent smokebreaks to deal with the stress of doing it all by him/herself...


----------



## jcde7ago

Quote:


> Originally Posted by *kemsoff*
> 
> Less than 2 minutes, lol


That was less than 1 minute....lol.
Quote:


> Originally Posted by *Xinoxide*
> 
> I wouldn't really put 5760x1080 as a really memory limiting resolution anymore, 1440P cost is dropping immensely. I would like to see 2GB vs 4GB on 7580 x 1440/1600. As soon as my probational is over and I get my first bonus, I am splooging 27" all over my desk.


Yeah, but it's not about testing memory-limiting resolutions though.

The fact is, if I had to make an educated guess about enthusiast-level gaming setups, i'd break it down to something like this:

75% : 1920x1080-1200p
15% : 2560x1440-1600p
10%% : 5760x1080 - 7580x1440/1600 and 120hz

The market just isn't there for tri-monitor setups yet, let alone at resolutions greater than 5760x1080. And while yes, 1440/1600p monitors are coming down in price, unless you're getting the Korean IPS displays that come 0-1 year of warranty, that is still a $1,000-2,000 investment.

Essentially, even for Surround setups, as rare as they are, 5760x1080 is going to be the norm. And with benchmarks like those above...it's a good argument that there really is no benefit to having 4GB vs 2GB right now and for probably the next year or so, period.


----------



## Kyouki

Well this time I got credit card info in hit submit and site just looped so I hit again and looped so refreshed and it back on notify. Give me a break I had credit card info in less then a min. I can't handle this stress haha!


----------



## bitMobber

I wonder when Newegg will get theirs in stock?

It seems EVGA is filling their inventory before any other online retailers.


----------



## ceteris

Quote:


> Originally Posted by *jcde7ago*
> 
> Essentially, even for Surround setups, as rare as they are, 5760x1080 is going to be the norm. And with benchmarks like those above...it's a good argument that there really is no benefit to having 4GB vs 2GB right now and for probably the next year or so, period.


Crossing my fingers on that. Although I know for sure that when GK110 is released, it will definitely make some funky waves amongst the GPU consumers. Especially the people who dumped a bunch of cash on the GK104's









I just hope that if there is an Asus Mars III in the works, they will base it off the GK110's and not GK104's.


----------



## CapnCrunch10

Quote:


> Originally Posted by *jcde7ago*
> 
> That was less than 1 minute....lol.


Btw, I was running a 2500K and the offsets were +181 clock, +535 memory, and 115% power for the beastly 22,000+ GPU score. Those cards were awesome, but they ran too warm.

I was getting the top to hover at 1280-1290MHz and the bottom stood at 1310MHz.


----------



## Arizonian

Just landed a Newegg GTX 690 next day delivery for $1025. Confirmation email recieved . Payment charged and in packaging.









Cancelled the EVGA order.

Should be here by Saturday.


----------



## PCModderMike

Not like I can afford one of these beast right now, but just checking out the different prices versus EVGA's $999....TigerDirect suggests I buy a 7770 since the 690 is out of stock??


----------



## kemsoff

Quote:


> Originally Posted by *PCModderMike*
> 
> Not like I can afford one of these beast right now, but just checking out the different prices versus EVGA's $999....TigerDirect suggests I buy a 7770 since the 690 is out of stock??


Lol do it! Surely it'll be comparable to the 690


----------



## bitMobber

Quote:


> Originally Posted by *Arizonian*
> 
> Just landed a Newegg GTX 690 next day delivery for $1025. Confirmation email recieved . Payment charged and in packaging.
> 
> 
> 
> 
> 
> 
> 
> 
> Cancelled the EVGA order.
> Should be here by Saturday.


They were in stock today on Newegg?


----------



## ceteris

Quote:


> Originally Posted by *Arizonian*
> 
> Just landed a Newegg GTX 690 next day delivery for $1025. Confirmation email recieved . Payment charged and in packaging.
> 
> 
> 
> 
> 
> 
> 
> 
> Cancelled the EVGA order.
> Should be here by Saturday.


Grats, Bro!

Going to open mine right now and start benchin'!


----------



## eviltinky

My TD order for the evga GTX 690 has shipped!!! Although, I'm still thinking of keeping my ASUS GTX 690 pre-order from ExcaliberPC.


----------



## CapnCrunch10

Quote:


> Originally Posted by *Arizonian*
> 
> Just landed a Newegg GTX 690 next day delivery for $1025. Confirmation email recieved . Payment charged and in packaging.
> 
> 
> 
> 
> 
> 
> 
> 
> Cancelled the EVGA order.
> Should be here by Saturday.


Awesome! Looks like Newegg finally lowered their price to MSRP. Wonder if the people who bought it for $200 more earlier can call and complain or something. Sometimes, they give you a portion of the difference via gift card if you have a good order history with them. However, due to the demand of this one, I doubt it.


----------



## TheRainMan

My 690 has shipped and is scheduled to arrive on Friday


----------



## Callandor

Quote:


> Originally Posted by *Lazz767*
> 
> Anyone who ordered off TD, mine just shipped. Looks like they received some today. Check your statuses!


Whoo-hoo! Mine too! Should be here tomorrow. Now I have the processor and video card for my new build, just need the case, power, motherboard, and ram.


----------



## ceteris

Here are my benches on stock. My ambient right now is 30.5C ATM so I'm not going to bother pushing it at this time. Since this is my first dual-gpu card, I'm going to let my baby run stock easy for a week before I start messing with OC'ing it.


*EVGA GTX 690 vs. EVGA GTX 680 x 2 SLI (stock settings)*
- 3DMark 11 has yet to be updated to recognize the 690.


*EVGA GTX 680 x 2 SLI (stock settings)*


*EVGA GTX 690 (stock settings)*

Here are some pics of a side by side visual comparison of 2 GTX 680's and a GTX 690 before












The 680's are going back into their boxes and into the closet until my Heatkiller blocks and accessories come in


----------



## dred

Quote:


> Originally Posted by *eviltinky*
> 
> *My TD order for the evga GTX 690 has shipped!!!* Although, I'm still thinking of keeping my ASUS GTX 690 pre-order from ExcaliberPC.


Mine as well!!! I'm SOOOOOOOOOO HAPPPPY!!!!!

I'll prolly keep my pre-order sig. ed. with EVGA.

Do you think the "Limit 1" will affect registering the card(s)?


----------



## ceteris

Quote:


> Originally Posted by *dred*
> 
> Mine as well!!! I'm SOOOOOOOOOO HAPPPPY!!!!!
> I'll prolly keep my pre-order sig. ed. with EVGA.
> Do you think the "Limit 1" will affect registering the card(s)?


Nah. It's based on ordering from EVGA direct only.


----------



## dred

Quote:


> Originally Posted by *ceteris*
> 
> Nah. It's based on ordering from EVGA direct only.


Sweet! Thanks!


----------



## jcde7ago

Glad to see more people getting theirs in finally, and good to see a few more people managing to snag one as the e-tailers list them.









For those of you who have their cards....GET TO OC'ing!!!! We need to record some absolute MAX OCs, and whatever offsets you're using to help out future owners (as well as ourselves to get an idea of what we might or might not be able to hit).


----------



## pat031

Add me ! woot just receive mine


----------



## jcde7ago

Quote:


> Originally Posted by *pat031*
> 
> Add me ! woot just receive mine


Grats, and added!


----------



## tonyjones

i want one


----------



## iARDAs

Hey guys

Where can I purchase an Asus 690 GTX GPU?

I am thinking of picking up one in mid June so no rush.


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> Hey guys
> 
> Where can I purchase an Asus 690 GTX GPU?
> 
> I am thinking of picking up one in mid June so no rush.


Hey Jardas!

Well that's good you can wait until June. At this time the ASUS or Zotac version have not surfaced. By June we'll easilly see them both. Zotac actually has the pic on thier website added recently. ASUS made thier paper release in thier news section they will be manufacturing them as well.

All the normal etailers. Limited quantities but not a limited series's this time.

This year I won't be a sideline post reader like i did the 590 as I finally scored / had the cash for the 690. Landed one yesterday afternoon on Newegg. They initially launched at $1199 but yesterday came down to sugessted retail of $999.

Well good luck as well. Will be nice if you do join us as a previous dual GPU owner your experience will be of asset to this club.


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> Hey Jardas!
> Well that's good you can wait until June. At this time the ASUS or Zotac version have not surfaced. By June we'll easilly see them both. Zotac actually has the pic on thier website added recently. ASUS made thier paper release in thier news section they will be manufacturing them as well.
> All the normal etailers. Limited quantities but not a limited series's this time.
> This year I won't be a sideline post reader like i did the 590 as I finally scored / had the cash for the 690. Landed one yesterday afternoon on Newegg. They initially launched at $1199 but yesterday came down to sugessted retail of $999.
> Well good luck as well. Will be nice if you do join us as a previous dual GPU owner your experience will be of asset to this club.


Thanks for the response. Yes i will wait for June, mid June or even late june to be honest. I will be going for vacation to USA and thinking of grabbing one there. We dont have Evga support here in Turkey so I will be going for the Asus version since Asus products purchased in USA are usually covered in warranty here in Turkey, but first I need to call the Turkish department and ask.

I can sell my 590 for around 610$ and i dont mind adding another 400$ to get a 690.

I actually do NOT need the 690 as I am not really gaming in 3D anymore ( at least for the moment ) but the urge to purchase one is a lot.

1000$ is probably the max I would pay for this GPU so lets see.Here in Turkey these suckers will be no less than 1500-1600$ so thats way too expensive.


----------



## bnj2

I hear rumors that there will be no non-reference design for 690, is that true?


----------



## iARDAs

Quote:


> Originally Posted by *bnj2*
> 
> I hear rumors that there will be no non-reference design for 690, is that true?


Probably true as 590 was the same way

Every single 590 out there was a reference card.

I wonder if this was the case for 295 back in the day?


----------



## Suit Up

My Gigabyte 690 came today. Took some photos.

























Now to bench/play


----------



## mxthunder

I am suprised that nvidia let gigabyte put that sticker on the card!


----------



## dred

Pushed UP!!!!


----------



## MessiDonna

Had mine for just over 24 hours now and have literally being playing games and running benchmarks since I got it. Love it love it love it.


----------



## PCModderMike

Quote:


> Originally Posted by *iARDAs*
> 
> Probably true as 590 was the same way
> Every single 590 out there was a reference card.
> I wonder if this was the case for 295 back in the day?


I think it was the same for the 295 as well....I had a dual PCB version myself. The only change they ever made was later releasing a single PCB version of the 295.


----------



## jcde7ago

Can't wait to see some 670SLI vs 690 benches now that 670 is released...I just bought two, but these may be going to a couple of friends who aren't able to F5 all day in front of a computer....we'll see.


----------



## iARDAs

Quote:


> Originally Posted by *jcde7ago*
> 
> Can't wait to see some 670SLI vs 690 benches now that 670 is released...I just bought two, but these may be going to a couple of friends who aren't able to F5 all day in front of a computer....we'll see.


http://www.guru3d.com/article/geforce-gtx-670-2-and-3way-sli-review/

here you go

So far the benchs suggest that 670 SLI = 690 even a bit better in some tests under 1080p gaming

They cost 800$ and a 690 costs 1000$

tough decision to be honest. I might lean towards 2 670s but owning a 690 is cooler so its a tough one


----------



## jcde7ago

Quote:


> Originally Posted by *iARDAs*
> 
> http://www.guru3d.com/article/geforce-gtx-670-2-and-3way-sli-review/
> 
> here you go
> 
> So far the benchs suggest that 670 SLI = 690 even a bit better in some tests under 1080p gaming
> They cost 800$ and a 690 costs 1000$
> 
> tough decision to be honest. I might lean towards 2 670s but owning a 690 is cooler so its a tough one


Impressive benches indeed, thanks for sharing!

However, i'd be more interested in higher-resolution gaming though....I would hope to think that people aren't dumping $800-1000 for a 670 SLI/690 setup for a single 1080/1200p monitor.









But you're right, a 5-10% total difference may not be justifiable for some people looking to get a 690, and that is completely understandable. Personally, I love only having one card, and aside from the engineering marvel that the 690 is, i really, really enjoy using much less power and in turn, outputting less heat...and it having ONE card with the power of 2 certainly helps that, especially when the 670s use more traditional fans that are MUCH louder than the 690's fan at high speeds.

I don't think you can lose either way, but the subtle advantages of the 690 are very much worth it to me - and no matter how you slice it, it still currently has a slight performance edge, even if it's nothing too mind-blowing.


----------



## bitMobber

That's interesting to see the 670 SLI and 690 so close in benchmarks. Kinda makes you think again if the extra $200 is worth the cost for a single card solution, granted you get less power consumption, heat, and noise.

I'm hoping with future drivers they will be able to optimize the 690 for more performance, if not then I bet the demand for the 690 will drop a little. Why spend $1000 when you can get the same performance for $800? I think in general people are more concerned with frame rates than power/heat/noise.


----------



## Arizonian

I'll have to concur that for some of us the 690 has better options and even added $200 premium I preferred the single slot card based on the fact of less power consumption, less heat, only taking up one single PCIe slot in my mid-tower leaving an option for a soundcard or tv tuner, and much quieter. Not to mention quality build on mother board thick PCB, 5 power phases per side, and hands down amazing aesthetics for the modder. Makes for one elite rig.

I am on a single 1920x1200 resolution monitor but I'm also taking advantage of 120 Hz and love to 3D game and 3D Vision cuts what you get in 2D gaming down by 40-45% in FPS. Will be awesome gaming to stay above or close to 60 FPS in 3D.


----------



## iARDAs

Yeah Fudge it.

I will get a 690. Yes 2 670s are chepar (in my case i need to purchase a SLI mobo too so the costs will be the same as 690)

Here is me praying that these cards will be available in June so i can grab an Asus brand.

But nonetheless it is impressive that 670 can perform very well too.

690 is more towards a different audience thats for sure. For enthusiasts and people who want to do quad SLI. So the 200 difference is understandable if you ask me.


----------



## Basti

Finally got mine! Hurrah!!







Called up a local retailer early this morning and luckily they got it in stock and it's the only 1. hehehehe
Time to cancel my preorder from NCIX.


----------



## ceteris

If GTX 670 SLI is as good as the 690, I would still go for the 690. Heck, I practically chose it over GTX 680 SLI. This leaves some slots open for possible premium sound cards and/or pci-e ssd's. That and 690 Quad-SLI has plenty of decent high wattage PSUs to run your whole system as opposed to 2 PSU per 4-way and in some cases 3-way. Even if those can run 1 PSU, the choices of power available cost almost as much as a new GPU.


----------



## bitMobber

Quote:


> Originally Posted by *Arizonian*
> 
> I'll have to concur that for some of us the 690 has better options and even added $200 premium I preferred the single slot card based on the fact of less power consumption, less heat, only taking up one single PCIe slot in my mid-tower leaving an option for a soundcard or tv tuner, and much quieter. Not to mention quality build on mother board thick PCB, 5 power phases per side, and hands down amazing aesthetics for the modder. Makes for one elite rig.
> I am on a single 1920x1200 resolution monitor but I'm also taking advantage of 120 Hz and love to 3D game and 3D Vision cuts what you get in 2D gaming down by 40-45% in FPS. Will be awesome gaming to stay above or close to 60 FPS in 3D.


Which 1920x1200 monitor do you have that also does 120hz?


----------



## Arizonian

Quote:


> Originally Posted by *bitMobber*
> 
> Which 1920x1200 monitor do you have that also does 120hz?


Correction on my part......I meant 1920x1080p.









I have two, one I gave to my kids was the Alienware AW2310 23" OptX and my main monitor now is the ASUS 27" VG278H. Both are 120 Hz and I meant at 1080p.









P.S. GTX 690 is now showing as shipped by Newegg.


----------



## V3teran

My card has shipped and will be here in the morning,had to change my card to a gigabyte as EVGA are hard to get over here,nobody has them and you have to wait along time until they are instock which i cannot do especially as i placed my order within 20mins of them going live on the 3rd of May...**** EVGA

I will post some pics tomorrow when i get it,please update my card status to Gigabyte.
Cheers


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Correction on my part......I meant 1920x1080p.
> 
> 
> 
> 
> 
> 
> 
> 
> I have two, one I gave to my kids was the Alienware AW2310 23" OptX and my main monitor now is the ASUS 27" VG278H. Both are 120 Hz and I meant at 1080p.
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. GTX 690 is now showing as shipped by Newegg.


Good to hear your monitor is 120hz...I was going to say, a GTX 690 for a SINGLE 1080p monitor....you're out of your mind!!!!


----------



## TheRainMan

Quote:


> Originally Posted by *jcde7ago*
> 
> Good to hear your monitor is 120hz...I was going to say, a GTX 690 for a SINGLE 1080p monitor....you're out of your mind!!!!


Then I guess I'm crazy too


----------



## kemsoff

Mine was delivered and I'm stuck here at work for 2.more hours


----------



## Shadowness

Quote:


> Originally Posted by *TheRainMan*
> 
> Then I guess I'm crazy too


Then i guess i am crazy as well.

Speaking of which, did either 590 or 295 get waterblocks ? I am not really concerned about the different PCBs, but i kind of think it would be a shame to watercool my upcoming build, and only have it cooling the CPU. So, hopefully, we see some waterblocks soon. I am thinking about Hydro Copper full cover block specifically, that would be an insta buy for me.

I should be doing a Log here on OCN about my upcoming build, as it happens to be custom build. Its all plans and thoughts right now, but i might have my first part this weekend, as i happened to snag a i7-3960X for half the price, completely new, so....We will see how this turns out, hopefullly our family wont run out of money, dont have much as a student so i am being funded ( obviously ).


----------



## jcde7ago

Quote:


> Originally Posted by *Shadowness*
> 
> Then i guess i am crazy as well.
> Speaking of which, did either 590 or 295 get waterblocks ? I am not really concerned about the different PCBs, but i kind of think it would be a shame to watercool my upcoming build, and only have it cooling the CPU. So, hopefully, we see some waterblocks soon. I am thinking about Hydro Copper full cover block specifically, that would be an insta buy for me.
> I should be doing a Log here on OCN about my upcoming build, as it happens to be custom build. Its all plans and thoughts right now, but i might have my first part this weekend, as i happened to snag a i7-3960X for half the price, completely new, so....We will see how this turns out, hopefullly our family wont run out of money, dont have much as a student so i am being funded ( obviously ).


Yes, the 295 and 590 both got waterblocks. It always takes 3-6 weeks for various manufacturers to put out blocks for the dual-GPU cards.

*Also, I don't want anyone getting a 690 to feel offended if they're using a single 1080p monitor (one that isn't 120hz)*, i'm just saying that using a 690 for such a low resolution is kind of cheating yourself in a way....you can't really appreciate how beast of a card it is until you're actually challenging it. There are so many setups right now that can give you the same gaming experience for about 3-4x less cost if you're on a single 1080p monitor.

I know this is OCN, so overkill is fine, but in this case, spending the cost of an entire rig ($1K) for performance on a non-120hz, single 1080p monitor is beyond insane, and that's just my two cents.









Think of it this way - my GTX 590 destroyed a single 1920x1200p monitor at the time, and then when i stepped up to a 2560x1440 monitor (which is an entirely different ballgame than 1080/1200p), my 590 struggled a LOT with the majority of demanding titles. After having experienced gaming at 2560x1440p now with my 690, I can honestly say how impressed and amazed i am at the ease with which it handles any game out there, completely maxed out with all eye candy turned to the highest, maintaining 60FPS w/ vsync on or upward of 80-100 with it off, at 1440/1600p, without breaking a sweat. That's the kind of feeling you can only get when you're unleashing high-end setups on higher-end resolutions/Surround/120hz setups, where you know that spending $800-1000 is the only way you're going to get that kind of performance.









Again though, if people are stoked about a 690 for 1080/1200p gaming, then hey, I gots no problems with that and you guys are just a part of this club as anyone else.


----------



## TheRainMan

Ah, misread. My monitors are 120 hz. Additionally, if I so chose I could swap out my monitor for a U3011


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> Ah, misread. My monitors are 120 hz. Additionally, if I so chose I could swap out my monitor for a U3011


Personally, unless you can maintain 120hz on every demanding game out right now (which can't be done on current GPU hardware), then i'd go with a U3011 - the visual fidelity upgrade is WELL worth it....it is a night and day difference going from 1080/1200p to 1440/1600p.


----------



## ceteris

I admit I'm on my old dual 23" college LCD's I got during Black Friday at Frys. I only use one of them now because the other screen gets all funky when I play games fullscreen.

With all the heavy purchasing I'm doing on other parts, I haven't exactly put aside money for some new ones, or done some researching and review scouring. Do you guys have any general recomendations that will go well with 3 monitor setups? I've had a couple Asus 24" LED's on my Amazon wishlist for awhile, but it seems like IPS is all the rage on these forums.


----------



## Cheesemaster

Quote:


> Originally Posted by *jcde7ago*
> 
> Yes, the 295 and 590 both got waterblocks. It always takes 3-6 weeks for various manufacturers to put out blocks for the dual-GPU cards.
> *Also, I don't want anyone getting a 690 to feel offended if they're using a single 1080p monitor (one that isn't 120hz)*, i'm just saying that using a 690 for such a low resolution is kind of cheating yourself in a way....you can't really appreciate how beast of a card it is until you're actually challenging it. There are so many setups right now that can give you the same gaming experience for about 3-4x less cost if you're on a single 1080p monitor.
> I know this is OCN, so overkill is fine, but in this case, spending the cost of an entire rig ($1K) for performance on a non-120hz, single 1080p monitor is beyond insane, and that's just my two cents.
> 
> 
> 
> 
> 
> 
> 
> 
> Think of it this way - my GTX 590 destroyed a single 1920x1200p monitor at the time, and then when i stepped up to a 2560x1440 monitor (which is an entirely different ballgame than 1080/1200p), my 590 struggled a LOT with the majority of demanding titles. After having experienced gaming at 2560x1440p now with my 690, I can honestly say how impressed and amazed i am at the ease with which it handles any game out there, completely maxed out with all eye candy turned to the highest, maintaining 60FPS w/ vsync on or upward of 80-100 with it off, at 1440/1600p, without breaking a sweat. That's the kind of feeling you can only get when you're unleashing high-end setups on higher-end resolutions/Surround/120hz setups, where you know that spending $800-1000 is the only way you're going to get that kind of performance.
> 
> 
> 
> 
> 
> 
> 
> 
> Again though, if people are stoked about a 690 for 1080/1200p gaming, then hey, I gots no problems with that and you guys are just a part of this club as anyone else.


Thats why I grabbed two 690's I have 3-way 580gtx 3gb right now running 5760x1080 with 3d surround and they struggle a little bit ultra on BF3 I get an average of 55fps (3d off)... from a single monitor I was getting 100+ fps second; 55fps second doesn't cut it for me.. waiting to see what quad 690gtx's will do for Nvidia surround!


----------



## jcde7ago

Quote:


> Originally Posted by *ceteris*
> 
> I admit I'm on my old dual 23" college LCD's I got during Black Friday at Frys. I only use one of them now because the other screen gets all funky when I play games fullscreen.
> With all the heavy purchasing I'm doing on other parts, I haven't exactly put aside money for some new ones, or done some researching and review scouring. Do you guys have any general recomendations that will go well with 3 monitor setups? I've had a couple Asus 24" LED's on my Amazon wishlist for awhile, but it seems like IPS is all the rage on these forums.


I wish I could help, but i'm not a fan of multi-monitor gaming, personally. One, I hate bezels and two, I just prefer a single, large, nice IPS display.









That said, as far as overall color balance and picture quality - IPS displays win, and it's not even close. The visual fidelity is greater on an IPS display.

If I could make a recommendation, it's to ditch your current monitors by selling them, and go out and buy one of the 27" Korean IPS displays @ 2560x1440p that use the same LG panels as Apple Cinema Displays and the Dell Uxx11s. The only risk is 1 year warranty, and you may get a dead pixel or two unless you shell out $20-30 extra to have the seller check for you and guarantee 0 dead pixels, but honestly....for $317...well worth it. Mine arrived in perfect condition, and i've been using it for about...3 months now, no dead pixels, no problems, and it is a night and day difference in terms of color balance and resolution from TN panels at only 1080/1200p. Easily the best hardware investment i've made, even moreso than my 690.









If you don't like a single high-res IPS display, you'll have no problems returning or selling it, and then you can try something different out instead. But I have to say....2560x1440/1600p IPS displays are just leaps and bounds better than 1920x1080/1200p displays in every way possible, especially now that there's hardware that can EASILY drive them (like the 690).


----------



## Shadowness

Quote:


> Originally Posted by *TheRainMan*
> 
> Ah, misread. My monitors are 120 hz. Additionally, if I so chose I could swap out my monitor for a U3011


And once again, same. I actually have a top dog Acer Monitor here that waits for the new build, and is 120hz. I pretty much plan to run every single freaking game on it at 120hz ( 120fps )


----------



## jcde7ago

Quote:


> Originally Posted by *Shadowness*
> 
> And once again, same. I actually have a top dog Acer Monitor here that waits for the new build, and is 120hz. I pretty much plan to run every single freaking game on it at 120hz ( 120fps )


120hz gaming is MUCH different from just standard 1080p gaming at 60hz, lol. So no the 690 will absolutely NOT be overkill for you guys' needs.


----------



## Kyouki

It up NOW!! i F5 before getting email! NEW OWNER FINALLY


----------



## kemsoff

Its here!!

Will get pics in my rig and such later tonight, this card is just WOW so beautiful in person, the pics do it no justice


----------



## hitman2169

where ?


----------



## Hokies83

Guess ill grab one of these when the prices settle a bit like the 590s did.

Till then grats guys the gpu is kick arse!


----------



## jcde7ago

Quote:


> Originally Posted by *hitman2169*
> 
> where ?


Guessing it was up on EVGA, as they tend to put out stock daily between 3:30 - 4:30 Pacific time...and it literally lasts all of about 45 seconds before they're gone.


----------



## hitman2169

I will wait for my evga pre order ... no diablo on launch day :$


----------



## hitman2169

So no more card on sale on egg for today ?


----------



## Kyouki

Please Add me to the Club it took me a while but I was able to score one today! I F5ed it showed up at 5:58 I ordered then I got E-mail at 6:02 saying they were in-stock hahaha this is why I was missing out last few days waiting around for that SLOW E-mail.


----------



## tahoward

Congrats, it's a really awesome card.


----------



## TheRainMan

Quote:


> Originally Posted by *Kyouki*
> 
> It up NOW!! i F5 before getting email! NEW OWNER FINALLY


Yay









Congrats on being Mr. Quickhands today ^^

Quote:


> Originally Posted by *jcde7ago*
> 
> Personally, unless you can maintain 120hz on every demanding game out right now (which can't be done on current GPU hardware), then i'd go with a U3011 - the visual fidelity upgrade is WELL worth it....it is a night and day difference going from 1080/1200p to 1440/1600p.


The U3011 is really really tempting (I loved the colours) but I just can't imagine NOT working with multiple monitors now that I've tried it. Additionally, occasionally when I play fast paced RTS games on the U3011 (Starcraft 2, League, Dota 2 (not as much the latter two)] I occasionally lose track of the cursor >.>

GTX690 is scheduled to come tomorrow, gonna take a day off and overclock everything. My i7 980x is still running at stock


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> Yay
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats on being Mr. Quickhands today ^^
> The U3011 is really really tempting (I loved the colours) but I just can't imagine NOT working with multiple monitors now that I've tried it. Additionally, occasionally when I play fast paced RTS games on the U3011 (Starcraft 2, League, Dota 2 (not as much the latter two)] I occasionally lose track of the cursor >.>


Well if you're used to working/using multiple monitors, then yeah, you should probably stick with them. I'm the exact opposite - I completely despise multiple monitors, period. If they made one long 7680x1440 monitor without bezels, I might re-consider.


----------



## jcde7ago

Quote:


> Originally Posted by *Kyouki*
> 
> Please Add me to the Club it took me a while but I was able to score one today! I F5ed it showed up at 5:58 I ordered then I got E-mail at 6:02 saying they were in-stock hahaha this is why I was missing out last few days waiting around for that SLOW E-mail.


Kyouki, you've been added! Grats, and also grats to everyone who received theirs or got an order in for one today.


----------



## Hokies83

Im going to go with Asus for re sale reasons Evga is a headache


----------



## jcde7ago

Quote:


> Originally Posted by *Hokies83*
> 
> Im going to go with Asus for re sale reasons Evga is a headache


Sorry to hear that man....personally, I love EVGA, and never had any problems with them...i've had nothing but great support from them in the past, actually.

That said, your frustration is understandable...the whole "whoever clicks and types in their CC info the fastest, wins" system is terrible, but there's really not much else they can at the moment, given the demand for a 690. :/


----------



## Hokies83

Quote:


> Originally Posted by *jcde7ago*
> 
> Sorry to hear that man....personally, I love EVGA, and never had any problems with them...i've had nothing but great support from them in the past, actually.
> That said, your frustration is understandable...the whole "whoever clicks and types in their CC info the fastest, wins" system is terrible, but there's really not much else they can at the moment, given the demand for a 690. :/


Oh i mean when you go to re sale it.. Evga Warranty Stays with the first owner.. Asus stays with the card makes it much more easy to resale and much less of a headache if the person you sell it to has it die on them lol.


----------



## jcde7ago

Quote:


> Originally Posted by *Hokies83*
> 
> Oh i mean when you go to re sale it.. Evga Warranty Stays with the first owner.. Asus stays with the card makes it much more easy to resale and much less of a headache if the person you sell it to has it die on them lol.


Yeah, the warranty stays with the first owner, so you just have whoever bought it for you contact you and you help them out with the RMA.


----------



## staryoshi

Consider this thread officially official. Fill it with useful information! Dual GK104 monsters...


----------



## jcde7ago

Quote:


> Originally Posted by *staryoshi*
> 
> Consider this thread officially official. Fill it with useful information! Dual GK104 monsters...


You rock staryoshi! Thanks!


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Good to hear your monitor is 120hz...I was going to say, a GTX 690 for a SINGLE 1080p monitor....you're out of your mind!!!!


Well then you are going to think I am a f'ing lunatic then. Not only do I have a 1080p 55" Sony XBR929 3d that I game on, the rig connected to it will have a 690....IN SLI! Overkill...you betcha! I know it sounds f'n insane, but there is a reason for my lunacy....Witcher 2 with every possible thing set to Ultra Plus Uber Sampling and few things in the .ini file to improve LOD and foilage etc. runs from 40fps to 60fps on 2x680's overclocked and paired with a 3930k OC to 4.6. While in some parts I get 60fps, there are alot of areas where it jumps around from 40-60fps....I want rock solid 60fps. I know that 3x680's in SLI will do it as a few people I know have such a setup..but when I thought of heat, power consumption etc...I figured SLI'ing 690's would do it easily even if the 4th GPU was not really used too much as in a 4x680 SLI there is not much scalling when adding the 4th GPU. With newer drivers coming I am sure SLI scaling will improve...but that is the reason for my lunancy and I know some here will think I am crazy...but hey it is my hobby and I work hard for my money so I figured what the hell!!!

Also got my Fed Ex tracking number today!!! My cards should arrive sometime tomorrow afternoon. Already got my rig ready to roll and eagerly awaiting the new cards.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Well then you are going to think I am a f'ing lunatic then. Not only do I have a 1080p 55" Sony XBR929 3d that I game on, the rig connected to it will have a 690....IN SLI! Overkill...you betcha! I know it sounds f'n insane, but there is a reason for my lunacy....Witcher 2 with every possible thing set to Ultra Plus Uber Sampling and few things in the .ini file to improve LOD and foilage etc. runs from 40fps to 60fps on 2x680's overclocked and paired with a 3930k OC to 4.6. While in some parts I get 60fps, there are alot of areas where it jumps around from 40-60fps....I want rock solid 60fps. I know that 3x680's in SLI will do it as a few people I know have such a setup..but when I thought of heat, power consumption etc...I figured SLI'ing 690's would do it easily even if the 4th GPU was not really used too much as in a 4x680 SLI there is not much scalling when adding the 4th GPU. With newer drivers coming I am sure SLI scaling will improve...but that is the reason for my lunancy and I know some here will think I am crazy...but hey it is my hobby and I work hard for my money so I figured what the hell!!!
> Also got my Fed Ex tracking number today!!! My cards should arrive sometime tomorrow afternoon. Already got my rig ready to roll and eagerly awaiting the new cards.


Lol, I can see why, yes, you're still insane though.









That said, The Witcher 2 with ubersampling is just not fair at all...you're literally rendering each frame 3-4 times into a single, smoother/enhanced, "uber" frame, and that's why it eats GPUs for dinner. That tech was also designed for FUTURE hardware when CD Projekt implemented it, so don't be surprised if we're still a ways off from actually experiencing it fluidly.

Although it has to be said, games that use DX11 tech like BF3, Metro, etc., can use different methods and achieve just as as high, if not higher, visual fidelity in games.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, the warranty stays with the first owner, so you just have whoever bought it for you contact you and you help them out with the RMA. :thumb
> :


Actually that is no longer the case. Evga has changed their Warranty program and the coverage is tied to the card, not the owner. This went into effect I beleive last month. It is a really good system as it makes it much easier to resell a card since it covered by the serial number of the card not the person who registered the card.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Actually that is no longer the case. Evga has changed their Warranty program and the coverage is tied to the card, not the owner. This went into effect I beleive last month. It is a really good system as it makes it much easier to resell a card since it covered by the serial number of the card not the person who registered the card.


Hard to keep up with the changes associated with EVGA cards these days (I know they went to a 3-year by-serial system, not sure when it took effect or if there was a cut off) so this is good to know! +rep.


----------



## Arizonian

Quote:


> Originally Posted by *jcde7ago*
> 
> Yeah, the warranty stays with the first owner, so you just have whoever bought it for you contact you and you help them out with the RMA.


New *EVGA Global Warranty*.
Quote:


> Introducing the New and Improved EVGA Global Warranty
> 
> *This process is valid only for products shipped from EVGA on or after July 1st, 2011.*
> 
> *Want to know if your serial number qualifies?* Check out our Guest RMA page to check your serial number's warranty!
> 
> EVGA has introduced a Global Warranty to ensure that no matter which region you live in you have the support that you look for when purchasing an EVGA product. The warranty will no longer belong to the purchaser but instead to the product as we believe in the workmanship and quality of our products and we are here to stand behind them. Please read more below to see exactly what this means.
> 
> Check out the newly updated official EVGA Warranty terms!
> 
> Key Features:
> 
> *Product warranty covers the product, not the user.*
> Registration is no longer required for RMAs with our Guest RMA process.
> Step-Up and Extended Warranties will be available for all original owners registered with the new global RMA system within 30 days of the purchase.
> If you move, you can send your product back to your local warranty center no matter what region you purchased it in.
> A new Standard Cross-Shipping RMA service is available.
> What does transferable warranty mean for me?
> 
> The warranty follows the product and not the person.
> The warranty is available for a time period of no more than 3* years from the EVGA shipping date.
> Long term warranties, purchased extended warranties and EAR plans do not transfer.
> Registration of the product under another user's account is possible.
> *This excludes -RX, -BR and -TR products.
> 
> What is the Guest RMA process and why does it help?
> 
> No registration is required to submit a request.
> Warranty submission is done through a simple online form without contacting tech support through the phone.
> You know if you're under warranty right away, if you're under warranty you can submit an RMA.
> You can work with EVGA's legendary tech support for troubleshooting if you wish, or you can skip and send your product in.
> Multiple registrations? This is madness!
> 
> A product can now be registered by the person in possession of the product.
> EVGA Support Tickets can now be submitted by the current owner.
> EVGA Software can now be downloaded by the current owner.
> RMAs can now be submitted by any owner for a period up to 3 years from the EVGA shipping date.
> What is the length of the warranty for a recertified "-RX" product?
> 
> A 1 year warranty is available on all recertified product from the date that the product shipped from EVGA or from the date of purchase if a receipt is available from a authorized EVGA reseller.
> Recertified products sent in for warranty will be repaired and if necessary replaced.
> Choose the Cross Shipment Method that best fits your needs.
> 
> Free Cross-Shipment Process
> Collateral for the full retail purchase price of the product is required.
> Return Shipping is paid by the customer.
> Outbound shipping is available free via UPS Ground shipping, you can pay for upgraded shipping at the time of the RMA request.
> A refund for the collateral will be processed within 7-10 business days once the product is received and checked into our warehouse.
> EVGA Advanced RMA (EAR)
> A plan can be purchased based on your shipping needs within the first 30 days of purchase.
> A prepaid ground shipping label is included with all shipments to ensure that all of the shipping is covered.
> Expedited shipping options are available including 2nd day, overnight, and Saturday Delivery options.
> Credit card information is needed for this process, but we do not charge any collateral for the replacement.*
> The plan is available for one use and can be purchased again on a replacement product within 10 days of receiving it.
> * A small test charge is placed on the credit card, but this is simply a hold and will not be charged unless the product is not returned in the original factory condition.
> 
> Explanation of warranty periods
> 
> Limited Lifetime, Limited 10 Year and Limited 5 Year Warranties
> The product must be registered within 30 days of the purchase date as shown on the receipt.
> It is required that a receipt is provided from an EVGA authorized reseller for verification prior to warranty approval.
> This warranty is available for customers located in North America and Canada.
> This warranty is non-transferable, but if you do decide to sell this product then you will give that purchaser a 3 year warranty from the date of shipment from EVGA.
> This Warranty does include the option for a Step-Up within 90 days of the original purchase date by the original owner.
> Limited 3 Year Warranty
> This warranty has no registration requirements.
> This warranty is valid for 3 years from the shipment date from EVGA unless a receipt is available from an authorized reseller.
> This warranty is transferable as the warranty belongs to the product.
> This warranty has the option of an additional 2 years or 7 years extension within 30 days of purchase date by the original owner from an authorized reseller with proof of purchase. This extended warranty opens the options for a Step-Up within 90 days of the purchase date as shown on the receipt provided by the EVGA Auth


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Lol, I can see why, yes, you're still insane though.
> 
> 
> 
> 
> 
> 
> 
> 
> That said, The Witcher 2 with ubersampling is just not fair at all...you're literally rendering each frame 3-4 times into a single, smoother/enhanced, "uber" frame, and that's why it eats GPUs for dinner. That tech was also designed for FUTURE hardware when CD Projekt implemented it, so don't be surprised if we're still a ways off from actually experiencing it fluidly.
> Although it has to be said, games that use DX11 tech like BF3, Metro, etc., can use different methods and achieve just as as high, if not higher, visual fidelity in games.


I am actually also looking at the higher res monitors that are available. Since I was more a console gamer for most of my life, I do enjoy the "big screen" from the sofa gaming. For me, using my rig on my Entertainment setup is great. For games that support it I use a wireless 360 controller...for all other games i have a wireless mouse/keyboard that do a great job as well. Having a high end rig is like owning a next, next gen console today! I have never seen or tried a high-end 2560x1600 display, so I need to find someone that has one so I can check it out...who knows, I may use one of the spare rooms in the house to make a nice computer gaming center.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> I am actually also looking at the higher res monitors that are available. Since I was more a console gamer for most of my life, I do enjoy the "big screen" from the sofa gaming. For me, using my rig on my Entertainment setup is great. For games that support it I use a wireless 360 controller...for all other games i have a wireless mouse/keyboard that do a great job as well. Having a high end rig is like owning a next, next gen console today! I have never seen or tried a high-end 2560x1600 display, so I need to find someone that has one so I can check it out...who knows, I may use one of the spare rooms in the house to make a nice computer gaming center.


Jumping into 2560x1440/1600p-land is choosing the red pill over the blue pill - you stay in Gaming Wonderland, and you will start to see just how deep the rabbit-hole goes.









And Arizonian, thanks for the clarification!


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Jumping into 2560x1440/1600p-land is choosing the red pill over the blue pill - you stay in Gaming Wonderland, and you will start to see just how deep the rabbit-hole goes.
> 
> 
> 
> 
> 
> 
> 
> 
> And Arizonian, thanks for the clarification!


Lol...Since I have not done any research, what is the largest display size they make that supports 2560x1600 resolutions? Do they have ones that do that resolution and 120fps?


----------



## Hokies83

Yes that system is better then before but Asus has life time by the serial number.

Not bashing Evga there cards are great i just like Asus life time no hassles warranty lol.


----------



## blumpking

So I just got my Dell U3011 IPS display this afternoon. Wow. It really is a huge improvement over the old Samsung T260HD that I was using. So I fired up some BF3, set res. to 2560x1600, ultra everything with V-synch on....... IT IS GORGEOUS. Words cannot express the level of clarity. It's like being totally immersed in the game (my wife just rolled her eyes and went back to watching American Idol).

One question though: What kind of temps should I expect to be seeing? I was hitting 82 Celsius during some of my later matches, and that worries me. My Antec 1200 has good airflow, but I game in a smaller room with the door shut and temperatures do rise over time. Any thoughts on a ballpark range? On 1920X1200 I was seeing around 76 C. I just don't want to shorten the cards life by subjecting it to such extreme temps. I am running bone stock clocks/freqs btw for now.

Any help is appreciated. Thx:thumb:


----------



## blumpking

It goes without saying I have the EVGA GTX 690 card.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Lol...Since I have not done any research, what is the largest display size they make that supports 2560x1600 resolutions? Do they have ones that do that resolution and 120fps?


Thus far, there are no displays that can do 120hz AND higher than 1080p.

As for 2560x1600 res. displays, the norm is usually 30" - I think they do make slightly larger ones, but you really don't need to go higher than 30 inches - when you're sitting 2-3 feet from a 30" monitor at an ultra-sharp resolution, you don't WANT the screen to be bigger, lol.








Quote:


> Originally Posted by *blumpking*
> 
> So I just got my Dell U3011 IPS display this afternoon. Wow. It really is a huge improvement over the old Samsung T260HD that I was using. So I fired up some BF3, set res. to 2560x1600, ultra everything with V-synch on....... IT IS GORGEOUS. Words cannot express the level of clarity. It's like being totally immersed in the game (my wife just rolled her eyes and went back to watching American Idol).
> One question though: What kind of temps should I expect to be seeing? I was hitting 82 Celsius during some of my later matches, and that worries me. My Antec 1200 has good airflow, but I game in a smaller room with the door shut and temperatures do rise over time. Any thoughts on a ballpark range? On 1920X1200 I was seeing around 76 C. I just don't want to shorten the cards life by subjecting it to such extreme temps. I am running bone stock clocks/freqs btw for now.
> Any help is appreciated. Thx:thumb:


Grats! And yes...1440/1600p is a sight that has to be seen and experienced in order to be fully appreciated.









As for temps, 82c is normal, especially if you're OC'ing at all. Make sure to use EVGA Precision X to set a custom fan profile, so it keeps the card cooler when temps are increasing - the normal fan profile is fairly conservative, to be honest. You can raise the fan speed curve by a lot and probably limit temps to below 75c with ease, and STILL not hear the fan much at all. Again, the default auto fan speed is very conservative.

Links to Precision X are on the front page by the way.


----------



## Callandor

Got mine today, now just have to get the rest of the parts for the build

Appreciate the discussion on displays as I am running two 1680x1050 22in monitors at present

Which would be better, 120 Hz or 2560x1440?

JC did you get your monitor from eBay?


----------



## jcde7ago

Quote:


> Originally Posted by *Callandor*
> 
> Got mine today, now just have to get the rest of the parts for the build
> Appreciate the discussion on displays as I am running two 1680x1050 22in monitors at present
> Which would be better, 120 Hz or 2560x1440?
> JC did you get your monitor from eBay?


In my opinion, unless you're a super-ultra-competitive pro gamer, 120hz is not necessary. Is it smoother? Yes, of course. But for me...since the most demanding games can't be run at 120FPS constant with Vsync on, I prefer a larger display.

Also, IPS displays in general are far superior to TN panels. The color balance and quality isn't even close - IPS displays will put TN panels to shame.

That said, i'd definitely go with a 2560x1440/1600 IPS display - you get a much higher resolution/visual fidelity, greater color balance and overall picture quality, and you have the hardware to run games at such a resolution with all of the eye candy turned up at 60FPS constant and even higher if you play with vsync off. That (to me) is a far better overall gaming experience than what gaming at 120hz can provide.

And yes, I did get my monitor on eBay. It arrived within 3 days in perfect condition. No dead pixels, no issues at all in the 3 months that I have owned it. And I only paid $380 for it - this same monitor can be had for like $320 on eBay right now, as well as other variants. Same LG panels found in Apple Cinema Displays and Dells, except there can be dead pixels unless you pay $20-30 extra to have the seller check for them and guarantee you zero dead pixels. Most of them also offer a few weeks to return for full refund or replacement, and they have a 1-year warranty.

I've said it many times in this thread, but I consider my monitor to be the absolute best hardware investment i've made in years. It really has made gaming THAT much more enjoyable.









Diablo 3 maxed out at 2560x1440p in 5 days? OH YEAH.









Again, this is all just my personal opinion though.


----------



## Callandor

Thanks JC. I think resolution trumps frequency for me as well.









Will definitely have to check out that monitor on eBay.


----------



## blumpking

Quote:


> Originally Posted by *jcde7ago*
> 
> Thus far, there are no displays that can do 120hz AND higher than 1080p.
> As for 2560x1600 res. displays, the norm is usually 30" - I think they do make slightly larger ones, but you really don't need to go higher than 30 inches - when you're sitting 2-3 feet from a 30" monitor at an ultra-sharp resolution, you don't WANT the screen to be bigger, lol.
> 
> 
> 
> 
> 
> 
> 
> 
> Grats! And yes...1440/1600p is a sight that has to be seen and experienced in order to be fully appreciated.
> 
> 
> 
> 
> 
> 
> 
> 
> As for temps, 82c is normal, especially if you're OC'ing at all. Make sure to use EVGA Precision X to set a custom fan profile, so it keeps the card cooler when temps are increasing - the normal fan profile is fairly conservative, to be honest. You can raise the fan speed curve by a lot and probably limit temps to below 75c with ease, and STILL not hear the fan much at all. Again, the default auto fan speed is very conservative.
> Links to Precision X are on the front page by the way.


As always, thank you for the quick response. Love checkin' for updates on this forum! That's good to hear that 82 C isn't abnormal under load. I will work on creating a more aggressive fan profile to keep the temps down. I did install Precision X and OC Scanner X when I installed the card, but being new to Nvidia cards I need to learn the 'ins and outs' of each. Looks like there are a crapload of options between the two.

You said you are going to get Diablo III, yes? I'm on the fence about getting it. Looks pretty sweet but I don't know if I have the time to devote to something like that.


----------



## Arizonian

Is anyone excited to try Crysis 3 on launch maxed out settings with the GTX 690 as me? I'm a huge Crysis fan and will warm up by going through Crysis and Crysis 2 in 3D Vision maxed settings to warm up for Crysis 3 campaign. I love the Crysis campaigns.









As for multiplayer nothing comes close to 64 multiplayer of Battlefield 3 except for Battlefield 4......WUT?


----------



## jcde7ago

Quote:


> Originally Posted by *blumpking*
> 
> As always, thank you for the quick response. Love checkin' for updates on this forum! That's good to hear that 82 C isn't abnormal under load. I will work on creating a more aggressive fan profile to keep the temps down. I did install Precision X and OC Scanner X when I installed the card, but being new to Nvidia cards I need to learn the 'ins and outs' of each. Looks like there are a crapload of options between the two.
> You said you are going to get Diablo III, yes? I'm on the fence about getting it. Looks pretty sweet but I don't know if I have the time to devote to something like that.


No problem! And yes, I was a big D2 player, so D3 is one of the most anticipated games for me in a long, long time. I definitely agree...if you do not have time to devote to a game like D3, i'd probably steer clear, or you're guaranteed to get addicted.








Quote:


> Originally Posted by *Arizonian*
> 
> Is anyone excited to try Crysis 3 on launch maxed out settings with the GTX 690 as me? I'm a huge Crysis fan and will warm up by going through Crysis and Crysis 2 in 3D Vision maxed settings to warm up for Crysis 3 campaign. I love the Crysis campaigns.
> 
> 
> 
> 
> 
> 
> 
> 
> As for multiplayer nothing comes close to 64 multiplayer of Battlefield 3 except for Battlefield 4......WUT?


I'm a big Crysis fan myself, and I enjoyed Crysis 2, but ONLY after the DX11 + High res patches...Crytek needs to step it up and make sure this game is built with DX11 from the ground up, and that it surpasses Crysis 1 and Crysis 2 in every way possible.

As for Battlefield 4...i've played every single Battlefield game ever made, and I would have to say that the prospect of a *Battlefield 2143* is far, far more intriguing than another modern-day based FPS.


----------



## CapnCrunch10

I actually managed to snag another 690 so I sold the one I got recently because they were just way too profitable on eBay (I'm a terrible person, I know).

But you know what the weirdest part is? The person who bought it sent their payment through a @nvidia.com email address... I don't even know what to think beyond being extremely confused that a Nvidia employee would purchase a card on eBay of all places.


----------



## tahoward

Quote:


> Originally Posted by *CapnCrunch10*
> 
> I actually managed to snag another 690 so I sold the one I got recently because they were just way too profitable on eBay (I'm a terrible person, I know).
> But you know what the weirdest part is? The person who bought it sent their payment through a @nvidia.com email address... I don't even know what to think beyond being extremely confused that a Nvidia employee would purchase a card on eBay of all places.


Damn merchant... j/k

Anyhow, that is interesting. Maybe the dude knew the serials of the highest binned GTX 690s and you just sold the golden GTX 690?


----------



## jcde7ago

Quote:


> Originally Posted by *tahoward*
> 
> Damn merchant... j/k
> Anyhow, that is interesting. Maybe the dude knew the serials of the highest binned GTX 690s and you just sold the golden GTX 690?


Or maybe he's an Nvidia employee who's doing a case study of buying off eBay for higher than msrp and then reselling a 2nd time at an even higher price for even more profits.


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> Good to hear your monitor is 120hz...I was going to say, a GTX 690 for a SINGLE 1080p monitor....you're out of your mind!!!!


Quote:


> Originally Posted by *jcde7ago*
> 
> I know this is OCN, so overkill is fine, but in this case, spending the cost of an entire rig ($1K) for performance on a non-120hz, single 1080p monitor is beyond insane, and that's just my two cents.
> 
> 
> 
> 
> 
> 
> 
> 
> Think of it this way - my GTX 590 destroyed a single 1920x1200p monitor at the time, and then when i stepped up to a 2560x1440 monitor (which is an entirely different ballgame than 1080/1200p), my 590 struggled a LOT with the majority of demanding titles. After having experienced gaming at 2560x1440p now with my 690, I can honestly say how impressed and amazed i am at the ease with which it handles any game out there, completely maxed out with all eye candy turned to the highest, maintaining 60FPS w/ vsync on or upward of 80-100 with it off, at 1440/1600p, without breaking a sweat. That's the kind of feeling you can only get when you're unleashing high-end setups on higher-end resolutions/Surround/120hz setups, where you know that spending $800-1000 is the only way you're going to get that kind of performance.
> 
> 
> 
> 
> 
> 
> 
> 
> Again though, if people are stoked about a 690 for 1080/1200p gaming, then hey, I gots no problems with that and you guys are just a part of this club as anyone else.


I don't think a 690 at 1080p is overkill. I personally would like play future games at high frame rates without having to upgrade my hardware. After a few years when games become more demanding that same GPU will struggle to keep up at a higher resolution. You'll be forced either to upgrade the GPU or downgrade to a small resolution monitor (costing you more in the long run). A 690 running at 1080p will last much longer, you will still get good frame rates 2-3 years out.

I do however plan to upgrade to a 24" 1920x1200 monitor soon, anything higher is too much for me. I've had a 30" 2560x1600 in the past and I can't game at all on it. Mainly because after a hour my head begins to hurt and I have eye strain.


----------



## deusofhearts

Sorry for the noob question but can someone briefly explain why the vram of the 690s 2x2gb is not the same as GTX 680 4GB?


----------



## jcde7ago

Quote:


> Originally Posted by *bitMobber*
> 
> I don't think a 690 at 1080p is overkill*. I personally would like play future games at high frame rates without having to upgrade my hardware.* After a few years when games become more demanding that same GPU will struggle to keep up at a higher resolution. You'll be forced either to upgrade the GPU or downgrade to a small resolution monitor (costing you more in the long run). A 690 running at 1080p will last much longer, you will still get good frame rates 2-3 years out.
> I do however plan to upgrade to a 24" 1920x1200 monitor soon, anything higher is too much for me. I've had a 30" 2560x1600 in the past and I can't game at all on it. Mainly because after a hour my head begins to hurt and I have eye strain.


This logic isn't quite as black and white as it seems though - "future games" and "current hardware without upgrading" don't mix too well in the enthusiast world, unless you're talking about console ports.









The reason why I think it's overkill is because those future games are either going to be harder on TODAY'S current-gen hardware, and thus the 690 might not flex as much muscle now as it will 12 months from now, or the next generation of GPUs will have a mid-range GPU that's half/60% less than the cost of a 690 now, but will provide better performance and a better overall experience. There aren't any particularly demanding titles coming out anytime soon that look like they're more demanding than what's already out now, either. So in essence, you choose to spend $1K for a single 1080p monitor now, or you can spend $600 less for the same experience on something not nearly as overkill, and use that $600 on next-gen hardware that will eclipse the 690 anyway when more demanding titles are out 8-12 months from now.

Again, i have absolutely no issues with a 690 being used for a single 1080p monitor, especially if one has the cash to burn, and has no need to upgrade any other part of their rig, either. I just think it's insane in that it's hard to experience the kind of power a card like the 690 has when it's being limited to running games on a resolution that far lesser cards could destroy.








Quote:


> Originally Posted by *deusofhearts*
> 
> Sorry for the noob question but can someone briefly explain why the vram of the 690s 2x2gb is not the same as GTX 680 4GB?


It's 2GB per GPU, not shared. The 680's 4GB VRAM is 4GB useable by the GPU. With the 690, it's 2GB per GPU, and they can only use a max of 2GB at a time. It's marketed as 4GB because other software and applications can actually take advantage of all 4GB - but games simply can't.


----------



## CapnCrunch10

Quote:


> Originally Posted by *tahoward*
> 
> Damn merchant... j/k
> Anyhow, that is interesting. Maybe the dude knew the serials of the highest binned GTX 690s and you just sold the golden GTX 690?


That would be tragic, but I don't post the serial numbers so that's not it. I messaged the person since I was curious and they replied that they bought it for "internal testing". I don't think that clarified anything. I tried looking up the email address and name, but nothing popped up. Was hoping for a title or something. Maybe I'll drop the HQ a phone call and do a little snooping. And then I'll demand they tell me the release date of GK110 in exchange for shipping the card, haha.


----------



## jcde7ago

Quote:


> Originally Posted by *CapnCrunch10*
> 
> That would be tragic, but I don't post the serial numbers so that's not it. I messaged the person since I was curious and they replied that *they bought it for "internal testing"*. I don't think that clarified anything. I tried looking up the email address and name, but nothing popped up. Was hoping for a title or something. Maybe I'll drop the HQ a phone call and do a little snooping. And then I'll demand they tell me the release date of GK110 in exchange for shipping the card, haha.


They actually do this, it's no lie. They will buy products back that are out in the wild just to see how they perform. And also, employees do not get some sort of secret stockpile that's separate from consumers - they gotta buy them just like everyone else (now discounts, that's a different story).


----------



## deusofhearts

Quote:


> Originally Posted by *jcde7ago*
> 
> It's 2GB per GPU, not shared. The 680's 4GB VRAM is 4GB useable by the GPU. With the 690, it's 2GB per GPU, and they can only use a max of 2GB at a time. It's marketed as 4GB because other software and applications can actually take advantage of all 4GB - but games simply can't.


Thanks!


----------



## Basti

Here are some of the pics of my card.




3dmark11 stock benchmark.
http://3dmark.com/3dm11/3393907;jsessionid=ywlccgeddpiu11nwzjknbyn7m

Heaven stock benchmark.


Heaven benchmark seems pretty low compared to others but I'm still happy with the card. *CHEERS*


----------



## V3teran

Finally....


----------



## Basti

I am having an issue with the 301.34 driver. When I shutdown my computer, my monitor will shutoff but the pc still runs. I have to press the power button for 5s to actually turn it off.
I know it's the driver because when I uninstalled it, everything went back to normal. When I reinstalled, the problem went back.
Anyone here having the same issue?

Here's my pc specs:
I5-2500k
p8z68v-gen3
corsair vengeance ram 8g
corsair ax1200 psu
gtx 690

Not sure if I can post this here but this is where my gtx690 brethren are!








Thanks!


----------



## Arizonian

Quote:


> Originally Posted by *deusofhearts*
> 
> Sorry for the noob question but can someone briefly explain why the vram of the 690s 2x2gb is not the same as GTX 680 4GB?


Well think of it as when you have two seperate cards in SLI.

The very first frame gets handled by the first card with 2GB. The second frame gets handled by the second card with 2GB. The third frame gets handled back to the first card with 2 GB. The forth frame gets handled by the second GPU with 2GB. Into infinity.

So in a single dual GPU the VRAM is mirrored as each frame gets rendered by each card individually with 2GB in each side. The RAM isn't combined just because it's on a single PCB. Each side of the card works separately to render a frame at a time.

Does this make sense a little bit better now?


----------



## Arizonian

Quote:


> Originally Posted by *Basti*
> 
> I am having an issue with the 301.34 driver. When I shutdown my computer, my monitor will shutoff but the pc still runs. I have to press the power button for 5s to actually turn it off.
> I know it's the driver because when I uninstalled it, everything went back to normal. When I reinstalled, the problem went back.
> Anyone here having the same issue?
> Here's my pc specs:
> I5-2500k
> p8z68v-gen3
> corsair vengeance ram 8g
> corsair ax1200 psu
> gtx 690
> Not sure if I can post this here but this is where my gtx690 brethren are!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


Hi Basti, First - welcome to the 690 club and OCN.









Onto your dilemma. I don't have my card yet. Should be here today. As of yet we've had a few members starting to get the cards. No one has reported this yet. Very strange indeed.

Is your system over clocked? When your forced to power down with a hard turn off and restart are you being asked to restart into default settings when you power back up from the hard turn off? Is your RAM over clocked? Wonder if you have an unstable over clock?

If your not over clocked, when you remove the driver are you rolling back to the previous? Have you gone to Nvidia directly and down load the 690 driver? When you did, did you choose the correct Windows platform? Double check you didn't choose 32 bit if your system is 64 bit.

I'll let you know what I encounter when I'm up and running. Anyone with a card already please let us know if this situation is happening to you? I'm stumped.


----------



## Basti

Quote:


> Originally Posted by *Arizonian*
> 
> Hi Basti, First - welcome to the 690 club and OCN.
> 
> 
> 
> 
> 
> 
> 
> 
> Onto your dilemma. I don't have my card yet. Should be here today. As of yet we've had a few members starting to get the cards. No one has reported this yet. Very strange indeed.
> Is your system over clocked? When your forced to power down with a hard turn off and restart are you being asked to restart into default settings when you power back up from the hard turn off? Is your RAM over clocked? Wonder if you have an unstable over clock?
> If your not over clocked, when you remove the driver are you rolling back to the previous? Have you gone to Nvidia directly and down load the 690 driver? When you did, did you choose the correct Windows platform? Double check you didn't choose 32 bit if your system is 64 bit.
> I'll let you know what I encounter when I'm up and running. Anyone with a card already please let us know if this situation is happening to you? I'm stumped.


Thanks Arizonian. Everything's at stock. There's only 1 driver for 690 as far as i know so there's no way to roll back. And yes the driver I downloaded was 64bit.
Help anyone?


----------



## Supper

howdy 690 owners...

today i try oc my 690 using my 680 setup (125 power, 180 core, 425 memory) and its unstable despite it being exactly like 680 with lower boost clock, i wonder why, any thoughts? or perhaps its chip problem, temperamental?


----------



## jcde7ago

Quote:


> Originally Posted by *Supper*
> 
> howdy 690 owners...
> today i try oc my 690 using my 680 setup (125 power, 180 core, 425 memory) and its unstable despite it being exactly like 680 with lower boost clock, i wonder why, any thoughts? or perhaps its chip problem, temperamental?


Not all 680s are created equal - so while your 1x 680 could pull off those clocks, it doesn't mean that both of the chips inside the 690 can individually. Since the OC is pushed across both cores, if one of them cannot maintain the clocks, the OC still be unstable.

180core is also fairly high for a 690 if you're only using 125% power target. I'd go with the max 135% power target (remember, it won't necessarily use all of that power, it will just raise the headroom), decrease your core by increments of 10 and memory by increments of 5, until you get something that is stable across both cores.

Also, post pics so I can add you to the list of club members.










*Basti and V3teran* - grats on getting your cards, guys!









And Basti, I will look into your issue but at this time, it would be hard to tell since each rig is different...but have you tried Googling around for similar instances of a failed shutdown? It almost seems like the PC is entering a sleep/hibernate state. Have you checked your windows Power settings to make sure you don't have PCI-E states in some sort of weird mode? It is odd that only the 301.34 driver causes this, but it seems like something that would have been caught by now if it was really a driver issue as well.


----------



## Supper

Quote:


> Originally Posted by *jcde7ago*
> 
> Not all 680s are created equal - so while your 1x 680 could pull off those clocks, it doesn't mean that both of the chips inside the 690 can individually. Since the OC is pushed across both cores, if one of them cannot maintain the clocks, the OC still be unstable.
> 180core is also fairly high for a 690 if you're only using 125% power target. I'd go with the max 135% power target (remember, it won't necessarily use all of that power, it will just raise the headroom), decrease your core by increments of 10 and memory by increments of 5, until you get something that is stable across both cores.
> Also, post pics so I can add you to the list of club members.


got it stable with 115 power target, 160 core, 485 memory and i guess i will go back using 680 sli, better performance.








thanks for club invitation but im not joining any club, i will just come and go and provide tips and info and sharing personal experience, if you dont mind.


----------



## jcde7ago

Quote:


> Originally Posted by *Supper*
> 
> got it stable with 115 power target, 160 core, 485 memory and i guess i will go back using 680 sli, better performance.
> 
> 
> 
> 
> 
> 
> 
> 
> thanks for club invitation but im not joining any club, i will just come and go and provide tips and info and sharing personal experience, if you dont mind.


Don't mind at all man, good luck with whatever you decide to stick with/do.


----------



## burningrave101

Amazon is taking orders for the ASUS GTX 690:

http://www.amazon.com/dp/B0080JWLFU/?m=ATVPDKIKX0DER&tag=hardfocom-20


----------



## Shadowness

Quote:


> Originally Posted by *Arizonian*
> 
> Hi Basti, First - welcome to the 690 club and OCN.
> 
> 
> 
> 
> 
> 
> 
> 
> Onto your dilemma. I don't have my card yet. Should be here today. As of yet we've had a few members starting to get the cards. No one has reported this yet. Very strange indeed.
> Is your system over clocked? When your forced to power down with a hard turn off and restart are you being asked to restart into default settings when you power back up from the hard turn off? Is your RAM over clocked? Wonder if you have an unstable over clock?
> If your not over clocked, when you remove the driver are you rolling back to the previous? Have you gone to Nvidia directly and down load the 690 driver? When you did, did you choose the correct Windows platform? Double check you didn't choose 32 bit if your system is 64 bit.
> I'll let you know what I encounter when I'm up and running. Anyone with a card already please let us know if this situation is happening to you? I'm stumped.


I think he is talking about OpenGL error code 3 that suddenly appears while either playing or doing stuff in After Effects / photoshop which happens to me lately a lot. Been experimenting on some project of my own and the damn stuff keep crashing every now and then.

Also, i know this isnt the thread for it but thought i'd mention it - I got the CPU, the guy delivered it today, so the heart of the build is now sitting next to me. The O'Mighty 3960X


----------



## Supper

Quote:


> Originally Posted by *burningrave101*
> 
> Amazon is taking orders for the ASUS GTX 690:
> http://www.amazon.com/dp/B0080JWLFU/?m=ATVPDKIKX0DER&tag=hardfocom-20


temporarily OOS









btw, i was thinking just now, quad 680 (maximum oc) vs dual 690 (maximum oc), who would win?


----------



## tahoward

Quote:


> Originally Posted by *Basti*
> 
> I am having an issue with the 301.34 driver. When I shutdown my computer, my monitor will shutoff but the pc still runs. I have to press the power button for 5s to actually turn it off.
> I know it's the driver because when I uninstalled it, everything went back to normal. When I reinstalled, the problem went back.
> Anyone here having the same issue?
> Here's my pc specs:
> I5-2500k
> p8z68v-gen3
> corsair vengeance ram 8g
> corsair ax1200 psu
> gtx 690
> Not sure if I can post this here but this is where my gtx690 brethren are!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


I have the same issue. The OS will shutdown but the fans and lights stay on. Sleep mode and restarting still work fine though. This is a driver related issue, pretty sure NVIDIA is aware; just waiting on some WHQL or a newer beta. Other than that everything runs perfect.


----------



## V3teran

Quote:


> Originally Posted by *Basti*
> 
> I am having an issue with the 301.34 driver. When I shutdown my computer, my monitor will shutoff but the pc still runs. I have to press the power button for 5s to actually turn it off.
> I know it's the driver because when I uninstalled it, everything went back to normal. When I reinstalled, the problem went back.
> Anyone here having the same issue?
> Here's my pc specs:
> I5-2500k
> p8z68v-gen3
> corsair vengeance ram 8g
> corsair ax1200 psu
> gtx 690
> Not sure if I can post this here but this is where my gtx690 brethren are!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


I had that problem too,301.34 is a horrible driver tbh,the only way i could ifx this issue was to totally uninstall the driver,reboot machine,then reinstall.....trying to install the driver over and over again without reinstalling it didnt work,these drivers are ok but they arnt great.


----------



## Arizonian

Quote:


> Originally Posted by *Supper*
> 
> temporarily OOS
> 
> 
> 
> 
> 
> 
> 
> 
> btw, i was thinking just now, quad 680 (maximum oc) vs dual 690 (maximum oc), who would win?


You don't have to wonder. Review links on home page here show many tech site reviews that can get the 690 within 2%-6% of 680 performance. The 680 single cards running in SLI will always win just like all single SLI cards have over dual GPU's since the first dual GPU.

Only difference is this series unlike any other prior comes closest any dual GPU has EVER come within range of it's single GPU brethren. Nvidia has done very well with this dual GPU.

The obvious benifits for dual GPU over the singles running together, much cooler temps as well as quieter fans, only take up one PCIe slot leaving room for a sound card / tv tuner / or even for a quad set up with mother boards that only have two PCIe slots.

Not to mention the It's also made on higher quality mother board thick PCB, 5 power PWM's per side (over 4 phase on reference). Vapor chamber cooling rather than heat sink cooling. Full throttle the 690 dosent see over 50C. Magnesium casing. To top it off with a cherry LED 'GTX GeForce' lighting on the side.

Personally I'm stoked, even with a little less performance than an SLI set up this card fits my needs like a glove. First dual GPU so I'm a bit nervous but only due to the stories of past dual GPU's which this 690 broke the molds from everything I'm reading.

Good luck with your SLI set up your switching back to. BTW I thought the 680 & 690 cards have a replacement only policy, are you selling yours?


----------



## emett

Supper the quad 680s will destroy the duel 690s.
From what I've seen stock sli 680's beat a 690 by around p1500.


----------



## Basti

There's nothing out of the ordinary on the settings under power management. Even played with the registry. Like Tahoward said, it's a driver issue, Nvidia will be releasing a driver in a week or two according to the other site.


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> This logic isn't quite as black and white as it seems though - "future games" and "current hardware without upgrading" don't mix too well in the enthusiast world, unless you're talking about console ports.
> 
> 
> 
> 
> 
> 
> 
> 
> The reason why I think it's overkill is because those future games are either going to be harder on TODAY'S current-gen hardware, and thus the 690 might not flex as much muscle now as it will 12 months from now, or the next generation of GPUs will have a mid-range GPU that's half/60% less than the cost of a 690 now, but will provide better performance and a better overall experience. There aren't any particularly demanding titles coming out anytime soon that look like they're more demanding than what's already out now, either. So in essence, you choose to spend $1K for a single 1080p monitor now, or you can spend $600 less for the same experience on something not nearly as overkill, and use that $600 on next-gen hardware that will eclipse the 690 anyway when more demanding titles are out 8-12 months from now.
> Again, i have absolutely no issues with a 690 being used for a single 1080p monitor, especially if one has the cash to burn, and has no need to upgrade any other part of their rig, either. I just think it's insane in that it's hard to experience the kind of power a card like the 690 has when it's being limited to running games on a resolution that far lesser cards could destroy.
> 
> 
> 
> 
> 
> 
> 
> 
> It's 2GB per GPU, not shared. The 680's 4GB VRAM is 4GB useable by the GPU. With the 690, it's 2GB per GPU, and they can only use a max of 2GB at a time. It's marketed as 4GB because other software and applications can actually take advantage of all 4GB - but games simply can't.


Your response was a little bit confusing to read. What I think you are saying is the 690 is better to be used with a high resolution monitor otherwise you are wasting money, you can get a cheaper graphics card now and still have the same performance/experience on a lower resolution monitor.

The thing is that cheaper GPU performance, two or three years down the road, will perform rather terrible with future titles. It depends on the person, how often they want to upgrade, but I'd rather spend the money now on the flagship model that will have a longer useable lifespan. The dual GPU performance, two or threes years down the road, will still be decent that it won't warrant a upgrade unless you are one of those enthusiasts that need to upgrade every time sometime new comes out.


----------



## jcde7ago

Interesting...what's the common denominator here for you guys then? I am currently not experiencing this issue at all...and it doesn't look like it's the PSU. Could it be the driver and specific mobos?

Also, emett, Four-Way-SLI with 680s will not "destroy" 690s. Quad-scaling is already CRAP to begin with, so i'm not sure how much scaling you hope to achieve just because you have OC'd 680s. Diminishing returns, my friend.

And for what it's worth, P1500 scores in favor of 680s have to do with higher overclocks on 680s, and we don't even have enough of a sample size yet to determine the performance of people who already have 690s. I still haven't found my max OC on my 690, that's for sure.


----------



## V3teran

Quote:


> Originally Posted by *emett*
> 
> Supper the quad 680s will destroy the duel 690s.
> From what I've seen stock sli 680's beat a 690 by around p1500.


Who cares,if you want performance go buy 680s,if you want the best looking gpu money can buy with massive exclusive ownership(epeen) then go with the 690,i know what i would prefer.


----------



## jcde7ago

Interesting...what's the common denominator here for you guys then? I am currently not experiencing this issue at all...and it doesn't look like it's the PSU. Could it be the driver and specific mobos?
Also, emett, Four-Way-SLI with 680s will not "destroy" 690s. Quad-scaling is already CRAP to begin with, so i'm not sure how much scaling you hope to achieve just because you have OC'd 680s. Diminishing returns, my friend.
And for what it's worth, P1500 scores in favor of 680s have to do with higher overclocks on 680s, and we don't even have enough of a sample size yet to determine the performance of people who already have 690s. I still haven't found my max OC on my 690, that's for sure.








Quote:


> bitMobber: The thing is that cheaper GPU performance, two or three years down the road, will perform rather terrible with future titles. It depends on the person, how often they want to upgrade, but I'd rather spend the money now on the flagship model that will have a longer useable lifespan. The dual GPU performance, two or threes years down the road, will still be decent that it won't warrant a upgrade unless you are one of those enthusiasts that need to upgrade every time sometime new comes out.


You clearly misunderstood. I'm not talking about cheaper GPU performance down the road. What i'm saying is, the mid-range cards of the next gen, at $500-600, is more likely to surpass the performance of a 690, much like the case now. So you can spend less and get a completely IDENTICAL experience with cheaper hardware now, and spend money in the next gen that would STILL be less than a 690 would cost now, and have BETTER performance than what a 690 would provide down the road.

Clearly, the 690 will hold its own down the road. But by then, even the mid range cards will outperform it in future and current titles, and you could have saved the money you spent now on the next gen. My two cents. Nowhere did I say ANYTHING about the 690 being a waste of money on a single 1080p monitor cause frankly, i don't care to tell people what to do with their hard-earned money.


----------



## ceteris

I personally benched 2-way GTX 680 SLI vs. a GTX 690 and it went in favor of GTX 680's. I think it is pretty safe to assume that 4-way SLI with GTX 680's is going to outperform dual GTX 690's for Quad-SLI. Given the MSRP of both the GTX 680 and the GTX 690, with the assumption of availability out of the equation lol, people who are buying GTX 690's are not necessarily thinking that the GTX 690's are "going to destroy" the GTX 680's in performance. It could be aesthetics, power efficiency, PCI-E slot availability, etc. I personally chose going with the GTX 690 because I want the power efficiency, to keep some PCI-E slots open and the pretty much an expectation that they will not stack 6-7 performance variations on top of the stock reference. If anything, that is what pissed me off the most as I already have a low threshold for playing the waiting + F5 game on e-tailer sites.

(NOTE: Yes I know quad-sli in either setup isn't going to run all games and programs flawlessly)


----------



## jcde7ago

Quote:


> Originally Posted by *ceteris*
> 
> I personally benched 2-way GTX 680 SLI vs. a GTX 690 and it went in favor of GTX 680's. I think it is pretty safe to assume that 4-way SLI with GTX 680's is going to outperform dual GTX 690's for Quad-SLI. Given the MSRP of both the GTX 680 and the GTX 690, with the assumption of availability out of the equation lol, people who are buying GTX 690's are not necessarily thinking that the GTX 690's are "going to destroy" the GTX 680's in performance. It could be aesthetics, power efficiency, PCI-E slot availability, etc. I personally chose going with the GTX 690 because I want the power efficiency, to keep some PCI-E slots open and the pretty much an expectation that they will not stack 6-7 performance variations on top of the stock reference. If anything, that is what pissed me off the most as I already have a low threshold for playing the waiting + F5 game on e-tailer sites.
> (NOTE: Yes I know quad-sli in either setup isn't going to run all games and programs flawlessly)


I completely agree - I think we all know by now that 680s in SLI will still win out by about 3-5%, or 2-3FPS, in the end. But with all of the subtle benefits that come with the 690, it is absolutely worth the trade off. I have no with my card being 20% more power efficient at the cost of 2-4% performance, not to mention being quieter and cooler. than a pair of 680s in SLI


----------



## Supper

Quote:


> Originally Posted by *jcde7ago*
> 
> Interesting...what's the common denominator here for you guys then? I am currently not experiencing this issue at all...and it doesn't look like it's the PSU. Could it be the driver and specific mobos?
> Also, emett, Four-Way-SLI with 680s will not "destroy" 690s. Quad-scaling is already CRAP to begin with, so i'm not sure how much scaling you hope to achieve just because you have OC'd 680s. Diminishing returns, my friend.
> And for what it's worth, P1500 scores in favor of 680s have to do with higher overclocks on 680s, and we don't even have enough of a sample size yet to determine the performance of people who already have 690s. I still haven't found my max OC on my 690, that's for sure.


SLI scaling for Kepler has improved compare to Fermi, e.g 670 3 way scale better than 3 way 570.
So 680 quad SLI, given the driver is up to date, should not perform like crap hence every 600 card driver is different
Quote:


> Originally Posted by *Arizonian*
> 
> You don't have to wonder. Review links on home page here show many tech site reviews that can get the 690 within 2%-6% of 680 performance. The 680 single cards running in SLI will always win just like all single SLI cards have over dual GPU's since the first dual GPU.
> Only difference is this series unlike any other prior comes closest any dual GPU has EVER come within range of it's single GPU brethren. Nvidia has done very well with this dual GPU.
> The obvious benifits for dual GPU over the singles running together, much cooler temps as well as quieter fans, only take up one PCIe slot leaving room for a sound card / tv tuner / or even for a quad set up with mother boards that only have two PCIe slots.
> Not to mention the It's also made on higher quality mother board thick PCB, 5 power PWM's per side (over 4 phase on reference). Vapor chamber cooling rather than heat sink cooling. Full throttle the 690 dosent see over 50C. Magnesium casing. To top it off with a cherry LED 'GTX GeForce' lighting on the side.
> Personally I'm stoked, even with a little less performance than an SLI set up this card fits my needs like a glove. First dual GPU so I'm a bit nervous but only due to the stories of past dual GPU's which this 690 broke the molds from everything I'm reading.
> Good luck with your SLI set up your switching back to. BTW I thought the 680 & 690 cards have a replacement only policy, are you selling yours?


Sorry my friend, I'm a keeper









Side note, gtx 680 will be fully available end of this month as my favorite retailer told me so that goes for 690 as well.


----------



## jcde7ago

Quote:


> Originally Posted by *Supper*
> 
> SLI scaling for Kepler has improved compare to Fermi, e.g 670 3 way scale better than 3 way 570.
> So 680 quad SLI, given the driver is up to date, should not perform like crap hence every 600 card driver is different


This is where I disagree. Every review i've seen of multi-GPU 600 series scaling pretty much shows that the benefits are in 2-card setups and drop off significantly in 3-way setups and even moreso with 4-way/quad SLI. Yes, Kepler scales better than Fermi, no doubt about it. However, the actual GAINS from 2-way SLI to 3-way, and especially to 4-way, are extremely marginal compared to the overall cost.

And only new-release cards get special/specific drivers; the GTX 690 will eventually fall in line with the rest of the GK104 variants in the same driver set in the next release.


----------



## Supper

Quote:


> Originally Posted by *jcde7ago*
> 
> This is where I disagree. Every review i've seen of multi-GPU 600 series scaling pretty much shows that the benefits are in 2-card setups and drop off significantly in 3-way setups and even moreso with 4-way/quad SLI. Yes, Kepler scales better than Fermi, no doubt about it. However, the actual GAINS from 2-way SLI to 3-way, and especially to 4-way, are extremely marginal compared to the overall cost.
> And only new-release cards get special/specific drivers; the GTX 690 will eventually fall in line with the rest of the GK104 variants in the same driver set in the next release.


From what I have tested so far 3 way sli scale better depending on games,
BF3 with 3 680 have about 5-7 fps increase and crysis 2 about 2-3 fps due to driver restrain.
But 670 3 way on the other hand scale better than 680 due to better drive compare to 680.
BF3 and crysis 2 shows 20 fps increase.

Running at 2560x1600


----------



## V3teran

jcde7ago can you change my card tio a gigabyte please,thanks


----------



## V3teran

Anyway some benchies,all benchies done with GPU stock settings!

3D mark Vantage Physx on









Physx off









3D Mark 11 Physx on









Physx off









Heaven Bench DX9









DX10









DX11









Will do Overclocking benchies once i find the max threshold of my card,stay tuned!


----------



## jcde7ago

V3teran, i've gone ahead and changed the brand per your request (as well as your proof pic), and also, thanks for the benchmarks!


----------



## TheRainMan

What an annoyance, I installed it okay but now .mkv files play as black screen in MPC-HC

No idea why


----------



## ceteris

Quote:


> Originally Posted by *TheRainMan*
> 
> What an annoyance, I installed it okay but now .mkv files play as black screen in MPC-HC
> No idea why


No hentai pr0n tonight?


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> What an annoyance, I installed it okay but now .mkv files play as black screen in MPC-HC
> No idea why


Weird, I watched Underworld: Awakening last night just fine, and it was an .mkv file (granted, i'm using VLC 2.x). Could there be some sort of GPU acceleration setting that needs to be re-enabled?


----------



## TheRainMan

Quote:


> Originally Posted by *ceteris*
> 
> No hentai pr0n tonight?


>.> All I wanna do is watch Steins;Gate and Game of Thrones but even this simple pleasure has been taken from me!

Quote:


> Originally Posted by *jcde7ago*
> 
> Weird, I watched Underworld: Awakening last night just fine, and it was an .mkv file (granted, i'm using VLC 2.x). Could there be some sort of GPU acceleration setting that needs to be re-enabled?


Perhaps. VLC 2.x +MKV files work fine for me, it's just MPC-HC + madVR (used this here guide: http://haruhichan.com/wpblog/index.php/205/hi10p-info-guide.html)


----------



## bitMobber

Quote:


> Originally Posted by *jcde7ago*
> 
> You clearly misunderstood. I'm not talking about cheaper GPU performance down the road. What i'm saying is, the mid-range cards of the next gen, at $500-600, is more likely to surpass the performance of a 690, much like the case now. So you can spend less and get a completely IDENTICAL experience with cheaper hardware now, and spend money in the next gen that would STILL be less than a 690 would cost now, and have BETTER performance than what a 690 would provide down the road.
> Clearly, the 690 will hold its own down the road. But by then, even the mid range cards will outperform it in future and current titles, and you could have saved the money you spent now on the next gen. My two cents. Nowhere did I say ANYTHING about the 690 being a waste of money on a single 1080p monitor cause frankly, i don't care to tell people what to do with their hard-earned money.


But the newest current mid-range card (680) still does not surpass a high-end card of last generation (590).

Plus looking at Nvidia's launch prices then it should cost about the same in the long run. Pay $500 now for a mid-range card and another $500-600 in the future, it will cost the same as a 690 now.


----------



## Stateless

Hey everyone,

The Evga Utility that allows you to change the brightness of the GTX 690 logo is out. Please note that this utility will only work with 690's from Evga. Here is a link to the post on Evga.com and it contains the download link for the Utility as well. It is pretty nifty little program from reading up on it.

http://www.evga.com/forums/tm.aspx?m=1590564


----------



## TheRainMan

I've found the problem (it's madVR) but I haven't a clue how to fix it. When I change the renderer to something like Haali's videos play fine but worse than with madVR). One of the requirements for madVR to work correctly is "a graphics card with full Direct3D9 hardware support" so...


----------



## jcde7ago

Quote:


> Originally Posted by *bitMobber*
> 
> But the newest current mid-range card (680) still does not surpass a high-end card of last generation (590).
> Plus looking at Nvidia's launch prices then it should cost about the same in the long run. Pay $500 now for a mid-range card and another $500-600 in the future, it will cost the same as a 690 now.


But you can see that the discrepancy is what...3-4 FPS tops? That's nothing that a 680 can't make up (and then some) with even only a slight OC.







The 590 was also $700-750 when it launched, compared to the 680 now at just $500. $200 is still a great deal of money saved. Or even moreso if you get a 670, which is only lagging behind the 680 at about 2-3 FPS tops...which it EASILY makes up with very high OCs. So that could be $300 saved right there, and the 670 is a mid-range card.

Again, not a waste if you use the 690 on a single 1080p monitor, but you are definitely holding it back from realizing its true potential.








Quote:


> Originally Posted by *Stateless*
> 
> Hey everyone,
> The Evga Utility that allows you to change the brightness of the GTX 690 logo is out. Please note that this utility will only work with 690's from Evga. Here is a link to the post on Evga.com and it contains the download link for the Utility as well. It is pretty nifty little program from reading up on it.
> http://www.evga.com/forums/tm.aspx?m=1590564


Thanks for posting...I will add the link to the actual app to front page.


----------



## TheRainMan

Okay, I fixed the issue by disabling the option "use a separate device for presentation (Vista/Windows 7 only)" option under "rendering" -> "general settings"


----------



## Arizonian

Okay I got my GTX 690. I took a 15 minute break. I plug it in and the response I'm getting is to make sure the PCIe power supply plugged in my PC graphics card. I check to make sure the two 8 pin connectors were connected properly. I also checked that the PCI E slot connection was fully in plug-in. I tried both dual DVI ports. doubled checked the connection to the monitor dual DVI cord.

Am I missing something here? It's s pretty straight foward installation. Done many many times. Any suggestions? Did I receive a DOA card?
The LED lighting kicks on. If the power cable wasn't plugged in properly it would have not lit up. I checked the PSU behind the tower and my extensions were tightly secured.

I won't have time to try anything else till my next break at about 3pm.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Okay I got my GTX 690. I took a 15 minute break. I plug it in and the response I'm getting is to make sure the PCIe power supply plugged in my PC graphics card. I check to make sure the two 8 pin connectors were connected properly. I also checked that the PCI E slot connection was fully in plug-in. I tried both dual DVI ports. doubled checked the connection to the monitor dual DVI cord.
> Am I missing something here? It's s pretty straight foward installation. Done many many times. Any suggestions? Did I receive a DOA card?
> The LED lighting kicks on. If the power cable wasn't plugged in properly it would have not lit up. I checked the PSU behind the tower and my extensions were tightly secured.
> I won't have time to try anything else till my next break at about 3pm.


Interesting...have you tried a different PSU? It could be the first reported DOA GTX 690, but i would exhaust all other options first....and the PSU is the first thing you want to rule out without a shadow of a doubt.

Try another PSU first and see what happens, or if you don't have one...try different power cables?


----------



## Michalius

Got it test fitted yesterday as I had to see how it looked with the motherboard and RAM.



Don't like the H100 there at all. Ordered an Alphacool ST-30 360mm radiator, Mayhem's Pastel Green concentrate, XSPC Raystorm, and some primochill clear tubing. Will be much happier with that.

Now I just need the fricken Shinobi XL white to release. Come on Bitfenix, pleeeeeeeeeeeease.


----------



## Arizonian

Quote:


> Originally Posted by *jcde7ago*
> 
> Interesting...have you tried a different PSU? It could be the first reported DOA GTX 690, but i would exhaust all other options first....and the PSU is the first thing you want to rule out without a shadow of a doubt.
> Try another PSU first and see what happens, or if you don't have one...try different power cables?


Figured it out. One of the Bitfenix 8 pin cables were bad. Had a black one and now it's working. Loading new drivers as I type.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Figured it out. One of the Bitfenix 8 pin cables were bad. Had a black one and now it's working. Loading new drivers as I type.


Hehe, told you man!!!









And Michalius, that mobo + 690 color combo is indeed pretty awesome looking.


----------



## Landon Heat

Got a call from the egg. Those of you who brought this card from newegg on the 3rd will get a $200 gift card. I guess too many people complained so they decide to price match everyone who paid $1200 for this card.


----------



## V3teran

Quote:


> Originally Posted by *jcde7ago*
> 
> V3teran, i've gone ahead and changed the brand per your request (as well as your proof pic), and also, thanks for the benchmarks!


Thankyou mate,also may i suggest that it would look better as proof instead of proof?...take away the question mark for people who have confirmed proof of pictures unboxing etc,it just makes the presentation better:thumb:


----------



## Arizonian

Ok figured out one first day problem and the new one has risen.

When I shut off Wndows through window start up menu, windows shuts off but not the tower will not shut off. I read someone else was having this issue and I can't remember if it was this thread or the 680 thread.

What could be causing me to not be able to power off?

It has no problems going to sleep and waking up from sleep.


----------



## Stateless

I received a big package today...I wonder what it could be???



Let's open up and see what could possibly be in such a large box....


Hmmm...a blue LED light bar and a 20pin USB Converter...but something seems to be beneath it....


Now that is some packaging...air bubbles..damn, now I need to spend a few hours popping every one of them because it is so much fun....2 hours later these are revealed...


And to be added to the group of owners the requred little proof pic...


Now I am off to see what these look like on the inside..(sounds a little perverted doesn't it????)...lol


----------



## V3teran

Quote:


> Originally Posted by *Arizonian*
> 
> Ok figured out one first day problem and the new one has risen.
> When I shut off Wndows through window start up menu, windows shuts off but not the tower will not shut off. I read someone else was having this issue and I can't remember if it was this thread or the 680 thread.
> What could be causing me to not be able to power off?
> It has no problems going to sleep and waking up from sleep.


Its the 301.34 driver mate its rubbish,i had that issue also.
Im now using this driver its so much better and the pc shuts down properly.
http://download.gigabyte.ru/driver/vga_driver_nvidia_(win7)vista64_301.33.exe


----------



## Arizonian

Quote:


> Originally Posted by *V3teran*
> 
> Its the 301.34 driver mate its rubbish,i had that issue also.
> Im now using this driver its so much better and the pc shuts down properly.
> http://download.gigabyte.ru/driver/vga_driver_nvidia_(win7)vista64_301.33.exe


A Gigabyte driver will work on an EVGA card? Did Gigabyte modify it?

I found this thread on EVGA website.

http://www.evga.com/forums/tm.aspx?high=&m=1589748&mpage=1#1590921

It's not a big problem because I mostly don't shut off my PC I usually just put it to sleep and wake it up when I need it. May try that driver thank you for your help +1 rep.


----------



## V3teran

All these cards are the same mate they just come in different boxes and the companies offer different warrantys,the driver will work on all 690 cards.
Thanks for the rep.


----------



## jcde7ago

I have added everyone who needed to be added and have edited the "Proof?" section....grats, Stateless, those two in Quad-SLI are going to rock!









Also, as for the system not shutting down all the way bug - that seems like it might be a combination of drivers and certain mobos. 680 and 680 SLI owners are also encountering the same problem, so it's not just the 690, and it's not just the 301.34 drivers.


----------



## V3teran

This driver works much better than 301.34.
http://download.gigabyte.ru/driver/vga_driver_nvidia_(win7)vista64_301.33.exe
PC Shutdown works and just did a quick heaven bench,all is good!

Also why cant i add my signature details like you guys have?Do i have to have so many posts before i can do that?
Cheers


----------



## TheRainMan

I'm gonna give those drivers a shot because I'm getting significantly less FPS than I see on benchmark sites despite having similar specs(e.g. 50 fps in Metro 2033 at 1920x1080, real stupid)

Additionally, to put your rig in the signature, simply navigate to your profile, click "add a rig" and after you fill out your specs go to the signature bit and choose to add your rig to your signature


----------



## V3teran

thanks


----------



## Arizonian

OK - JCD - officially in the club.

Ran my first run 3DMark11 with Validation links. Posted on Page 1 - *LINK*

Started with 1000 MHz Core Stock 1502 MHz Memory - GPU Boost 1130 Mhz. *P15240*

I'm not even close to finished will be updating as I have time. I did this on my lunch break.










My mobo is far from configured properly. I've got to delve into it. I'm not even sure if I'm at PCIe 3.0 ATM. I think it's set to 2.0 and a dual GPU benifits from it. Also I have a mild OC on the CPU but don't think it's done properly either.

Give me a week and I'll have this purring.


----------



## ceteris

Quote:


> Originally Posted by *Arizonian*
> 
> OK - JCD - officially in the club.
> Ran my first run 3DMark11 with Validation links. Posted on Page 1 - *LINK*
> Started with 1000 MHz Core Stock 1502 MHz Memory - GPU Boost 1130 Mhz. *P15240*
> I'm not even close to finished will be updating as I have time. I did this on my lunch break.


That's a pretty nice score. Did you OC your IB too?


----------



## Arizonian

Quote:


> Originally Posted by *ceteris*
> 
> That's a pretty nice score. Did you OC your IB too?


Mild 4.2 on the CPU and I know that GPU can be boosted further. It should hit 1200 MHz OC the core with GPU boost. See first page second post for view.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> OK - JCD - officially in the club.
> Ran my first run 3DMark11 with Validation links. Posted on Page 1 - *LINK*
> Started with 1000 MHz Core Stock 1502 MHz Memory - GPU Boost 1130 Mhz. *P15240*
> I'm not even close to finished will be updating as I have time. I did this on my lunch break.
> 
> 
> 
> 
> 
> 
> 
> 
> My mobo is far from configured properly. I've got to delve into it. I'm not even sure if I'm at PCIe 3.0 ATM. I think it's set to 2.0 and a dual GPU benifits from it. Also I have a mild OC on the CPU but don't think it's done properly either.
> Give me a week and I'll have this purring.


Grats man, you're just a couple hundred points behind me!!! I was just over P15430~


----------



## emett

jcde7ago saw one of your members pull about 13,800 with 2500k 4.8 and a 690 at stock, so I ran 3d mark 11 to see a comparison.
I scored about 15,300 with settings of 2600k 4.5 and sli 680s at stock. Anyway believe what you want , I'm not trying to big note.. I was after a 690 but couldn't track one down.. I'm sure the new drivers will pull out a bit more performance..


----------



## V3teran

What settings are you guys using for your overclocking?


----------



## kemsoff

Just wanted to throw up my first benchmark

Both the 3930k and 690 are stock, no oc, nothing what so ever.


----------



## jcde7ago

Quote:


> Originally Posted by *emett*
> 
> jcde7ago saw one of your members pull about 13,800 with 2500k 4.8 and a 690 at stock, so I ran 3d mark 11 to see a comparison.
> I scored about 15,300 with settings of 2600k 4.5 and sli 680s at stock. Anyway believe what you want , I'm not trying to big note.. I was after a 690 but couldn't track one down.. I'm sure the new drivers will pull out a bit more performance..


I don't have a problem believing that at all...my score with a stock 690 was at P15,125 (it's some pages back, I posted it). With a modest core and mem OC (not max, haven't found that yet), I raised it to P15438. Again, it's natural that the 680s in SLI will pull ahead, there's no doubt about that - but we're talking literally 3-4% or a few hundred points in 3DMark11. That's really not a big discrepancy at all. And again, I will take a 3-4% performance loss with a 690 and draw 20% less power, output less heat, and have it be much quieter all in a single card over a pair of 680s any day. That said, there's nothing more to prove. 680s in SLI will always come out on top, ever so slightly, and I don't think anyone is challenging that at this point.
Quote:


> Originally Posted by *kemsoff*
> 
> Just wanted to throw up my first benchmark
> Both the 3930k and 690 are stock, no oc, nothing what so ever.


Definitely a sweet score, man. Again, mine was P15125 @ stock 690 clocks, 4.7Ghz OC on the 3930K. We've still got a lot of room left...in fact, I think only one person here found a max OC for their 690, and I can't even confirm it right now. We'll get a better idea in the weeks ahead i'm sure.


----------



## Kyouki

Excited! Got my tracking number now just gotta sit and wait for it!


----------



## V3teran

Quote:


> Originally Posted by *V3teran*
> 
> What settings are you guys using for your overclocking?


Bump


----------



## Arizonian

I'm not sure why 680 SLI owners feel like they've got something to prove on the 690 club. Not one of us said the 690 compares on par. All we've said that from all the reviews out the 690 has come closet to its SLI cousin than any other dual GPU before it. Nvidia is doing as well as AMD did with thier 6990 coming very close to two 6970's last year.

Personally I like this dual over an SLI for other reasons than the performance. It's really a high quality gaming card. Motherboard thick PCB. 10 power PWM'S (5 per side), single PCIe slot space, lower power draw by far, lower temp, amazing magnesuim shroud, and vapor chamber cooling over heat sink fan.

By the way my idle temp has been 31C today. A couple degrees cooler than my single 680 in the kids rig.


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Bump


I just keep Power Target at 135%, and then play around with the core and mem - these should all be able to hit +140Mhz Core and +250Mhz Mem - so those should be pretty good starting points.

Once you start crashing or whatnot, lower the memory by 25-50Mhz and maintain the core, and try again. If it fails still, lower memory by 25-50Mhz, drop the core by 10Mhz, rinse and repeat.









Also, I just read this on EK's website, for those of you (like me) interested in putting your 690 under water at some point in the future:

Quote:


> *EK is currently developing new water blocks for the newest and most popular graphics cards on the market,* namely ASUS GeForce GTX 680 DirectCU II, MSI R7970 Lightning, reference design AMD Radeon HD 7870 *as well as referece design nVidia GeForce GTX 690.*
> 
> All Full-Cover water blocks will feature excellent cooling and hydraulic performance, pre-installed standoffs for safe and easy installation. More details will be available close to release. The products are *expected to be widely available for purhcase in early June.*
> 
> - Your EK Team











Quote:


> Originally Posted by *Arizonian*
> 
> I'm not sure why 680 SLI owners feel like they've got something to prove on the 690 club. Not one of us said the 690 compares on par. All we've said that from all the reviews out the 690 has come closet to its SLI cousin than any other dual GPU before it. Nvidia is doing as well as AMD did with thier 6990 coming very close to two 6970's last year.
> Personally I like this dual over an SLI for other reasons than the performance. It's really a high quality gaming card. Motherboard thick PCB. 10 power PWM'S (5 per side), single slot space, lower power draw by far, lower temp, amazing magnesuim shroud, and vapor chamber cooling over heat sink fan.
> By the way my idle temp has been 31C today. A couple degrees cooler than my single 680 in the kids rig.


Agreed on all accounts. Nvidia did a fantastic job and has now set the bar for how dual-GPU cards should perform in the future - meaning they should be about 95% the performance at the very least of their individual-SLI cousins, which has never before been achieved on a dual-GPU card.









And mine also idles at 31-32c!


----------



## bitMobber

I couldn't bring myself to sell it. Once it arrived I just had to have it!!

I know my current system is going to bottleneck the card but it's just going to be in there temporarily. I'm planning to build a Ivy Bridge system in the near future.


----------



## Arizonian

i7 3770 4.2 GHZ RAM 1600 Mhz (it's under clocked I've noticed) - GTX 690 Base 1016 Stock 1502 Memory - Hit 1145 GPU Boost. +132 Power Target +100 Core and stock Memory. Tonight I'm going to dive into this more seriously. This was done on a 15 min break.

3DMark11 P15425
CPU-Z Validation



Yeah - I know this baby has a lot more to give. Going to fine tune this tonight.


----------



## Malachi101

I also ordered my GTX 690 , but I am getting alot of haters flamming my thread about being stupid and waste money










by the way , add me to the owner club as well cheers


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> i7 3770 4.2 GHZ RAM 1600 Mhz (it's under clocked I've noticed) - GTX 690 Base 1016 Stock 1502 Memory - Hit 1145 GPU Boost. +132 Power Target +100 Core and stock Memory. Tonight I'm going to dive into this more seriously. This was done on a 15 min break.
> 3DMark11 P15425
> CPU-Z Validation
> 
> Yeah - I know this baby has a lot more to give. Going to fine tune this tonight.


Very nice clocks so far! For what it's worth, i've seen Core 1 hit 1196 the highest so far, with Core 2 hitting 1202 after boost in my OC tests...so again, we've got lots of room here.








Quote:


> Originally Posted by *Malachi101*
> 
> I also ordered my GTX 690 , but I am getting alot of haters flamming my thread about being stupid and waste money
> 
> 
> 
> 
> 
> 
> 
> add me too into the club owner


Man, haters gonna hate, don't worry about them - what you do with your money is none of anyone else's concern, though they're probably just hating because you paid over MSRP for it (but if it's New Zealand prices, it's not surprising at all). And yes, I will go ahead and add you, congrats man!


----------



## Arizonian

Heh JCDE - you just pointed out something I never gave thought to before. I have two GPU's core boost I should be looking at. I assumed they ran in sync.


----------



## Stateless

Quote:


> Originally Posted by *emett*
> 
> jcde7ago saw one of your members pull about 13,800 with 2500k 4.8 and a 690 at stock, so I ran 3d mark 11 to see a comparison.
> I scored about 15,300 with settings of 2600k 4.5 and sli 680s at stock. Anyway believe what you want , I'm not trying to big note.. I was after a 690 but couldn't track one down.. I'm sure the new drivers will pull out a bit more performance..


Emett, you also have to take the stock speed for each card into account. The 680's stock are at 1006mhz without boost...The 690's are at 915mhz without boost at Stock. I also had 2x680's in SLI and they ran great, but when you want to compare 2x680's SLI vs.a 690 if they both were set to the same clock speed I am sure that the results would be equal. If I get time, I will test it out myself, but right now I am testing my 690's in SLI.

When 2x680's are running at 1006mhz and the 690 is running at 915mhz with a CPU at the same speed it would be logical that the 680's would beat out a 690.

Just in my small amount of testing however, I can tell you this...I can already OC my 690's higher than my 680's and am already hitting 1200+boost on the 690's...the best boost clock I got out my 2x680's was around 1150 or so.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Heh JCDE - you just pointed out something I never gave thought to before. I have two GPU's core boost I should be looking at. I assumed they ran in sync.


Yep. I look at the EVGA Precision X readout on my G19 LCD, and the clock speeds will definitely vary between the two, though remain reasonably close to each other.


----------



## emett

Ah cool that is where the difference will be. I was thinking the 690 was the same speed. Anyway the only reason I mentioned it was jcde7ago said I was comparing an overclocked sli setup to a stock 690 setup when in fact I was not. anyway i'll shut up now


----------



## gatehous3

Had to RMA my 680 .. So I got a refund and bought an ASUS 690


----------



## jcde7ago

Quote:


> Originally Posted by *emett*
> 
> Ah cool that is where the difference will be. I was thinking the 690 was the same speed. Anyway the only reason I mentioned it was jcde7ago said I was comparing an overclocked sli setup to a stock 690 setup when in fact I was not. anyway i'll shut up now


Lol, no worries man.


----------



## Stateless

In some testing, +150 Offset is crash city for me....Right now I am rock solid at +120 Offset, 135% Power and +200 Memory. I cranked up the Core Offset to 150 and crashed within minutes...dialed back to +140 and it also crashed. When I get back from dinner I will try +130 to see what it does...if it also crashes, it looks like +120 it is for me, then I will work on raising the memory some more. At +120 Core, 135% Power and +200 on memory I ran 4-5 runs of Unigine at full tilt and did 4 runs of 3dmark11 with no issues. Also scored 25k in 3dmark11 "P" score!


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> In some testing, +150 Offset is crash city for me....Right now I am rock solid at +120 Offset, 135% Power and +200 Memory. I cranked up the Core Offset to 150 and crashed within minutes...dialed back to +140 and it also crashed. When I get back from dinner I will try +130 to see what it does...if it also crashes, it looks like +120 it is for me, then I will work on raising the memory some more. At +120 Core, 135% Power and +200 on memory I ran 4-5 runs of Unigine at full tilt and did 4 runs of 3dmark11 with no issues. Also scored 25k in 3dmark11 "P" score!


Hmm...I could do +140Mhz core, +200mem just fine, I ran 3x Heaven and 2x 3DMark without issues.

Before you give up though, I would suggest trying +140-145 Core, drop Mem to +150 and see what happens. Otherwise you're still getting a really nice OC and great scores.


----------



## Arizonian

Quote:


> Originally Posted by *gatehous3*
> 
> Had to RMA my 680 .. So I got a refund and bought an ASUS 690


Very nice!







Post back and join us, roll up our sleeves and learn about the 690. Welcome to OCN.


----------



## tahoward

Mine seems to crash above 1250Mhz right away and will run for a few minutes before crashing at 1225. 1202 might be stable but I'll need to check later tonight... sorta like the 725 offset I got on the memory. Will most likely have to lower it a bit to up the core frequency.

I'm curious if liquid cooling the card will lower the voltage needed to maintain 1225-1250.


----------



## bitMobber

Quote:


> Originally Posted by *ceteris*
> 
> LOL I think some people would be more annoyed if you did keep it. That is if your system specs in your sig are still up the date.. Bottleneck City!
> 
> 
> 
> 
> 
> 
> 
> 
> Although I would be interested to see what kind of scores you would get on 3DMark 11 and Unigine


Here's my scores with the computer I have listed in my profile (AMD Phenom II X4 965 http://3dmark.com/3dm11/3401557 ). I'm about P5,000 short compared to an Intel 3770K based system otherwise the rests of the graphics scores are just 2-10 FPS less. Where I really get crushed is the Physics and Combined tests, the 3770K system is twice as fast.

AMD 965 vs Intel 3770K
http://3dmark.com/compare/3dm11/3401557/3dm11/3400083

3DMark Score - P10245
Graphics Score - 17007
Physics Score - 4748
Combined Score - 4563
Graphics Test 1 - 77.96 FPS
Graphics Test 2 - 80.45 FPS
Graphics Test 3 - 106.34 FPS
Graphics Test 4 - 51.46 FPS
Physics Test - 15.07 FPS
Combined Test - 21.22 FPS


----------



## jcde7ago

This may sound crazy, but i'm going to run benchmarks using EVGA's LED Controller to shut off the LED, and see if I can attain a higher OC...LOL.

The LED is just plugged into a 2-pin 12v fan connection on the PCB. All the software technically does is control the amount of voltage being fed out of that 2-pin connection, as the drivers see the LED as a "fan," and allows software control of the voltage (LED is brightened/dimmed depending on how much voltage you feed it with the LED controller).

The voltage required by the LED is most likely completely insignificant, but hey, you never know, that extra tiny bit of voltage the LED is drawing might just be what I need to get to that next level, LOL.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Hmm...I could do +140Mhz core, +200mem just fine, I ran 3x Heaven and 2x 3DMark without issues.
> Before you give up though, I would suggest trying +140-145 Core, drop Mem to +150 and see what happens. Otherwise you're still getting a really nice OC and great scores.


Thanks for that advice. I am done with 1 run of unigine at +140 on the offset, lowered memory to +150 and no crash yet. Temps are also ver good at 65c-68c...boost clocks range from 1199 to 1202 right now on all 4 Gpu's. It is starting run 2 right now..so far so good.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Thanks for that advice. I am done with 1 run of unigine at +140 on the offset, lowered memory to +150 and no crash yet. Temps are also ver good at 65c-68c...boost clocks range from 1199 to 1202 right now on all 4 Gpu's. It is starting run 2 right now..so far so good.


Nice, and keep us posted with your results!!


----------



## TheRainMan

I'm getting like 50 fps on Metro 2033 with everything maxed. Is this normal?

I'm running stock settings on everything on the computer and my specs are in my sig.


----------



## Stateless

Quote:


> Originally Posted by *tahoward*
> 
> Mine seems to crash above 1250Mhz right away and will run for a few minutes before crashing at 1225. 1202 might be stable but I'll need to check later tonight... sorta like the 725 offset I got on the memory. Will most likely have to lower it a bit to up the core frequency.
> I'm curious if liquid cooling the card will lower the voltage needed to maintain 1225-1250.


Water wont help it. It is not crashing due to the heat, it is due to not being able to give it a little more volts...of course unless you are running too hot. For me, heat is not the issue it's the fact that we cannot add a little bit more voltage.


----------



## Arizonian

Good to see the club starting to move foward as we're all starting to get out cards and learning together what they can do. It's been very informational and we've only just begun.

So here's a pic of my final build. Only two things left is to get the 690 back plate and I'm going to be removing the RAM fins and painting them white to complete the inside coordination.

I must say carefully choosing the components and having a nice variety to choose from made this a very easy build with some personal definition to my rig. This card really tops off the rig with that elitist feel and can't wait to play games. I'm going to run through my favorite games like Crysis, Crysis 2, BF3 campaigns in absolute ultimate settings not holding back. Really enjoy the performance of my build as I await Crysis 3 out soon.



The bottom fan pointing up into the GTX 690 brought my idle temp down to a constant 30C.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Nice, and keep us posted with your results!!


Thanks..but it crashed 1/2 way into the 2nd run!!! lol. Being a long time SLI user, I already expect not to hit as high as those that are running single cards though. This was the case with my 460 in SLI, 480 in SLI, 580 SLI, 580 SLI MSI Lightnings, 580 SLI Evga Classifieds and 680 SLI.

I am going to do one more run at +140 on the offset, but lower the memory to +100 to see what happens. If I get another crash, I will settle try 130 to see how that does...120 for me is rock solid, so if 130 also crashes, I will leave it at 120 and work on the memory.

With my 680's the best offset was +100, even +105 would crash, but +100, 132% Power and +400 on mem. I could run Unigine and 3dmark11 all day and be fine.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Thanks..but it crashed 1/2 way into the 2nd run!!! lol. Being a long time SLI user, I already expect not to hit as high as those that are running single cards though. This was the case with my 460 in SLI, 480 in SLI, 580 SLI, 580 SLI MSI Lightnings, 580 SLI Evga Classifieds and 680 SLI.
> I am going to do one more run at +140 on the offset, but lower the memory to +100 to see what happens. If I get another crash, I will settle try 130 to see how that does...120 for me is rock solid, so if 130 also crashes, I will leave it at 120 and work on the memory.
> With my 680's the best offset was +100, even +105 would crash, but +100, 132% Power and +400 on mem. I could run Unigine and 3dmark11 all day and be fine.


I'm going to do a couple of Heaven runs at +140 core and + 300 mem with the LED off (not that that should make a difference, lol)...i'll let you guys know what the results are in a bit.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> I'm going to do a couple of Heaven runs at +140 core and + 300 mem with the LED off (not that that should make a difference, lol)...i'll let you guys know what the results are in a bit.


Cool..keep us updated. I just started a new run at +140 and took memory back down to default just to see. When I OC my 680's I keep going on the core offset until I was not stable, then I worked on memory. On the 680's anything over 100 on the core would crash even with memory at stock...so going to try it the same way I did the 680's and do the core itself first the do the memory.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Cool..keep us updated. I just started a new run at +140 and took memory back down to default just to see. When I OC my 680's I keep going on the core offset until I was not stable, then I worked on memory. On the 680's anything over 100 on the core would crash even with memory at stock...so going to try it the same way I did the 680's and do the core itself first the do the memory.


*Success!!!!*









My previous best according to the OCN Top 30 list (*i'm currently 16th place* here http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores/0_100) was:

*- FPS: 107.9
- Score: 2719*

OC settings via EVGA Precision X:

- Power Target: 135%
- Core Offset: +140 Mhz
- Memory Offset: +200 Mhz

Results:



Tonight's run, changing nothing but memory offset:

*- FPS: 109.1
- Score: 2749*

OC settings via EVGA Precision X:

- Power Target: 135%
- Core Offset: +140 Mhz
- Memory Offset: +300 Mhz

Results:



*That is good for 2 spots up, to 14th place!*









*Now, there are only 2 GTX 680 SLIs (2-card SLIs) in front of me* - Capn'Crunches' at *109.3*, and EvTron's at *113.9*.

I have to say, i am VERY impressed thus far. Another thing is, *I can see the core speeds dropping down 15-20Mhz once the temps start to go a couple degrees above 70* (highest I reached was 76, turned the fan up to 85 though during benchmarking) - i have no doubt putting these under water can squeeze a *tiny* bit more of performance out of them.

I am going to try an OC with 400 Mhz and then 500 Mhz OC, if 400 is stable, and then try to squeeze some more out of the core.


----------



## Kyouki

So guys with your cards how are the inside case temps being effected? I understand with one GPU blowing into the case might see a small rise in temps, but wanted to see how this is effecting you and if it is a big deal? My CPU is on a H100 so i assume inside temps wont hurt it to much. But as I have been waiting for my GTX690 to get here I been buying few extras to MOD my case and I had thought up an idea to make an exhaust for the GPU inside, but don't know if it even worth worrying about.


----------



## jcde7ago

Quote:


> Originally Posted by *Kyouki*
> 
> So guys with your cards how are the inside case temps being effected? I understand with one GPU blowing into the case might see a small rise in temps, but wanted to see how this is effecting you and if it is a big deal? My CPU is on a H100 so i assume inside temps wont hurt it to much. But as I have been waiting for my GTX690 to get here I been buying few extras to MOD my case and I had thought up an idea to make an exhaust for the GPU inside, but don't know if it even worth worrying about.


The 690's fan design will dump hot air right out the front of the card and into your case. Best thing to alleviate that is to have an intake fan from the front, blowing cool air in and pushing that hot air towards the back, where you hopefully have another fan or push/pull setup with your H100 sucking that hot air through the rad (might raise your CPU temps a degree or two) and out the back.

Or, you can mount a fan on the bottom as an intake if your case allows it, blowing the hot air right up and hopefully into some fans exhausting it out of the top. That is how mine works, since the 800D allows me to have a bottom-mounted intake, sucking cold air in and using that to push the hot air from the 690 up and towards 3x AP-15GT's that exhaust it out pronto.


----------



## Stateless

Quote:


> Originally Posted by *Kyouki*
> 
> So guys with your cards how are the inside case temps being effected? I understand with one GPU blowing into the case might see a small rise in temps, but wanted to see how this is effecting you and if it is a big deal? My CPU is on a H100 so i assume inside temps wont hurt it to much. But as I have been waiting for my GTX690 to get here I been buying few extras to MOD my case and I had thought up an idea to make an exhaust for the GPU inside, but don't know if it even worth worrying about.


Well I gave it alot of thought and also asked around for advice. Someone suggested turning on of the intake fans that lines up with the exhaust of the 690 so it blows the hot air straight out. For me this would be my middle fan. However, I have a top front and bottom front as well...so this would of pushed the hot air straight out, but then the top/bottom fan would suck some of it back in. Being I have 2 cards, it was a concern and so far in my testing, it has not been an issue. I also have very good airflow in my case. Here is a pic that shows it.

I basically have 3 front intakes...behind the top and middle fan I removed the HDD cage so there is nothiing blocking air flow. On the back side of the bars that boarder the Hdd cage I placed 2 internal fans that basically work in a push pull configuration. These fans take the cool air from the 2 outer fans and pushes the air into the case. As you can see in the shot, the upper internal fan is angled cause I wanted to have some air blowing over the memory and parts of the mobo. What I found is that the internal middle fan is mixing cool air with the hot air...there is a 200mm fan on the bottom pushing cool air from outside the case up and then the angled fan is capturing some of this and also mixing it's cool air with the air from the GPU's which at this point have cooled down.



I also am running a H100 in a push pull config. and temps are not effected at all. Been running my Realtemp monitor as I am doing these Unigine runs to be sure and the highest CPU temp is one core at 63c.

The bottom intake fan is used primarily to blow cool air over the SSD and my 2 Hdd's. Since those dont warm up too much what also is happing is that 200mm bottom fan is grabbing that cool air as well and pushing it up. In short, the two 690's are outputting hot air which then is met with the cool air from the intake fan and the bottom fan is pushing cool air up. Ironically, the 2 GPU's that run the coolest are the 2 that are exhausting air into the case. In the runs I am doing right now, GPU 2 & 4 are running on average 3-5c cooler than GPU 1 & 3.


----------



## Kyouki

Quote:


> Originally Posted by *jcde7ago*
> 
> The 690's fan design will dump hot air right out the front of the card and into your case. Best thing to alleviate that is to have an intake fan from the front, blowing cool air in and pushing that hot air towards the back, where you hopefully have another fan or push/pull setup with your H100 sucking that hot air through the rad (might raise your CPU temps a degree or two) and out the back.
> Or, you can mount a fan on the bottom as an intake if your case allows it, blowing the hot air right up and hopefully into some fans exhausting it out of the top. That is how mine works, since the 800D allows me to have a bottom-mounted intake, sucking cold air in and using that to push the hot air from the 690 up and towards 3x AP-15GT's that exhaust it out pronto.


I have the SilverStone TJ10 ESA edition Love the case and made few mods to it! Ill have to get pics up soon. But if you have ever seen the case it has a mid/front fan that brings cold air in and right over the graphic cards. I was thinking I could just flip that fan and it would pull hot air out the front if it gets to hot in the case. My current set up is my RAD for H100 is in the top and it in a push/pull but it bringing cold air in the top and blowing over RAM and MB. and have a signal fan exhausting out the back like normal. And intake front/mid for GPU.


----------



## jcde7ago

Quote:


> Originally Posted by *Kyouki*
> 
> I have the SilverStone TJ10 ESA edition Love the case and made few mods to it! Ill have to get pics up soon. But if you have ever seen the case it has a mid/front fan that brings cold air in and right over the graphic cards. I was thinking I could just flip that fan and it would pull hot air out the front if it gets to hot in the case. My current set up is my RAD for H100 is in the top and it in a push/pull but it bringing cold air in the top and blowing over RAM and MB. and have a signal fan exhausting out the back like normal. And intake front/mid for GPU.


Yep, if you have a mid mounted fan that you can flip around, this would be PERFECT for the 690 - it ejects air STRAIGHT out of the card, so a middle mounted fan would suck that air right out. Nvidia did an awesome job engineering this card's interior and fan design, not to mention, the paths which air exits the card. It literally will not keep any hot air inside itself. So, as long as the rest of the airflow in your case isn't creating any sort of wind-tunnels/vacuums, you should be fine with a mid-front exhaust fan.


----------



## Stateless

Quote:


> Originally Posted by *Arizonian*
> 
> Good to see the club starting to move foward as we're all starting to get out cards and learning together what they can do. It's been very informational and we've only just begun.
> So here's a pic of my final build. Only two things left is to get the 690 back plate and I'm going to be removing the RAM fins and painting them white to complete the inside coordination.
> I must say carefully choosing the components and having a nice variety to choose from made this a very easy build with some personal definition to my rig. This card really tops off the rig with that elitist feel and can't wait to play games. I'm going to run through my favorite games like Crysis, Crysis 2, BF3 campaigns in absolute ultimate settings not holding back. Really enjoy the performance of my build as I await Crysis 3 out soon.
> 
> The bottom fan pointing up into the GTX 690 brought my idle temp down to a constant 30C.


Really nice man...very clean and once you paint the memory it will look fantastic...well more fantastic because it looks really good right now.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> The 690's fan design will dump hot air right out the front of the card and into your case. Best thing to alleviate that is to have an intake fan from the front, blowing cool air in and pushing that hot air towards the back, where you hopefully have another fan or push/pull setup with your H100 sucking that hot air through the rad (might raise your CPU temps a degree or two) and out the back.
> Or, you can mount a fan on the bottom as an intake if your case allows it, blowing the hot air right up and hopefully into some fans exhausting it out of the top. That is how mine works, since the 800D allows me to have a bottom-mounted intake, sucking cold air in and using that to push the hot air from the 690 up and towards 3x AP-15GT's that exhaust it out pronto.


lmao....I replied to him with my setup which is exactly your suggestions you responded with...I was typing I guess when you replied, but it was funny to read you had the same thought process as I did when I came up with my fan set up for my case.


----------



## jcde7ago

Also, I just have to say it...*but i'm really glad to have beaten Majin SSJ Eric's Heaven score...lol*. Just cause I know how much he loves his Heaven scores and his 7970s (and those 7970's have a pretty high OC). Not to mention, his 3960X was clocked 100 Mhz more than my 3930K. *Eric, if you're reading this...all in good fun, right?* Challenge accepted and won! Payback for you beating my first score by .1, lol.









Quote:


> Originally Posted by *Majin SSJ Eric*
> 
> Here's my overclocked result:
> *Majin SSJ Eric --- Intel Core i7 3960X / 4800 Mhz ---- MSI R7970 Lightning CF / 1225MHz / 1400MHz ---- 108.0 ---- 2720*


Quote:


> Originally Posted by *jcde7ago*
> 
> *jcde7ago --- Intel i7 3930k / 4700 Mhz ---- nVidia GeForce GTX 690 / 1055 Mhz / 1652 Mhz / 1160 Mhz ---- 109.1 ---- 2749*


Quote:


> Originally Posted by *Stateless*
> 
> lmao....I replied to him with my setup which is exactly your suggestions you responded with...I was typing I guess when you replied, but it was funny to read you had the same thought process as I did when I came up with my fan set up for my case.


Yeah, I just realized that too, lol.


----------



## Stateless

Well, I am on run number 5 of Unigine and currently it is now on Run 6 which I am doing the benchmark with and with the Core offset to +130 it is rock solid. So 2-3 more runs and I will say this is pretty damn stable. Will then begin to OC the memory to see how far I get. The nice thing is that my temps never broke 66c in the 6 runs of Unigine!! Now that folks for a dual GPU is really impressive!


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Also, I just have to say it...*but i'm really glad to have beaten Majin SSJ Eric's Heaven score...lol*. Just cause I know how much he loves his Heaven scores and his 7970s (and those 7970's have a pretty high OC). Not to mention, his 3960X was clocked 100 Mhz more than my 3930K. *Eric, if you're reading this...all in good fun, right?* Challenge accepted and won! Payback for you beating my first score by .1, lol.
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, I just realized that too, lol.


Good job man. I need to do a run that matches the settings you guys use for your Unigine run. Will post the results of that shortly. Now that my +130 offset to the core is stable, will crank memory up to +200 for the first run to see what happens.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Good job man. I need to do a run that matches the settings you guys use for your Unigine run. Will post the results of that shortly. Now that my +130 offset to the core is stable, will crank memory up to +200 for the first run to see what happens.


Thanks man! Again, really impressed with the power of these cards, as well as the OC potential....and this is OC potential on AIR, and WITHOUT any sort of voltage increase. That's impressive. As a former 590 owner...man. I don't even want to speak "590" and "690" in the same sentence after this.

When Nvidia said they made NO COMPROMISES with slapping 2x GK104's on a single PCB and it would perform on par with 680s in SLI, they weren't kidding. This card have proven itself more than worthy of its price tag, in my opinion. Packing tremendous performance in a single-card design, with lower power consumption, less heat and less noise than a pair of 680s? Man, the only thing bad now is that if future dual-GPU cards aren't up to far, they should be considered failures from here on out.









Now, hurry up and get those Heaven runs with the Top 30 settings done and post the results! I'm actually off to run some 3DMark 11 runs, before calling it a night with some BF3 and ME3...


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Thanks man! Again, really impressed with the power of these cards, as well as the OC potential....and this is OC potential on AIR, and WITHOUT any sort of voltage increase. That's impressive. As a former 590 owner...man. I don't even want to speak "590" and "690" in the same sentence after this.
> When Nvidia said they made NO COMPROMISES with slapping 2x GK104's on a single PCB and it would perform on par with 680s in SLI, they weren't kidding. This card have proven itself more than worthy of its price tag, in my opinion. Packing tremendous performance in a single-card design, with lower power consumption, less heat and less noise than a pair of 680s? Man, the only thing bad now is that if future dual-GPU cards aren't up to far, they should be considered failures from here on out.
> 
> 
> 
> 
> 
> 
> 
> 
> Now, hurry up and get those Heaven runs with the Top 30 settings done and post the results! I'm actually off to run some 3DMark 11 runs, before calling it a night with some BF3 and ME3...


Will do. Just doing a test with the +130 Offset and +200 to memory. So far so good...1 run done already...max temp is 64c...will go up in sub-sequent runs, but I am betting I dont go over 68c after 5 runs.

Will my score count's toward this ladder standing being that I am using in essence 4 GPU's? Also where is the link or location where this ladder is so I can make sure I set up Unigine to qualify.


----------



## jcde7ago

Here is the thread:

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores/

And yes, even with Quad-SLI, you will still qualify. Just make sure you read and follow all the requirements for score submissions. Good luck man, can't wait to see what Quad-SLI with a pair of 690s can do!


----------



## jcde7ago

Okay, DEFINITELY done with benchmarks for tonight...*another successful run with OC'd clocks, this time on 3DMark11*!









Note: I'm using the 301.34 drivers for the GTX 690.

First run on stock clocks:

P15124
Graphics Score: 16,477



Tonight's run, OC settings at: 135% Power Target, +140 Mhz Core Offset, + 300 Mhz Mem. Offset, 1055/1652/1160 (Core/Mem/Boost):

P17532
Graphics Score: 19,906



And I still haven't found my max OC yet!









By the way, my 3DMark11 OC score would put me in 6th place on EVGA's 680-SLI 3DMark scores leaderboard...just below someone who has 680s in SLI under water, and a couple thousand points behind the rest who have CPU frequencies 300-700Mhz greater than mine.


----------



## kemsoff

Quote:


> Originally Posted by *jcde7ago*
> 
> Okay, DEFINITELY done with benchmarks for tonight...*another successful run with OC'd clocks, this time on 3DMark11*!
> 
> 
> 
> 
> 
> 
> 
> 
> Note: I'm using the 301.34 drivers for the GTX 690.
> First run on stock clocks:
> P15124
> Graphics Score: 16,477
> 
> Tonight's run, OC settings at: 135% Power Target, +140 Mhz Core Offset, + 300 Mhz Mem. Offset, 1055/1652/1160 (Core/Mem/Boost):
> P17532
> Graphics Score: 19,906
> 
> And I still haven't found my max OC yet!


Very nice!!


----------



## TheRainMan

What voltages are you guys running?


----------



## jcde7ago

Quote:


> Originally Posted by *TheRainMan*
> 
> What voltages are you guys running?


The GTX 690 is voltage-capped at 1.175mv.

The max power target = the limit of the voltage capping - so, if you ever use the full 35% TDP overhead...it means the card is at its 1.175mv limit.

Just know that upping the Power Target to the max of 135% doesn't mean you'll actually use 35% more power than TDP in all cases - it just gives you that headroom to do so. The card will only use as much power as it needs, so, if you set a core and memory offset that it will try to attain and it needs less than 135% power target, it will dynamically use voltage only as needed.

Personally, I leave it at just 135% PT. The way boost works for the 600 series is that you may not even need to set core/mem offsets for an OC - simply raising the power target will make the card boost on its own and give you a performance boost within your PT range.

*By the way, the maximum 135% Power Target is absolutely, completely safe.* This is actually a rather conservative voltage setting, so setting it to the maximum isn't even beginning to tap into the power of GK104 (at the expense of increased temps and power consumption of course).

In theory, Nvidia could have probably gotten away with using our current max voltage cap of 1.175mv for stock, but again, they wanted to be conservative with voltages and clock speeds to stay well within the TDP and keep the low temp. range (and also avoid any possible 590-esque scenarios). That's the amazing part about all of this is that the stock of performance of these cards could have been the OC's we're hitting now, and even after keeping them choked a little, they still manage to pull completely beastly numbers and ridiculous performance.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Here is the thread:
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores/
> And yes, even with Quad-SLI, you will still qualify. Just make sure you read and follow all the requirements for score submissions. Good luck man, can't wait to see what Quad-SLI with a pair of 690s can do!


My Unigine score at 1080p alone is good enough for 2nd place on that chart...lowering the resolution to match the requirements might get me to number 1, not sure just yet. I am still trying to find a good OC for my cards. It looks like 130 offset and default on the memory is a no go for me. I tend to run Unigine at least 8-10 runs before thinking it is good and on run #8 it crashed. So I dialed back to +120 which previously was solid and it currently is on run #8 with a +200 on the memory. Temps yet again are excellent with the highest temp being 67c.

I am going to give it 2-3 more runs and then do one for the leaderboard to see how I fair. If the 690's are like my 680's then once a solid core offset is found memory usually can go a bit further. On my 680's once I found that +100 offset in SLI was the magic number I was able to get to +400 on the memory. I think I can achieve the same here.

As it stands right now, my max offset and final OC for my 680's in SLI was +100 on the offset which gave me a base core clock of 1106. Right now, on the GTX 690's in SLI I am at +120 for a base clock of 1035, so -71 on the core base...HOWEVER...and this a good HOWEVER...my best boost clock on my 680's in SLI was 1178....Right now on the 690's in SLI I am hitting 1189 and at times it spikes to 1200.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> The GTX 690 is voltage-capped at 1.175mv.
> The max power target = the limit of the voltage capping - so, if you ever use the full 35% TDP overhead...it means the card is at its 1.175mv limit.
> Just know that upping the Power Target to the max of 135% doesn't mean you'll actually use 35% more power than TDP in all cases - it just gives you that headroom to do so. The card will only use as much power as it needs, so, if you set a core and memory offset that it will try to attain and it needs less than 135% power target, it will dynamically use voltage only as needed.
> Personally, I leave it at just 135% PT. The way boost works for the 600 series is that you may not even need to set core/mem offsets for an OC - simply raising the power target will make the card boost on its own and give you a performance boost within your PT range.
> *By the way, the maximum 135% Power Target is absolutely, completely safe.* This is actually a rather conservative voltage setting, so setting it to the maximum isn't even beginning to tap into the power of GK104 (at the expense of increased temps and power consumption of course).
> In theory, Nvidia could have probably gotten away with using our current max voltage cap of 1.175mv for stock, but again, they wanted to be conservative with voltages and clock speeds to stay well within the TDP and keep the low temp. range (and also avoid any possible 590-esque scenarios). That's the amazing part about all of this is that the stock of performance of these cards could have been the OC's we're hitting now, and even after keeping them choked a little, they still manage to pull completely beastly numbers and ridiculous performance.


Well said. In addition, in a few reviews of the 690 they found setting the power target to 135% would level the two GPU's. With my 680's I noticed that one GPU would clock higher than the other and rarely would they be in sync. When I mazed the power to 132% the 2 cards synced up pretty well...there would still be some fluxuation, but for the most part they were in perfect harmony. Testing the 690 I found the same thing. This is not like the 590 where people were blowing them up being a bit to aggressive with voltages...since they cap it a 1.175 hit max power is no issue at all.

Water cooling the cards wont help too much in achieving a higher clock unless you are already going above 70c with your overclock. The 600 series can throttle back a bit once temps exceed 70c...I do notice a bit less throttling on the 690's than the 680's...but with a good fan profile you can keep it under 70c and still get a hell of an OC. Water will definately help with the temps and if and when they allow a bit more voltage is where water will really make these cards sing because it will help keep temps in check.


----------



## jincuteguy

Quote:


> Originally Posted by *Supper*
> 
> Side note, gtx 680 will be fully available end of this month as my favorite retailer told me so that goes for 690 as well.


Are you really sure about this? It's almost mid May and still no sign of 690s availability except EVGA.com put up like a couple every day.


----------



## phantomphenom

Just catching up on the last 11 pages, anyone believe there will be non reference model? I am waiting patiently for it to come back into stock, though i am planning to slap a water block on there when they're out....it would be a shame to take of the metal casing because it looks gorgeous!!!!!!


----------



## MrTOOSHORT

Canada NCIX has the Asus GTX 690 in stock for $1099.99 as of 12:25am:

http://www.ncix.ca/products/?sku=71419&vpn=GTX690%2D4GD5&manufacture=ASUS


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> My Unigine score at 1080p alone is good enough for 2nd place on that chart...lowering the resolution to match the requirements might get me to number 1, not sure just yet. I am still trying to find a good OC for my cards. It looks like 130 offset and default on the memory is a no go for me. I tend to run Unigine at least 8-10 runs before thinking it is good and on run #8 it crashed. So I dialed back to +120 which previously was solid and it currently is on run #8 with a +200 on the memory. Temps yet again are excellent with the highest temp being 67c.
> I am going to give it 2-3 more runs and then do one for the leaderboard to see how I fair. If the 690's are like my 680's then once a solid core offset is found memory usually can go a bit further. On my 680's once I found that +100 offset in SLI was the magic number I was able to get to +400 on the memory. I think I can achieve the same here.
> As it stands right now, my max offset and final OC for my 680's in SLI was +100 on the offset which gave me a base core clock of 1106. Right now, on the GTX 690's in SLI I am at +120 for a base clock of 1035, so -71 on the core base...HOWEVER...and this a good HOWEVER...my best boost clock on my 680's in SLI was 1178....Right now on the 690's in SLI I am hitting 1189 and at times it spikes to 1200.


That's awesome man...there was never a doubt that quad-sli with 690s would churn out some pretty jaw-dropping numbers.









Also, as much as people were talking smack about the 690 not being to overclock as well as the 680...i've been reading a LOT of 680 OC threads on various forums, and it seems to me like an average OC for the 680 is only around the +110-120 Core Offset range...and yet our 690s can easily hit the same. And then some. So again...the 690...is nothing short of a beast of a card.


----------



## CapnCrunch10

Wow jcde7ago, those are some stellar numbers.

Though, you surpassed my 680 SLI Heaven score by 2 points, but I'm still 2000+ over your graphics score on 3DMark11. Maybe the drivers made a difference in Heaven (was using 301.10 I believe). Wish I could test those two right now to see if there is an improvement.

Keep up-ing the clock and find out what the max is before Heaven crashes.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> That's awesome man...there was never a doubt that quad-sli with 690s would be churn out some pretty jaw-dropping numbers.
> 
> 
> 
> 
> 
> 
> 
> 
> Also, as much as people were talking smack about the 690 not being to overclock as well as the 680...i've been reading a LOT of 680 OC threads on various forums, and it seems to me like an average OC for the 680 is only around the +110-120 Core Offset range...and yet our 690s can easily hit the same. And then some. So again...the 690...is nothing short of a beast of a card.


True, but you have to take into account a stock 680 is starting at a base of 1006mhz, while our 690 are at a base of 915. So when a 680 user is able to go +150 there base clock is 1156. On a 690 a +150 would net a 1065 which is -91 core clock deficit. However, I owned my 680's since launch and I can say that even though I was able to reach a core clock higher on them than I can on the 690, my boost clock on the 690 is higher (slightly) than the 680's.

680's core clocks are 1006mhz with a boost of 1058mhz which equals: +52 mhz when boost at standard power level.
690's core clocks are 915 mhz with a boost of 1019 mhx which equals: +104 mhz when boost at standard power level

There is a difference of 52 mhz on default 680 vs. 690 boost...which leads me back to my own tests where my core OC is lower than my 680's, yet my boost is higher than my 680's.


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> True, but you have to take into account a stock 680 is starting at a base of 1006mhz, while our 690 are at a base of 915. So when a 680 user is able to go +150 there base clock is 1156. On a 690 a +150 would net a 1065 which is -91 core clock deficit. However, I owned my 680's since launch and I can say that even though I was able to reach a core clock higher on them than I can on the 690, my boost clock on the 690 is higher (slightly) than the 680's.
> 680's core clocks are 1006mhz with a boost of 1058mhz which equals: +52 mhz when boost at standard power level.
> 690's core clocks are 915 mhz with a boost of 1019 mhx which equals: +104 mhz when boost at standard power level
> There is a difference of 52 mhz on default 680 vs. 690 boost...which leads me back to my own tests where my core OC is lower than my 680's, yet my boost is higher than my 680's.


Makes sense.


----------



## Arizonian

i7 3770K 4.5 GHz - GTX 690 Base Core 1050 Mhz Memory 1502 MHz - *CPU-Z Validation*

*3DMark11 Score P16137 Validation*



*3DMark11 E6489 Validation*


----------



## TheRainMan

best i could do tonight (140/220)

ill do more tomorrow!


----------



## Stateless

Need some help. No matter what I do to capture the screen in unigine, when I go to paste into pain or another program I get nothing but a black screen. I able to hit print screen on the desktop, surfing the web, playing a game, but no matter how many times I have tried, whenever I hit print screen in unigine and paste it is is just a blank black image. Any ideas? I need to take a screen shot of the my results for the leaderboard (about 10fps short from #1 spot).

Thanks for anyones help.


----------



## TheRainMan

Try alt + prt scrn


----------



## jcde7ago

Quote:


> Originally Posted by *Stateless*
> 
> Need some help. No matter what I do to capture the screen in unigine, when I go to paste into pain or another program I get nothing but a black screen. I able to hit print screen on the desktop, surfing the web, playing a game, but no matter how many times I have tried, whenever I hit print screen in unigine and paste it is is just a blank black image. Any ideas? I need to take a screen shot of the my results for the leaderboard (about 10fps short from #1 spot).
> Thanks for anyones help.


Unigine's hotkey for screenshots is F12.

It'll save the screenshots in your C:/Windows/Users/username here/Heaven folder.


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Unigine's hotkey for screenshots is F12.
> It'll save the screenshots in your C:/Windows/Users//Heaven folder.


Thanks...need to go to bed, but will do a run for the leaderboard tomorrow. Like I said, I think I am about 10fps behind the #1 guy..he does have his CPU much higher than mine and his 3 cards are OC higher as well....so not too big of a deal. I know I can do a "suicide" run to beat him though. Usually I can run 2-3 laps of Unigine at a higher oc but it crashes, but loading in and starting immediately just to get one run in shouldnt be a problem..but prefer to post a stable score with a stable OC. We'll see tomorrow when I start fresh.

One thing for certain...Withcer 2 is still NOT 100% rock solid 60fps!!! Even with 2 690's. GPU utilization is all over the place as it was with the 680's. While I am getting much better performance, for example in the intro where geralt is running through the forest hurt on my 2x680's it was around 40 or so with Uber on...with the 690's it is 60fps solid. It is in some gameplay areas where it tanks down due to low GPU usage..for example when you emerge out of the tent with Trish and you look towards the catepults, it drops to around 36fps or so in that view...of course gpu utilization on all 4 are down to about 35-40% or so.


----------



## Basti

I am nearing to the end of my OC. The only thing left to play with is memory clock. This is 301.33 driver from Gigabyte.


----------



## Stateless

Quote:


> Originally Posted by *Basti*
> 
> I am nearing to the end of my OC. The only thing left to play with is memory clock. This is 301.33 driver from Gigabyte.


Cool man. My max overclock is +120, 135% Power and +200 on memory. Last crash was the same set up but memory at +300, so I can still test at 20 intervals, but too damn tired tonight to keep pushing. Just going to set up some steam games to download as I get some sleep.


----------



## V3teran

Quote:


> Originally Posted by *jcde7ago*
> 
> I just keep Power Target at 135%, and then play around with the core and mem - these should all be able to hit +140Mhz Core and +250Mhz Mem - so those should be pretty good starting points.
> Once you start crashing or whatnot, lower the memory by 25-50Mhz and maintain the core, and try again. If it fails still, lower memory by 25-50Mhz, drop the core by 10Mhz, rinse and repeat.
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I just read this on EK's website, for those of you (like me) interested in putting your 690 under water at some point in the future:
> Quote:
> 
> 
> 
> *EK is currently developing new water blocks for the newest and most popular graphics cards on the market,* namely ASUS GeForce GTX 680 DirectCU II, MSI R7970 Lightning, reference design AMD Radeon HD 7870 *as well as referece design nVidia GeForce GTX 690.*
> All Full-Cover water blocks will feature excellent cooling and hydraulic performance, pre-installed standoffs for safe and easy installation. More details will be available close to release. The products are *expected to be widely available for purhcase in early June.*
> - Your EK Team
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Agreed on all accounts. Nvidia did a fantastic job and has now set the bar for how dual-GPU cards should perform in the future - meaning they should be about 95% the performance at the very least of their individual-SLI cousins, which has never before been achieved on a dual-GPU card.
> 
> 
> 
> 
> 
> 
> 
> 
> And mine also idles at 31-32c!
Click to expand...

Thanks,yeah im interested in water also,i already have 1xRX360 just for this card which will be going in my caselabs m8,what do you think of volt modding?Will these cards need to be volt modded to gte extra out of them or is the 135 power target enough?


----------



## V3teran

whenever i run at 135+ power, 110+ on the core and 200 on memory clock and i bench on heaven dx11 it runs great till half way through and then the fps crawls to around 5-10fps,its either the driver or the card dont oc too well,i think im gonna send the card back.


----------



## gatehous3

Will do man, I cant wait to get it!

Its gonna look beautiful in my case haha, im a huge fan of the LED lit side


----------



## gatehous3

Nice build man!

Pretty similar to mine actually!

What kind of case is that? I was considering modding my 500r to have a clear side window, instead of the mesh, but im afraid that the 690 will keep the temps up in my case where half the heat will be dissipated in there.

Also, if the ambient temp inside the case, it means less effective cooling with the h100 I got for my 3770k







and with those, you need the best cooling arrangement possible haha.

Im going to have SO many goodies when I get home. New p8z77 mobo, 3770k, g.skill 2400 ram, h100, gtx 690. Its gonna be like Christmas!! Wooo


----------



## Arizonian

Ok - played some Crysis - got one last question before I hit the sack and a busy day tommorow working.

I noticed I was not in SLI but only one GPU was being utilized in Crysis. How do I enable the SLI option in the game. I thought it was automatic. I only owned a crossfire rig for about 30 days. Is there an Nvidia software driver that enables them in games? Does each individual game have their own drivers to enable two cards?

I have Multi-GPU enabled in Nvidia Control Panel.

Thanks guys in advance.

On a side note: Some nice numbers and over clocks coming in from the club. Just think we're just starting to get our feet wet. Haven't fined tuned it just yet. I've got a steep learning curve with my new CPU and first time UEFI user. It's only going to get better. This has been the best week in a long time finishing this completely new build. The GTX 690 was the icing on the cake.


----------



## gatehous3

untitled.JPG 143k .JPG file


This good enough for validation?!

ASUS card for me!


----------



## gatehous3

Quote:


> Originally Posted by *Arizonian*
> 
> Ok - played some Crysis - got one last question before I hit the sack and a busy day tommorow working.
> I noticed I was not in SLI but only one GPU was being utilized in Crysis. How do I enable the SLI option in the game. I thought it was automatic. I only owned a crossfire rig for about 30 days. Is there an Nvidia software driver that enables them in games? Does each individual game have their own drivers to enable two cards?
> I have Multi-GPU enabled in Nvidia Control Panel.
> Thanks guys in advance.
> On a side note: Some nice numbers and over clocks coming in from the club. Just think we're just starting to get our feet wet. Haven't fined tuned it just yet. I've got a steep learning curve with my new CPU and first time UEFI user. It's only going to get better. This has been the best week in a long time finishing this completely new build. The GTX 690 was the icing on the cake.


I was totally supposed to quote you on my message about 'your build'
whoopsie ...

lawlz


----------



## Arizonian

Quote:


> Originally Posted by *gatehous3*
> 
> I was totally supposed to quote you on my message about 'your build'
> whoopsie ...
> lawlz


Thanks for the compliment and welcome to both OCN and our exclusive club Gatehous3.









Your going to love that build. The Ivy heat issues are a myth. Mine is running under the 70's in temp with 24/7 4.2 GHz today.

The GTX 690 has been idling at 30C and hottest even after Crysis and benchmarking was 72C.


----------



## Supper

Quote:


> Originally Posted by *jincuteguy*
> 
> Are you really sure about this? It's almost mid May and still no sign of 690s availability except EVGA.com put up like a couple every day.


Perhaps you can look harder and EVGA is not the only brand out there.
I have seen plenty like Gigabyte, Asus, MSI, Palit, Galaxy.
In Singapore, people are selling around S$1550 (US$1237) a piece.

Guys, im curious how you guys setup your case airflow with 690 in it?
For me,
1x rear intake
1x bottom intake
2x top intake
2x front exhaust


----------



## V3teran

Some more Benchies with an Overclock.
Monitoring software:EVGA Precision(Seems more stable than Afterburner for me)
Driver:301.33
Rating on Driver:Good driver but could and will get better,certainly better than the horrendous 301.34
Power target:135
Gpu clock offset:120mhz
Memory clock offset:200mhz

Gpu1:1176
Gpu2:1166/1176
Max temp:70 Degrees
Fan speed:70
Max volts:1.175

This is a great oc and i feel there is more to squeeze out of these on air with this driver!

The drivers will get better,guaranteed which will improve oc.
I will be pushing this card further as i have not hit oc threshold yet.

Here are few benchies.
3Dmark Vantage...DX10 Phyxs on








3D Mark 11








Heaven Bench DX11 cpu at 4.4









Heaven Bench DX11 cpu at 4.4 increase in fps by 1








More benchies soon,stay tuned


----------



## kemsoff

Cpu still stock ( waiting to finish buying my loop, WC for a th10 is expensive







)

145 / 300 oc on the card


----------



## jincuteguy

Quote:


> Originally Posted by *Supper*
> 
> Perhaps you can look harder and EVGA is not the only brand out there.
> I have seen plenty like Gigabyte, Asus, MSI, Palit, Galaxy.
> In Singapore, people are selling around S$1550 (US$1237) a piece.
> Guys, im curious how you guys setup your case airflow with 690 in it?
> For me,
> 1x rear intake
> 1x bottom intake
> 2x top intake
> 2x front exhaust


I know EVGA is not the only brand out there. But there's no 690s in stock anywhere man, except ebay where ppl selling for 1500


----------



## Basti

Quote:


> Originally Posted by *jincuteguy*
> 
> I know EVGA is not the only brand out there. But there's no 690s in stock anywhere man, except ebay where ppl selling for 1500


Ncix has stock>10!
http://ncix.ca/products/?sku=71419&vpn=GTX690-4GD5&manufacture=ASUS


----------



## jincuteguy

That's Cananda money, im in America. And I checked their US stores, none.

Unless you know a way to order with Canadian money?

I tried to add it to cart, then select US customer, but it said there's no stock for US. So unless you live in Canada, i do'nt think u can order .


----------



## bob808

I was thinking that this card would be better than 2x680 since the 690 has 4 GB ddr5 on board, but it performs the same (well slightly lower than sli 680's). I guess it is really like having 2x680's and the same rule applies where basically you only get to use 2 GB? Otherwise there would be a performance increase at 2560x1600 no?


----------



## jcde7ago

Too lazy to quote everyone, so i'm just going to address people individually here, lol:

*@ V3teran*: I would not recommend volt modding a $1K card, lol...the performance gain will not be worth it. Wait to see if Nvidia unlocks the voltage slightly down the road, especially since a high thermal envelope downclocks your boost OC automatically.

*@ gatehous3*: Added you to the list!

*@ Arizonian*: SLI-usage on both cores will be automatic if multi-gpu use is enabled in Nvidia Control Panel

*@ bob808*: Correct.

*Reminder guys; the same rule that applies to the 680 applies to the 690 here.* *As soon as you hit 70c, the card WILL downclock 13Mhz off of your boost, and will downclock again near/at 80*, to maintain a good thermal threshold. This is not conjecture; this is fact. This is why i've been saying that these cards need to go under water if you want to hit their absolute limit (without overvolting, cause unlocked voltage would be a different story) - if we can keep these under 60c or around there, our boost clocks will undoubtedly be higher. The question is, is buying a waterblock (and taking off that sweet housing) and going through the work of adding it to a loop worth a couple FPS in Heaven or a breaking 18K in 3DMark?


----------



## tonyjones

Where are the QUAD SLI benches!


----------



## ceteris

Wow! Lots of posts coming back on today! Those are some kick arse scores JCD! I actually broke my rule and tried to OC a little, but my ambient is killing me. I can barely do half of your settings. Anything over will kill Heaven and lock my system up. I can't wait to move out of this apartment and into a place with central air cooling. But til then, I need to wait for the waterblocks to come out so I can hook my waterchiller up and push these babies to the limit!


----------



## Insomnium

More pictures please of your new shiny cards








The 690 is so gorgeous


----------



## gatehous3

Quote:


> Originally Posted by *Insomnium*
> 
> More pictures please of your new shiny cards
> 
> 
> 
> 
> 
> 
> 
> 
> The 690 is so gorgeous


Agreed haha, too bad im the only one that has an ASUS







Im super curious as to whats going to come in the box hahaha


----------



## jcde7ago

Quote:


> Originally Posted by *Insomnium*
> 
> More pictures please of your new shiny cards
> 
> 
> 
> 
> 
> 
> 
> 
> The 690 is so gorgeous


I think we're all too busy gaming and benchmarking at this point, haha. I won't lie though...the 690 is probably the best looking reference card i've ever owned/seen. It really is in a class of its own. That said, i'd probably post more pics if I didn't only have an iPhone 4, lol. They don't turn out that great, and Googling images is probably a lot better, lol.

Also, every GTX 690 being sold is currently a reference model directly from Nvidia. Partners are not (yet) allowed to make changes to the reference design, so they are all going to look the same.


----------



## bnj2

Quote:


> Originally Posted by *jcde7ago*
> 
> Also, every GTX 690 being sold is currently a reference model directly from Nvidia. Partners are not (yet) allowed to make changes to the reference design, so they are all going to look the same.


There will be no non-reference design for 690, Nvidia said that.
I am also expecting my two Asus cards on monday and I'm really curious what's the box gonna look like. If it's gonna be like the ones they put their normal cards in (not that they are ugly), I'm going to be disappointed.


----------



## gatehous3

Quote:


> Originally Posted by *bnj2*
> 
> There will be no non-reference design for 690, Nvidia said that.
> I am also expecting my two Asus cards on monday and I'm really curious what's the box gonna look like. If it's gonna be like the ones they put their normal cards in (not that they are ugly), I'm going to be disappointed.


Please do me a favor and post up pics/unboxing


----------



## jcde7ago

Quote:


> Originally Posted by *bnj2*
> 
> There will be no non-reference design for 690, Nvidia said that.
> I am also expecting my two Asus cards on monday and I'm really curious what's the box gonna look like. If it's gonna be like the ones they put their normal cards in (not that they are ugly), I'm going to be disappointed.


Not entirely true. Partners can't change the reference housing on the reference design, but they will be allowed to release variations like watercooled versions. For example, EVGA is already working on a Hydro Copper version, although it's still probably about ~6 weeks away or so. EVGA Jacob already confirmed this on his Twitter.


----------



## bnj2

Quote:


> Originally Posted by *gatehous3*
> 
> Please do me a favor and post up pics/unboxing


Will do








Quote:


> Originally Posted by *jcde7ago*
> 
> Not entirely true. Partners can't change the reference housing on the reference design, but they will be allowed to release variations like watercooled versions. For example, EVGA is already working on a Hydro Copper version, although it's still probably about ~6 weeks away or so. EVGA Jacob already confirmed this on his Twitter.


Yup, like with the 590, but there won't be any changes to the PCB or the card itself.


----------



## jcde7ago

Can you guys start adding your OC results in this format with some of the OC settings that you've found to be stable through at least 2-3 runs of Heaven with the OCN Top 30 settings as well as with 3DMark11 (P) settings? And of course, stable through gaming (something demanding like BF3, Crysis, Metro). I'm thinking of something like this:

*jcde7ago*
Power Target: 135% (should always be left at 135% PT for maximum OC'ing headroom)
Core Offset: *+140 Mhz*
Memory Offset: *+300 Mhz*
Driver: *301.34*
Stable through Heaven 3.0 (OCN Top 30 Settings)/3DMark11 (P)/Gaming (BF3/Metro/Crysis)?: *Yes*
Is this your Max OC?: *No*

And of course, these are EVGA Precision X OC settings, so go with that if you can. Thanks!


----------



## Cheesemaster

Im still waiting... I think a quad bench on system should be the bomb... I got a 3960x @ 4.8... wish I can go higher but that is all my h100 will let me..


----------



## V3teran

Quote:


> Originally Posted by *Cheesemaster*
> 
> Im still waiting... I think a quad bench on system should be the bomb... I got a 3960x @ 4.8... wish I can go higher but that is all my h100 will let me..


You need custom water cooling mate,works much better,esepcially and an Extreme cpu?


----------



## V3teran

Quote:


> Originally Posted by *jcde7ago*
> 
> Too lazy to quote everyone, so i'm just going to address people individually here, lol:
> *@ V3teran*: I would not recommend volt modding a $1K card, lol...the performance gain will not be worth it. Wait to see if Nvidia unlocks the voltage slightly down the road, especially since a high thermal envelope downclocks your boost OC automatically.


Why not?The card is a quality build from the best components,not like the flaky 590,i reckon it could handle a little volt mod,a little volt mod to it would allow you to get even more out of it providing its cooled well(custom water)
i already have an RX 360 ready for this card all to its self,i also have another RX360 rad ready for another 690:thumb:
http://www.overclock.net/t/1111648/build-log-caselabs-project-overkill#post_16863168


----------



## SimpleTech

I was able to order one earlier today. Unfortunately it's going to be resold once I get it.. but for a good cause. The guy I'm selling it to lives in Brazil where most of their parts cost almost double than what they selling USA. Only going to be making $50 profit on it.


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Why not?The card is a quality build from the best components,not like the flaky 590,i reckon it could handle a little volt mod,a little volt mod to it would allow you to get even more out of it providing its cooled well(custom water)
> i already have an RX 360 ready for this card all to its self,i also have another RX360 rad ready for another 690:thumb:
> http://www.overclock.net/t/1111648/build-log-caselabs-project-overkill#post_16863168


Because the drivers will still downclock it unless you keep temps in check. More voltage = higher temps = automatic downclocking, unless you watercool and keep temps down. Volt modding would not do much for us at the moment.


----------



## tahoward

I tried dropping the memory but my GPU won't have it. 1202 is stable for about 20/26 of the Heaven bench before it crashes without warning. Dropped the target by 13Mhz and they are perfectly stable at 1190Mhz with memory offset at 715Mhz.

Here are some benches with the new settings:





Edit: I got it to run stable with an offset of 137. Both GPUs @ 1196 but the heaven score was lower by a couple hundred points without the increased memory offset. It crashed when I set the memory offset to 715 again.

Guessing my highest scoring overclock without better cooling is: 135% power / +131 Core offset / +715 memory offset.


----------



## jcde7ago

Quote:


> Originally Posted by *tahoward*
> 
> I tried dropping the memory but my GPU won't have it. 1202 is stable for about 20/26 of the Heaven bench before it crashes without warning. Dropped the target by 13Mhz and they are perfectly stable at 1190Mhz with memory offset at 715Mhz.
> Here are some benches with the new settings:
> Edit: I got it to run stable with an offset of 137. Both GPUs @ 1196 but the heaven score was lower by a couple hundred points without the increased memory offset. It crashed when I set the memory offset to 715 again.
> Guessing my highest scoring overclock without better cooling is: 135% power / +131 Core offset / +715 memory offset.


Great scores man!!!









I tried out that ridiculous memory offset you're using, and here is what I came up with for complete stability:

*jcde7ago*
Power Target: *135%*
Core Offset: *+130 Mhz*
Memory Offset: *+700 Mhz*
Driver: *301.34*
Stable through Heaven 3.0 (OCN Top 30 Settings)/3DMark11 (P)/Gaming (BF3/Metro/Crysis)?: *Yes*
Is this your Max OC?: *No*

*The results are my highest 3DMark11 score yet, as well as my highest Heaven score by over 3FPS!!!*









Heaven definitely benefits from a higher memory OC, just like I suspected. 3DMark, not so much, but lowering the Core by a +10 offset and raising the memory did bring it up about 34 points:





I may end up running 135% Power Target/120 Core Offset/500 Memory Offset for a 24/7 OC.









EDIT: Looks like my latest Heaven score is good for 10th overall on the OCN Top 30. EvTron's 680s in SLI are the only Kepler cards ahead of me (in two-card config.) and then a pair of 7970s in 2-card CF as well. Everything else above me are either Tri/Quad-card configs.


----------



## tahoward

Awesome, looks pretty consistent. Seems our worst enemy when achieving highest overclock is temps. I'd expect a GTX 690 equiped with a waterblock can easily achieve 1200 core/3400 shader/7000-7400 memory.

I can see the benefit with your CPU @ 4.7Ghz, it really pushes the max frame rate a good bit.


----------



## kemsoff

Woot - cpu still @ stock speeds

I think I can probably go a bit further, Ill try more later

Kemsoff
Power Target: 135%
Core Offset: +142 Mhz
Memory Offset: +700 Mhz
Driver: 301.34
Stable through Heaven 3.0 (OCN Top 30 Settings)/3DMark11 (P)/Gaming (BF3/Metro/Crysis)?: Yes
Is this your Max OC?: No


----------



## V3teran

Question for you guys....
When using EVGA precision and you make your oc adjustments using the 3 bars do you need to also max out the voltage or does it do it automatically with the power target,i noticed that the voltage at stock is always 988mv.
Thanks


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Question for you guys....
> When using EVGA precision and you make your oc adjustments using the 3 bars do you need to also max out the voltage or does it do it automatically with the power target,i noticed that the voltage at stock is always 988mv.
> Thanks


It does it automatically.


----------



## Shadowness

Quote:


> Originally Posted by *jcde7ago*
> 
> Not entirely true. Partners can't change the reference housing on the reference design, but they will be allowed to release variations like watercooled versions. For example, EVGA is already working on a Hydro Copper version, although it's still probably about ~6 weeks away or so. *EVGA Jacob already confirmed this on his Twitter.*


Perfect


----------



## gatehous3

Correct me if im mistaken, but from my 680 OC, i thought that if you increased your power target, it would actually reduce the numbers you would get while benching.

I would max mine, find a stable clock, then use the EVGA software to see what my maximum power usage was (ie: 105) and then change the power offset to be something like 110 or a little more.


----------



## jcde7ago

Quote:


> Originally Posted by *gatehous3*
> 
> Correct me if im mistaken, but from my 680 OC, i thought that if you increased your power target, it would actually reduce the numbers you would get while benching.
> I would max mine, find a stable clock, then use the EVGA software to see what my maximum power usage was (ie: 105) and then change the power offset to be something like 110 or a little more.


Think of Power Target as a voltage limiter, since that is all it does. Nothing more, nothing less. If you set your power target to the max, you have the maximum headroom to allow your card to consume more power IF it needs it, depending on your clock offsets.


----------



## Arizonian

Quote:


> Originally Posted by *gatehous3*
> 
> Correct me if im mistaken, but from my 680 OC, i thought that if you increased your power target, it would actually reduce the numbers you would get while benching.
> I would max mine, find a stable clock, then use the EVGA software to see what my maximum power usage was (ie: 105) and then change the power offset to be something like 110 or a little more.


Actually your pretty much got it right. The power target may over exceed core GPU boost which would crash your driver. Which would cloud highest over clock your GPU may reach.

How to: 680 overclocking fundamentals

Not much difference in the methodology in over clocking the GTX 690 from the 680 either. I've yet to master the over clocking to true absolute over clock potential.

Celeras gave a good guide and explained the over clocking very well.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Actually your pretty much got it right. The power target may over exceed core GPU boost which would crash your driver. Which would cloud highest over clock your GPU may reach.
> How to: 680 overclocking fundamentals
> Not much difference in the methodology in over clocking the GTX 690 from the 680 either. I've yet to master the over clocking to true absolute over clock potential.
> Celeras gave a good guide and explained the over clocking very well.


Great link! Yeah, it would appear that Kepler can over-draw more power than allowed to hit an offset, though that shouldn't make sense, because the whole point of the PT is to increase the headroom if needed, not consume the power anyway even if it can achieve the same clocks at a lower PT...doesn't make much sense, unless i'm not understanding it correctly.

Also, it seems like leaving the PT at max doesn't even matter if you can keep your card below 70c, where it begins to throttle.

In any case, good stuff.


----------



## Basti

Anyone see what's wrong with these pictures?


----------



## bitMobber

Quote:


> Originally Posted by *Michalius*
> 
> Got it test fitted yesterday as I had to see how it looked with the motherboard and RAM.
> 
> Don't like the H100 there at all. Ordered an Alphacool ST-30 360mm radiator, Mayhem's Pastel Green concentrate, XSPC Raystorm, and some primochill clear tubing. Will be much happier with that.
> Now I just need the fricken Shinobi XL white to release. Come on Bitfenix, pleeeeeeeeeeeease.


Very nice! What paint did you use for the ram sticks and slots, and how did you paint the slots?


----------



## SimpleTech

Quote:


> Originally Posted by *bitMobber*
> 
> Very nice! What paint did you use for the ram sticks and slots, and how did you paint the slots?


He's using a Gigabyte G1.Assassin 2 and Corsair Vengeance Low Profile White sticks.


----------



## rwchui

I am using the same RAM, they are not painted.

Here they are:

http://www.ncix.com/products/?sku=62841&vpn=CML8GX3M2A1600C9W&manufacture=Corsair

http://www.corsair.com/memory-by-product-family/vengeance/vengeance-low-profile-8gb-dual-channel-ddr3-memory-kit-cml8gx3m2a1600c9w.html


----------



## CapnCrunch10

Quote:


> Originally Posted by *Cheesemaster*
> 
> Im still waiting... I think a quad bench on system should be the bomb... I got a 3960x @ 4.8... wish I can go higher but that is all my h100 will let me..


4.8GHz with a H100 on the 3960X is pretty damn good. What are your max temps and ambient temps?


----------



## gatehous3

Quote:


> Originally Posted by *Arizonian*
> 
> Actually your pretty much got it right. The power target may over exceed core GPU boost which would crash your driver. Which would cloud highest over clock your GPU may reach.
> How to: 680 overclocking fundamentals
> Not much difference in the methodology in over clocking the GTX 690 from the 680 either. I've yet to master the over clocking to true absolute over clock potential.
> Celeras gave a good guide and explained the over clocking very well.


Thanks for the link!!

Quote:


> Originally Posted by *jcde7ago*
> 
> Great link! Yeah, it would appear that Kepler can over-draw more power than allowed to hit an offset, though that shouldn't make sense, because the whole point of the PT is to increase the headroom if needed, not consume the power anyway even if it can achieve the same clocks at a lower PT...doesn't make much sense, unless i'm not understanding it correctly.
> Also, it seems like leaving the PT at max doesn't even matter if you can keep your card below 70c, where it begins to throttle.
> In any case, good stuff.


I found that as I was OCing, I was losing FPS while maintining a higher PT. I also found that my clock speeds had to be reduced a bit in order to compensate for the higher PT.

I was getting better results reducing the PT to a comfortable level while still staying above max power, and then being able to increase my clock speeds!










i love OCN


----------



## V3teran

Quick heaven bench,i increased the core clock up by 10mhz and the memory clock up to a massive 500mhz!Fully stable
Things are starting to get hot now as card is hitting mid 70s in temps,max on the core is now 1186mhz full load.

Monitoring software:EVGA Precision(Seems more stable than Afterburner for me)
Driver:301.33
Rating on Driver:Good driver but could and will get better,certainly better than the horrendous 301.34

Power target:135
Gpu clock offset:130mhz
Memory clock offset:500mhz

Gpu1:1186
Gpu2:1176/1186
Max temp:75 Degrees
Fan speed:70
Max volts:1.175

This is a great oc and i feel there is more to squeeze out of these on air with this driver!

The drivers will get better,guaranteed which will improve oc.
I will be pushing this card further as i have not hit oc threshold yet.


----------



## bnj2

Any of you guys got to play anything on your cards or just benchmarks?
Seems that the 670/680 is plagued by stuttering and I wonder if this affects 690 as well. Hopefully it's just a driver issue and not a hardware bug - that would really suck for us









More info here: http://forums.nvidia.com/index.php?showtopic=226227


----------



## V3teran

Another bench using Heaven.









Power target:135
Gpu clock offset:140mhz
Memory clock offset:500mhz

Gpu1:1196
Gpu2:1186/1196
Max temp:68 Degrees
Fan speed:80
Max volts:1.175

For Hilbert(Guru3d) to get 1250mhz on the core stable is outstanding,obviously the sample cards are much better than the ones that are for sale,ive noticed this about many reviewers,there sample cards seem to be better generally.

Im still happy with this oc that ive achieved and im confident i will be able to go even higher once a proper driver comes out as the drivers at the moment are hit and miss i feel.


----------



## kemsoff

Quote:


> Originally Posted by *bnj2*
> 
> Any of you guys got to play anything on your cards or just benchmarks?
> Seems that the 670/680 is plagued by stuttering and I wonder if this affects 690 as well. Hopefully it's just a driver issue and not a hardware bug - that would really suck for us
> 
> 
> 
> 
> 
> 
> 
> 
> More info here: http://forums.nvidia.com/index.php?showtopic=226227


Ive had 0 issues on mine, no stuttering or anything


----------



## V3teran

If i try to set my memory offset to 700mhz i get artifacting on my desktop and then the card drops back to stock levels and i have to reset the 3 bars in evga precision,who comes you guys can get 700 and i cannot?It wont even let me attempt to even try....


----------



## kemsoff

Quote:


> Originally Posted by *V3teran*
> 
> If i try to set my memory offset to 700mhz i get artifacting on my desktop and then the card drops back to stock levels and i have to reset the 3 bars in evga precision,who comes you guys can get 700 and i cannot?It wont even let me attempt to even try....


What's the core offset? I was able to do 145 and 700 but 150 would crash


----------



## V3teran

core offset is 135
memory is 500


----------



## Arizonian

Do you think we should be downloading 'The EVGA SLI Enchancement' driver? Or is it being already incorporated with our GTX 690's in the current 690 drivers?


----------



## kemsoff

Quote:


> Originally Posted by *V3teran*
> 
> core offset is 135
> memory is 500


Hmmmm can you do anything? Or is it immediately giving you issues the second you apply the oc?

I can run heaven with 700 but couldn't make it through 3d mark set that high.

Also maybe you posted somewhere else but what are your system specs? Maybe fill in your specs so we can see what your running


----------



## V3teran

Quote:


> Originally Posted by *kemsoff*
> 
> Hmmmm can you do anything? Or is it immediately giving you issues the second you apply the oc?
> I can run heaven with 700 but couldn't make it through 3d mark set that high.
> Also maybe you posted somewhere else but what are your system specs? Maybe fill in your specs so we can see what your running


im fine at 135 power
130 core
500 memory
With these settings im stable in everything,whenever i try 700 on the memory the screen flickers like a rainbow of colours as soon as i hit apply on evga precision,i then have to reset my 3 bars as the artifacting knocks everthing down to stock levels,it happens for about 2 seconds the flickering so 700 is ano go area for me,my card wont even accept it.

Ill add my rig soon but im running a rampage 2 extreme,930 cpu at 4.4 ht on fully stable in prime,linx,occt and ibt.
6gb of corsair xms.
My cpu and motherboard is fully watercooled with the best custom water cooling components,its a little old but its a good performer at 4.4 and is rock solid stability wise.

I was thinking this gigabyte 690 card is ok,what do you guys think of my max oc below
1176 pcb1
1176 pcb2
I can run at this for 24/7 use

Or shall i send my card back and get an msi card as my retailer has them instock....opinions welcome


----------



## Arizonian

I'm almost positive but not confirmed as of thus far my limit: +135 / +130 / +200. Memory seems low but any higher driver fails. Stable in Crysis maxed out.

Base Core 1045
Memory 1602
Core GPU Boost 1175 (1st GPU) / 1201 (2nd GPU)





Now to get working on Heaven Benchmark.


----------



## Basti

Quote:


> Originally Posted by *V3teran*
> 
> im fine at 135 power
> 130 core
> 500 memory
> With these settings im stable in everything,whenever i try 700 on the memory the screen flickers like a rainbow of colours as soon as i hit apply on evga precision,i then have to reset my 3 bars as the artifacting knocks everthing down to stock levels,it happens for about 2 seconds the flickering so 700 is ano go area for me,my card wont even accept it.
> Ill add my rig soon but im running a rampage 2 extreme,930 cpu at 4.4 ht on fully stable in prime,linx,occt and ibt.
> 6gb of corsair xms.
> My cpu and motherboard is fully watercooled with the best custom water cooling components,its a little old but its a good performer at 4.4 and is rock solid stability wise.
> I was thinking this gigabyte 690 card is ok,what do you guys think of my max oc below
> 1176 pcb1
> 1176 pcb2
> I can run at this for 24/7 use
> Or shall i send my card back and get an msi card as my retailer has them instock....opinions welcome


Speaking of low OC, I could only set mine to 111mhz core and 260mhz mem, if I go above it, driver crashes. I noticed also that my card draws max power(1.175mv) regardless of how much pt/co settings, be it positive or negative during benchmarks.
Defective card maybe? What do you guys think?


----------



## Arizonian

Quote:


> Originally Posted by *Basti*
> 
> Speaking of low OC, I could only set mine to 111mhz core and 260mhz mem, if I go above it, driver crashes. I noticed also that my card draws max power(1.175mv) regardless of how much pt/co settings, be it positive or negative during benchmarks.
> Defective card maybe? What do you guys think?


Not defective at all. If the card runs at stock speeds it's working. Over clocking is not a garrantee past what it ships stock. You cannot return a card because it dosen't over clock well. Now if your crashing playing games or running a benchmark at stock speeds then you do have an issue.

Over clocking success will vary card to card from one that barely over clocks, average, better than average, great, and golden chip incredible over clocking. Affectionately called ' silicone lottery '.









As for your over clock - go down on the Memory to +200 and higher your Core to +130. As of yet that's my top speed. Memory effects the amount of over clock less than core. If anything you want your Core to be at it's max before you start to over clock on the memory to get the absolute best over clock you can squeeze.

Those clocks though low for me get the 1st GPU 1175 Mhz and the 2nd GPU to 1201 Mhz Core. I wish I could reach both to 1200 Mhz core but in real world usage 25 Mhz relates to a few FPS in a game and not anything to stress over one bit.

Anyone hitting 1200 Mhz or more should be pretty happy because that's an 285 MHz over clock. Anything past 200 Mhz over Core clock should be considered a good over clock regardless of which card you own. That's the reality of it. Trying to compare with those lucky people with great VRAM is impracticable.


----------



## V3teran

Quote:


> Originally Posted by *Arizonian*
> 
> I'm almost positive but not confirmed as of thus far my limit: +135 / +130 / +200. Memory seems low but any higher driver fails. Stable in Crysis maxed out.
> Base Core 1045
> Memory 1602
> Core GPU Boost 1175 (1st GPU) / 1201 (2nd GPU)
> 
> 
> Now to get working on Heaven Benchmark.


Thats wierd
These are my settings
135 power
130 core
500 memory

pcb1-1176
pcb2-1176

Yet you say that gpu 2 hits 1201 yet my settings are higher than yours and i dont get near 1200mhz.
Also your 3dmark is higher than mine yet i have higher settings....
Quote:


> Originally Posted by *Basti*
> 
> Speaking of low OC, I could only set mine to 111mhz core and 260mhz mem, if I go above it, driver crashes. I noticed also that my card draws max power(1.175mv) regardless of how much pt/co settings, be it positive or negative during benchmarks.
> Defective card maybe? What do you guys think?


Hmm i dont think your card is faulty i just think its luck of the draw,however i do think the EVGA cards are generally better even though the cards are supposedly all the same.You can always send your card back and change it if your not happy for another gigabyte 690 or msi or evga one(if you can find an evga one)


----------



## Arizonian

@ V3teran - Well I can't see your system specs in your Bio. Should probably add those for other OCN members to view, especially if your asking for help because it helps us know what your dealing with to help you.

The scores can vary depending on CPU and CPU over clocks. 3DMark11 benches have sections that combine CPU with GPU and in turn will effect everyone score. So unless the same identical CPU is being used the scores may be different.

As for the over clocks - sometimes too high of a memory will either cause your GPU to crash or actually lower your over all score. Another thing to keep in mind is that my VRAM on the other side may very be better in terms of the 'silicone lottery' and doing better. They don't have to run on the same Core just like two 680's in SLI will vary. Remember we essentially have two independently working GPU's on one PCB.

Just like VRAM is 2GB per side and 5 Power PWM's per side don't combine but is mirrored so is our VRAM's capability.

You'll notice there is a 3DMark 11 Graphic score - that would be a better gauge because that one score doesn't take the CPU into consideration.


----------



## Basti

But isn't my card suppose to not reach max voltage when I set the core/power target below stock?


----------



## V3teran

Quote:


> Originally Posted by *Basti*
> 
> But isn't my card suppose to not reach max voltage when I set the core/power target below stock?


Your gpu will use max voltage if your benching or playing games and the gpu needs extra power,it will do this automatically providing that your power target is high enough,i just leave my power target at 135 permanent.

I see you have a Gigabyte card the same as me,can you do me a favour and increase your memory clock up to 700mhz and apply it,tell me what happens as my gpu shows a rainbow of colours on the screen for a couple of seconds and then resets itself back to stock within evga precision,i dont think the evga cards have this problem but we will soon find out.


----------



## Arizonian

Quote:


> Originally Posted by *Basti*
> 
> But isn't my card suppose to not reach max voltage when I set the core/power target below stock?


One thing I noticed between my 680 and 690 is the power target will max out and I seen it at 136.4 in TDP spec using GPU-Z to monitor it.

My 680 power target was lower and would stay within a few % of where I set it. The 690 is exceeding it. I'm not sure what 680 SLI owners are showing but I imagine similar.

Dynamic over clocking is taking over during the more intense areas while demanding more voltage in order to OC itself. It's hard coded in to give the best perfomance obtainable and not crash or hold you back.

One thing for certain is its NEVER using more than 1.175v at any given time, even when I force 1.175v at a constant.

On a side note I seem to get lower score when my votlage is set to 1.175v constant anyway. Seems the constant voltage higher temps by not being able to fluctuate with what's being rendered.

I've even tried hiring my frame target to see if I could force more FPS but in situations whet it's not able to maintain that higher FPS it still dips lower and in area it could go over my target FPS it's limiting me from going higher and lowering my over all OC.


----------



## jcde7ago

Quote:


> Originally Posted by *bnj2*
> 
> Any of you guys got to play anything on your cards or just benchmarks?
> Seems that the 670/680 is plagued by stuttering and I wonder if this affects 690 as well. Hopefully it's just a driver issue and not a hardware bug - that would really suck for us
> 
> 
> 
> 
> 
> 
> 
> 
> More info here: http://forums.nvidia.com/index.php?showtopic=226227


I've played tons of games, all maxed out, all running at 60fps constant with vsync on, easily over 60 with vsync off, completely buttery smooth @ 2560x1440p.

The GTX 690 also uses frame metering, and actually handles any micro stutter better than individual 680s in SLI did. I haven't had any stuttering issues at all - this card just eats every game i throw at it, completely maxed out.


----------



## V3teran

Quote:


> Originally Posted by *Arizonian*
> 
> One thing I noticed between my 680 and 690 is the power target will max out and I seen it at 136.4 in TDP spec using GPU-Z to monitor it.
> My 680 power target was lower and would stay within a few % of where I set it. The 690 is exceeding it. I'm not sure what 680 SLI owners are showing but I imagine similar.
> Dynamic over clocking is taking over during the more intense areas while demanding more voltage in order to OC itself. It's hard coded in to give the best perfomance obtainable and not crash or hold you back.
> One thing for certain is its NEVER using more than 1.175v at any given time, even when I force 1.175v at a constant.
> On a side note I seem to get lower score when my votlage is set to 1.175v constant anyway. Seems the constant voltage higher temps by not being able to fluctuate with what's being rendered.
> I've even tried hiring my frame target to see if I could force more FPS but in situations whet it's not able to maintain that higher FPS it still dips lower and in area it could go over my target FPS it's limiting me from going higher and lowering my over all OC.


I agree with you about the volts,definitely more stable by leaving them set at stock instead of a permanent 1.175v.
So whats your maximum stable oc now then?


----------



## Michalius

Max OC so far.


----------



## jcde7ago

Quote:


> Originally Posted by *Michalius*
> 
> 
> Max OC so far.


I wouldn't consider OC scanner a dependable test for OC stability.

If you can get through 3 straight runs of Heaven (at least with OCN Top 30 settings, max AA/AF, Extreme Tessellation, etc) AND 3DMark11 at those clocks, then yes. Otherwise...they can't be considered stable, so it wouldn't be your max.


----------



## V3teran

Dont you bother using the dedicated volt meter?


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Dont you bother using the dedicated volt meter?


He had his power target at max. That will already allow him to reach his highest clocks. Setting it at max voltage the whole time will just increase temps, cause the voltage will be at its highest even if doesn't need to be. Remember, downclocking is going to occur the as soon as you hit 70c, don't forget that.


----------



## Kyouki

Just a random idea but I was thinking about buying this http://www.newegg.com/Product/Product.aspx?Item=N82E16811996017

And flipping fan to act as a vacuum so I could then line it up with the hot air exhausting into case this basically would sit in my empty drive bay and line up perfect from what I can see. And send the hot air out the front. I dunno if you can see what I see if not I could draw a picture lol. But what do you guys think? Or am I wasting my time haha.

Here My quick Paint image hahahaha in GREEN being MY IDEA and showing my set up.


----------



## kemsoff

Quote:


> Originally Posted by *Kyouki*
> 
> Just a random idea but I was thinking about buying this http://www.newegg.com/Product/Product.aspx?Item=N82E16811996017
> And flipping fan to act as a vacuum so I could then line it up with the hot air exhausting into case this basically would sit in my empty drive bay and line up perfect from what I can see. And send the hot air out the front. I dunno if you can see what I see if not I could draw a picture lol. But what do you guys think? Or am I wasting my time haha.


This card runs so cool as it is. I would think it would be a waste


----------



## V3teran

Quote:


> Originally Posted by *jcde7ago*
> 
> He had his power target at max. That will already allow him to reach his highest clocks. Setting it at max voltage the whole time will just increase temps, cause the voltage will be at its highest even if doesn't need to be. Remember, downclocking is going to occur the as soon as you hit 70c, don't forget that.


Few questions mate.....
You leave the dedicated volt meter alone at stock and just raise the PT?
Why does downclocking occur at 70 degrees,isnt that abit low for it to start downclocking?
Is this driver related?
Is there anyway to prevent downclocking other than adding waterblocks?
Thanks mate


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Few questions mate.....
> You leave the dedicated volt meter alone at stock and just raise the PT?
> Why does downclocking occur at 70 degrees,isnt that abit low for it to start downclocking?
> Is this driver related?
> Is there anyway to prevent downclocking other than adding waterblocks?
> Thanks mate


Yes, you can leave the dedicated volt meter alone. Raising the Power Target to max is essentially telling the card that it can use the full 1.175mV if it needs it. Downclocking at 70c is at the driver level. There is no way to prevent this currently unless you can keep temps in check, or unless Nvidia changes it in a future driver.

Also, i believe all Kepler cards will downclock at 70c, and then furthermore at every 10c+ onward, so at ~80c, etc.


----------



## V3teran

Quote:


> Originally Posted by *jcde7ago*
> 
> Yes, you can leave the dedicated volt meter alone. Raising the Power Target to max is essentially telling the card that it can use the full 1.175mV if it needs it. Downclocking at 70c is at the driver level. There is no way to prevent this currently unless you can keep temps in check, or unless Nvidia changes it in a future driver.
> Also, i believe all Kepler cards will downclock at 70c, and then furthermore at every 10c+ onward, so at ~80c, etc.


Oh ok thanks.
My temps go up to 75 when i use heaven on stock with a fan speed of 65,so although downclocking occurs,i dont see it because my on screen display shows both of my pcbs at 1046...


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Oh ok thanks.
> My temps go up to 75 when i use heaven on stock with a fan speed of 65,so although downclocking occurs,i dont see it because my on screen display shows both of my pcbs at 1046...


You may not actively see it, but if both cores are running at 1046 at 75c, that automatically means that they COULD be running higher, and probably should be, but are being limited by the temps.


----------



## error-id10t

Given that there is a Asus TOP version for both 670 and 680, does this imply (or give us a chance) that we'll see a 690 TOP also? I don't care about the cooler, whatever I get is going under water but this seems to have the highest clocks.


----------



## jcde7ago

Quote:


> Originally Posted by *error-id10t*
> 
> Given that there is a Asus TOP version for both 670 and 680, does this imply (or give us a chance) that we'll see a 690 TOP also? I don't care about the cooler, whatever I get is going under water but this seems to have the highest clocks.


As of right now, no non-reference 690s are going to be released. Nvidia has the board partners needing to stick with reference cards and reference shroud/housing, though there will be some watercooled versions down the road. For a TOP version, it'd need to be a non-reference PCB, which Nvidia is not allowing at the moment.


----------



## Arizonian

Quote:


> Originally Posted by *error-id10t*
> 
> Given that there is a Asus TOP version for both 670 and 680, does this imply (or give us a chance) that we'll see a 690 TOP also? I don't care about the cooler, whatever I get is going under water but this seems to have the highest clocks.


Quote:


> Originally Posted by *jcde7ago*
> 
> As of right now, no non-reference 690s are going to be released. Nvidia has the board partners needing to stick with reference cards and reference shroud/housing, though there will be some watercooled versions down the road. For a TOP version, it'd need to be a non-reference PCB, which Nvidia is not allowing at the moment.


JCD is correct. There has never been a non-reference dual GPU prior to this that was allowed a non-reference build and will most likely continue as reference only moving forward.

The GTX 690 is very different in design from the ground up, that took this basic reference model to a new level. I'm pretty confident Nvidia will not allow permission of any type of modification to their unique build.

Other than putting the GTX 690 under water I don't forsee variants from any other vendor for the GTX 690 either.


----------



## Basti

So having a low OC card, if I buy another gtx 690 and even if the new card OC's well, the low OC card will hold it back when used in sli. Is that correct?


----------



## jcde7ago

Quote:


> Originally Posted by *Basti*
> 
> So having a low OC card, if I buy another gtx 690 and even if the new card OC's well, the low OC card will hold it back when used in sli. Is that correct?


Correct.

However, I don't think you need to worry about it, since Quad-SLI scaling will not be as efficient as 2 or 3-card SLI configs. Therefore, you wouldn't see those marginal gains from the clock speed increase on one card versus the other anyways.


----------



## phantomphenom

Any news on a true 4gb version?


----------



## jcde7ago

Quote:


> Originally Posted by *phantomphenom*
> 
> Any news on a true 4gb version?


Again, no non-reference 690s for now. Nvidia is not allowing it. The partners chosen to resell the 690 have agreed to that. So, no 4GB 690s for the foreseeable future.


----------



## Arizonian

Quote:


> Originally Posted by *jcde7ago*
> 
> Again, no non-reference 690s for now. Nvidia is not allowing it. The partners chosen to resell the 690 have agreed to that. So, no 4GB 690s for the foreseeable future.


Correct.

Also to dispel another question that comes up time to time, I highly doubt we will ever see a voltage unlock.

The GTX 590 release lesson learned was that end users would apply to much voltage to their card and end up damaging them. So a driver update that took place was to lock voltage on the 590. It never changed moving forward after that.

This series we've seen the entire Kepler series all voltage locked now. The GTX 690 will never see a voltage unlock either nor will we ever experience another fried card due to unwise end user over clocking.

These cards still do well as most of us are getting an increase of 130 MHz on the base core clock from 915 MHz to 1045 Mhz with a +130 Boot. When dynamic over clocking kicks in we are seeing an increase of 260 MHz GPU Boost on the Core. That's a healthy over clock without any type of voltage increase needed.

Not too shabby because regardless of which card you own Nvidia or AMD any +200 Core over clock is actually quite nice.


----------



## V3teran

Quote:


> Originally Posted by *Basti*
> 
> So having a low OC card, if I buy another gtx 690 and even if the new card OC's well, the low OC card will hold it back when used in sli. Is that correct?


Did you try setting your card to 700 memory core and applying it?EVGA precision will not let me do it as i get artifacting,your the only person that i know how has a gigiabyte the same as me so it would be good if you could test this out to see if its just me or the gigabytes cards in general.


----------



## phantomphenom

Hi guys, is this a legit pre-order....

http://www.amazon.com/dp/B0080JWLFU/?tag=nisa-20&m=ATVPDKIKX0DER

It also shows the clock speed above the advertised 915mhz.... Should i pull the trigger on this? It seems vague with no picture of the product and the information is questionable....though i have per-ordered some rock albums a month in advance and the products description was also lacking some what.... should I wait until its in stock?


----------



## error-id10t

Isn't that just the stock Boost clocks. I'm guessing that's what they're advertising considering Asus themselves only show 1 card.


----------



## Arizonian

Quote:


> Originally Posted by *phantomphenom*
> 
> Hi guys, is this a legit pre-order....
> http://www.amazon.com/dp/B0080JWLFU/?tag=nisa-20&m=ATVPDKIKX0DER
> It also shows the clock speed above the advertised 915mhz.... Should i pull the trigger on this? It seems vague with no picture of the product and the information is questionable....though i have per-ordered some rock albums a month in advance and the products description was also lacking some what.... should I wait until its in stock?


1019 MHz is the Boost speed from 915 MHz Base Core clock specs. So it's advertising the Boost Core clock rather than the Base.

As for it being legit I'm not sure. If it doesn't say when more will come into stock you might be waiting until the end of the month to mid-June time frame when supplies widen for availability. Can't give you advise of the legitimacy of the Amazon link.


----------



## Basti

Quote:


> Originally Posted by *V3teran*
> 
> Did you try setting your card to 700 memory core and applying it?EVGA precision will not let me do it as i get artifacting,your the only person that i know how has a gigiabyte the same as me so it would be good if you could test this out to see if its just me or the gigabytes cards in general.


Eh, you must be mistaken. I have an EVGA card.









Do you guys chain bench or wait a few minutes to cool down the card?


----------



## V3teran

Quote:


> Originally Posted by *Basti*
> 
> Eh, you must be mistaken. I have an EVGA card.
> 
> 
> 
> 
> 
> 
> 
> 
> Do you guys chain bench or wait a few minutes to cool down the card?


Oh i see fair enough my bad,so you can only go 113 on the core with your evga card?


----------



## Basti

Quote:


> Originally Posted by *V3teran*
> 
> Oh i see fair enough my bad,so you can only go 113 on the core with your evga card?


Yes, but i am now @ 115 core and it's not crashing. The only thing I changed was keep my temps below 70degrees. I'm beginning to think that premature crashes are due to the card throlling down the core when it reaches 70degrees. I'll do some more tests.


----------



## Basti

Looks like I'm wrong. Crashed at 120core.


----------



## V3teran

Quote:


> Originally Posted by *Basti*
> 
> Looks like I'm wrong. Crashed at 120core.


Are you using the 301.33 driver?What about your memory clock?Up your fan speed to around 80.


----------



## sundrou

next installation and bench's


----------



## V3teran

Looking forward to seeing benchmarks from the msi card,please get them up asap,try this for starters.

Power target +135
core clock+130
Memory +200
Fan speed 70 minimum

See if you can get 700 on the memory clock.


----------



## Basti

This is with 110core 700memory. I don't really know what to think anymore. On my past benches, I couldn't get passed 260memory. One things certain though, My driveR(301.34) crashes passed 113.



Voltage: 1.175
PT: 110%
Max Temps: 67-68 degrees


----------



## V3teran

Try driver 301.33,the driver your using does not work for everyone.


----------



## Basti

I did try 301.33 driver and it was the same as 301.34 except the shutdown bug.


----------



## jcde7ago

Quote:


> Originally Posted by *Basti*
> 
> This is with 110core 700memory. I don't really know what to think anymore. On my past benches, I couldn't get passed 260memory. One things certain though, My driveR(301.34) crashes passed 113.
> 
> Voltage: 1.175
> PT: 110%
> Max Temps: 67-68 degrees


Leave the voltage at default (hit the 'reset' button in Precision X), then up your power target and try the same clocks again. An decent OC would be +100Mhz core/+200Mhz mem., so give that a try as well.

Also, i myself am not experiencing the shut down bug on 301.34.


----------



## Basti

Quote:


> Originally Posted by *jcde7ago*
> 
> Leave the voltage at default (hit the 'reset' button in Precision X), then up your power target and try the same clocks again. An decent OC would be +100Mhz core/+200Mhz mem., so give that a try as well.
> Also, i myself am not experiencing the shut down bug on 301.34.


The voltage was at default, card's using 1.175v during benchmark.


----------



## jcde7ago

Quote:


> Originally Posted by *Basti*
> 
> The voltage was at default, card's using 1.175v during benchmark.


I don't know, man. It's possible that you just got a bad OC'ing card...what's your PSU? Any reason to doubt its ability to supply ample power? It's probably not the issue, but worth asking anyways.


----------



## DarkTemplar

Ordered a GTX 690 on Saturday! Very excited. Not a very avid poster, but I've been following this thread for some time now. Missed a lot of notifications, but finally snagged an order.









Shows as still being shipped though. Anyone have an idea on how long it takes before EVGA normally ships cards after order?

Also, just wanted to say a quick thanks to everyone posting their OC's. Can't wait to try out some of these when mine arrives!


----------



## jincuteguy

Quote:


> Originally Posted by *DarkTemplar*
> 
> Ordered a GTX 690 on Saturday! Very excited. Not a very avid poster, but I've been following this thread for some time now. Missed a lot of notifications, but finally snagged an order.
> 
> 
> 
> 
> 
> 
> 
> 
> Shows as still being shipped though. Anyone have an idea on how long it takes before EVGA normally ships cards after order?
> Also, just wanted to say a quick thanks to everyone posting their OC's. Can't wait to try out some of these when mine arrives!


how did you get one on Saturday? Like where did u get it from?


----------



## DarkTemplar

Quote:


> Originally Posted by *jincuteguy*
> 
> how did you get one on Saturday? Like where did u get it from?


I had EVGA set to notify me of any new stock. Honestly, this will be the fifth or so email I've received and previous times I was not fast enough. There's about a 2-minute window, it seems, before they sell out again. I would just set it to notify me each time and cross your fingers. I left a tab open for EVGA specifically on Saturday and just ran to the computer, refreshed the screen, and mashed the buy now. Still can't believe I got it. LOL.


----------



## Cheesemaster

Quote:


> Originally Posted by *CapnCrunch10*
> 
> 4.8GHz with a H100 on the 3960X is pretty damn good. What are your max temps and ambient temps?


As of right now my ambient is 35 average (thats at idle)... gaming hits an average of 72. Prime is just nuts... I dont do folding but can run any realistic bench (for what I use my computer for) Mark 11, Unigine. Its not a dyno queen, I really use this on the daily for the last 5 months; no issues.


----------



## wRRM

Sup!

Here's my proof!

20120514_182848.jpg 2954k .jpg file


And here's a GPU-Z Validation.

http://www.techpowerup.com/gpuz/ncphx/

The brand is Gigabyte!


----------



## Basti

Quote:


> Originally Posted by *jcde7ago*
> 
> I don't know, man. It's possible that you just got a bad OC'ing card...what's your PSU? Any reason to doubt its ability to supply ample power? It's probably not the issue, but worth asking anyways.


I have ax1200 corsair so power won't be an issue. Maybe it's just a bad card. I'm tempted to sell it and get another.


----------



## jcde7ago

Quote:


> Originally Posted by *wRRM*
> 
> Sup!
> Here's my proof!
> 
> 20120514_182848.jpg 2954k .jpg file
> 
> And here's a GPU-Z Validation.
> http://www.techpowerup.com/gpuz/ncphx/
> The brand is Gigabyte!


Congrats, added!








Quote:


> Originally Posted by *Basti*
> 
> I have ax1200 corsair so power won't be an issue. Maybe it's just a bad card. I'm tempted to sell it and get another.


Yeah, it could just be a bad card man...if you do sell it, good luck getting another one!


----------



## V3teran

Yay somebody who has the same card as me,will be interested in the benchies.


----------



## V3teran

Quote:


> Originally Posted by *Basti*
> 
> I have ax1200 corsair so power won't be an issue. Maybe it's just a bad card. I'm tempted to sell it and get another.


Send it back tell them its giving you problems like stuttering,no matter what driver that you use and you want possibly a different card,even if they find nothing wrong with it which they wont,you still want a new card as it does not work well in your pc.


----------



## mxthunder

Cool stuff in here guys! Inspired me to pull out my 9800gx2 for a few days and have some duel GPU fun. The 690 is definitely a card I will have my eye on for a while.

Cool video here:





!


----------



## Fightar

Disregard


----------



## kemsoff

Quote:


> Originally Posted by *Fightar*
> 
> Disregard


This is for the 690 man not the 670









Looks like you beat me to it


----------



## Arizonian

Quote:


> Originally Posted by *mxthunder*
> 
> Cool stuff in here guys! Inspired me to pull out my 9800gx2 for a few days and have some duel GPU fun. The 690 is definitely a card I will have my eye on for a while.
> Cool video here:
> 
> 
> 
> 
> !


Hey MX









Defintely a sweet card on many different levels. In short two full functional GK104 no compromise only with 5 power PWM'S per side (10 total) rather than 4 power PWM's as the reference 680's.

The list is pretty long on the benifits this time around, more than the usual dual GPU reasons. Best dual GPU to come out to date.

Looking foward to having you in the club should it work out for you. First time dual GPU owner myself. Haven't done more than an hour of gaming in Crysis enthusiast settings. Quite impressive.

Once I get my entire new build dialed in perfectly will be back to gaming. Working on the CPU side of over clocking ATM. See you around.


----------



## V3teran

Have you guys noticed how gpu1 always lags behind gpu2 by about 10-15mhz i wonder if you could do something about this....
The only way you could do it is un-sync the cards and alter the one that is lagging abit but i dont know how stable this will be.

New heaven bench 3 runs
PT=135+
Core=140+
Memory=500+
Fan speed= 95max
pcb1=1199
pcb2=1186


----------



## Arizonian

Quote:


> Originally Posted by *V3teran*
> 
> Have you guys noticed how gpu1 always lags behind gpu2 by about 10-15mhz i wonder if you could do something about this....


I have the same thing going on. My first GPU Boost shows 1175 MHz Core and my second 1201 MHz.

They are sync'd up. I wonder if we uncheck the. 'sync' option if we can add a touch more OC to the first GPU and not the second? Going to have to try this tonight.

I would be more than happy with both cores GPU boost at 1201 MHz.


----------



## tahoward

Ya notice it pretty often, I think it mostly has to do with the thermals though. After some time both of my GPUs will run at the same speed with my +131 core offset. +144 GPU2 likes to run at 1215 (+13 above offset) while core 1 cruises at +1202.

I found my GTX 690 runs more cool/stable with the FT02 filters removed from the AP fans so I'll be doing another round of 3dmark/heaven benching later this evening.


----------



## Arizonian

Quote:


> Originally Posted by *tahoward*
> 
> Ya notice it pretty often, I think it mostly has to do with the thermals though. After some time both of my GPUs will run at the same speed with my +131 core offset. +144 GPU2 likes to run at 1215 (+13 above offset) while core 1 cruises at +1202.
> I found my GTX 690 runs more cool/stable with the FT02 filters removed from the AP fans so I'll be doing another round of 3dmark/heaven benching later this evening.


Okay so the dual GPUs can run separate clocks from each other? Just like an SLI configuration?

That's awesome because if I can push one of the GPUs further than the other, in theory I should be able to get better FPS.

Going to give this a try later tonight thank you.


----------



## Shadowness

You can get a lot of answers from that podcast video one page back, it was very informative, watched it, and i am even more amazed.


----------



## Dorgrene

Do I have to have PCIe 3.0 motherboard to benefit from the full potential of 690? I am currently using a Phenom 1055t now. My 690 will arrive tomorrow. Afraid it will bottleneck my new gpu.


----------



## Dorgrene

Quote:


> Originally Posted by *Dorgrene*
> 
> Do I have to have PCIe 3.0 motherboard to benefit from the full potential of 690? I am currently using a Phenom 1055t now. My 690 will arrive tomorrow. Afraid it will bottleneck my new gpu.


Never mind, this post answered my question
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html


----------



## jcde7ago

Quote:


> Originally Posted by *Dorgrene*
> 
> Do I have to have PCIe 3.0 motherboard to benefit from the full potential of 690? I am currently using a Phenom 1055t now. My 690 will arrive tomorrow. Afraid it will bottleneck my new gpu.


Virtually no performance loss at all....the GTX 690 is more than fine with a PCI-E 2.0 mobo.











EDIT: You ninja'd me 1 second quicker, already had this typed out and you had responded to your own question before I could see it, lol.


----------



## bnj2

Got my Asus GTX690 today. A bit disappointed by the package, it looks like any ordinary Asus card and inside the box, beside the card, there is only a generic leaflet, a CD and a DVI->HDMI adapter.



But the card looks stunning!


----------



## Arizonian

@ BNJ2 - Congrats on the ASUS GTX 690!









On a side note - regarding Power Target during games.

I've been running benchmarks at +135 Power Target / +130 Core / +200 Memory which translates to synced 1145 MHz Core 1502 MHz Memory. GPU #1 Boost 1175 MHz Core / GPU #2 Boost 1201 MHz.

Playing Crysis I crashed. Hard freeze. Had to Ctrl + Alt + Del to taskbar manager and stop process not responding. I couldn't believe it was happening because I've benched now successfully through 3DMark11 and Heaven for a couple days at this GPU over clock and though I know gaming is the ultimate test for stability I noticed something interesting.

In GPU-Z my TDP was only 115% maxed. So I decided to try something and lowered my Power Target to +125. I was able to game through Crysis @ Enthusiast graphic setting with AA x8. (I find x16 doesn't provide any greater visuals IMO), for over an hour and stable. So by lowering my Power Target during an intensive game that didn't require Power Target to +135 I was able to maintain stability without having to down clock.

My max temps GPU #1 71C 1175 MHz Max Core and GPU #2 70C 1201 MHz Max Core. I noticed my GPU Usage was 96% and 97% which makes me wonder if my i7 3770K 4.30 GHz over clock is still not enough and might be bottlenecking me. Going to check a couple things though as I come to a conclusion.

I will say before we can make assumptions and really test our cards we do have to master not just GPU over clocking but CPU over clocking and having everything down to the PCIe slots configured properly before we can start to see true potential.

I've got my PCIe slot set to x4 mode in BIOS and don't have an over volt on my CPU. Not reaching the 4.5 GHz I know others have been getting.

I will say though checking Fraps every now and again, my FPS was anywhere from high +100 FPS and low as 61 FPS lowest. I was extremely smooth as butter and was exploding things getting close up for maximum graphic displays to really put this card to the test and had no micro stutter and smooth as a hot knife through butter. I'm very happy with my purchase. What a freaking card.


----------



## gatehous3

Quote:


> Originally Posted by *bnj2*
> 
> Got my Asus GTX690 today. A bit disappointed by the package, it looks like any ordinary Asus card and inside the box, beside the card, there is only a generic leaflet, a CD and a DVI->HDMI adapter.
> 
> But the card looks stunning!


oh god .. its .. so .. sexyyyyyy.

Not happy about what comes in the box though. IMO, should come with everything possible. A few adapters, power cables, stickers, etc.
Would be even better to see it come in the wooden box with a crow bar haha. You are paying for a $1200 card, it should at least look like thats whats inside.

I just found out that im #16 on the back order list ... fffffuuuu

Anyway, congrats man!


----------



## bnj2

Thanks!

The EVGA PrecisionX software is simply awesome, the Asus GPU Tweak is horrible. I hope they'll come up with a better version soon to be on par with their AI Suite motherboard software.

Here is the Asus box content if anyone is interested


----------



## error-id10t

Quote:


> Originally Posted by *Arizonian*
> 
> My max temps GPU #1 71C 1175 MHz Max Core and GPU #2 70C 1201 MHz Max Core.
> I've got my PCIe slot set to x4 mode in BIOS and don't have an over volt on my CPU.


Not sure I understand this part.. are you running it @ x4 instead of x16?

Are the temps limiting your boost, I read someone say it downclocks at 70 degrees first time?


----------



## Arizonian

Quote:


> Originally Posted by *error-id10t*
> 
> Not sure I understand this part.. are you running it @ x4 instead of x16?
> Are the temps limiting your boost, I read someone say it downclocks at 70 degrees first time?


First question. In the UEFI BIOS there is a PCIe_16 x1 or x4 option. It's confusing me. It's like a multiplier. I've never seen it before in the old BIOS. Not sure which I should be running it at?

Secondly to answer your over clock question which I do know. When we hit 70C the GPU dynamically down clocks whatever Core you've reached by 13 MHz automatically. So the magic number for these cards is 70C.

By limiting you power target two things happen which sort of contradicts itself. First is by limiting your voltage you essentially keep your temp a bit cooler and longer from hitting 70C. I found that when I force 1.175v at a constant it raises temps higher faster rather than keep the core clock forced higher more consistently as I thought.

Trying to maintain a constant high OC'd core is what's keeping the 600 series's from doing as well as it can during benches where the graphic rendering isn't quite demanding enough for the card so it down clocks itself. Hence the lower scores and what AMD fans use to say that thier cards are better. In games where the graphic rendering is more of a constant we actually do better and are either on par or better than the AMD counter part. Why we do well in tech web site reviews across many reviews even when both cards are over clocked. It's truly a gamer card built from the ground up.

Secondly what inadvertentantly happens when lowering the power target, if it requires more voltage than we are limiting the card to it will not achieve highest potential. Seems the power target required can vary from games and benchmarks.

So it's a bit of a quandary working with dynamic over clocking that's happening behind the scenes and what we have to work with. I know we don't play our benchmarks but it lowers perception on our cards true performance on OCN when competing.

It truly is a better gaming card IMO but not a benching card. We have seen when volt modded vs volt modded AMD cards we've hit the record highest core but that impractical in every day usage. I also know it confirms that Nvidia has conservatively capped out volt limit and our cards are better than we are able to achieve because as they stand Nvidia knew it would compete against the competition.

The 600 series has some innovative thinking behind it but took away some if the control from the end user. Notice how this year not one fried reference card? Its a thing of the past. The early 590 & 570 reference fiasco of last year is behind us. In the end RMA's down profits up.

Hope that clears up the second question. If you knew then I apologize up front.


----------



## error-id10t

Ah, I think that BIOS entry is just for the PCIe 2.0 slot which I doubt you'd be using with this card.

Thanks for the sum-up on the 2nd point. The temp limit is little concerning for people at warmer climates but nothing a water-block wouldn't fix.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> First question. In the UEFI BIOS there is a PCIe_16 x1 or x4 option. It's confusing me. It's like a multiplier. I've never seen it before in the old BIOS. Not sure which I should be running it at?
> Secondly to answer your over clock question which I do know. When we hit 70C the GPU dynamically down clocks whatever Core you've reached by 13 MHz automatically. So the magic number for these cards is 70C.
> By limiting you power target two things happen which sort of contradicts itself. First is by limiting your voltage you essentially keep your temp a bit cooler and longer from hitting 70C. I found that when I force 1.175v at a constant it raises temps higher faster rather than keep the core clock forced higher more consistently as I thought.
> Trying to maintain a constant high OC'd core is what's keeping the 600 series's from doing as well as it can during benches where the graphic rendering isn't quite demanding enough for the card so it down clocks itself. Hence the lower scores and what AMD fans use to say that thier cards are better. In games where the graphic rendering is more of a constant we actually do better and are either on par or better than the AMD counter part. Why we do well in tech web site reviews across many reviews even when both cards are over clocked. It's truly a gamer card built from the ground up.
> Secondly what inadvertentantly happens when lowering the power target, if it requires more voltage than we are limiting the card to it will not achieve highest potential. Seems the power target required can vary from games and benchmarks.
> So it's a bit of a quandary working with dynamic over clocking that's happening behind the scenes and what we have to work with. I know we don't play our benchmarks but it lowers perception on our cards true performance on OCN when competing.
> It truly is a better gaming card IMO but not a benching card. We have seen when volt modded vs volt modded AMD cards we've hit the record highest core but that impractical in every day usage. I also know it confirms that Nvidia has conservatively capped out volt limit and our cards are better than we are able to achieve because as they stand Nvidia knew it would compete against the competition.
> The 600 series has some innovative thinking behind it but took away some if the control from the end user. Notice how this year not one fried reference card? Its a thing of the past. The early 590 & 570 reference fiasco of last year is behind us. In the end RMA's down profits up.
> Hope that clears up the second question. If you knew then I apologize up front.


Excellent post!

I'll sum it up about as best as I can here, from what i've been saying all along: The 690s *need* to be watercooled (at the very least) to achieve higher potential, and if it the voltage is unlocked as well, then these things will FLY.









Regardless, the fact that some of us are matching and/or EXCEEDING GTX 680 SLI scores and performance (even when those 680s themselves are OC'd) is still pretty amazing.


----------



## V3teran

Nvidia Geforce 301.40 Beta driver
http://downloads.guru3d.com/Nvidia-GeForce-301.40-WHQL-64-bit-download-2910.html


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Nvidia Geforce 301.40 Beta driver
> http://downloads.guru3d.com/Nvidia-GeForce-301.40-WHQL-64-bit-download-2910.html


At work at the moment, but i've got the next couple of days off...if I get a quick minute to take a break from playing Diablo III starting tonight, i'll have to run some benches with these.

In the meantime, do you think you can run some benchmarks with these, V3teran, and compare scores? Curios to see if these are much better. Thanks!


----------



## V3teran

No different for me tbh just the same
This is my max oc,i cant go any higher at all without instability in heaven and 3d mark,what do you guys think....decent oc?

PT=135
Core=135
Memory=500
Gpu1=1194
Gpu2=1182
Fan speed 80


----------



## V3teran

Hey mate cant you move your oc on your memory past 200 without it failing in heaven and 3d mark?

?


----------



## Arizonian

Quote:


> Originally Posted by *V3teran*
> 
> Hey mate cant you move your oc on your memory past 200 without it failing in heaven and 3d mark?
> ?


I'm not sure if your talking to me but that's my limit on memory as well. I can OC +200 offset on memory which gives me a 100 MHz OC to 1602 MHz Memory. Anything past it will driver crash.


----------



## jcde7ago

Quote:


> Originally Posted by *V3teran*
> 
> Hey mate cant you move your oc on your memory past 200 without it failing in heaven and 3d mark?
> ?


I have run my memory up to +730Mhz (with +130mhz core offset) perfectly stable through Heaven and 3DMark. It will fail at +135-140 Mhz core offset though with that high of a mem OC. Perfectly stable if i stick with +130 Mhz.


----------



## bnj2

And here is a benchmark of my Asus:


CPU was at my daily 4600 settings.


----------



## V3teran

Quote:


> Originally Posted by *Arizonian*
> 
> I'm not sure if your talking to me but that's my limit on memory as well. I can OC +200 offset on memory which gives me a 100 MHz OC to 1602 MHz Memory. Anything past it will driver crash.


Quote:


> Originally Posted by *jcde7ago*
> 
> I have run my memory up to +730Mhz (with +130mhz core offset) perfectly stable through Heaven and 3DMark. It will fail at +135-140 Mhz core offset though with that high of a mem OC. Perfectly stable if i stick with +130 Mhz.


Thankyou to you both.
I can run at
135 core
500 memory

If i run any more than that even just 136 on core or 510 on the memory then i crash.
Even though im running at this oc i still cant hit 1200 on the core,im just under by about 10mhz.


----------



## Cobolt005

So ok atm what's the best Driver to download for the GTX690? I just got my card in from UPS and want to get this bad boy in my case.


----------



## Arizonian

Quote:


> Originally Posted by *Cobolt005*
> 
> So ok atm what's the best Driver to download for the GTX690? I just got my card in from UPS and want to get this bad boy in my case.


Currently I'm using 301.34 but it has a shut down bug that won't let you powerdown your computer when you close windows. Use the previous driver and that problem does not exist.

They are working on a fix for the next driver. I don't shut down my system often so for me it's not a big issue. It does not have a problem with waking up from sleep which works perfectly still.

P.S. by the way congratulations on your new GTX 690.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Currently I'm using 301.34 but it has a shut down bug that won't let you powerdown your computer when you close windows. Use the previous driver and that problem does not exist.
> They are working on a fix for the next driver. I don't shut down my system often so for me it's not a big issue. It does not have a problem with waking up from sleep which works perfectly still.


I just want to note that *the shut-down bug in 301.34 drivers DOES NOT AFFECT EVERYONE*. I have not run into this, and I shut my computer off every night before going to bed.

That said, it sounds like the 301.34 and the 301.40 are essentially the same, performance wise, so feel free to go with either one.


----------



## Arizonian

Quote:


> Originally Posted by *jcde7ago*
> 
> I just want to note that *the shut-down bug in 301.34 drivers DOES NOT AFFECT EVERYONE*. I have not run into this, and I shut my computer off every night before going to bed.
> That said, it sounds like the 301.34 and the 301.40 are essentially the same, performance wise, so feel free to go with either one.


Yes that's true and I'm sorry I didn't mention that. I even tried disabling hibernation through the command prompts and it did not work to fix my issue.

What I don't like doing is rolling back on drivers I only move forward. I never download beta. For the last 17 months I've had not one driver issue with my 580 or my 680 GPUs. So it's been an awesome track record for me not having to reformat windows every time a driver comes out as I did when I was using AMD.

Again like I said for me it's not an issue because I can still put my computer to sleep and be able to wake from it with no problem as I don't usually ever shut my PC down anyway.

]


----------



## Suit Up

Hey peeps. I've overclocked my card and ran some benchmarks but only took note of my scores in heaven.
I am using the 301.34 drivers.

Are my scores about where they should be based on my overclock?
My overclock is stable. Is it about what everyone is getting on air?

Cheers.



In case you cant see the above image:


Spoiler: Click to show



Overclock:

116% power target
+145 Mhz GPU clock offset
+ 650 Mhz Mem clock offset

GPUz:

1060 Mhz GPU clock
1165 Mhz Boost
1827 Mhz Memory

Heaven score:

FPS 113.3
Score 2853
Min 45.3
Max 273.6


----------



## Arizonian

Quote:


> Originally Posted by *bnj2*
> 
> And here is a benchmark of my Asus:
> 
> CPU was at my daily 4600 settings.


Quote:


> Originally Posted by *Suit Up*
> 
> Hey peeps. I've overclocked my card and ran some benchmarks but only took note of my scores in heaven.
> I am using the 301.34 drivers.
> Are my scores about where they should be based on my overclock?
> My overclock is stable. Is it about what everyone is getting on air?
> Cheers.
> 
> In case you cant see the above image:
> 
> 
> Spoiler: Click to show
> 
> 
> 
> Overclock:
> 116% power target
> +145 Mhz GPU clock offset
> + 650 Mhz Mem clock offset
> GPUz:
> 1060 Mhz GPU clock
> 1165 Mhz Boost
> 1827 Mhz Memory
> Heaven score:
> FPS 113.3
> Score 2853
> Min 45.3
> Max 273.6


You both need to enter your scores in the Heaven 3.0 Benchmak Thread. Amazing what these cards are doing not surpassing 1.175v in these benchmarks. Represent the 690's. My score was a lowly 97.1 FPS.

Gaming I know our cards are superior but synthetic benches don't fair well.

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores


----------



## Stateless

Hey all...been really busy and have not had time to post or mess with my cards much. But I am running into some odd stuff. For example, in some benchmarks like the Batman Arkham City, the cards are staying at stock settings for the most part...everyonce in a while GPU 3 will OC to about 1034, but the other 3 will stay at 915 stock. What is even more puzzling in some areas where there is more stuff going on, the GPU's are not scaling at all to compensate. My score in the Batman Benchmark are actually lower than my 2x680's.

Another one is Witcher 2 in Ubersampling mode. Again, even OC, the performance is pretty much the same as it was with the 2x680's. In the same areas where the framerate would drop to about 30fps or so on 2x680's it does the same with 4 gpu's. I do notice that the cards are definately overclocking at hitting the boost limits, but GPU utilization is pretty much in the 30-40%...so the game is really not taking advantage of the cards at all. For me, this is a major bummer as when I had the 2x680's I was close to hitting my goal of Uber Witcher 2 and alot of the parts it wold be fine but alot of drops depending on area down to the 30's..figured 2x690's would destroy the game and basically am getting the same performance. My CPU is the 3930k overclocked to 4.6 so I know it is not the CPU.

In real benchmarks like Unigine/3dmark you can definately see my cards are kicking major booty, so it seems more driver/gpu usage in the games causing the cards not to be really utilized.


----------



## Cheesemaster

Quote:


> Originally Posted by *Stateless*
> 
> Hey all...been really busy and have not had time to post or mess with my cards much. But I am running into some odd stuff. For example, in some benchmarks like the Batman Arkham City, the cards are staying at stock settings for the most part...everyonce in a while GPU 3 will OC to about 1034, but the other 3 will stay at 915 stock. What is even more puzzling in some areas where there is more stuff going on, the GPU's are not scaling at all to compensate. My score in the Batman Benchmark are actually lower than my 2x680's.
> Another one is Witcher 2 in Ubersampling mode. Again, even OC, the performance is pretty much the same as it was with the 2x680's. In the same areas where the framerate would drop to about 30fps or so on 2x680's it does the same with 4 gpu's. I do notice that the cards are definately overclocking at hitting the boost limits, but GPU utilization is pretty much in the 30-40%...so the game is really not taking advantage of the cards at all. For me, this is a major bummer as when I had the 2x680's I was close to hitting my goal of Uber Witcher 2 and alot of the parts it wold be fine but alot of drops depending on area down to the 30's..figured 2x690's would destroy the game and basically am getting the same performance. My CPU is the 3930k overclocked to 4.6 so I know it is not the CPU.
> In real benchmarks like Unigine/3dmark you can definately see my cards are kicking major booty, so it seems more driver/gpu usage in the games causing the cards not to be really utilized.


Well my cards will be here Thursday or Friday I to am running a quad setup.. Ill see whats going on (we should stay in touch because your the only other one that I know of that has a quad setup on the forums right now)... I was running Uber on witcher 2 with my 3-way 580 gtx's they are the 3 gig version and always get more then 50fps except the opening scene with tris marrigold. hmmmm... I have a 3960x @ 4.8ghz.. We shall see! Keep it up the drivers will mature!


----------



## Arizonian

Well I've discovered I don't have such a good memory OC going. My offset during benching can go as high as +200 for a 100 Mhz OC to 1602 MHz Memory. However in Crysis tonight after an hour of playing I crashed. Backed off on the memory and at least my Core stayed stable with it lowered.

So far my stable over clock testing thus far is *Base 1145 MHz Core* and *1552 MHz Memory* gaming. Max GPU Boost on Core GPU #1 *1175 MHz* and GPU #2 *1201 MHz*. So the dynamic over clocking on the two GPU's differ.

I've been trying to nail down a nice over clock stable on my CPU with not much luck other than benching. I have to revert to 4.2 GHz on my i7 3770K for stability.

When I push my GPU a tad too much my system crashes if I'm at 4.5Mhz gaming and sometimes benching. I've not applied any voltage or have configured my CPU OC properly as of yet. I'm positive I can hit 4.5 GHz but I'm too much a n00b on the OC for CPU and mother board specs let alone new UEFI BIOS.

I also have to test my GPU in games more which I started doing tonight with stable CPU at 4.2 GHz. Been a long process with an entire new build trying to figure it all out and be productive with results on OCN so we can see what these cards performance really lays.

At the end of the day I try not to compare too much against others because what matters most (I tend to forget) is the joy of what gaming performance my system brings me. I'm excited for Crysis 3.

Been testing tonight stability on Crysis (and having fun) which by far to this date is the best testing game IMO there is for stability.







Figured out had to back down on my Memory from 100 Mhz OC to 50 MHz OC to be truley stable from offset +200 to +100 atm.

In any rate I may have found my stable 24/7 over clock :

Power Target +135
Core +130 = 1145 MHz
Memory +100 = 1552 Mhz

GPU #1 Boost 1175 MHz & GPU #2 Boost 1201 MHz


----------



## Stateless

Quote:


> Originally Posted by *Cheesemaster*
> 
> Well my cards will be here Thursday or Friday I to am running a quad setup.. Ill see whats going on (we should stay in touch because your the only other one that I know of that has a quad setup on the forums right now)... I was running Uber on witcher 2 with my 3-way 580 gtx's they are the 3 gig version and always get more then 50fps except the opening scene with tris marrigold. hmmmm... I have a 3960x @ 4.8ghz.. We shall see! Keep it up the drivers will mature!


That is funny because that is the scene I am talking about. After you leave the tent and you look towards the left where the catepults are launching boulders..I mean there is alot of complex scenery, lighting, shadows etc..and that is where it dips...I can position the camera to where it dips to 30fps. There are other areas there in the camp that do the same thing.

Appreciate you looking into this..seems like we have identical systems so our results should be pretty close to each other. Keep me updated when you get your cards and your overclocking etc.


----------



## Cheesemaster

Quote:


> Originally Posted by *Stateless*
> 
> That is funny because that is the scene I am talking about. After you leave the tent and you look towards the left where the catepults are launching boulders..I mean there is alot of complex scenery, lighting, shadows etc..and that is where it dips...I can position the camera to where it dips to 30fps. There are other areas there in the camp that do the same thing.
> Appreciate you looking into this..seems like we have identical systems so our results should be pretty close to each other. Keep me updated when you get your cards and your overclocking etc.


I am also running three acer 3d monitors in surround (5760x1980) My frame rates as stated were on 1980x1080.. I just cant surround to work with witcher 2.. : (


----------



## m3t4lh34d

Quote:


> Originally Posted by *Cheesemaster*
> 
> Well my cards will be here Thursday or Friday I to am running a quad setup.. Ill see whats going on (we should stay in touch because your the only other one that I know of that has a quad setup on the forums right now)... I was running Uber on witcher 2 with my 3-way 580 gtx's they are the 3 gig version and always get more then 50fps except the opening scene with tris marrigold. hmmmm... I have a 3960x @ 4.8ghz.. We shall see! Keep it up the drivers will mature!


There are plenty of people with quad setups on the forums, like myself


----------



## phantomphenom

Pulled the trigger on the Asus 690...its back ordered until the end of the month but when it ships, i'll post the proof!!!!!


----------



## Tslm

Quote:


> Originally Posted by *Arizonian*
> 
> In any rate I may have found my stable 24/7 over clock :
> 
> Power Target +135
> Core +130 = 1145 MHz
> Memory +100 = 1552 Mhz
> 
> GPU #1 Boost 1175 MHz & GPU #2 Boost 1201 MHz


That is such a sweet overclock for a dual gpu card. That'd be faster than some 680 SLI setups


----------



## tahoward

Posted these results in the top 30:

TaHoward ---- 2600k GTX 690 P16214 Score

3dmark.com3dm11/3444042



TaHoward --- 2600k / 5.0Ghz ---- GTX 690 , 1190/ 2380 / 3715 ---- 111.9 ---- 2819


----------



## Cheesemaster

Quote:


> Originally Posted by *m3t4lh34d*
> 
> There are plenty of people with quad setups on the forums, like myself


Awsome sauce! So my cards are out for delivary... i got one for my little bro but my mom flipped out.. no more games for my brother for awhile. Looks like ill have an extra laying around. So, anywayz my cards will be here today very soon i got the evga signature card.. ill take pics of me in the shirt and dry humping the 690gtx mouse pad... tear down the system for a deep clean... giving my three 580gtxs to my freind and pulling the corsair ax1200 out.. that is going to freind as well and putting in the xfx 1250 black edition... now can finnally see what my dell core duo desktop can do...


----------



## Arizonian

Quote:


> Originally Posted by *Cheesemaster*
> 
> *snip*
> Awsome sauce! So my cards are out for delivary. So, anywayz my cards will be here today very soon i got the evga signature card.... now can finnally see what my dell core duo desktop can do...


Quote:


> Originally Posted by *tahoward*
> 
> *snip* Posted these results in the top 30:
> 
> TaHoward ---- 2600k GTX 690 P16214 Score
> 
> TaHoward --- 2600k / 5.0Ghz ---- GTX 690 , 1190/ 2380 / 3715 ---- 111.9 ---- 2819


Congrats on your cards gentlemen.


----------



## iARDAs

I have been reading more and more reviews about 680 vs 690. I would like to really say that 690 is finally a great dual GPU from Nvidia. It does perform 2xreference 680s. Great job Nvidia.

If i was living in the states, i would have prefered 690 over 680 SLI for lower heat and power consumption.

However

1-) Due to the long time it takes to sell a Dual GPU second hand

2-) Due to the fact that 2x 680 OC versions will perform a bit better than 690

3-) Due to if something happens to a card i can send 1 for repair and keep the other one and play games until it arrives,

I will be picking up 2 680 OC versions.

But honestly. 690 is the best card on the planet at the moment.

It also outperforms a Mars II right? or am i mistaken?


----------



## crazychink

I just got mine two days ago haven't done anything yet



I got them with my other friend so I had the chance of taking this rare picture of three 690 GTX together


----------



## brettlaf

-


----------



## bnj2

Quote:


> Originally Posted by *crazychink*
> 
> I just got mine two days ago haven't done anything yet


You need one of those awesome SLI bridges nvidia used in the 690 release pics


----------



## Arizonian

Quote:


> Originally Posted by *bnj2*
> 
> You need one of those awesome SLI bridges nvidia used in the 690 release pics


No kidding. The SLI bridge they used looked future space age. Definitely fitting for the GTX 690 in all its beauty. Best darn looking card ever.


----------



## shiloh

Finally received mine! Official proof:




























Cant wait to install it tonight!!


----------



## Arizonian

I got some news regarding the 'power down system' bug that is affecting some of us. It's not just an Nvidia bug it's also AMD cards so the issue is more of an Asus or Ivy Bridge related problem.

http://www.overclock.net/t/1246595/official-asus-sabertooth-z77-owners-thread-club/480#post_17264344


----------



## bnj2

Rule out IB because I get it on my SB rig.


----------



## Cheesemaster

Oh yeah!


----------



## Cheesemaster

Fail!


----------



## Arizonian

Congrats Cheesemaster. Must feel epeen.







Enjoy those babies.


----------



## shiloh

i did some benchmark tonight. however it was on my backup/testrig that has an old Q9650 running at 4.2. I could not wait to try it so I didnt feel like messing in the loop of my main rig tonight... so on the old rig it goes !

The screenshot below are the best OC I was able to get so far... The Heaven bench result is probably 10 to 15% lower to those who have Sandy/Ivy bridge.

I really like this card!

Cheers!

















Heaven bench Results:










GPU Graph:


----------



## Kyouki

Finally got mine today =) just now finally getting to have a little fun with it. sorry for bad pics from cell phone but I figure everyone here seen some hi res pictures by now.






You can see in pictures of case I also modded a area at the bottom to hide my power supply and cable because on the SilverStone TJ10 it has bad cable management and little room behind MB tray. I really like how it turned out! I just used Aluminum mesh from frozencpu.com and shaped it to fit. I really want to finish this with a black and green theme since I have the Nvidia ESA edition case, the Asus rampage throws it a little off with red but I think it looks good!

Also I haven't benched the card yet or overclocked it but I ran PersionX while playing a game of LOL at max setting and it didn't go over 40c and stayed at 324mhz the game didn't even fade the card haha.

(edit I take that back i gpu one was doing nothing while gpu 2 did it all but i wasn't viewing 2 at time.)


----------



## shiloh

Quote:


> Originally Posted by *bnj2*
> 
> Rule out IB because I get it on my SB rig.


+1 Just had the problem on an old Yorkfield!


----------



## Kyouki

So Why is only one of my GPU doing all the work (GPU2) while gpu1 does nothing. I have 2 Samsung plugged into the top 2 DVIs i am running them both as multiple displays. but I would assume I should still be using SLI. because I was playing a game of L.O.L. and GPU1 sat at 324mhz while GPU2 went to 1071Mhz with out any changed to persionX. Am I missing somthing. I am coming from 2 old 8800 GTS in SLI to GTX 690, Loving the card but I want to get the most out of it. I also ran a 3dmark11 and got low score and said SLI disabled. Any help would be nice!

sorry for any bad grammar on this one it late and typed this up in few seconds haha.


----------



## bnj2

Quote:


> Originally Posted by *Kyouki*
> 
> So Why is only one of my GPU doing all the work (GPU2) while gpu1 does nothing.


Go to NVIDIA control panel and set it up like this:


----------



## Arizonian

Is anyone else having issues gaming constantly crashing? I'm on 301.34 driver. I've got my memory turned down to default. Only working on Core atm. +134 is crashing in Crysis. Power Target +125 since I'm not reaching over +118 anyway. I'm pushing an intense game to see true stability where it counts.

I guess I can try stock next. BRB. See how it turns out.


----------



## Kyouki

Quote:


> Originally Posted by *bnj2*
> 
> Go to NVIDIA control panel and set it up like this:


\
Thank you I couldn't do that at first because of where I had my second monitor plugged in. I moved it to the DVI-d and now it allows me to select that as a options and is now in SLI. also I re-did my 3dMark and my score went up an extra 6k points. so I am happy now ill really start playing with it. Just stupid user error on my part! need some sleep haha.


----------



## Cheesemaster

Got 25000P score mark11 is that good?


----------



## blackend

Quote:


> Originally Posted by *Cheesemaster*
> 
> Got 25000P score mark11 is that good?


yes it is good,i have 4 gtx 680 and the score i got 27000p


----------



## Arizonian

Yes P25000 is a good score Cheesemaster. What Core & Memory Clock are you base. What Core clock do you max?

ON a side note: Just re-tried Crysis at +130 Core offset @ 1045 MHz base Core no memory OC atm and staying stable thus far. Core Boost hitting at GPU #1 *1175 MHz* & GPU #2 *1201 Mhz.*

If I could stay stable with this Core OC I'd be happy coming from 915 Mhz Base 1050 Boost. I'd settle for no memory OC to keep the core hevily OC'd 24/7.

Will keep testing. Nothing to really test a GPU than an intensive game. Benchmarks like Heaven and 3DMark11 don't really tell the stable picture. You see a lot of people getting these high clocks which means diddly in real world usage of our GPU's. What stable is all that matters.

Anyway I think I know what my problem is, I don't have my CPU overclock properly and so I'm unstable with my CPU OC.

Going to a completely whole new system really makes things a little more difficult to pinpoint when you have problems and you're not sure where it's coming from cause there're too many variables that can be going on.


----------



## Cheesemaster

Quote:


> Originally Posted by *Arizonian*
> 
> Yes P25000 is a good score Cheesemaster. What Core & Memory Clock are you base. What Core clock do you max?
> ON a side note: Just re-tried Crysis at +130 Core offset @ 1045 MHz base Core no memory OC atm and staying stable thus far. Core Boost hitting at GPU #1 *1175 MHz* & GPU #2 *1201 Mhz.*
> If I could stay stable with this Core OC I'd be happy coming from 915 Mhz Base 1050 Boost. I'd settle for no memory OC to keep the core hevily OC'd 24/7.
> Will keep testing. Nothing to really test a GPU than an intensive game. Benchmarks like Heaven and 3DMark11 don't really tell the stable picture. You see a lot of people getting these high clocks which means diddly in real world usage of our GPU's. What stable is all that matters.
> Anyway I think I know what my problem is, I don't have my CPU overclock properly and so I'm unstable with my CPU OC.
> Going to a completely whole new system really makes things a little more difficult to pinpoint when you have problems and you're not sure where it's coming from cause there're too many variables that can be going on.


----------



## V3teran

Had an MSI card and it was absolutely terrible,only went 110 on the core and 100 on the memory and it draws 110% on pcb1 and 2,it really struggles to run even at stock,ive sent it back...MSI=AVOID AT ALL COSTS!!!


----------



## jcde7ago

Hey guys,

Apologies for not keeping up to date, had a bunch of real-life stuff come up this week, and then there was juggling work and Diablo III (an almost insurmountable task, lol)....*if I need to add you to the club, please let me know, or if you want it done faster, PM me w/ link to your proof.
*
Other than that, good to be back after a few days.


----------



## ceteris

Yeah D3 has been keeping me pretty tied up too lol. I'm still having trouble hunting down a 2nd 690


----------



## kemsoff

Quote:


> Originally Posted by *V3teran*
> 
> Had an MSI card and it was absolutely terrible,only went 110 on the core and 100 on the memory and it draws 110% on pcb1 and 2,it really struggles to run even at stock,ive sent it back...MSI=AVOID AT ALL COSTS!!!


That stinks, my Evga one has hit 145 gpu and 725 memory hoping to push it more once the water blocks start coming pt


----------



## V3teran

Quote:


> Originally Posted by *kemsoff*
> 
> That sinks, my Evga one has hit 145 gpu and 725 memory hoping to push it more once the water blocks start coming out


Yeah thats a good card u got there,probably 1 in a 100.


----------



## kemsoff

Quote:


> Originally Posted by *V3teran*
> 
> Yeah thats a good card u got there,probably 1 in a 100.


Guess I just got lucky. I'm sure it'll go higher but the temps climb really fast with it pushed that far


----------



## Stateless

Quote:


> Originally Posted by *kemsoff*
> 
> Guess I just got lucky. I'm sure it'll go higher but the temps climb really fast with it pushed that far


Yeah my cards pretty max at +120 offset and +250 on the memory. Anything beyond that and it crashes...this is after about 5-6 runs of Unigine does it crash when set higher. It is however a rock solid overclock as I have done Unigine, 3dmark11, Metro2033, Witcher 2, Crysis 2 and Diablo 3 on it and no issues at all...temps stay around the high 60's depending on game with an occasional gpu going into the 70c for a brief moment.


----------



## kemsoff

Quote:


> Originally Posted by *Stateless*
> 
> Yeah my cards pretty max at +120 offset and +250 on the memory. Anything beyond that and it crashes...this is after about 5-6 runs of Unigine does it crash when set higher. It is however a rock solid overclock as I have done Unigine, 3dmark11, Metro2033, Witcher 2, Crysis 2 and Diablo 3 on it and no issues at all...temps stay around the high 60's depending on game with an occasional gpu going into the 70c for a brief moment.


Mine doesnt hit 70 unless I'm at 140 and 500+ , last run I did at 145 and 725 I was hitting 74 or so. Can't wait to get this thing under water


----------



## bjrebo

I am thinking that I have one of the worst oc gtx 690's available. It will only do 112 and 500. Card gets up into the 80 degree range. I have an i7 at 5.2 on water and would like to push the card to the limits. Should I return this card? I purchased it at Microcenter last weekend.


----------



## Arizonian

Quote:


> Originally Posted by *bjrebo*
> 
> I am thinking that I have one of the worst oc gtx 690's available. It will only do 112 and 500. Card gets up into the 80 degree range. I have an i7 at 5.2 on water and would like to push the card to the limits. Should I return this card? I purchased it at Microcenter last weekend.


I'm pushing +130 +200. 1145 MHz Core 1602 MHz Memory. Ive been crashing in Crysis unless I put my memory to stock.

Might be my CPU OC. Most likely need to add a tiny bit of voltage. Im going to try my GPU OC tonight with stock CPU to confirm which might be the culprit. It's not a driver failure but a hard freeze. So it might be unstable CPU.

At +130 1145 MHz my boost ends up at GPU #1 1175 MHz & GPU #2 1202 MHz.

I bet if you lowered your memory you'd get a higher core. Now keep in mind we start at 915 MHz Core and 1502 MHz Memory so if your anything near 1140 MHz Base Core you've got a healthy OC when boost kicks in.

Instead of telling us your offsets you should be stating your Base Core and Memory then the max boost you see your GPU hitting in GPU-Z. Then you'll see what over clocks your actually achievei g. Are you testing it in games or benches ATM?

BTW - Welcome to OCN!







Noticed that was your first post.


----------



## bjrebo

In unigen heaven 3.0 about the best i get is 1163 before crash. if both gpu's scale up to 1176 and they go over 70 degrees it crashes shortly after. I just did a run on core one at 141 and 500 the clock was at 1186 ind it ran without crash (72 degrees) Could it be that i have a very weak core 2. I ran overclocks with no memory oc and the same thing happened. Please tell me what you think i should do.


----------



## Shadowness

I feel like this is a perfect opportunity to get back into the discussion ( lonely sucker







) .... Anyway,

on the topic about thriple monitors or mentioned big screen, i think i might have a point or two.

So, first of all i was thinking about 3 screens, 120hz, 1080p. And you know, i then connected it with the power needed and i got some scary results. In Dirt 3 GTX690 pretty much destroys it, at around 200 FPS. Though go in Battlefield 3, you end up with 115 FPS -/+

Now, if this would scale proportionally ( not really an expert of how FPS drops / decreases with multiple monitors, or if there is some formula ), you would need 2 690s for two monitors to maintain 120 FPS. Taking scaling into account, we would most likely turn some AA down, or Shadows. Adding third monitor, we pretty much get the scary result. This generation cant run most demanding games at 120FPS, on 3x 120Hz monitors, no matter what GPU set-up you use.

It doesnt really matter if i made a mistake here and there, for the final result will be the same. So, you know, i was thinking further. Id take triple monitors any second, but the GPU power is not there. Also, if i bought one it would have to be them + Quad GPU set-up. Kinda expensive if you ask me. So my conlusion is big screen as the next step.

Maybe we save some money, sell 690 and maybe 790 will be close and i might pull the trigger, but so far the big screen is my next best option. And a big screen for me would be my favorite 40" Samsung 3DTV.

I can already imagine all these awesome movies or TV series like say, The Walking Dead, playing via Nvidia 3D Vision player on that big TV. Or flying with a jet in BF on that big screen ? Simply gorgeous.

So thats what i wanted to post, been thinking about it for quite some time you know. As for probably the only concern, multitasking, i am going to have TV, one monitor, laptop monitor. Seems good enough to me. Thoughts ?


----------



## Raf Leung

how did u guys afford this stuff, are most of u guys have a job or working?


----------



## Kyouki

Got good paying job and good financial habits! So I can afford my toys!


----------



## Arizonian

Quote:


> Originally Posted by *Raf Leung*
> 
> how did u guys afford this stuff, are most of u guys have a job or working?


I work hard and have been setting money aside in an envelope since last January after buying a 580. If you start now by next April you'll be ready for a 790. I'm going to start saving for a large SSD to replace my HDD as my main storage drive. A large capacity SSD is about a grand as well.

This 690 will have to last me two years this time.

You'll also notice on OCN many SLI or Crossfire cards which cost the same. A lot of members here it's more of a hobby of interest than a need.


----------



## Cheesemaster

^^^ Im with stupid! ^^^


----------



## ajslay

saw the motherboards.org review for these. elric has a band???? omg thats pretty freaking awesome!

just my type of music also, it sounds so badass/sick.


----------



## V3teran

I now have 2 of these,will run benchies later.


----------



## Arizonian

Sweet V3teran







Congrats. Let us know how she performs.

On a side note: Not sure why I'm crashing everything stock from CPU to GPU on Crysis maxed out settings except for AA x8 instead ox x16. I'm above 38 FPS in the last two scenes of the game. It's the most demanding and I've crashed a few times now. It doesn't say drivers stopped responding it just freezes.

I may be calling Newegg for a replacement. If I can't play games stock on this it's not worth $1000 I spent. I'm pretty bummed because I'm sure they won't have a replacement. Hopefully they will fufill it.

Will be trying this again tomorrow but I'm pretty positive now that everything is at stock. It's not too surprising that 3DMar11 and Heaven I did ok but a real intense game would test the GPU properly. From now on instead of jumping to benchmarks I'm testing it on games first.


----------



## V3teran

try this.....
go into windows cmd and type in
sfc /scannow (press enter)

Windows will now check the integrity of your system files,let me know what it says at the end of the scan as errors here can also cause hard freeze when heavy overclocking occurs.


----------



## V3teran

Ok people i tried these in quad sli using the 301.40 drivers.
I tried in x16 x8 pcie lanes which game me a tremendous amount of stutter.
I tried them also in x16 x16 lanes which improved a great deal but whenever i came to bench them only 1 gpu out of the 4 was on full load and the rest were just idling,i tried 3d mark vantage and heaven.

At one point according to gpuz and Nvidia control panel the cards were running in tri-sli and not quad sli,no matter what i did regarding power features or reinstallation of drivers using *Your uninstaller* and *driversweeper* i couldnt change this.

My conclusion is that i couldnt get them to work properly,the stutter was so bad around desktop even that i found it difficult to even click on anything,its probably an immature driver problem.

My 295's(SPCB) ran perfect and i had no stutter whatsoever with anything but after testing these 690s today in quad sli i really truthfully honestly would steer well clear of quad sli at least atm anyhow,i found it an appaling experience.....you dont need it anyway even if it did work.

After my experience today i don think i will ever be going quad sli with these tbh as im sending one back....i dont recommended it.


----------



## Arizonian

V3teran regarding your experience on Qaud SLI of 690, sorry to hear about this. Defintely early driver as this is the first one out pretty much. Might be the root of our problems atm. I'd agree that if your going to put down $2000 for these cards we'd want them to be fully functional or why bother?

Thanks for the heads up. How are you finding the one alone doing from your experience?

On a previous note: I did what you said in CMD and the prompt said it did not find any integrity violations. I tried it right after it happened again just now. I did have a small OC of 4.2 on the CPU and +130 off set only on the core GPU. Same thing happened though when everything was at stock.

It's happening in 'to hell and back' area toward the end of Crysis but my screen locks in the game have been also in various demanding places. What's your take on this? I've never had this sort of issues with my single 680 (for brief time in this rig) when I had an OC on it.



I have Crysis set to highest 'enthusiast' settings and only turned down AAX16 to AAx8. Two cards SLI or a single GPU should be able to handle this right?

Perhaps I should try Crysis 2 maxed settings before I jump to sending back the 690. If it's not the 690 I'd hate to send it back and them not be able to replace it. I love the card for many reasons that suit me well.

Downloading the HiRes DX11 update now. Since doing a clean install of Win7 when putting this rig together I've had to reload all my games.


----------



## Cheesemaster

Quad Sli does tax my system I am running a 3960x @ 4.8ghz and I feel my cpu is holding these things back.. However I did get a 300 point consistant increase in mark11 with a slightly lower gpu clock.. Presion-X 135 PT, 135 core offset and 200 mem offset.


----------



## V3teran

If your running everything at stock settings as in cpu,gpu,ram etc and its still freezing you may want to check your ram.
Use memtest and use only a single stick in each slot and try each single stick in all slots on your motherboard.
Failing that send the card back and get an exchange,i would infact i have!
Also psu is worth keeping in mind its recommended that you have a 750 watt psu standard as at full load it will use this amount,not to mention your overclocked cpu and ram,personally i would want a 1000w minimum with a 690,im using a corsair 1200w....take alook here,read this page....
http://www.guru3d.com/article/geforce-gtx-690-review/8

Here is Guru3D's power supply recommendation:
GeForce GTX 690 - On your average system the card requires you to have a *750 Watt* power supply unit.
GeForce GTX 690 SLI - On your average system the cards require you to have a 950 Watt power supply unit as minimum.

*If you are going to overclock the GPU or processor*, then we do recommend you purchase something with some more stamina.

There are many good PSUs out there, please do have a look at our many PSU reviews as we have loads of recommended PSUs for you to check out in there. What would happen if your PSU can't cope with the load:
bad 3D performance
*crashing games*
spontaneous reset or imminent shutdown of the PC
*freezing during gameplay*
PSU overload can cause it to break down


----------



## bnj2

Quote:


> Originally Posted by *V3teran*
> 
> Here is Guru3D's power supply recommendation:


Why is everyone exaggerating with the power requirements?
I remember last year when I got my 580 SLI cards everyone was saying that 850W might not be enough. Turned out to be just fine.
Even if on that Guru3D article they get... wait for it... *400W* at the wall with the GPU in full stress, they still recommend a 750W PSU... really? Hell, add another 150W if the CPU would be at full load and you still get 550W *at the wall socket*!
My guess is that a 850W PSU would be enough for a 690 SLI setup, but I'll test it when I'll get my 2nd 690 in the next few weeks.


----------



## Cheesemaster

quick opinion on these cards.. I am running quad setup.... these cards are beast! you dont really need to overclock much.. you can set 135% power offset and 100mhz clock off set and your done.. These things automatically over clock on boost to about 1150mhz average across the gpus.. other then benching.. who needs four gpus running at 1150mhzs.. what I should say you dont need any more. I am running BF3 ultra with 2xmsaa on three monitors with 120hz acer monitors thats 5760x1080 and am getting 80-90fps! do i need to say more... still cant run mindcraft though.. cheers to all who have them; these cards are freaking wicked! to those that are gonna water block these things and take them even further you guys are the real beasts.. that is like putting twin turbos on a lambo.. but hey you guys will be the ones to push the technological envolope to the max.. which in turn finds its way down to the rest!

P.s the cards are limited by thermals.. the colder they are the faster they boost. I can easily see these things hittin 1400mz under water.


----------



## phantomphenom

I pre ordered the asus 690 off of amazon 9 days ago...i wonder much time i'll have to wait....did anyone pre order and how long did you guys have to wait for it to be shipped? I'm waiting for this card to come in for my new build -_-'


----------



## V3teran

I wouldnt count on amazon tbh,you may get it in a week,you may get it in a month,you may get it in 3 months.


----------



## Cheesemaster

Just had my best run! I got 37772 Mark P graphics score! here is a contrast from previous setup... My new score


----------



## V3teran

Geat score man!


----------



## emett

Nm he's talking about grfx score.. I got excited..


----------



## Stateless

Quote:


> Originally Posted by *V3teran*
> 
> Ok people i tried these in quad sli using the 301.40 drivers.
> I tried in x16 x8 pcie lanes which game me a tremendous amount of stutter.
> I tried them also in x16 x16 lanes which improved a great deal but whenever i came to bench them only 1 gpu out of the 4 was on full load and the rest were just idling,i tried 3d mark vantage and heaven.
> At one point according to gpuz and Nvidia control panel the cards were running in tri-sli and not quad sli,no matter what i did regarding power features or reinstallation of drivers using *Your uninstaller* and *driversweeper* i couldnt change this.
> My conclusion is that i couldnt get them to work properly,the stutter was so bad around desktop even that i found it difficult to even click on anything,its probably an immature driver problem.
> My 295's(SPCB) ran perfect and i had no stutter whatsoever with anything but after testing these 690s today in quad sli i really truthfully honestly would steer well clear of quad sli at least atm anyhow,i found it an appaling experience.....you dont need it anyway even if it did work.
> After my experience today i don think i will ever be going quad sli with these tbh as im sending one back....i dont recommended it.


Something is up with your setup because my quad is running fine. Through Unigine and 3dmark11 all 4 gpu's are running very close and in some cases at the same exact speed. I have no issues with the desktop, surfing the web or anything else remotely like you are experiencing. It sounds like something is definatly wrong somewhere in your set up that is causing this issue. I am sure that driver maturity can and will improve things, but the issues you have can in no way be driver related as I am running those same drivers.


----------



## Arizonian

Ok quick update with Crysis and even tested Crysis 2 where it freezes or crashes to desk top.

I've changed my AX850 with another new AX850 as a test. I'm experiencing the same issue of the game freezing up.

I've brought everything to stock CPU & GPU and RAM is never over clocked.

So my next step is to test each RAM stick individually through Memtest. I hope it's a bad RAM stick rather than a bad GTX 690.

I know that 850 watts has powered up two GTX 580's over clocked with over clocked CPU's on other peoples rigs all last year without any problems. My current AX 850 is brand new as well so I know it's not that I don't have enough wattage. I thought it may have been a bad PSU but that's not the case. The second new one I purchased is going back. In fact my entire rig is new build ground up.

If anyone has any advice, I'll accept it.


----------



## MrTOOSHORT

Give your ram some more voltage Arizonian. Add a tad of voltage to the cpu and IMC(VCCSA).

See if that helps.


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Give your ram some more voltage Arizonian. Add a tad of voltage to the cpu and IMC(VCCSA).
> See if that helps.


Thanks for the quick reply.

Are we talking a small 0.05v bump?


----------



## MrTOOSHORT

Yep, what's your ram's voltage set at atm?


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Yep, what's your ram's voltage set at atm?


CPU voltage is = 1.130v
DDR = 1.650v

Found VCCSA = 0.092500


----------



## MrTOOSHORT

Ram is at 2400Mhz?

Try 1.67v and see what happens with Crysis. VCCSA is the IMC voltage for SB-E, not sure if it's the same with IVY. But if you have it at auto, it might be fine where it is.


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Ram is at 2400Mhz?
> Try 1.67v and see what happens with Crysis. VCCSA is the IMC voltage for SB-E, not sure if it's the same with IVY. But if you have it at auto, it might be fine where it is.


I'll give it a shot. Something is a miss. It's probably pretty minor. Will be doing further testing. Thanks for the input. +1 rep.


----------



## MrTOOSHORT

I believe it's a fail set of settings for your cpu and memory. Sometimes auto doesn't put the correct values needed even for stock.

Run HyperPI with the setting 32m and see if it passes. Will only take about 11 minutes. Tests your CPU and Memory for stability. I use is to see if my Memory OC is stable.

http://files.extremeoverclocking.com/file.php?f=211


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I believe it's a fail set of settings for your cpu and memory. Sometimes auto doesn't put the correct values needed even for stock.
> Run HyperPI with the setting 32m and see if it passes. Will only take about 11 minutes. Tests your CPU and Memory for stability. I use is to see if my Memory OC is stable.
> http://files.extremeoverclocking.com/file.php?f=211


I've never used this program but gave it a try. I didn't see any error messages after it was done.


----------



## MrTOOSHORT

^^^^

Looks pretty good. I think your ram is fine.

I am wondering if it's just a driver issue. Even a bios update for the mobo can fix problems. Since everything is new tech, might need to wait for updates. Who knows.

Hope you get it sorted out.


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> ^^^^
> Looks pretty good. I think your ram is fine.
> I am wondering if it's just a driver issue. Even a bios update for the mobo can fix problems. Since everything is new tech, might need to wait for updates. Who knows.
> Hope you get it sorted out.


Bumped the VCCSA & DRAM voltage just a touch. Disabled C1 & Stepping. Running 3.9. Ghz CPU OC at a constant rather than fluctuating.

With my 1175 MHz & 1201 MHz GPU Boost OC I seem to be stable. That might have been the trick. It's only a 12% OC on the CPU but I'm taking it slowly and step at a time.

The GPU stock or with an OC stable is more important to me. Fingers crossed. Well earned rep.







Going to test the hell out of this before I move forward.

Edited to add: Didn't work. Moved onto BF3 Ultra mode slight OC except turned off motion blur. Started playing and black screened hard lock up. Went back to stock and have artifacts. BF3 is more demanding turned up all the way as opposed to Crysis.

So calling Newegg and hopefully doing an exchange. I sure hope they can replace it. Didn't pay $1000 to not to be able to game. I can stream Netflix with the intergrated Intel iGPU 4000.


----------



## V3teran

Quote:


> Originally Posted by *Stateless*
> 
> Something is up with your setup because my quad is running fine. Through Unigine and 3dmark11 all 4 gpu's are running very close and in some cases at the same exact speed. I have no issues with the desktop, surfing the web or anything else remotely like you are experiencing. It sounds like something is definatly wrong somewhere in your set up that is causing this issue. I am sure that driver maturity can and will improve things, but the issues you have can in no way be driver related as I am running those same drivers.


Yeah maybe mate,the card is still going to go back as its not needed for any game whatsoever at my resolution,im waiting for them to pick it up now today.
Quote:


> Originally Posted by *Arizonian*
> 
> Bumped the VCCSA & DRAM voltage just a touch. Disabled C1 & Stepping. Running 3.9. Ghz CPU OC at a constant rather than fluctuating.
> With my 1175 MHz & 1201 MHz GPU Boost OC I seem to be stable. That might have been the trick. It's only a 12% OC on the CPU but I'm taking it slowly and step at a time.
> The GPU stock or with an OC stable is more important to me. Fingers crossed. Well earned rep.
> 
> 
> 
> 
> 
> 
> 
> Going to test the hell out of this before I move forward.
> Edited to add: Didn't work. Moved onto BF3 Ultra mode slight OC except turned off motion blur. Started playing and black screened hard lock up. Went back to stock and have artifacts. BF3 is more demanding turned up all the way as opposed to Crysis.
> So calling Newegg and hopefully doing an exchange. I sure hope they can replace it. Didn't pay $1000 to not to be able to game. I can stream Netflix with the intergrated Intel iGPU 4000.


Are you running at stock settings with no OC??You need to test using memtest mate,not that program your using,it wont test for stability like memtest does,also your ram running at auto should be good enough,its possible it could also be your motherboard...once step at a time,maybe reinstalling windows also can help....


----------



## Arizonian

@V3teran - I haven't tried running the RAM in auto because when I did it showed 1300 MHz when it's actually 2400 MHz. Will try that first.

Yes everything was at stock when I had the artifacts last night. I was having a wash of colors of purple and some lines in various areas. I can even try putting my 680 in to test if it's my card. I should try that too. Will wait on the return until I exhausted all options.

I didn't think RAM would cause purple colors to wash into the graphics being rendered. So I guess I'm assuming it must be the card. Thanks for your input.


----------



## V3teran

Quote:


> Originally Posted by *Arizonian*
> 
> @V3teran - I haven't tried running the RAM in auto because when I did it showed 1300 MHz when it's actually 2400 MHz. Will try that first.
> Yes everything was at stock when I had the artifacts last night. I was having a wash of colors of purple and some lines in various areas. I can even try putting my 680 in to test if it's my card. I should try that too. Will wait on the return until I exhausted all options.
> I didn't think RAM would cause purple colors to wash into the graphics being rendered. So I guess I'm assuming it must be the card. Thanks for your input.


Have you tried using the 301.40 driver?


----------



## Cheesemaster

Best run yet.. I ordered Team Extreme [email protected] 9 It should improve very much!


----------



## Arizonian

Quote:


> Originally Posted by *V3teran*
> 
> Have you tried using the 301.40 driver?


That's what I'm on.

@Cheesemaster - impressive run.


----------



## Cheesemaster

Quote:


> Originally Posted by *Arizonian*
> 
> That's what I'm on.
> @Cheesemaster - impressive run.


still playing around a little bit just waiting for my faster ram..I should be closer to a 27000p score.


----------



## Arizonian

Well things went from bad to worse. So I'm at fully stock CPU / GPU & RAM on Auto.

Getting artifacts on windows desktop now. I've not been able to try anything in games until tonight but this really points to GPU at this point.

Card is failing on me it looks like. Will try again tonight some further testing.

Had a driver failure with 100 MHz core / memory over clock watching a Netflix stream.


----------



## V3teran

Just send the card back,the sooner you do it the sooner you can get back to normal,why waste anymore of your time,all options have been tried,send it back and be done with it


----------



## bnj2

Koolance finally has a WB for 690.
Too bad it looks like crap. I would have expected a led-lit logo or something, afterall one of the reasons we payed that much four our cards because was the awesome looks.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Arizonian*
> 
> Well things went from bad to worse. So I'm at fully stock CPU / GPU & RAM on Auto.
> Getting artifacts on windows desktop now. I've not been able to try anything in games until tonight but this really points to GPU at this point.
> Card is failing on me it looks like. Will try again tonight some further testing.
> Had a driver failure with 100 MHz core / memory over clock watching a Netflix stream.


That sucks man.









I hope you get a replacement asap.


----------



## ceteris

Quote:


> Originally Posted by *bnj2*
> 
> Koolance finally has a WB for 690.
> Too bad it looks like crap. I would have expected a led-lit logo or something, afterall one of the reasons we payed that much four our cards because was the awesome looks.


Design looks like all the other recent blocks released. At least its not made with cheaper plastic with some big sticker on it. I'm going to try holding out for a Watercool Heatkiller unless the EK one happens to look spectacular.


----------



## sakekitsune

I'm on fixed income,I just want one for the $999 price........not $1500 bs


----------



## Kyouki

Quote:


> Originally Posted by *bnj2*
> 
> Koolance finally has a WB for 690.
> Too bad it looks like crap. I would have expected a led-lit logo or something, afterall one of the reasons we payed that much four our cards because was the awesome looks.


Not bad but it would be nice to see a company design it to keep the look of the stock shroud and LED logo somehow but sometime you have to lose aesthetics for performance.


----------



## Arizonian

Hmmm I may have figured out my problem. A bitfenix cable that looks like it could have been faulty.



Not sure but the wires seemed connected. It was only the white sleeving that was torn. Perhaps it was being pushed away from the connectors when I put my back side panel on where I'm hiding the cables. Would explain some things why I'm crashing.

Changed the wire to a spare black one I have.

So I loaded up BF3 Ultra settings with motion blur turned off and it hit GPU #1 *1174 Mhz Core* & *GPU #2 1200 Mhz Core* with a 1602 MHz Memory over clock. CPU stock currently with RAM on AUTO instead of a forced 2400 MHz. On AUTO the RAM shows it's 1300 MHz.

45 mins of continuous play no issues or artifacting and I was rushing around heading toward explosions to really push the graphics.









Then when browsing internet had a 'drivers stopped responding' message. So removed the OC on the GPU Memory.

_I'm going to be further testing. Though I did get an RMA # from Newegg I'm now thinking that was my issues. Thanks for the club members throwing ideas at me. Always nice to be helped and it's the simple things that go over looked that another pair of eyes can spot._









Off to Crysis 2 and push some more gaming tests.

Edited to add since I'm still last post.

I've over clocked *CPU 4.0 GHz* - C1 & Stepping disabled. RAM is on AUTO instead of XMP and I did not force a 2400 MHz on it. I've got a constant 14% over clock on CPU and 14% over clock on GPU stable. No GPU memory over clock I decided to leave it at stock 1502 MHz Memory. Everything seems to be stable.

I've tested now for about 2.30 hours of BF3 campaign maxed out Ultra graphics and Crysis 2 campaign for about 40 mins Ultra settings as well and both with motion blur turned off.

I think this whole time it was the multi-sleeved PCIe power cable extension from Bitfenix that may have had a bad connection and causing power loss. It wasn't the Bitfenix cable itself that was bad but possibly the pressure of the inside tower case that rubbed up against it and dislodged or loosened the one wire from a solid connection.

My card has been working perfectly and I couldn't be happier now that I'm almost positive I've got the problem fixed. Newegg had said if they couldn't replace the card they would have given me a refund and I do not want a refund, I want this beautiful GTX 690 sitting in my rig.









Getting great graphics and FPS as expected from the GTX 690.







A 260 MHz & 285 MHz over clock on the Core Boost.

Here is what GPU-Z screen shot of both GPU's looked liked when I shut down BF3 stats. Card stayed 72C & 71C highest temps. Sweet for a dual GPU.



*On a side note* : *The new 301.42 drivers FIXED the power down bug!* I love Nvidia's drivers. It didn't take them long.


----------



## Shadowness

Quote:


> Originally Posted by *Kyouki*
> 
> Not bad but it would be nice to see a company design it to keep the look of the stock shroud and LED logo somehow but sometime you have to lose aesthetics for performance.


Can only hope the EVGA Waterblock comes soon. Black, sleek, full cover. Hope i am not going to be disappinted i so want to put this card under water if i get it.


----------



## zkalra

So I have an Alienware Aurora ALX chasis (cuz Area 51s arent available where i live) and have two gtx 690s installed in quad sli qith A 990X EXTREME OVERCLOCKED (BUT WATERCOOLED) TO 4.5 Ghz. Thing is the darn chasis is too small so i have to have them in corresponding slots which means little gap (if at all) between the cards. So the GPU1 and GPU2 temps are awesome (maxing out even unigine for over 15 mins at 75-78) but GPU3 and GPU4 temps are insane - maxing out at 98!!!! all this with the darn side panel removed (with it in place they were reaching100 and probably more (as MSI afternurner doesnt record beyond 100). Point is there is no space for watercooling these beasts. And there are plenty of people with quad sli 680s running their rigs with air cooling and little to no space between their cards - so how come my cards reach such insane temps.

THEY ARE NOT OVERCLOCKED. Is this normal???? i would love to be able to shut the chasis and hence reduce sound levels but at this rate even BF3 after around 30 minutes makes 3 and 4 reach 100 degrees!!!! is there something wrong with my card???


----------



## bnj2

Switch the cards and see what temperatures you're getting.


----------



## ceteris

I would try each card individually to see if its the same. Also reinstall your drivers after wiping all traces of previous ones if you hadn't done a clean install already.


----------



## V3teran

Your gonna need a bigger boat!


----------



## zkalra

Quote:


> Originally Posted by *ceteris*
> 
> I would try each card individually to see if its the same. Also reinstall your drivers after wiping all traces of previous ones if you hadn't done a clean install already.


Thanks. Will try.


----------



## Stein357

There's not enough space to get sufficient air flow. As many told you in your thread you created, get a better case so you can space apart the cards.


----------



## Cheesemaster

So here is the final bench post for a awhile.. I am at the wall but here are the final numbers that my rig will permit at this time.


----------



## Arizonian

Quote:


> Originally Posted by *Cheesemaster*
> 
> So here is the final bench post for a awhile.. I am at the wall but here are the final numbers that my rig will permit at this time.


Nice score Cheesemaster







Save those babies. Never know if a dual GPU benchoff is far off.


----------



## Cheesemaster

Quote:


> Originally Posted by *Arizonian*
> 
> Nice score Cheesemaster
> 
> 
> 
> 
> 
> 
> 
> Save those babies. Never know if a dual GPU benchoff is far off.


This was done with quality settings enabled along with FXAA turned on I could have maby got a couple hundred more by tweaking to performance settings in nvidia control panel but I feel that these cards not only represent raw power but quality so I used HQ settings .


----------



## Cheesemaster

!Random post! Does anyone agree with me that these cards are FREAKING AWSOME! OMG!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


----------



## Arizonian

Quote:


> Originally Posted by *Cheesemaster*
> 
> !Random post! Does anyone agree with me that these cards are FREAKING AWSOME! OMG!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


I agree I love the GTX 690 too.









First off the quality of the build is amazing best GPU of any kind to this date. Built from the ground up with solid engineering and components.

Plated aluminum frame & magnesium fan housing. Top if off with polycarbonate windows to view inside and laser etched LED lightning name on side.

Built on mother board thick quality PCB. Five phase PWM's per side (10 total) instead four phase PWM's like reference. Dual vapor chamber cooling which is very efficient and quite ducted airflow channels with axial fan.

Frame metering to reduce micro-stutter. Best performing dual GPU ever with improved scaling almost as good as two singles in SLI performance.

Everything you guys in the club already knew but it was worth repeating following Cheesemaster's post.









Fits my needs perfectly for my single 120 Hz monitor for 2D & 3D Vision only taking up one PCIe 3.0 x 16 slot, optimal bandwidth. Temperatures is phenomenal at 30C-31C idle and highly over clocked 71C-74C full throttle. Lower power consumption than two in SLI.

The cherry on the top for me is asthetics which is very important to me in my rig (see pics of Ivy Cruncher) with an EVGA back plate being released real soon I can't wait to finish my system.

I'm relived to know it's backed by EVGA 3 year outstanding RMA customer service on top which makes me sleep much better at night knowing my investment is covered.









P.S. - I know that I could have got two reference GPU's for a bit more performance but for the reasons stated above that you can't get with two in SLI it was more than worth it to me.


----------



## ShodanMarcus

Quote:


> Originally Posted by *Cheesemaster*
> 
> So here is the final bench post for a awhile.. I am at the wall but here are the final numbers that my rig will permit at this time.


Oh
My
God


----------



## Kyouki

So right now I have a real mild overclock =)!!! just playing around. But today I was playing few games with and without Vsync and also tried the adaptive mode. I currently only have a 60hz Samsung. what do you guys recommend on the Vsync setting? on,of,adaptive, or let program chose. this card is easy maxing out my 60fps so just wondering.


----------



## tonyjones

Hopefully soon when my damn tax refund comes I will get TWO GTX 690! is there any GTX 690 quad sli benchmarks yet?


----------



## Arizonian

Quote:


> Originally Posted by *tonyjones*
> 
> Hopefully soon when my damn tax refund comes I will get TWO GTX 690! is there any GTX 690 quad sli benchmarks yet?


Cheesemaster - one page back here.


----------



## Kyouki

Quote:


> Originally Posted by *tonyjones*
> 
> Hopefully soon when my damn tax refund comes I will get TWO GTX 690! is there any GTX 690 quad sli benchmarks yet?


Yah cheesemasters just a few post above ours shows his 3dmark11 score with dual gtx690 in quad sli! I also have plans to pic up another one. But first I'm trying to decide between going with 3 24in 120hz in surround or one 27in 120hz. Any ideas guys? And also vsync what setting you prefer?


----------



## bnj2

Quote:


> Originally Posted by *Kyouki*
> 
> But first I'm trying to decide between going with 3 24in 120hz in surround or one 27in 120hz. Any ideas guys?


2x Benq XL2420T + 1x Benq XL2420TX + a nice triple monitor stand, that's what I just got


----------



## Cheesemaster

Quote:


> Originally Posted by *Kyouki*
> 
> Yah cheesemasters just a few post above ours shows his 3dmark11 score with dual gtx690 in quad sli! I also have plans to pic up another one. But first I'm trying to decide between going with 3 24in 120hz in surround or one 27in 120hz. Any ideas guys? And also vsync what setting you prefer?


3 acers 235hz they are awsome and they have really addressed any issues with them. I have nvidia 3d vision 2 and with full depth you have to really strain and go looking for ghosting.. they are a steal of a bargin and really nice quality. I have bought five of them in the last year. The new models are so much better (older ones pushed alot of green) really best all around, really I dont know how you can do much better IMO...


----------



## Cheesemaster

By the way my none of my runs are suicide runs I can run prime 95 and unigine bench mark simultaneously which I just did, actually im running prime and unigine as I am typing this right now to make myself rest assured she is rock stable.. God I love all of you guys you are such an inspiration to such a noob like myself..


----------



## zkalra

My Benchies on GTX 690 Quad Sli on the Extreme Preset of 3DMark 11!!!!


----------



## Cheesemaster

Quote:


> Originally Posted by *zkalra*
> 
> My Benchies on GTX 690 Quad Sli on the Extreme Preset of 3DMark 11!!!!






AWSOME! I feel we all need stronger CPU's I either need faster clock or an 8 core @6.0ghz to let these suckers run free!


----------



## Arizonian

Impressive Xtreme 3DMark11 bench Zkalra.









See your somewhat new. Welcome to OCN and welcome to the GTX 690 Owners Club.


----------



## Sir_Gawain

Quote:


> Originally Posted by *zkalra*
> 
> 
> So I have an Alienware Aurora ALX chasis (cuz Area 51s arent available where i live) and have two gtx 690s installed in quad sli qith A 990X EXTREME OVERCLOCKED (BUT WATERCOOLED) TO 4.5 Ghz. Thing is the darn chasis is too small so i have to have them in corresponding slots which means little gap (if at all) between the cards. So the GPU1 and GPU2 temps are awesome (maxing out even unigine for over 15 mins at 75-78) but GPU3 and GPU4 temps are insane - maxing out at 98!!!! all this with the darn side panel removed (with it in place they were reaching100 and probably more (as MSI afternurner doesnt record beyond 100). Point is there is no space for watercooling these beasts. And there are plenty of people with quad sli 680s running their rigs with air cooling and little to no space between their cards - so how come my cards reach such insane temps.
> THEY ARE NOT OVERCLOCKED. Is this normal???? i would love to be able to shut the chasis and hence reduce sound levels but at this rate even BF3 after around 30 minutes makes 3 and 4 reach 100 degrees!!!! is there something wrong with my card???


Good luck with that man. The Aurora case is just too small. I have had 1 GTX 590, 2X 7970'S, 2X GTX 680's in my case.....The 7970's and the 680's worked fine however I had to keep the card fans running high as well as the PCIe fan very high as well making my system very loud. I now have just one 680 lol.

You have two things going against you in that case, first is the obvious closeness between the cards...second the fan design of the 690 is dumping hot air back into your case and is being sucked back into the GPU's.
I have made alot of changes to my case for better airflow as well (one of the most beneficial being a cable reroute so the bottom GPU can breath easier) but nothing help the temp on the GTX 590 when i had it.

I was set to water cool my GPU's and do some fancy work to the case, but at the end of the day it just wasnt worth it for me. The cost/work it would require is silly compared to just building a new rig which is far better suited for the thermal requirements.


----------



## Kyouki

Quote:


> Originally Posted by *bnj2*
> 
> 2x Benq XL2420T + 1x Benq XL2420TX + a nice triple monitor stand, that's what I just got


Quote:


> Originally Posted by *Cheesemaster*
> 
> 3 acers 235hz they are awsome and they have really addressed any issues with them. I have nvidia 3d vision 2 and with full depth you have to really strain and go looking for ghosting.. they are a steal of a bargin and really nice quality. I have bought five of them in the last year. The new models are so much better (older ones pushed alot of green) really best all around, really I dont know how you can do much better IMO...


Thank you I have looked at both these options, Also reading up on the BenQ's I heard to just get 3 of the XL2420T and buy Nvidia 3D seperate because the stock sensor was having issue in the benq. But 3D not really the selling point for me, it will be a nice adition though. So as it stands it is 3 24in at 2 votes and 1 27in at 0! hahaha


----------



## Sir_Gawain

Quote:


> Originally Posted by *Kyouki*
> 
> Thank you I have looked at both these options, Also reading up on the BenQ's I heard to just get 3 of the XL2420T and buy Nvidia 3D seperate because the stock sensor was having issue in the benq. But 3D not really the selling point for me, it will be a nice adition though. So as it stands it is 3 24in at 2 votes and 1 27in at 0! hahaha


My vote would be to stay away from the 2420T, that thing burned my retinas at any setting and colors looked so drab IMHO. I sold it after 3 weeks of buying it.

I am picky I suppose though, I was spoiled with the RGB screen on my M17X DTR...that thing was amazing. My current monitor is the closest I have had to that RGB LED.


----------



## Cheesemaster

well, I am back for those that might want to know ill let some numbers speak for themselves.. This is with the new drivers from nvidia, they allowed a 160mhz core offset as opposed to 140mhz (old drivers would crash) in mark11 with a score increase of about 250 points.. there ya go for those who might have been wondering about the new drivers...



P.S not all results will be the same.


----------



## zkalra

Quote:


> Originally Posted by *Arizonian*
> 
> Impressive Xtreme 3DMark11 bench Zkalra.
> 
> 
> 
> 
> 
> 
> 
> 
> See your somewhat new. Welcome to OCN and welcome to the GTX 690 Owners Club.


Thanks and good to be here.


----------



## zkalra

Quote:


> Originally Posted by *Sir_Gawain*
> 
> Good luck with that man. The Aurora case is just too small. I have had 1 GTX 590, 2X 7970'S, 2X GTX 680's in my case.....The 7970's and the 680's worked fine however I had to keep the card fans running high as well as the PCIe fan very high as well making my system very loud. I now have just one 680 lol.
> You have two things going against you in that case, first is the obvious closeness between the cards...second the fan design of the 690 is dumping hot air back into your case and is being sucked back into the GPU's.
> I have made alot of changes to my case for better airflow as well (one of the most beneficial being a cable reroute so the bottom GPU can breath easier) but nothing help the temp on the GTX 590 when i had it.
> I was set to water cool my GPU's and do some fancy work to the case, but at the end of the day it just wasnt worth it for me. The cost/work it would require is silly compared to just building a new rig which is far better suited for the thermal requirements.


Thanks. I am changing the case tomorrow. nothing beats a bit of space between the GPUs


----------



## zkalra

Quote:


> Originally Posted by *Cheesemaster*
> 
> well, I am back for those that might want to know ill let some numbers speak for themselves.. This is with the new drivers from nvidia, they allowed a 160mhz core offset as opposed to 140mhz (old drivers would crash) in mark11 with a score increase of about 250 points.. there ya go for those who might have been wondering about the new drivers...
> 
> P.S not all results will be the same.


Phenomenal


----------



## Stateless

Quote:


> Originally Posted by *Cheesemaster*
> 
> well, I am back for those that might want to know ill let some numbers speak for themselves.. This is with the new drivers from nvidia, they allowed a 160mhz core offset as opposed to 140mhz (old drivers would crash) in mark11 with a score increase of about 250 points.. there ya go for those who might have been wondering about the new drivers...
> 
> P.S not all results will be the same.


Very nice man. I have been so busy lately, but I have some vacation time coming up where i can do some work on my rig. Last time I did a 3dmark11 P Score I was around the 25k mark, but recently did some work on my CPU OC. Need to DL the new drivers and see what that does for me.


----------



## Callandor

Quote:


> Originally Posted by *Stateless*
> 
> Yeah my cards pretty max at +120 offset and +250 on the memory. Anything beyond that and it crashes...this is after about 5-6 runs of Unigine does it crash when set higher. It is however a rock solid overclock as I have done Unigine, 3dmark11, Metro2033, Witcher 2, Crysis 2 and Diablo 3 on it and no issues at all...temps stay around the high 60's depending on game with an occasional gpu going into the 70c for a brief moment.


So I just finished my build with the GTX 690 and jumped right into some Diablo 3, everything stock.

I was initially seeing temps on both GPUs of 71-72 which seems high to me, but the fan seemed to only be set to around 45-50%. I am running the game with all high settings and at 2550x1440 resolution, but this is still not a graphically intensive game.

I tried manually setting the fan higher but it didn't seem to take. I did turn on software control of the fan and when I went back into the game the temps got into the mid 60s but the fan was also running at ~60%

Does this seem normal? I just feel if I am hitting 70 degrees at stock in a game that is not graphically intensive then I am facing paltry results when I start trying to overclock.


----------



## V3teran

These temps sound about right for the resolution your playing at and the fan speed that your using.
I dont use the dynamic fan i use 2 settings,30% for idling around internet and 65% for gaming, i use the end and home buttons on the keyboard,try that and see what you think.


----------



## Callandor

Quote:


> Originally Posted by *V3teran*
> 
> These temps sound about right for the resolution your playing at and the fan speed that your using.
> I dont use the dynamic fan i use 2 settings,30% for idling around internet and 65% for gaming, i use the end and home buttons on the keyboard,try that and see what you think.


Thanks for the tip, I will try this when I get home.

Even at that resolution, just seems surprising the temps get that high, wouldn't think that game would be that demanding.


----------



## Arizonian

@Callandor - your temps are nromal like Vet said at your resolution.

Mine heavily overclocked on 1920x1080p 120 Hz hits the same temps 71C-72 with heavy OC and idle 30C-31C.

Welcome aboard OCN and the 690 club.









Your set up is vey close to mine.


----------



## bnj2

Are you guys having higher temperatures with 301.42?


----------



## V3teran

Quote:


> Originally Posted by *bnj2*
> 
> Are you guys having higher temperatures with 301.42?


I found with the 301.42 gpu 1 was downclocking to around 300mhz while gpu2 was doing all the work,ive gone back to the 301.40.


----------



## burningrave101

ASUS GTX 690 currently available from TigerDirect:

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=2652128&CatId=7387


----------



## Stateless

Has anyone gone to water cooling on these yet? I know Koolance has Waterblocks available and over at Evga in the main 690 thread, Jacob from evga posted a picture of the Hydrocooper 690 and stated blocks will be available seperately.

I have never done water anything other that using a H100 on my CPU and I am taking a few weeks off of work to burn some vacation time and am really having an itch to go to water, but never done it before. Is Koolance a good brand? It seems they are the only ones that have blocks right now and just need some input. I will say upfront I am total noob at water so please have patience with me and my questions.

My thoughts are is going complete water for CPU and GPU's. I am looking at a bunch of stuff right now and think I have things that I would need. If you are running water, would 1 120mm radiator that supports 3 120mm fans be enough to water cool both CPU and 2 690's?

The only thing that has me scared is tearing apart the lovely 690's and adding the blocks to them because if I f'up, I am out alot of money...so any advice on how hard/easy it is to add a water block to a gpu would also be appreciated.


----------



## error-id10t

Considering the first throttling point is only at 70 degrees I would personally put one in. After doing that to my 580 (and the CPU) I would recommended it everytime - of course with the 500 series you can raise the volts while you're limited what you can do with this card..


----------



## bnj2

Ok, I was getting insane temperatures (80-85 C) in Diablo 3 after upgrading to 301.42 so I decided to play with the nvidia control panel settings and ended up doing this



Now I'm getting around 45 C on both cores with 20-25% usage.
I have no idea what I have done as I haven't slept for the 48 hours but I thought I would share this and maybe someone will figure it out before I wake up









LE: neeeevermind that, I am stupid and tired... I found what I have done: I enabled the VSYNC from the nvidia control panel.


----------



## Cheesemaster

Teah I had to play with my cpu I really had to do some tweaking but I git it to 4.9ghz.. ( much respect to those that can get it to 5.0ghz) I squeezed about 150 more points out....


----------



## Cheesemaster

I'm really trying to tweak a higher combined score, feel that it is the hardest score to dial in. I feel it represents that all of components are synchronized.. Please correct my logic, if this is unsound thinking..


----------



## Cheesemaster

Quote:


> Originally Posted by *Cheesemaster*
> 
> I'm really trying to tweak a higher combined score, feel that it is the hardest score to dial in. I feel it represents that all of components are synchronized.. Please correct my logic, if this is unsound thinking..


Just for the cheese sticks in all of us here is an Xscore


----------



## Stateless

Hey guys...just throwing this out there because I am not sure what I want to do.

I was able to borrow a 30" monitor from a friend to try out my 2x690's on it and while incredible looking, I just cant go from my living room/sofa and 55" 3dHDTV to a single 30" monitor sitting at a desk.

With that, I am thinking of selling one of my 690's to someone here as I know many of you have not got one yet or are looking for a 2nd card. While I am not 100% fully decided on doing this, I am serious leaning and I would rather sell it to someone here versus ebay...I am not looking to make any money other than what it cost me to buy the card.

Just seeing if there is interest in someone purchasing.

Thanks!


----------



## djriful

Quote:


> Originally Posted by *Stateless*
> 
> Hey guys...just throwing this out there because I am not sure what I want to do.
> 
> I was able to borrow a 30" monitor from a friend to try out my 2x690's on it and while incredible looking, I just cant go from my living room/sofa and 55" 3dHDTV to a single 30" monitor sitting at a desk.
> 
> With that, I am thinking of selling one of my 690's to someone here as I know many of you have not got one yet or are looking for a 2nd card. While I am not 100% fully decided on doing this, I am serious leaning and I would rather sell it to someone here versus ebay...I am not looking to make any money other than what it cost me to buy the card.
> 
> Just seeing if there is interest in someone purchasing.
> 
> Thanks!


Monitor or TV size does not matter if all the resolutions are 1080p. Those 2x 690 are overkill on a 1080p anyways even on a 120hz.

What you really need to look at is a surround monitors setup or beyond 1600p res.


----------



## Stateless

Quote:


> Originally Posted by *djriful*
> 
> Monitor or TV size does not matter if all the resolutions are 1080p. Those 2x 690 are overkill on a 1080p anyways even on a 120hz.
> 
> What you really need to look at is a surround monitors setup or beyond 1600p res.


Perhaps I should of been a bit clearer...the monitor I borrowed was a 2560x1600 and it did look amazing...but my main point is that I just could not get use to gaming at a desk versus from my big 7.1 audio system, 55" HDTV and comfy sofa. If it was not for that, I would be all over a multi-monitor setup or a 2560x1600 monitor, but it is not in the cards for me as I prefer my current gaming set up.


----------



## phantomphenom

I would like to ask if everyone who is going to water cool their 690's, if they could post their max benchmark score on stock cooling and their max benchmark scores on water. I would like to see everyones scores and see generally how different everyones scores will be for both stock and water cooling!


----------



## V3teran

My gtx 690 is going in this,it will have 2xRX360 triple rads all to itself,i 2 of these gtx 690s but they really are not needed at atm so i sent one back for a full refund,so my existing gtx 690 will get 2x triple rads instead of one which is complete overkill.
http://www.overclock.net/t/1111648/build-log-caselabs-project-overkill


----------



## Callandor

Quote:


> Originally Posted by *Arizonian*
> 
> @Callandor - your temps are nromal like Vet said at your resolution.
> Mine heavily overclocked on 1920x1080p 120 Hz hits the same temps 71C-72 with heavy OC and idle 30C-31C.
> Welcome aboard OCN and the 690 club.
> 
> 
> 
> 
> 
> 
> 
> 
> Your set up is vey close to mine.


Thanks, this is my first build in 4 years, coming off a Q6700 and GTX 280 that I never did much with.

Plan on overclocking both the 3770k and 690 as soon as I get some gaming out of my system









The info in this thread is great and I look forward to seeing what everyone is able to wring out of these cards, especially once the waterblocks are available


----------



## Cheesemaster

Quote:


> Originally Posted by *Callandor*
> 
> Thanks, this is my first build in 4 years, coming off a Q6700 and GTX 280 that I never did much with.
> Plan on overclocking both the 3770k and 690 as soon as I get some gaming out of my system
> 
> 
> 
> 
> 
> 
> 
> 
> The info in this thread is great and I look forward to seeing what everyone is able to wring out of these cards, especially once the waterblocks are available


Well this is my final run I spent last night trying to squeeze everything out of this thing with no artifacts or such time to enjoy the setup.. Cheesemasters final pscore.. (for now; lol face!)



P.s expect this score or better, I am such a noob.


----------



## Arizonian

Quote:


> Originally Posted by *Cheesemaster*
> 
> Well this is my final run I spent last night trying to squeeze everything out of this thing with no artifacts or such time to enjoy the setup.. Cheesemasters final pscore.. (for now; lol face!)
> 
> P.s expect this score or better, I am such a noob.


Very nice score. You topped yourself.









Have you done any gaming yet? Impressions?

I'm simply amazed at the crazy FPS fully maxed graphic settings. I can see myself holding this card for two years. Simply gratifying experience.


----------



## zkalra

So I did some benchmarks on the GTX 690 Quad SLI with my new 3930k on RIVE with Vengeance 1600 Ram. The results are quite oustanding -


----------



## zkalra

Quote:


> Originally Posted by *Cheesemaster*
> 
> Well this is my final run I spent last night trying to squeeze everything out of this thing with no artifacts or such time to enjoy the setup.. Cheesemasters final pscore.. (for now; lol face!)
> 
> P.s expect this score or better, I am such a noob.


Fabulous score bro. Truly admirable. Mine is below - suffice to say I have much to hone in my overclocking skills. Despite nearly identical hardware, my score is way lower than yours (3930 @4.7, RIVE, Quad SLI 690 and Corsair vengeance 1600 MHz) - I'm thinking its the memory. Any suggestions?


----------



## Cheesemaster

Quote:


> Originally Posted by *zkalra*
> 
> Fabulous score bro. Truly admirable. Mine is below - suffice to say I have much to hone in my overclocking skills. Despite nearly identical hardware, my score is way lower than yours (3930 @4.7, RIVE, Quad SLI 690 and Corsair vengeance 1600 MHz) - I'm thinking its the memory. Any suggestions?


well my run was done with 4.9ghz on the cpu and I am running extreme team RAM @ 2400mhz 9-11-11-28-1T. Also I got my gtx's with 140mhz offset on the core with 200mhz on the memory... also, lol.. I am running Corsair force 3 GT ssd... so right there is the difference Also, double lol... the 3960x has 15mb cache.. that has an impact on benches.. Suffice to say; you have a wicked system!


----------



## Cheesemaster

Quote:


> Originally Posted by *Arizonian*
> 
> Very nice score. You topped yourself.
> 
> 
> 
> 
> 
> 
> 
> 
> Have you done any gaming yet? Impressions?
> I'm simply amazed at the crazy FPS fully maxed graphic settings. I can see myself holding this card for two years. Simply gratifying experience.


OMG!!!! I am playing BF3 on ultra, Msaa turned off, fxaa turned on (same if not better looking) @5670x1080 on 120hz monitors; and muliti player I am averaging about 90fps.. Campaign single player, I rarely see a drop below a hundred.. 3d surround on ultra no prob! Mafia surround 3d OMG! all the games, even older ones, its like new life has been breathed back into them!

P.s no micro stutter, its gone not, only high frames; its smooth as cream.. and this is coming from 580gtx's 3-way sli; and they were the 3 gig version.....!


----------



## Arizonian

Quote:


> Originally Posted by *Cheesemaster*
> 
> OMG!!!! I am playing BF3 on ultra, Msaa turned off, fxaa turned on (same if not better looking) @5670x1080 on 120hz monitors; and muliti player I am averaging about 90fps.. Campaign single player, I rarely see a drop below a hundred.. 3d surround on ultra no prob! Mafia surround 3d OMG! all the games, even older ones, its like new life has been breathed back into them!
> P.s no micro stutter, its gone not, only high frames; its smooth as cream.. and this is coming from 580gtx's 3-way sli; and they were the 3 gig version.....!


That's awesome







I've been having fun running through my favorite game campaigns again. Can't wait for Crysis 3







until BF4









The frame metering seems to be helping this dual GPU with less micro-stutter by syncing them efficiently.


----------



## zkalra

Quote:


> Originally Posted by *Arizonian*
> 
> That's awesome
> 
> 
> 
> 
> 
> 
> 
> I've been having fun running through my favorite game campaigns again. Can't wait for Crysis 3
> 
> 
> 
> 
> 
> 
> 
> until BF4
> 
> 
> 
> 
> 
> 
> 
> 
> The frame metering seems to be helping this dual GPU with less micro-stutter by syncing them efficiently.


True. Though with quad sli the micro stutter rears its ugly head again.


----------



## m3t4lh34d

Quote:


> Originally Posted by *Cheesemaster*
> 
> Got 25000P score mark11 is that good?


Quote:


> Originally Posted by *blackend*
> 
> yes it is good,i have 4 gtx 680 and the score i got 27000p


I also use 4 680s, and running 3dmark at such a low resolution in Performance mode will not thoroughly stress all of the GPUs. Run a few benches at Extreme mode. I achieved a score of around 13k Extreme with my 4 GPUs.


----------



## Cheesemaster

Wow, wasn't expecting these flavors of cheese...



\

More cheese please, because the cheesiest is me.

P.s Cheese is my undercover name, you guys can still call me Cheesemaster!


----------



## ceteris

Quote:


> Originally Posted by *Cheesemaster*
> 
> Wow, wasn't expecting these flavor of cheese...
> 
> \
> More cheese please, because the cheesiest is me.
> P.s Cheese is my undercover name, you guys can still call me Cheesemaster!


Wow grats bro! You going to push with H2O later? Gotta keep your spot


----------



## Cheesemaster

I am looking into water but I cant find what i need


----------



## Cheesemaster




----------



## ceteris

Delta fans on H100? You must be deaf, boi!


----------



## Cheesemaster

Quote:


> Originally Posted by *ceteris*
> 
> Delta fans on H100? You must be deaf, boi!


thats where the Corsiar sp2500 comes in


----------



## Psyrical

The price is just horrible, same with all the other 600 series cards.


----------



## Cheesemaster

Price.. Are you kidding? The surround setup is something I only dreamed about as a child the performance is unparalleled. It is a rarity... Yeah most people could not afford two 690gtx's, that's two months rent for most people but I was so poor growing up; I used to go to the arcades and just watch people play... So this is a real treat, and to build it and overclock it myself makes me feel good. Well, I got to join and learn from this community right here.. Thanx everyone!


----------



## Sujeto 1

Quote:


> Originally Posted by *Cheesemaster*
> 
> Price.. Are you kidding? The surround setup is something I only dreamed about as a child the performance is unparalleled. It is a rarity... Yeah most people could not afford two 690gtx's, that's two months rent for most people but I was so poor growing up; I used to go to the arcades and just watch people play... So this is a real treat, and to build it and overclock it myself makes me feel good. Well, I got to join and learn from this community right here.. Thanx everyone!


Guess how you gonna feel when nvidia release their GK110 Geforce, back to the childhood.


----------



## Cheesemaster

I think I am good until Maxwell comes out .... But yeah It just keeps getting better!


----------



## ceteris

Quote:


> Originally Posted by *Sujeto 1*
> 
> Guess how you gonna feel when nvidia release their GK110 Geforce, back to the childhood.


I already have money in the bank waiting for those babies as well


----------



## Arizonian

Any SLI 670 & 680 or crossfire 7950 & 7970 or dual GPU 690 & 7990 will have no problems keeping up with games for the next two years and can easily sit out next series releases speculated to be out in December.

I hardly think anyone will feel bad owning any of these cards. Especially Cheese here with a killer quad set up.









If your waiting for the next big thing to come out you'll be waiting indefinetly.


----------



## Sujeto 1

Quote:


> Originally Posted by *Arizonian*
> 
> Any SLI 670 & 680 or crossfire 7950 & 7970 or dual GPU 690 & 7990 will have no problems keeping up with games for the next two years and can easily sit out next series releases speculated to be out in December.
> I hardly think anyone will feel bad owning any of these cards. Especially Cheese here with a killer quad set up.
> 
> 
> 
> 
> 
> 
> 
> 
> If your waiting for the next big thing to come out you'll be waiting indefinetly.


that's correct however, imo, there are trully steps on technolgy which worth the upgrading and some wich not worth, for example all the fermis including gtx 580,590 were really bad cards, noisy,expensives, hot and poor perfomance in a lot of games, that's why i skiped all fermi era. kepler is a lot better but expensive next year when they drop in price they will become a real good option, not for now. and about the gtx 690 they will be so hard to find in the future, just like today's GTX 590 which can barely been see at 1500 usd or something else from resellers. this are cards not to be produced in mass just to increase somebody's ego.


----------



## Arizonian

First of all we're not talking FERMI on the 690 owners club thread Kepler has nothing to do with it. We are talking Kepler to next series as your post suggested.

That's truly your opinion to wait, wait as long as you like.

Owning two single GPU's or one dual GPU right now on OCN has nothing to do with ego. So your advising 690 owners in a club thread they were unwise to purchase a 690 is pointless. They are enjoying them already.


----------



## ceteris

Quote:


> Originally Posted by *Sujeto 1*
> 
> that's correct however, imo, there are trully steps on technolgy which worth the upgrading and some wich not worth, for example all the fermis including gtx 580,590 were really bad cards, noisy,expensives, hot and poor perfomance in a lot of games, that's why i skiped all fermi era. kepler is a lot better but expensive next year when they drop in price they will become a real good option, not for now. and about the gtx 690 they will be so hard to find in the future, just like today's GTX 590 which can barely been see at 1500 usd or something else from resellers. this are cards not to be produced in mass just to increase somebody's ego.


Why is there always some guy with a 9 year old system posting on threads like these trying to make it rain on the parade or lecture us about what we should be doing about our money.


----------



## Sujeto 1

Quote:


> Originally Posted by *Arizonian*
> 
> First of all we're not talking FERMI on the 690 owners club thread Kepler has nothing to do with it. We are talking Kepler to next series as your post suggested.
> That's truly your opinion to wait, wait as long as you like.
> Owning two single GPU's or one dual GPU right now on OCN has nothing to do with ego. So your advising 690 owners in a club thread they were unwise to purchase a 690 is pointless. They are enjoying them already.


WT? im not telling is unwise to get an gtx 690, i'm telling is great card Why you do this?? missunderstanding my words on porpose, well what a pointless forum you have here, is not a alowed to say nothing of nothing except stuff that increase the purchase of a particular item. im not advacing to do nothing to the people, that's your job, people can do what they want, And OF COURSE i can wait as long as i want, you CAN'T tell me what i must to do mr arizonian.

pd: how much " they" pay you?


----------



## Qu1ckset

$1500 for a gtx590 LOL, i can grab that card for $500, less once the 600series cards come more available.
i sold my hd6990 once i saw the benchmarks of the gtx690 this things a beast, the scaling is almost perfect, should have my funds together to pick this card up once ebay releases my funds for my 6990...
not sure if im going sli two of these or wait till maxwell, im pretty sure one of these beast can max out 2560x1600rez no problem!


----------



## Arizonian

Quote:


> Originally Posted by *Sujeto 1*
> 
> WT? im not telling is unwise to get an gtx 690, i'm telling is great card Why you do this?? missunderstanding my words on porpose, well what a pointless forum you have here, is not a alowed to say nothing of nothing except stuff that increase the purchase of a particular item. im not advacing to do nothing to the people, that's your job, people can do what they want, And OF COURSE i can wait as long as i want, you CAN'T tell me what i must to do mr arizonian.
> pd: how much " they" pay you?


I don't get paid anything btw. I'm enjoying the 690. However you did come into a club thread to post below.
Quote:


> Originally Posted by *Sujeto 1*
> 
> Guess how you gonna feel when nvidia release their GK110 Geforce, back to the childhood.


Which is trying to tell members we re going 'back to the childhood' (whatever that means) by buying our 690 cards doesn't make sense to me.

Then to say we did it to increase our ego is a little inappropriate. We bought the 690 in the pursuit of current performance. Some because it was time to upgrade right now and it's a top performance card, waiting wasn't an option.

Your entitled to your opinion and no one is saying otherwise. You posted your opinion I posted mine. Simply only that.

However I do request we try to keep the club thread on topic. Seems to be getting derailed off being a club thread.


----------



## V3teran

The 690 is a great card,you have to own one to really appreciate how good it is,it just handles anything you throw at it effortlessly.


----------



## RobotDevil666

All those magnificent 690's wasted on 1080p ....








I was going to grab one until 670 came out and i changed my mind , getting 2x670 instead , already bough the first one ......
Quad SLI with those looks absolutely crazy but it just begs for higher res.


----------



## emett

Quote:


> Originally Posted by *ceteris*
> 
> Why is there always some guy with a 9 year old system posting on threads like these trying to make it rain on the parade or lecture us about what we should be doing about our money.


You read my mind, what a clown.
I sold my 590 a couple weeks back fot $500, don't know what planet this clowns on but it's not earth.
Are we starting to see more availability for the 680's and the 690? I personally doubt there will be a gk110 gaming gpu this year.


----------



## Arizonian

Quote:


> Originally Posted by *RobotDevil666*
> 
> All those magnificent 690's wasted on 1080p ....
> 
> 
> 
> 
> 
> 
> 
> 
> I was going to grab one until 670 came out and i changed my mind , getting 2x670 instead , already bough the first one ......
> Quad SLI with those looks absolutely crazy but it just begs for higher res.


1080p in 3D Vision cuts FPS by 45% performance. So a lot of juice is required if one wants to maintain 60 FPS in 3D Vision gaming.


----------



## RobotDevil666

Quote:


> Originally Posted by *Arizonian*
> 
> 1080p in 3D Vision cuts FPS by 45% performance. So a lot of juice is required if one wants to maintain 60 FPS in 3D Vision gaming.


wow didn't know that , thought it might require bit more power but not that much.


----------



## Arizonian

Quote:


> Originally Posted by *RobotDevil666*
> 
> wow didn't know that , thought it might require bit more power but not that much.


Yeah so let's say you're playing max graphics and getting 100 FPS in 2D when you turn on 3D Vision same settings you'll get 55 FPS. A lot of 3D vision owners will turn down settings one ot two notches to get better FPS depending on thier GPU performance.

Technically you can get by in 3D vision minimum with 30 FPS but not one FPS lower. Naturally optimal 60 FPS would be the perfect spot.


----------



## dboythagr8

Quote:


> Originally Posted by *Arizonian*
> 
> Yeah so let's say you're playing max graphics and getting 100 FPS in 2D when you turn on 3D Vision same settings you'll get 55 FPS. A lot of 3D vision owners will turn down settings one ot two notches to get better FPS depending on thier GPU performance.
> Technically you can get by in 3D vision minimum with 30 FPS but not one FPS lower. Naturally optimal 60 FPS would be the perfect spot.


Yep, this is the reason I went SLI on my 580s even though I'm at 1080p.

When is the 690 going to be available again :/


----------



## Arizonian

Quote:


> Originally Posted by *dboythagr8*
> 
> Yep, this is the reason I went SLI on my 580s even though I'm at 1080p.
> When is the 690 going to be available again :/


Early June I've heard on the EVGA forum in more abundance. Now that the 680 shortage is over. Today there is a lot of 680's on sale on Newegg. They are still 690's trickling each day but goes out of stock within mins of posting.


----------



## pchow05

I saw the asus 690 come into stock this morning.


----------



## dboythagr8

what's the difference between the evga 690 sig and regular 690?


----------



## EVGA-JacobF

Just the box and contents. Has a Tshirt, mousepad and collector box.


----------



## Kimo

What will better suit me. a GTX 680 ftw 4gb or 690? I have a catleap monitor (1440p)


----------



## Qu1ckset

Quote:


> Originally Posted by *Kimo*
> 
> What will better suit me. a GTX 680 ftw 4gb or 690? I have a catleap monitor (1440p)


the added vram on the gtx680 4gb will give you head room for turning up AA settings and stuff but the 690 wud be better because it will be able to achieve 60fps+ and 2gb vram is more then enough as long as AA is turned down, witch in my opinion is not needed at that resolution.

the single 680 will struggle to get 60fps in some games.

when i had 2way sli 580s on my 30" (2560x1600) i never ran out of vram as long as AA was turned down and the was only 1.5gb of vram on the 580s


----------



## Kimo

Thanks for the reply. Is a AX750 enough to power the 690?


----------



## PeteJM

So far the 680 and 690 has been very hard to find for me. Many sites are only allowing the purchase of 1 680 or 690 if they are in stock. Sadly I hope next month availability will be better.


----------



## SimpleTech

Quote:


> Originally Posted by *Kimo*
> 
> Thanks for the reply. Is a AX750 enough to power the 690?


It will be plenty.


----------



## Cheesemaster

Played around with some different clocks and this is what got....


----------



## Cheesemaster

Finally got to 5.0ghz on cpu 49 multi 102.0blk.. it speed up the system resulting in better graphics, score here is my final run (for now) lol!


----------



## zkalra

Cheesemaster that's quite incredible. Well done.


----------



## Cheesemaster

Any body know why my first card's fan is @3000rpm and my second is @ 1400rpm?


----------



## Arizonian

*BF3 - Caspian Border - 64 Multi player - Ultra Settings with 4xMSAA & Motion Blur OFF @ 120 Hz.*






"Frame Metering" on the GTX 690 does wonders in syncing the dual GPU eliminating micro-stutter.


----------



## emett

What res is that video filmed at? My 680's pull quite a few more frames than that.


----------



## Arizonian

Quote:


> Originally Posted by *emett*
> 
> What res is that video filmed at? My 680's pull quite a few more frames than that.


Not sure I understand what res is the video filmed at means? The video itself was from my iPhone and then uploaded to You Tube at its reccomended resolution which I forget nor do I know what's the iPhones res it records at.

The monitor is 1920x1080p 120 Hz BF3 in Ultra except 4xMSAA and motion blur turned off. Found a fully loaded 64 player multi player game in Caspian Borders being a heavy demanding area. I'm at 1044 MHz Base Core and no memory OC.

Was showing the fluidity in motion on dual GPU. When you say quite a few frames more are the circumstances the same? Cause I pull more FPS in a campaign or other areas in BF3 multi player that aren't as demanding.


----------



## emett

Sorry ment what res are you playing at.
Yer same situations caspian border 64p all maxed I'd say my frames are quite a bit higher.
Both my card boost to 1124mhz out the box and I have the CPU overclocked that's proberly why..
Anyway off topic..


----------



## iARDAs

I wouldnt be suprised if 680 SLI gives a bit more FPS than a 690 though, since the benchmarks show that 670 SLI is very close to 690 in performance.

Allthough I planned my route in this generation of cards by getting 2 670s, I will always cherish this 690.

Great card. It really is.


----------



## Shadowness

@Arizonian, did i miss something, or shouldnt you be pulling well over 100-110 FPS ? At least from the benchmarks i have seen the 690 being around 115 FPS in BF3 on ultra.









I dont know, at least on my current laptop rig i play on High and i get 40-50+ FPS, the deal breaker for me to get more performance ( minimal FPS, as in upping the lowest number ) was to get rid of that one AA setting, dont remember the name, set it to low and it worked.

Anyway, maybe i really missed something, or maybe with a good overclock, 110+ stable will be possible. The video kinda broke my heart a bit


----------



## Arizonian

Quote:


> Originally Posted by *Shadowness*
> 
> @Arizonian, did i miss something, or shouldnt you be pulling well over 100-110 FPS ? At least from the benchmarks i have seen the 690 being around 115 FPS in BF3 on ultra.
> 
> 
> 
> 
> 
> 
> 
> 
> I dont know, at least on my current laptop rig i play on High and i get 40-50+ FPS, the deal breaker for me to get more performance ( minimal FPS, as in upping the lowest number ) was to get rid of that one AA setting, dont remember the name, set it to low and it worked.
> Anyway, maybe i really missed something, or maybe with a good overclock, 110+ stable will be possible. The video kinda broke my heart a bit


Hmmmm I'll have to look into what I did then that might be cutting myself short. I am at stock CPU, testing card out. Have had issues that turned out wasn't the card after all. So I've been focused on that first. Making sure its working before 30 day replacement is up.

Maybe something is still wrong with card? I don't have time today so I will have to look into this.


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> Hmmmm I'll have to look into what I did then that might be cutting myself short. I am at stock CPU, testing card out. Have had issues that turned out wasn't the card after all. So I've been focused on that first. Making sure its working before 30 day replacement is up.
> Maybe something is still wrong with card? I don't have time today so I will have to look into this.


Hey bro

Dont you think stock CPU is holding back your 690?

When i had my 590 and left my old i5 2500k at stock speed and than OCed it to 4.5 there was a noticeable difference.


----------



## PCModderMike

Quote:


> Originally Posted by *iARDAs*
> 
> Hey bro
> 
> Dont you think stock CPU is holding back your 690?
> 
> When i had my 590 and left my old i5 2500k at stock speed and than OCed it to 4.5 there was a noticeable difference.


Maybe a noticeable difference in benchmarks such as 3DMark11...but a stock i7 3770K should not be holding back a 690 when gaming.


----------



## iARDAs

Quote:


> Originally Posted by *PCModderMike*
> 
> Maybe a noticeable difference in benchmarks such as 3DMark11...but a stock i7 3770K should not be holding back a 690 when gaming.


Oh yeah the benchmark difference was about 1000 points in 3Dmark11, but i also believe I had difference in FPS as well. Too bad i dont have that rig to test again.


----------



## Arizonian

i7 3770K shouldn't hold back 690 in gaming. Will be doing mild OC anyway but not until after I cover my card is working properly. Testing atm.

Turns out going through UEFI I had iGPU turned off and forced Gen3 on PCIe slot rather than 'auto'. Going to test out BF3 again same scenario Ultra on all the way.


----------



## Kyouki

Looking at doing nvidia surround for my gtx 690 and have got a a lot of good feed back from you guys on options and was looking at a 24in led 120hz but I got a email from new egg and they have a great deal for today on 27in led 60hz for only $279 with promo code. is this one worth getting because I could get 3 for the price of 2 120hz 24in ones. Also it says it a TFT LCD panel is that good all I have been reading about is TN and IPS. Also I was kinda leaning towards IPS because I do graphic design on the side but it hard because I want a good gaming monitor. I play a lot of FPS games but as of lately play LOL and liking it it.
Here a link http://www.newegg.com/Product/Product.aspx?Item=N82E16824236103&cm_sp=DailyDeal-_-24-236-103-_-Product

Thank you for advise!


----------



## y2kcamaross

Quote:


> Originally Posted by *Kyouki*
> 
> Looking at doing nvidia surround for my gtx 690 and have got a a lot of good feed back from you guys on options and was looking at a 24in led 120hz but I got a email from new egg and they have a great deal for today on 27in led 60hz for only $279 with promo code. is this one worth getting because I could get 3 for the price of 2 120hz 24in ones. Also it says it a TFT LCD panel is that good all I have been reading about is TN and IPS. Also I was kinda leaning towards IPS because I do graphic design on the side but it hard because I want a good gaming monitor. I play a lot of FPS games but as of lately play LOL and liking it it.
> Here a link http://www.newegg.com/Product/Product.aspx?Item=N82E16824236103&cm_sp=DailyDeal-_-24-236-103-_-Product
> 
> Thank you for advise!


It's a TN panel, still a pretty good deal, though for only a few more dollars you could order some 1440p ips monitors off ebay


----------



## Cheesemaster

I run it with ultra on three monitors (msaa off fxaa on) and average 100fps.


----------



## Vinnce

Here are pictures of my EVGA's 690
Got this little baby on may 8th, been ******* enjoying the thing ever since !!!!


----------



## Kimo

Sexy card! I can't wait to order mine!


----------



## btdvox

Got mine all overclocked and ready to go.
Honestly I dont even need to overclock the card, because Performance is through the roof.
2560X1600 Battlefield 3, Ultra Settings + 4x SSAA never goes below 70 FPS!! Usually around 80-95 fps.

Anyways my modest 24/7 OC is +100 Core, +200 Memory, +135 Power Target

Gets me 1145-1167 on both cores.


----------



## Kimo

Quote:


> Originally Posted by *btdvox*
> 
> Got mine all overclocked and ready to go.
> Honestly I dont even need to overclock the card, because Performance is through the roof.
> 2560X1600 Battlefield 3, Ultra Settings + 4x SSAA never goes below 70 FPS!! Usually around 80-95 fps.
> Anyways my modest 24/7 OC is +100 Core, +200 Memory, +135 Power Target
> Gets me 1145-1167 on both cores.


good to know. I plan to use this card on my 2560x1440 catleap


----------



## Arizonian

I've retested my card after some tweaking for optimal performance. Forced PCIe x16 Gen3. Over clocked CPU i7 3770 to 4.0 GHz. Went to NVP - Manage 3D Settings for BF3 program and set 'Texture Filtering - Quality' and set it to High Performance.

*BF3 Caspian Border / Default Ultra Settings / i7 3770K 4.0 /﻿ GTX 690 Base 1039 Memory 1531 Boost 1144 - after Dynamic OverClock GPU#1 1162 GPU #2 Core 1195 / 1920x1080p 120 Hz / 7:57 seconds.*








Again, impressed by 'Frame Metering' happening in this card syncing the GPU's to reduce micro stutter.

Interesting what the right BIOS and settings will do for performance compared to my last run. +14% OC on CPU and GPU Core. Didn't go lower than 76 FPS and as high as 200 FPS depending on environment.









PS Edited to add: ASIC Score 81.5% - just in case someone likes to know this info.


----------



## iARDAs

@ Arizonian

Glad that you are getting a better performance. WHen i build my rig i will also turn on that PCI 3.0 setting from the motherboard.

Did you try with setting the Texture Filtering to Quality again? I wonder if setting it to Quality or High Performance gives noticeable FPS differences. When i had my 590 I always set everything to High Quality. For me Quality and High Quality didnt really have visual difference, but quality vs high performance had some difference.


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> @ Arizonian
> 
> Glad that you are getting a better performance. WHen i build my rig i will also turn on that PCI 3.0 setting from the motherboard.
> 
> Did you try with setting the Texture Filtering to Quality again? I wonder if setting it to Quality or High Performance gives noticeable FPS differences. When i had my 590 I always set everything to High Quality. For me Quality and High Quality didnt really have visual difference, but quality vs high performance had some difference.


I guess I'd have to retry in 'optimal' settings and observe if that made a difference or not.

I'm not sure which of my changes helped most. I tweaked 'Manage 3D Settings' default profile for BF3 from Quality to High Performance. I over clocked to a mild stable +14% on GPU and the CPU this time. Went default 'ultra' settings rather than turn off Motion Blur and do custom. Also though I don't think made much difference. forced PCIe x16 to GEN3 rather than 'AUTO'.

Through all my testing for the last week and a half, my card has passed my own personal inspection. What I thought was a card issue now is confirmed was a PCIe power cable that had a poor connection, in short.

I'm very pleased.


----------



## Cheesemaster

I really need help here people I need some clarification as how to optimally setup my surround 3d with quad 690gtx's.. here is the article from nvidia


----------



## Kyouki

I don’t have quad SLI on my 690 set up yet! But based on that image you gave us it looks simple. Just make sure you use the green selected ports. You didn’t label the issue you’re having but I assume you are wondering what set up is better the right or left? In that case I am not much help. I actually would like to know if it is a benefit to have one GPU run physX rather then set up on the LEFT and set physX to auto select GPU. Not taking away from cheessemaster question but while one of you informed members answer it can you go into more detail on why or why not to use a signal GPU to run physX and for me if I was to keep one GTX690 and use a 2nd card like a 8800GTS laying around or a new GTX680 as a signal add in to run physX. Is it worth it I guess is the question? Thank you. I hope that what your also wondering cheessemaster so it can be answered in more detail.


----------



## nadfink

Hi guys,

Been following this thread for weeks now - seen some awesome set up & overclocks!! This thread pretty much convinced me to pick up my Gainward 690 - so thanks! I'm now a lot poorer...

Haven't pushed the card that much as yet as i'm not really liking the air cooling that my 800d is providing right now (installed custom fans but still pretty poor). So wondered if anyone could suggest a replacement case which would offer some improved air cooling goodness???? I'm seriously open to any suggestions here...

Cheers


----------



## Cheesemaster

Quote:


> Originally Posted by *Kyouki*
> 
> I don't have quad SLI on my 690 set up yet! But based on that image you gave us it looks simple. Just make sure you use the green selected ports. You didn't label the issue you're having but I assume you are wondering what set up is better the right or left? In that case I am not much help. I actually would like to know if it is a benefit to have one GPU run physX rather then set up on the LEFT and set physX to auto select GPU. Not taking away from cheessemaster question but while one of you informed members answer it can you go into more detail on why or why not to use a signal GPU to run physX and for me if I was to keep one GTX690 and use a 2nd card like a 8800GTS laying around or a new GTX680 as a signal add in to run physX. Is it worth it I guess is the question? Thank you. I hope that what your also wondering cheessemaster so it can be answered in more detail.


The illustrations are in conflict with the written special instructions. I do get a message time to time that my config isn't optimal in regards to the monitor setup.


----------



## Vinnce

Hey guys, since I figured that the GTX 690 will push hot air inside my computer case, I kinda started believing that having my H100 rad mounted in the front of the case in push/pull as intake was bad for the card as it would always blow air very fast at the end of the card.

If I mount my rad to the top, would it be logical to setup the fans as show in this picture or would it be better to keep up with the old school way: front intake, bottom intake, top out, back out?


----------



## Arizonian

Quote:


> Originally Posted by *nadfink*
> 
> Hi guys,
> Been following this thread for weeks now - seen some awesome set up & overclocks!! This thread pretty much convinced me to pick up my Gainward 690 - so thanks! I'm now a lot poorer...
> Haven't pushed the card that much as yet as i'm not really liking the air cooling that my 800d is providing right now (installed custom fans but still pretty poor). So wondered if anyone could suggest a replacement case which would offer some improved air cooling goodness???? I'm seriously open to any suggestions here...
> Cheers


Hi Nadfink,

See it's your first post. Welcome to OCN.









The GTX 690 is a very good card, I agree. What's your cards temp when it sits idle & max temps when full throttle?

A lot of cases to choose from which have good air flow. My current favorites come from NZXT in the Phantom line.

NZXT Phantom (Full tower) or Phantom 410 (Mid-Tower). I have the mid-tower and have seven fans and with middle HDD rack could have eight. The full tower offers more fan options. I prefferred my middle HDD rack out of the way of front fans for more air flow personally. There's a fan you can mount right on the case facing your card sideways and a bottom fan you can add facing upward blowing cool air toward your GPU fan. Newegg shows off the 



 nicely.

My idle temps are 31C-32C and full throttle 73C-74C using NZXT Phantom 410.

Another great place on OCN as you do your homework on cases would be - Computer Cases.

If you should find the time, we would also like to see your system specs in Your Profile / Edit System. It helps others helping you with any questions you might have, knowing what your dealing with for best advice to give.

Again, Welcome.


----------



## nadfink

Quote:


> Originally Posted by *Arizonian*
> 
> Hi Nadfink,
> See it's your first post. Welcome to OCN.
> 
> 
> 
> 
> 
> 
> 
> 
> The GTX 690 is a very good card, I agree. What's your cards temp when it sits idle & max temps when full throttle?
> A lot of cases to choose from which have good air flow. My current favorites come from NZXT in the Phantom line.
> NZXT Phantom (Full tower) or Phantom 410 (Mid-Tower). I have the mid-tower and have seven fans and with middle HDD rack could have eight. The full tower offers more fan options. I prefferred my middle HDD rack out of the way of front fans for more air flow personally. There's a fan you can mount right on the case facing your card sideways and a bottom fan you can add facing upward blowing cool air toward your GPU fan. Newegg shows off the
> 
> 
> 
> nicely.
> My idle temps are 31C-32C and full throttle 73C-74C using NZXT Phantom 410.
> Another great place on OCN as you do your homework on cases would be - Computer Cases.
> If you should find the time, we would also like to see your system specs in Your Profile / Edit System. It helps others helping you with any questions you might have, knowing what your dealing with for best advice to give.
> Again, Welcome.


Hi Arizonian

Thanks for the welcome







Hopefully my sig should be showing now.

Right now at stock clocks i'm idling at 37/38. I've seen load temps hit 79/80 in Battlefield 3 so a good 6c higher than your 410. It is quite warm here in the UK at the minute but want to get those temps down so I can find a decent 24/7 oc.

Best run i've had through Heaven so far has been 135% PT, 120 GPU & 200 MEM - the card didn't like going any higher than that! Don't think i've got the best over-clocker but reckon I can get some more out of it once the temps are down. I'll scoot on over to the cases section & see what I can find....


----------



## Arizonian

Quote:


> Originally Posted by *nadfink*
> 
> Hi Arizonian
> Thanks for the welcome
> 
> 
> 
> 
> 
> 
> 
> Hopefully my sig should be showing now.
> Right now at stock clocks i'm idling at 37/38. I've seen load temps hit 79/80 in Battlefield 3 so a good 6c higher than your 410. It is quite warm here in the UK at the minute but want to get those temps down so I can find a decent 24/7 oc.
> Best run i've had through Heaven so far has been 135% PT, 120 GPU & 200 MEM - the card didn't like going any higher than that! Don't think i've got the best over-clocker but reckon I can get some more out of it once the temps are down. I'll scoot on over to the cases section & see what I can find....


Interesting because I've had similar over clock offsets and seem to be hitting that same ceiling.

I'm at +125 PT / +124 Core / +58 Memory. Haven't messed much with memory. Memory seems to bring least FPS gains if any, more than it causes temps to rise.

Ambient temps of your room might be your biggest factor but we're both seeing approximately the same over clock ceiling and even though I live in a desert I've got my home temp of 76F-77F regardless of outside heat.


----------



## Kyouki

Quote:


> Originally Posted by *Cheesemaster*
> 
> The illustrations are in conflict with the written special instructions. I do get a message time to time that my config isn't optimal in regards to the monitor setup.


I didnt read the text in picture my bad! that a good point! I still want to know about using a singal GPU for phsyX is good or not! haha Sorry I was no help I am sure somone in here will know!


----------



## Cheesemaster

so Nvidia says that I dont need the mini-display port adapter to work. well I had oredred a cable .. sad face thats one hun down the tubes.. thank god its from newegg so ill get my money back...

Update #2 withh all the max setting in nvida controll panel FXAA on... Battle field three on ultra msaa off .. fxaa does the same thing to my humble eye. I am getting average 100fps on 120hz monitors at 5760x1080 p.. it is quite awesome my three way 580 gtx 3 gig version could not do that.. so more vram doesn't mean any thing in my opinion; at this res.


----------



## Kyouki

Very nice! I am still stuck on what monitors i am going with. I think I really still wanna do 3+1 but I am worried I might want a 2nd gtx 690 after I do that witch i have no problem with but my bank might when trying to buy a house. lol


----------



## PowerK

Hi folks.

A long time lurker registered today.









I've recently built a GTX 690 Quad-SLI rig.

3770K @ 4.5GHz
AsRock Fatality Z77 Professional
16GB G.SKILL ARES 2133MHz
2x ASUS GTX 690 @ stock
Seasonic SS-1000XP
DELL U3011

3DMark11 Extreme mode.
http://3dmark.com/3dm11/3520849

3DMark11 Performance mode.
http://3dmark.com/3dm11/3520627

Here're some pictures.


----------



## Imprezala

very very awesome setup. I am very jealous. its funny I have the same wallpaper but a 680


----------



## error-id10t

Beast, nice and clean.. I like.


----------



## dboythagr8

i would like to buy this card.


----------



## PCModderMike

Quote:


> Originally Posted by *PowerK*
> 
> Hi folks.
> A long time lurker registered today.
> 
> 
> 
> 
> 
> 
> 
> 
> I've recently built a GTX 690 Quad-SLI rig.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3770K @ 4.5GHz
> AsRock Fatality Z77 Professional
> 16GB G.SKILL ARES 2133MHz
> 2x ASUS GTX 690 @ stock
> Seasonic SS-1000XP
> DELL U3011
> 3DMark11 Extreme mode.
> http://3dmark.com/3dm11/3520849
> 3DMark11 Performance mode.
> http://3dmark.com/3dm11/3520627
> Here're some pictures.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [


Wow that's an awesome setup, got me jealous


----------



## mironccr345

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *PowerK*
> 
> Hi folks.
> A long time lurker registered today.
> 
> 
> 
> 
> 
> 
> 
> 
> I've recently built a GTX 690 Quad-SLI rig.
> 3770K @ 4.5GHz
> AsRock Fatality Z77 Professional
> 16GB G.SKILL ARES 2133MHz
> 2x ASUS GTX 690 @ stock
> Seasonic SS-1000XP
> DELL U3011
> 3DMark11 Extreme mode.
> http://3dmark.com/3dm11/3520849
> 3DMark11 Performance mode.
> http://3dmark.com/3dm11/3520627
> Here're some pictures.





Im jelly! Nice rig and nice 3dmark11 scores.


----------



## btdvox

Are many of you getting around 1145-1190 core with Power Target @ 135%?

My card is overclocked to +100 on core which is 1045 base but in games it seems to always run from 1145-1190 on the both cores. Never goes below it...

Dynamic adjust is pretty awesome.


----------



## Arizonian

Quote:


> Originally Posted by *btdvox*
> 
> Are many of you getting around 1145-1190 core with Power Target @ 135%?
> My card is overclocked to +100 on core which is 1045 base but in games it seems to always run from 1145-1190 on the both cores. Never goes below it...
> Dynamic adjust is pretty awesome.


I'm getting Max Boost *1170* & *1200* on Core with +125 PT +129 Core offsets. My dynamic over clock comes from a Base Core 1044 MHz.

Translates into a 129 MHz Base Core over clock and ends up 255 MHz & 285 MHz over clock Max Boost on the Core.

Your over clock compare with my findings as well.


----------



## btdvox

Quote:


> Originally Posted by *Arizonian*
> 
> I'm getting Max Boost *1170* & *1200* on Core with +125 PT +129 Core offsets. My dynamic over clock comes from a Base Core 1044 MHz.
> Translates into a 129 MHz Base Core over clock and ends up 255 MHz & 285 MHz over clock Max Boost on the Core.
> Your over clock compare with my findings as well.


Thanks for letting me know.

Yeah it's interesting, My core is actually at 1015 not 1045. I kind of cherry picked those numbers from all the reviews I saw, and plugged them in and then ran 3D Mark 11 for 2 hours with no issues.

Usually the opposite way of how I overclock (going up 25 mhz and testing). I'm sure my card has more room left, but with 2560X1600 I don't think I need the extra performance at all except for benchies.


----------



## WaXmAn

Just got mine installed and played a few rounds of BF3 Awesome!!!


----------



## Arizonian

Quote:


> Originally Posted by *WaXmAn*
> 
> Just got mine installed and played a few rounds of BF3 Awesome!!!


Congrats! Great card for great gaming.









Welcome to OCN.


----------



## dboythagr8

Quote:


> Originally Posted by *WaXmAn*
> 
> Just got mine installed and played a few rounds of BF3 Awesome!!!


What are your temps like (cpu and GPU)? i also have a ft02 and will be moving back to it once i get my new cards


----------



## PowerK

Does anyone know where to get a good looking SLI Bridge (70mm) like the one in the picture below ?










I'm currently using a SLI Bridge that came with my AsRock mobo as seen in the picture below. But I would like to get a better looking SLI bridge.







I've searched eBay with no luck.


----------



## WaXmAn

Quote:


> Originally Posted by *dboythagr8*
> 
> What are your temps like (cpu and GPU)? i also have a ft02 and will be moving back to it once i get my new cards


After 3 hrs playing BF3 tonight my gtx 690 was hovering around 78 to 80C the whole time!!


----------



## Arizonian

Quote:


> Originally Posted by *WaXmAn*
> 
> After 3 hrs playing BF3 tonight my gtx 690 was hovering around 78 to 80C the whole time!!


Seems a bit on the warm side. A 14% Core & +2% Memory OC run 71C-74C for me.

What is your over clock Core & Memory?

_Too high a memory over clock can raise temps and make a difference on maximizing your Core over clock._

Does sound like you've got a stable over clock atm though.


----------



## SpartanVXL

Hey guys what can you spot thats wrong on this page? (Apart from the price, it's in NZD but still)

http://www.trademe.co.nz/computers/components/video-cards/pciexpress/auction-479274980.htm

Lol... I'm in half a mind to report him for being a scam, the guy sounds like a right troll


----------



## PTCB

Quote:


> Originally Posted by *SpartanVXL*
> 
> Hey guys what can you spot thats wrong on this page? (Apart from the price, it's in NZD but still)
> http://www.trademe.co.nz/computers/components/video-cards/pciexpress/auction-479274980.htm
> Lol... I'm in half a mind to report him for being a scam, the guy sounds like a right troll


I just came from there, and was thinking the same thing. No person with the right mind will buy it. Just wasting TM's server space.


----------



## nadfink

Quote:


> Originally Posted by *PowerK*
> 
> Hi folks.
> A long time lurker registered today.
> 
> 
> 
> 
> 
> 
> 
> 
> I've recently built a GTX 690 Quad-SLI rig.
> 3770K @ 4.5GHz
> AsRock Fatality Z77 Professional
> 16GB G.SKILL ARES 2133MHz
> 2x ASUS GTX 690 @ stock
> Seasonic SS-1000XP
> DELL U3011
> 3DMark11 Extreme mode.
> http://3dmark.com/3dm11/3520849
> 3DMark11 Performance mode.
> http://3dmark.com/3dm11/3520627
> Here're some pictures.


Hi Powerk,

Loving that set - up. Is that the X900 case you have there? I've been looking into picking up a new case recently - that might be my next case!! How are temps with the 690's?


----------



## zkalra

My Latest Heaven benchmarks on:
1. 3930k at 4.4 (1.4v)
2. GTX 690s Quad Sli @ +120 power, +100 core, +200 Mem
3. RIVE
4. Corsair vengeance 1600 @ 1600


----------



## Pingis

just got my 690 this morning from newegg. didn't have much time to check it out as i was on my way to work but i noticed there were no seals on the box. so for those of you who bought one, did yours come sealed with those clear circles?
i got the asus model.


----------



## Stateless

Can we discuss the power target setting a bit?

I was alwasy told/read that you should just max the power target and then mess with the offset, but I am seeing more and more people using 120, 110 or stuff like that instead of the full power. What is this actually really doing? I did a test last night at 120 power vs. 135 power and boost clocks, voltage etc. all hit the max regardless of which power target I used. Just curious as to what it really is doing.

Thanks!

BTW...Has ANYONE seen or read about a 690 on water yet? I know Koolance is the only company that have blocks right now and Evga will have them as well soon, but I have yet to read that anyone has put these on water yet. Eathier it has not been done yet or my google skills are lacking.


----------



## bnj2

Quote:


> Originally Posted by *Pingis*
> 
> just got my 690 this morning from newegg. didn't have much time to check it out as i was on my way to work but i noticed there were no seals on the box. so for those of you who bought one, did yours come sealed with those clear circles?
> i got the asus model.


AFAIK Asus products don't come with those seals. Neither of my previous cards or motherboards had one and the 690 was no exception.


----------



## PowerK

Quote:


> Originally Posted by *nadfink*
> 
> Hi Powerk,
> Loving that set - up. Is that the X900 case you have there? I've been looking into picking up a new case recently - that might be my next case!! How are temps with the 690's?


Hi nadfink,

Yes, it's X900 Red edition.

The temps are pretty high with two 690s.
Under ambient temp around 28C, the first 690 goes as high as 90C.


----------



## derickwm

If anyone is looking for a 690 I have one up for trade







ordered it and then had a change of heart









www.overclock.net/t/1263953/ft-evga-690-for-intel-6-core-xeons


----------



## dboythagr8

Just ordered a 690 Sig from EVGA. Should be here by Saturday at the latest









Had my gmail opened and set to desktop notifications and just about fell out of my seat when it popped up with the 690 notification


----------



## Pingis

Ok. I'm pretty ticked right now. Just got home from work and was juiced to open up my GTX 690 from newegg. They definitely sold me a used card. There are finger prints on it AND a scratch in the shroud. This is unacceptable especially since i paid full price for a used card. I'll take pictures once i find my camera. Gonna call newegg first.


----------



## Cheesemaster

Quote:


> Originally Posted by *PowerK*
> 
> Hi nadfink,
> Yes, it's X900 Red edition.
> The temps are pretty high with two 690s.
> Under ambient temp around 28C, the first 690 goes as high as 90C.


Playing BF3 online ultra settings, msaa off (FXAA on in nvidia control panel) running three 120hz [email protected] these are my temps and speeds.


----------



## Qu1ckset

will the gtx690 even max the pci-e 2.0 port or be bottle necked? cuz im thinking of upgrading my cpu/mobo but thinking it wud be better to wait if the 690 isnt bottle necked at pci-e 2.0


----------



## Stateless

Quote:


> Originally Posted by *Pingis*
> 
> Ok. I'm pretty ticked right now. Just got home from work and was juiced to open up my GTX 690 from newegg. They definitely sold me a used card. There are finger prints on it AND a scratch in the shroud. This is unacceptable especially since i paid full price for a used card. I'll take pictures once i find my camera. Gonna call newegg first.


Did it have the protective plastic on the windows of the card?

It sucks that they sold a used card...kind of bizzare at the same time since there have not been many available and for them to get a used one in and resale it as new is kind bad/odd.


----------



## Stateless

What is the consenus amongst GTX 690 owners on a 120mhz monitor vs. a 2560x1600 monitor?


----------



## dboythagr8

Quote:


> Originally Posted by *Stateless*
> 
> What is the consenus amongst GTX 690 owners on a 120mhz monitor vs. a 2560x1600 monitor?


I can let you know on Saturday..I just ordered the Dell U3011 and 690 today.

Tis Smokey btw:thumb:


----------



## Pingis

it did have the plastic on the vapor chambers. i just got off the phone with customer support and they said i'll have to do an RMA for a new card. But since its out of stock they said i'll most likely just get a refund. super lame.


----------



## Stateless

Quote:


> Originally Posted by *dboythagr8*
> 
> I can let you know on Saturday..I just ordered the Dell U3011 and 690 today.
> Tis Smokey btw:thumb:


Hey Smokey...So you got a 690 now (or will)...Cool man! So what are you going to do with your 3d stuff? I assume it wont work on that new monitor since it is not 3d?


----------



## Stateless

Quote:


> Originally Posted by *Pingis*
> 
> it did have the plastic on the vapor chambers. i just got off the phone with customer support and they said i'll have to do an RMA for a new card. But since its out of stock they said i'll most likely just get a refund. super lame.


Were any of the cables or anything else opened? Was the static bag still sealed?

Regardless, it sucks man...especially since most of us have waited so long to get our cards and to have this happen would really piss me off.


----------



## Pingis

so now I'm just wondering if i should try it before i send it back.......that's probably what the last guy did.


----------



## dboythagr8

Quote:


> Originally Posted by *Stateless*
> 
> Hey Smokey...So you got a 690 now (or will)...Cool man! So what are you going to do with your 3d stuff? I assume it wont work on that new monitor since it is not 3d?


Yeah I ordered the 690 Sig edition from Evga.com earlier this afternoon, and the U3011 from the Egg as well. For now I'm going to hold onto my Asus 3d monitor. I will probably pop the 690 in and try it on the Asus since it's already hooked up, before removing it for the U3011.

Will be able to see first hand which I prefer, 120hz or the graphical fidelity of 1600p. I took next week off for E3 and I should be getting the monitor and 690 on Saturday (Monday at the very latest). I will have plenty of time to play around:thumb:


----------



## Stateless

Quote:


> Originally Posted by *dboythagr8*
> 
> Yeah I ordered the 690 Sig edition from Evga.com earlier this afternoon, and the U3011 from the Egg as well. For now I'm going to hold onto my Asus 3d monitor. I will probably pop the 690 in and try it on the Asus since it's already hooked up, before removing it for the U3011.
> Will be able to see first hand which I prefer, 120hz or the graphical fidelity of 1600p. I took next week off for E3 and I should be getting the monitor and 690 on Saturday (Monday at the very latest). I will have plenty of time to play around:thumb:


LMAO...You are as bad as I am. I am also off next week...actually off this week too, but have next week off to enjoy all the E3 stuff. Yeah, you have to definately let me know what is the best solution. For me, it will be hard to tear away from my 1080p 55" HDTV for gaming, but I have heard nothing but how amazing games look at the super hi-res. 120mhz, we will see. I read several studies and the human eye can't really see 120fp...I think the report said most people can tell up to around 70-75fps. I know some people that cant tell the difference between 30 and 60...I sure and hell can though.

Have you considered going full water with your set up yet? I have parts coming in today and tomorrow to go full water.


----------



## Pingis

Quote:


> Originally Posted by *Stateless*
> 
> Were any of the cables or anything else opened? Was the static bag still sealed?
> Regardless, it sucks man...especially since most of us have waited so long to get our cards and to have this happen would really piss me off.


static bag was sealed but the driver cd and manual looked like they had been handled due to bent corners and wrinkled pages. I just decided that since i got screwed i'm gonna test it out before i sent it back. hooked it up to the 27" ACD and its just crazy to me that the 690 runs everything smoothly at that res. I've tried Batman AC and BF3 and it was pretty much locked at 60fps. Gotta find my camera's transfer cable to post some pics. Also gonna hook my 120hz monitor to test that out.


----------



## dboythagr8

Quote:


> Originally Posted by *Stateless*
> 
> LMAO...You are as bad as I am. I am also off next week...actually off this week too, but have next week off to enjoy all the E3 stuff. Yeah, you have to definately let me know what is the best solution. For me, it will be hard to tear away from my 1080p 55" HDTV for gaming, but I have heard nothing but how amazing games look at the super hi-res. 120mhz, we will see. I read several studies and the human eye can't really see 120fp...I think the report said most people can tell up to around 70-75fps. I know some people that cant tell the difference between 30 and 60...I sure and hell can though.
> Have you considered going full water with your set up yet? I have parts coming in today and tomorrow to go full water.


It's not so much that it looks any smoother...once you get over 60fps it will look smooth, but rather it _feels_ much more responsive once you get around 90+ frames due to the 120hz refresh. Closest you can get to 120fps the better. Hard to explain it until you've tried it. I'm interested to see how I'll make the transition back to 60hz since I have been using a 120hz monitor since I built my machine last year.

Still considering full water but I don't really have any plans at the moment to really follow through on it.
Quote:


> Originally Posted by *Pingis*
> 
> static bag was sealed but the driver cd and manual looked like they had been handled due to bent corners and wrinkled pages. I just decided that since i got screwed i'm gonna test it out before i sent it back. hooked it up to the 27" ACD and its just crazy to me that the 690 runs everything smoothly at that res. I've tried Batman AC and BF3 and it was pretty much locked at 60fps. Gotta find my camera's transfer cable to post some pics. Also gonna hook my 120hz monitor to test that out.


What's the res on the ACD?


----------



## Stateless

Quote:


> Originally Posted by *Pingis*
> 
> static bag was sealed but the driver cd and manual looked like they had been handled due to bent corners and wrinkled pages. I just decided that since i got screwed i'm gonna test it out before i sent it back. hooked it up to the 27" ACD and its just crazy to me that the 690 runs everything smoothly at that res. I've tried Batman AC and BF3 and it was pretty much locked at 60fps. Gotta find my camera's transfer cable to post some pics. Also gonna hook my 120hz monitor to test that out.


Clock the **** out of it stat! If it clocks really well, you may consider keeping it! My 2 690's clock exactly the same (they are in serial number sequence)...I know others here have been able to get higher offsets than I can...so if you can live with it and it is a good clocker...you might want to consider keeping it.


----------



## Pingis

Quote:


> Originally Posted by *dboythagr8*
> 
> What's the res on the ACD?


its 2560 x 1440 (16:9)


----------



## Pingis

Quote:


> Originally Posted by *Stateless*
> 
> Clock the **** out of it stat! If it clocks really well, you may consider keeping it! My 2 690's clock exactly the same (they are in serial number sequence)...I know others here have been able to get higher offsets than I can...so if you can live with it and it is a good clocker...you might want to consider keeping it.


what are you using to overclock it? this is only my second nvidia card. Came from a gtx 580 matrix that I left stock and before that 2 6870s in crossfire so i'm not sure how to even begin.


----------



## dboythagr8

Quote:


> Originally Posted by *Pingis*
> 
> what are you using to overclock it? this is only my second nvidia card. Came from a gtx 580 matrix that I left stock and before that 2 6870s in crossfire so i'm not sure how to even begin.


Most recommend EVGA Precision tool. MSI Afterburner works as well.


----------



## Stateless

Quote:


> Originally Posted by *Pingis*
> 
> what are you using to overclock it? this is only my second nvidia card. Came from a gtx 580 matrix that I left stock and before that 2 6870s in crossfire so i'm not sure how to even begin.


Go to Evga.com and download Precision X. Once you download it and install it set up a quick fan profile under settings. I basically use a 1:1 up until 55c or so. So if the card is at 55c the fan is at 55%, but once it hit's 60c I have the fan at 70 and keep it going like that. So if the card hits 65c the fans automatically go to 75c.

Once you have a fan profile set, change the power target to 135% and then use the Offset slider and start at +100 offset. Almost every 690 that I know of can easily do +100, so you can start there or do 120 or so and run Unigine at full tilt for a few runs. On my cards, the highest in SLI I can go is 120, individually they go to 130 on the offset. Don't worry about doing memory until you find a good core overclock. If you can be stable above 140 or so, you got yourself a good card as most dont go past 130, but we have users here that have hit 150 or so and some a bit higher.


----------



## Pingis

Thanks fellas. Rep to both of you.



Here is a pic of the chip in the fan. It was the best I could get with my camera. Its in bottom middle on the left most side of that stupid shadow.


----------



## Stateless

Quote:


> Originally Posted by *Pingis*
> 
> Thanks fellas. Rep to both of you.
> 
> Here is a pic of the chip in the fan. It was the best I could get with my camera. Its in bottom middle on the left most side of that stupid shadow.


Yeah, I can see that chip. It is not too bad, but run the card to make sure it does not make odd sounds. Overclock it a bit to test it and if it clocks well, you might want to keep it. BTW...Which brand is your 690? If it is EVGA, in the EVGA 600 Series Forum there is sticky post with a utility that allows you to dim and do other things with the lighted led on the card itself. It must be a Evga card for it work, but it is pretty neat little utility.


----------



## bnj2

It's been a month since Asus 690 is out of stock here. Is there a reason I shouldn't get a MSI or a Gigabyte card for the 2nd 690 or should I wait?


----------



## Kyouki

Quote:


> Originally Posted by *dboythagr8*
> 
> Yeah I ordered the 690 Sig edition from Evga.com earlier this afternoon, and the U3011 from the Egg as well. For now I'm going to hold onto my Asus 3d monitor. I will probably pop the 690 in and try it on the Asus since it's already hooked up, before removing it for the U3011.
> Will be able to see first hand which I prefer, 120hz or the graphical fidelity of 1600p. I took next week off for E3 and I should be getting the monitor and 690 on Saturday (Monday at the very latest). I will have plenty of time to play around:thumb:


I'm in for your review on what you think is better! I'm still in the market for new monitor setup but I am taking my time.


----------



## Cheesemaster

sorry I have the HAF 932 advanced.


----------



## Pingis

after playing BF3 for about an hour each on my ACD and Asus VG236H i would have to say I prefer the ACD for the higher res. I don't play competitively so the extra benefit of the 120hz doesn't really apply to me. Time to sell it. If anyone is interested hit me up. Decided to contact newegg to see if they will give me a discount or something on the card since its out of stock and if I send back for a replacement i probably won't get one.


----------



## ceteris

Quote:


> Originally Posted by *Pingis*
> 
> after playing BF3 for about an hour each on my ACD and Asus VG236H i would have to say I prefer the ACD for the higher res. I don't play competitively so the extra benefit of the 120hz doesn't really apply to me. Time to sell it. If anyone is interested hit me up. Decided to contact newegg to see if they will give me a discount or something on the card since its out of stock and if I send back for a replacement i probably won't get one.


Personally I dont' think its that big off a deal. Even if you had your system mounted on a tech bench with the card facing you, you aren't going to see that chip. While I do agree with you on the principle of having a legitimate expectation of getting a brand new card that has not been opened, inspected and/or touch, you currently have one of the most rare graphic cards on the market atm. If that card was posted on eBay and advertised as open box with a chip on the fan, the card will still be swiped within seconds.

But do try getting a discount on it as well since you did get an open box item.
Quote:


> Originally Posted by *bnj2*
> 
> It's been a month since Asus 690 is out of stock here. Is there a reason I shouldn't get a MSI or a Gigabyte card for the 2nd 690 or should I wait?


How bad do you want it? LOL. The cards are pretty much the same. I think one of them has the company branding on it, just get the one that doesn't have it. For me,I've decided to hold off on getting a 2nd GTX 690 and wait for the GK110's in the event they come out with a 2 GPU graphics card like the 690. I'm playing every game maxed out as it is, and waiting for the other waterblocks to be released is just pushing my build out further.


----------



## Pingis

I read over at [H] that newegg has the 51 evga gtx 690s in stock. The cosmetic defects don't bug me its just that they would sell something that has obviously been used for full price.


----------



## Shadowness

Quote:


> Originally Posted by *Stateless*
> 
> What is the consenus amongst GTX 690 owners on a 120mhz monitor vs. a 2560x1600 monitor?


120 Mhz monitor, Trust me.

To add to this, a big Samsung 40" modified 3DTV using Nvidia 3D Vision with some nice full cover for the IR emitter to stop it from sending direct signal for home cinema using Nvidia 3D Vision glasses and gaming BIG


----------



## Kimo

I just ordered a gtx 690









Might still be available


----------



## Arizonian

Quote:


> Originally Posted by *Kimo*
> 
> I just ordered a gtx 690
> 
> 
> 
> 
> 
> 
> 
> 
> Might still be available


Pulled that trigger did you? For some of us







(referring to myself) sort of a rush for us addicts.









Post back with some pics in your rig when you receive it. Congrats!


----------



## Marcsrx

Just ordered mine as well. I will be a member in 3 business days!


----------



## Arizonian

Quote:


> Originally Posted by *Marcsrx*
> 
> Just ordered mine as well. I will be a member in 3 business days!


Looking forward to pics as well then.









Would love to see your system specs in 'Your Profile' / 'Edit Systems'. Nice to get an idea of what rig it's going in etc.....


----------



## Marcsrx

Will do. Rig is still under construction at the moment. Im an OCN n00by so bare with me haha


----------



## Stateless

The Koolance Water Block looks pretty good once it is installed. I will get some pics of it up as soon as I can. Doing my 1st Water Build and taking it slow right now. Got the Waterblock on the 690 done, installed the top Rad, installed the CPU Block and of course gutted the system earlier to be ready for the new water stuff that came in earlier today.

I have to say, it was a bit of a pain to get all the thermal tape on everything that required it. There were 10 small chips, all the memory chips, the sli chip and then the 2 GPU's used thermal paste. What i found interesting is that the Koolance instructions has you putting thermal tape on all the stuff except the GPU's, but when I opened and removed the original shroud it looks like nvidia used thermal past on the small 10 chips, the sl chip and the GPU's.

I had done some reading and read that it does not hurt to put a dab of thermal paste on the chips and then add the thermal tape on top and another small dab to help with better heat transfer. The one thing I did learn being that I never done this before was that using thermal tape, you should cut pieces slightly smaller than the chip. Once you tighten down the screws for the block it compresses the tape and some of it pushes out, kind of like when you put too much peanut butter on bread and you put the other piece of bread you have a little PB oooze out the sides (lol).

I should be able to put some pics up in a few hours...just taking a break and watching the NBA game for a little bit.


----------



## tonyjones

Anybody play Max Payne 3 with GTX 690 yet, god damn it's beautiful, 4K textures!


----------



## Arizonian

I didn't know that the VRAM on the GTX 690's are higher density "Samsung" modules?









Wonder if that meeting with Samsung earlier this year had something to do with the VRAM on the 690?

I thought all VRAM on the Kepler series line came from TMSC?


----------



## ReignsOfPower

It's good to be welcome

















I'm planning to watercool it. Waiting on EK Acetal Nickel Full cover waterblock + back-plate before I do. EVGA GTX 690


----------



## Qu1ckset

Quote:


> Originally Posted by *ReignsOfPower*
> 
> It's good to be welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm planning to watercool it. Waiting on EK Acetal Nickel Full cover waterblock + back-plate before I do. EVGA GTX 690


sorry this might be a little off topic but how do you enjoy the corsair k60? is it betting the razer black widow?


----------



## Pingis

congrats on the new card Reigns. Its pretty awesome. Looks like I'm sending mine back to newegg. The best they could do was offer me a $30 gift card for selling me a used 690 for full price. What a bunch of BS. Hope you enjoy your card. Time for me to track down another one.


----------



## WaXmAn

Quote:


> Originally Posted by *Pingis*
> 
> congrats on the new card Reigns. Its pretty awesome. Looks like I'm sending mine back to newegg. The best they could do was offer me a $30 gift card for selling me a used 690 for full price. What a bunch of BS. Hope you enjoy your card. Time for me to track down another one.


Easiest way to get a EVGA GTX 690 is @ evga.com and F5 right around 6PM Central daily!! Worked for me


----------



## icecpu

I got 690 from amazon not even trying, was looking for EVGA FTW, then ran across EVGA 690 GTX, ordered, shipping soon, delivery estimate 6/6
can't wait


----------



## ReignsOfPower

Quote:


> Originally Posted by *Qu1ckset*
> 
> sorry this might be a little off topic but how do you enjoy the corsair k60? is it betting the razer black widow?


Personally I think its a fantastic keyboard. I don't like the clickyness of the black Widow (personal). I suggest simply going into the store and giving them a type. This to me felt super solid and I really like brushed aluminiuim. The only drawback is that the not every single key is mechanical.

Anyone have any ETA's on when the EK GTX 690 Waterblock will be available?


----------



## Stateless

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Personally I think its a fantastic keyboard. I don't like the clickyness of the black Widow (personal). I suggest simply going into the store and giving them a type. This to me felt super solid and I really like brushed aluminiuim. The only drawback is that the not every single key is mechanical.
> Anyone have any ETA's on when the EK GTX 690 Waterblock will be available?


Last I heard was that it should of been out already...that is why I went with the Koolance one, which is pretty nice looking. Have not tested it yet since this is my first water loop and am still working on it.


----------



## ReignsOfPower

Quote:


> Originally Posted by *Stateless*
> 
> Last I heard was that it should of been out already...that is why I went with the Koolance one, which is pretty nice looking. Have not tested it yet since this is my first water loop and am still working on it.


Nice. Does it come with a backplate? PICS!!


----------



## Arizonian

Quote:


> Originally Posted by *Arizonian*
> 
> I didn't know that the VRAM on the GTX 690's are higher density "Samsung" modules?
> 
> 
> 
> 
> 
> 
> 
> 
> Wonder if that meeting with Samsung earlier this year had something to do with the VRAM on the 690?
> I thought all VRAM on the Kepler series line came from TMSC?


Confirmed - it is Samsung VRAM. NVIDIA GeForce GTX 690 Review: Ultra Expensive, Ultra Rare, Ultra Fast

Quote:


> It would appear that no one has told NVIDIA's engineers that 7GHz is supposed to be impossible, and as a result they've gone and done the unthinkable. Some of this is certainly down to the luck of the draw, but it doesn't change the fact that our GTX 690 passed every last stability test we could throw at it at 7GHz. And what makes this particularly interesting is the difference between the GTX 680 and the GTX 690 - both are equipped with 6GHz GDDR5 RAM, but while the GTX 680 is equipped with Hynix *the GTX 690 is equipped with Samsung*. Perhaps the key to all of this is the Samsung RAM?


----------



## Shadowness

Interesting.

Still waiting for EVGA Watercooled 690. The time is shortening, so they better come with it in the next 1-1.5 months.


----------



## icecpu

EVGA GTX 690 coming soon, can't wait for Tuesday to come, please add me to the club.


----------



## Arizonian

So far a stable 24/7 over clock. +14% Core +6% Memory.

Bandwidth *204.0* GB/s
Texture Fillrate *133.0* GTexel/s
Pixel Fillrate *33.2* GPixel/s

Core *1039* Memory *1594* Boost *1144*

GPU #1 *1162* Max Core Boost
GPU #2 *1188* Max Core Boost



Spoiler: Warning: Spoiler!


----------



## alienware

ive been using the asus 690 (will provide proof later on) for around 15 days now. Anyone else notice that their gpu tweak utility is totally ****? With me when i had it installed and running, my screen would randomly go all red...resulting in a system restart....is there any way i can flash my 690 to evga bios and use the evga precision tool? or would the evga tool work on all manufacturer cards?


----------



## bnj2

Yeah, GPU Tweak is pure crap - I uninstalled it after 2 days.
You can use MSI Afterburner or EVGA PrecisionX with your Asus card with no problem.


----------



## Arizonian

Quote:


> Originally Posted by *alienware*
> 
> ive been using the asus 690 (will provide proof later on) for around 15 days now. Anyone else notice that their gpu tweak utility is totally ****? With me when i had it installed and running, my screen would randomly go all red...resulting in a system restart....is there any way i can flash my 690 to evga bios and use the evga precision tool? or would the evga tool work on all manufacturer cards?


I don't see why you couldn't use EVGA Precision X. I know you can run MSI Afterburner and they are both Riva based programs.

Wouldn't harm you to try Precision or even Afterburner. I've had no problems with EVGA Precision X with an EVGA card.

As for your screen going all red, it may not pertain to your GPU utility program. No way to confirm until you swap it out for another GPU utility software and see if that fixes it.

Shut down your current 'GPU Tweak' first from restarting when computer boots up. Install the new program and choose the options to start up at system restart.

As for flashing BIOS.....I wouldn't flash your ASUS BIOS as it voids your warranty. I do not know of any of our previous members posting thus far who have flashed their BIOS either. You'd be the first brave soul to try. Not worth it IMO.

RsOD affectionately 'Red Screen of Death' bug is effecting some set ups cropping up and happening in different scenarios that triggers it. At first showed up in the X79 boards running back ground monitoring software then spread to other CPU chip sets as well.

Rolling back to previous driver, not using GPU-Z (not confirmed), or turning off back ground monitoring utility software, sometimes fixed this bug and only for some.

Most recent thread on OCN....GTX 680 RSOD fix?.

I'd give another profram a try and see what happens and go from their first.


----------



## V3teran

Quote:


> Originally Posted by *Stateless*
> 
> The Koolance Water Block looks pretty good once it is installed. I will get some pics of it up as soon as I can. Doing my 1st Water Build and taking it slow right now. Got the Waterblock on the 690 done, installed the top Rad, installed the CPU Block and of course gutted the system earlier to be ready for the new water stuff that came in earlier today.
> I have to say, it was a bit of a pain to get all the thermal tape on everything that required it. There were 10 small chips, all the memory chips, the sli chip and then the 2 GPU's used thermal paste. What i found interesting is that the Koolance instructions has you putting thermal tape on all the stuff except the GPU's, but when I opened and removed the original shroud it looks like nvidia used thermal past on the small 10 chips, the sl chip and the GPU's.
> I had done some reading and read that it does not hurt to put a dab of thermal paste on the chips and then add the thermal tape on top and another small dab to help with better heat transfer. The one thing I did learn being that I never done this before was that using thermal tape, you should cut pieces slightly smaller than the chip. Once you tighten down the screws for the block it compresses the tape and some of it pushes out, kind of like when you put too much peanut butter on bread and you put the other piece of bread you have a little PB oooze out the sides (lol).
> I should be able to put some pics up in a few hours...just taking a break and watching the NBA game for a little bit.


I look forward to it,see how much you can push each single card on there own,not two cards together because they will both run at whichever one is weaker than the other,im hoping in general they can hit around 1350-1400mhz for 24/7 use.


----------



## Vinnce

A wild GTX690 appears!

=


----------



## dboythagr8

Please add me as an owner!



Got this and my Dell U3011 today (currently using it). It is HUGGEEE. Right now I still have the MSI Lightning Xtreme 580s in my system. Going to test them out before installing the 690.


----------



## JeepsRcool

Can I join the club?








It is going in here, well its already in!


----------



## Cheesemaster

seems we have a lil 690gtx competition going on...


----------



## Cheesemaster

http://www.overclock.net/t/1260697/gtx-690-quad-sli-3dmark-11-extreme-preset-scores


----------



## jcde7ago

Hey guys, i've been away due to some personal stuff, but i will be scrubbing through the thread and picking up with adding people to the club. I apologize but life got ridiculously hectic the past 2 weeks for me.

Anyway, glad to see there's a lot going on and that a lot of people are getting some 690's now!!!


----------



## Shadowness

Quote:


> Originally Posted by *Cheesemaster*
> 
> seems we have a lil 690gtx competition going on...


Cheesemaster,

i'd have a question about your i7-3960X. How well does it overclock, and how does the extreme CPU feel in general compared to high end or mid high end products ? I got this extreme CPU because i wanted to own, i just wanted to have the best, extreme, out there. Since i am not building my new computer yet though, i am just sucking any awesome information i can find on the internet. I am jealous even though i actually own the CPU simply because i dont use it yet


----------



## Stateless

Quote:


> Originally Posted by *jcde7ago*
> 
> Hey guys, i've been away due to some personal stuff, but i will be scrubbing through the thread and picking up with adding people to the club. I apologize but life got ridiculously hectic the past 2 weeks for me.
> Anyway, glad to see there's a lot going on and that a lot of people are getting some 690's now!!!


Welcome back. I know you have been curious about a 690 on water, well tomorrow (Wednesday) I get my new Case Labs Case and will be setting up my water setup to include a Koolance Water Block for the 690. Actually, the block is already on, been doing some general work on other stuff until the case arrives tomorrow. If all goes well (my first water set up) I should have some impressions late in the evening of a 690 on water.


----------



## Kimo

Quote:


> Originally Posted by *jcde7ago*
> 
> Hey guys, i've been away due to some personal stuff, but i will be scrubbing through the thread and picking up with adding people to the club. I apologize but life got ridiculously hectic the past 2 weeks for me.
> Anyway, glad to see there's a lot going on and that a lot of people are getting some 690's now!!!


I just received my gtx 690 today. I can't install it until Thursday or so.(finals week). How's the gtx 690 treating your monitor? We technically have the same monitor. I plan to play bf3, skyrim , max payne 3, etc etc..


----------



## Cheesemaster

Quote:


> Originally Posted by *Shadowness*
> 
> Cheesemaster,
> i'd have a question about your i7-3960X. How well does it overclock, and how does the extreme CPU feel in general compared to high end or mid high end products ? I got this extreme CPU because i wanted to own, i just wanted to have the best, extreme, out there. Since i am not building my new computer yet though, i am just sucking any awesome information i can find on the internet. I am jealous even though i actually own the CPU simply because i dont use it yet


The highest I had scored in cinebench is 13.90 and that was 4.8ghz setting.. I owned an 875k i7 and a 750 i5 I love this extreme 3960.. I can overclock to 5.0ghz but I didnt like the temps.. I ordered an swifttech 320 HD and am gonna shoot for 5.0ghz 24/7. I am confidant it will run a lil more than that though. Bigger water cooling on the way then I can really play with it.. In short I love this thing!


----------



## xoleras

My latest 3dmark11 score with a 690:

http://3dmark.com/3dm11/3594537










+160 / 550 offsets, 135%, 24/7 stable, Needless to say I am happy with the card


----------



## Shadowness

Quote:


> Originally Posted by *Cheesemaster*
> 
> The highest I had scored in cinebench is 13.90 and that was 4.8ghz setting.. I owned an 875k i7 and a 750 i5 I love this extreme 3960.. I can overclock to 5.0ghz but I didnt like the temps.. I ordered an swifttech 320 HD and am gonna shoot for 5.0ghz 24/7. I am confidant it will run a lil more than that though. Bigger water cooling on the way then I can really play with it.. In short I love this thing!


Share the results when you get your WC system. I am planning on buying something similar from swiftech, and put this CPU under water.

Also, I am really looking forward to 3D Vision, god damn. Its been like 6 months now and i need to get it back. I need to play my favorite movies in 3D, i need games in 3D....


----------



## Kimo

How do i enable sli on a single gtx 690? Only one gtx 690 is active. I'm getting 60 fps on diablo 3 : /

Sorry for the noob question. After all, I am a noob. My first built computer


----------



## Imprezala

you are getting 60fps because of vsync, turn it off and see what happens.
also you enable sli in the control panel


----------



## Kimo

Quote:


> Originally Posted by *Imprezala*
> 
> you are getting 60fps because of vsync, turn it off and see what happens.
> also you enable sli in the control panel


Thank you! + rep


----------



## Arizonian

Quote:


> Originally Posted by *xoleras*
> 
> My latest 3dmark11 score with a 690:
> http://3dmark.com/3dm11/3594537
> 
> 
> 
> 
> 
> 
> 
> 
> +160 / 550 offsets, 135%, 24/7 stable, Needless to say I am happy with the card


That's a very nice score on that GTX 690.









Hey Xoleras I didn't know you upgraded to a GTX 690. Congrats.









I like this card myself for many reasons more than it's performance level it offers but the value in the many subtle differences that makes this card great for a dual GPU.

Welcome aboard and show that pic sitting in your rig.


----------



## icecpu

Too hard to keep GTX 690 under 75C
both GPU hovering at around 80C when stress testing and gaming.


----------



## dboythagr8

Quote:


> Originally Posted by *icecpu*
> 
> Too hard to keep GTX 690 under 75C
> both GPU hovering at around 80C when stress testing and gaming.


Change those top 2 front fans in your case to exhaust if you haven't already, or at least the red one that is pointing towards the 690. Heat exhausts out of there as well as the rear of the 690. If that fan is an intake you are blowing the hot air right back onto the card.


----------



## xoleras

Quote:


> Originally Posted by *Arizonian*
> 
> That's a very nice score on that GTX 690.
> 
> 
> 
> 
> 
> 
> 
> 
> Hey Xoleras I didn't know you upgraded to a GTX 690. Congrats.
> 
> 
> 
> 
> 
> 
> 
> 
> I like this card myself for many reasons more than it's performance level it offers but the value in the many subtle differences that makes this card great for a dual GPU.
> Welcome aboard and show that pic sitting in your rig.


Yeah, I love the card to death - I was very impressed from the getgo at how quiet and small it is while being ridiculously fast. My last go with nvidia were lightning 580s which
I liked a lot, but I think this is probably my favorite gpu purchase in a very long time.

I'm hoping EVGA releases their HC2 block for it soon, I wanna get them on water - my card oc's really well as is, but i'm anxious to see how much water improves that


----------



## Marcsrx

Add me to the list!


















































And for good measure!


----------



## mironccr345

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *xoleras*
> 
> My latest 3dmark11 score with a 690:
> http://3dmark.com/3dm11/3594537
> 
> 
> 
> 
> 
> 
> 
> 
> +160 / 550 offsets, 135%, 24/7 stable, Needless to say I am happy with the card





That is a nice score!


----------



## Sujeto 1

Quote:


> Originally Posted by *Marcsrx*
> 
> Add me to the list!
> And for good measure!


Oh that's the schrodinger cat, i wonder if it's death or alive.

cacelled my order, i think is too much video card for me.


----------



## Psychonaut

Hey guys, just a quick question that I didn't think was worth a new thread:

Which GTX 690's are reference models? I'm planning on throwing a waterblock on it.

Thanks.


----------



## bnj2

All of them.

All GTX690s have reference design PCBs.


----------



## Psychonaut

Excellent. Thank you!


----------



## mwayne5




----------



## burningrave101

EVGA GTX 690's in stock at EVGA.com:

http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=

http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2692-KR&family=GeForce%20600%20Series%20Family&sw=


----------



## Schwuar

hey all, i have recently built a new rig (should be in my sig) and went for 3 x 24" monitors, i have them set up fine with my 7970 however have noticed screen tearing in diablo 3, not checked other games yet and was wondering about getting 2 680s or a 690 instead, either that or wait till newer ones are released next year or do 680s etc still screen tear?


----------



## Kimo

Quote:


> Originally Posted by *burningrave101*
> 
> EVGA GTX 690's in stock at EVGA.com:
> http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2690-KR&family=GeForce%20600%20Series%20Family&sw=
> http://www.evga.com/products/moreInfo.asp?pn=04G-P4-2692-KR&family=GeForce%20600%20Series%20Family&sw=


SC still up

48 mins O.O


----------



## ReignsOfPower

Dear EK.

Hurry up any release a waterblock for this card.

And a backplate.

In fact, bundle the two so I dont have to chase them up individually.

Regards,
Impatient Koolance gtx 690 waterblock design disliker.


----------



## shiloh

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Dear EK.
> Hurry up any release a waterblock for this card.
> And a backplate.
> In fact, bundle the two so I dont have to chase them up individually.
> Regards,
> Impatient Koolance gtx 690 waterblock design disliker.


I could not wait any longer and I ordered the Koolance block yesterday. The 690 is sitting in my test bench (with a very old Q9650) since I got it. I need to get this card into my main rig ASAP. I would have prefered the look of an EK but since I saw their new CSQ design, that I dont really like, I settled for the Koolance. The block should be here by monday.

EDIT: typos


----------



## Shadowness

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Dear EK.
> Hurry up any release a waterblock for this card.
> And a backplate.
> In fact, bundle the two so I dont have to chase them up individually.
> Regards,
> Impatient Koolance gtx 690 waterblock design disliker.


/signed.


----------



## Romin

Just ordered one, But really disappointed when found out a $1k card didn't have a backplate!


----------



## bnj2

And why would it need one?


----------



## PowerK

Quote:


> Originally Posted by *Romin*
> 
> Just ordered one, But really disappointed when found out a $1k card didn't have a backplate!


Are you serious? Backplate would be a bad idea for this card as hot air is trapped between PCB and backplate. This results in higher temp.


----------



## Arizonian

Quote:


> Originally Posted by *PowerK*
> 
> Are you serious? Backplate would be a bad idea for this card as hot air is trapped between PCB and backplate. This results in higher temp.


On the contrary. I've had a back plate on the 580 for over a year and one on a current 680 and it didn't raise or lower temps. I tested specs both times.

It's purely aesthetic and protection only.


----------



## Romin

Quote:


> Originally Posted by *PowerK*
> 
> Are you serious? Backplate would be a bad idea for this card as hot air is trapped between PCB and backplate. This results in higher temp.


Yes I am ! Even if it did change the temp, I don't think it would be more than 1-2C!
Quote:


> Originally Posted by *Arizonian*
> 
> On the contrary. I've had a back plate on the 580 for over a year and one on a current 680 and it didn't raise or lower temps. I tested specs both times.
> It's purely aesthetic and protection only.


^^^This is so true !


----------



## Vinnce

Hello, here is a picture.


----------



## Arizonian

Quote:


> Originally Posted by *Vinnce*
> 
> Hello, here is a picture.
> 
> 
> Spoiler: Warning: Spoiler!


Looks clean Vinnce.







jcde7ago will add you to our distinguished list of club members.


----------



## burningrave101

EVGA Signature in stock at Newegg:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130790


----------



## xoleras

Quote:


> Originally Posted by *emett*
> 
> My last card was a gtx 590, max bf3 temps were 70c & 68c.
> Your telling me the 690 is a hotter card?
> There is something wrong if your running a new card playing a game and it's throttling. I.e. bad airflow.


You can feel free to check every review on the internet where the 690 hits 80C even at stock settings.

http://www.hardwareheaven.com/reviews/1489/pg19/nvidia-geforce-gtx-690-dual-gpu-graphics-card-review-power-temps-noise.html

That reviewer, 85C at stock, doing something wrong.

Now back to the topic at hand, yes water will improve 690 overclocking.


----------



## emett

lol, what are you getting upset for? I'm not trolling you.
You're saying the 690 runs so much hotter than a 680 and then you post a chart showing the opposite. Then you tell me I have nfi?

That chart with 680sli @ 80c means nothing, and 89 for the 590.
I think this whole drama is caused because you haven't realised I'm talking about temps with a custom fan curve. Same as the guy running the air and water temps. Might be time for your bed mate.

Hey if you wanna spend a few hundred on a gpu water loop by all means do it, I just can't see the point when it's so easy to stay under 70c on air.


----------



## xoleras

Manual fan is effective but NOT AS effective on the 690 compared to 680. There is 1 fan cooling 2 chips. I'm running manual 95% and it improves temps over auto for sure but it still gets into the 80s at load in DX11 crysis 2.

Now this is where you tell me i'm clueless and don't know how to properly cool my system. FYI I had 680s prior to the 690s and they were much cooler with manual fan. With the 680 manual fan I could stay below 70c most of the time. Manual fan is not as effective on the 690 and you will still get into the high 70s/low80s in very demanding games with vsync turned off.

Anyway, water will help prevent the 70C throttle and more importantly, water is much much better for VRM cooling which is extremely important for OC'ing. Is water cooling cost effective? No, but it will definitely help max OC's. Even if you completely ignore GPU temps, vapor chamber cooling is inferior for VRM cooling which is equally important for maximum overclocks.


----------



## emett

Quote:


> Originally Posted by *xoleras*
> 
> Now this is where you tell me i'm clueless and don't know how to properly cool my system.


Didn't happen. Go to the gym, or something I think you need to blow off some steam.

Quote:


> Originally Posted by *emett*
> 
> Yeah but I know he said he has good airflow but I don't think that's the case. I know I'm comparing sli 680's to a 690 but the hottest my top card gets is 60c and the bottom one 4c under that.
> Nowhere near the 70c throttle. Guess it just depends on how good your airflow is, as to wether its worth water cooling the 600 series.


The he I referred to was not you and if you can read I mention a little disclaimer. I'll point it out to you.
Quote:


> Originally Posted by *emett*
> 
> I know I'm comparing sli 680's to a 690 but...


Next thing you're going off your ******* rocker...


----------



## icecpu

Is my card bad , need RMA ???????
I Just notice highest setting Heaven cause 690 to artifacting at around 100 sec - 107 sec.
can someone test their GTX 680 or GTX 690 at highest Heaven setting and carefully keep an eye at 100 sec to 107 sec, they have little artifacting . That's what happen to my card everytime , same place , same time at Heaven benchmark.
But the card runs just fine, no errors , no crash

Use this setting and look carefully at 100 sec to 107 sec (the dragon) for artifacting

Direct X 11
Tessellation: extreme
Shaders: high
Anisotropy: 16x
Anti-Aliasing: 8x


----------



## bnj2

Sorry for the offtopic, but I couldn't help it...


----------



## xoleras

I wish I could rep you twice


----------



## sotorious

Card looks great, but man that 1k price tag, only if gpu's did not get out dated so fast i would of been in the club! i am waiting for the 680 4gb.


----------



## Stateless

Quote:


> Originally Posted by *V3teran*
> 
> Nice and thanks,what was your max oc on the core on air before your pc failed stability wise?Was it 120?
> Im looking forward to how much you can push this on the core on water and be stable in heaven/3d mark etc


My max on Air was +130 Offset, 135% Power Target and +200 on Memory. When I get time later today, I will try 140 with no memory to see how that works out.


----------



## Arizonian

Quote:


> Originally Posted by *Stateless*
> 
> My max on Air was +130 Offset, 135% Power Target and +200 on Memory. When I get time later today, I will try 140 with no memory to see how that works out.


Those offsets are almost identical to mine. Just had to point that out as I'm watching others over clocks and their results in correlation to each other.

+135% PT / +130 Core / +183 Memory = 1045 Core 1594 Memory 1150 Boost

For myself, +183 Memory offset gives me an even 204.0 Bandwidth. I found I can do +200 Memory offset but it's border line stable, where I'd be OK benching but not many hours of BF3 multiplayer. My +200 is a 100 MHz Memory over clock. I backed off memory just a tad and looks like I've found the memory sweet spot, for my card anyway, gaming and benching stable and able to keep the Core +130 offset.

I think I've met my Core clock limit as well at +130. Funny thing was that I backed off the Core to +124 to give me an even 133.0 Texture Fillrate and ended up with a better score in a performance 3DMark11 score. I can't see why unless being just a tad more stable gave a better Core boost as end result.

My card stacks up to some 680's SLI / 7970's Crossfire performance from what I'm seeing on average give or take which is pretty darn good for a dual GPU to accomplish.

You've got some great benches I've noticed Stateless. CheeseMaster as well. Very nice.


----------



## xoleras

Quote:


> Originally Posted by *Stateless*
> 
> My max on Air was +130 Offset, 135% Power Target and +200 on Memory. When I get time later today, I will try 140 with no memory to see how that works out.


I've been able to get some absolutely unreal memory overclocks on my card, I think it has to do with the samsung memory used (compared to the cheaper hynix used on most 680s).

I'm not completely sure if upping memory OC's substantially increases temps, so far i'm stable at +500 memory. I need to fiddle with it to see if backing off on memory helps with core overclocks.


----------



## icecpu

Help , Anyone ??
Is my card bad , need RMA ???????
I Just notice highest setting Heaven cause 690 to artifacting at around 100 sec - 107 sec.
can someone test their GTX 680 or GTX 690 at highest Heaven setting and carefully keep an eye at 100 sec to 107 sec, they have little quick flash artifacting . That's what happen to my card everytime , same place , same time at Heaven benchmark.
But the card runs just fine, no errors , no crash

Use this setting and look carefully at 100 sec to 107 sec (panning around the dragon statue) for artifacting (upper right corner)

Direct X 11
Tessellation: extreme
Shaders: high
Anisotropy: 16x
Anti-Aliasing: 8x


----------



## JeepsRcool

Just ran Heaven and didn't notice any thing in the spot you mention. What drivers are you running? Have you tried reinstalling Heaven? Does it happen at stock clocks?


----------



## Stateless

Quote:


> Originally Posted by *Arizonian*
> 
> Those offsets are almost identical to mine. Just had to point that out as I'm watching others over clocks and their results in correlation to each other.
> +135% PT / +130 Core / +183 Memory = 1045 Core 1594 Memory 1150 Boost
> For myself, +183 Memory offset gives me an even 204.0 Bandwidth. I found I can do +200 Memory offset but it's border line stable, where I'd be OK benching but not many hours of BF3 multiplayer. My +200 is a 100 MHz Memory over clock. I backed off memory just a tad and looks like I've found the memory sweet spot, for my card anyway, gaming and benching stable and able to keep the Core +130 offset.
> I think I've met my Core clock limit as well at +130. Funny thing was that I backed off the Core to +124 to give me an even 133.0 Texture Fillrate and ended up with a better score in a performance 3DMark11 score. I can't see why unless being just a tad more stable gave a better Core boost as end result.
> My card stacks up to some 680's SLI / 7970's Crossfire performance from what I'm seeing on average give or take which is pretty darn good for a dual GPU to accomplish.
> You've got some great benches I've noticed Stateless. CheeseMaster as well. Very nice.


Well, I only have 1 690 now. I sold the 2nd one off due to really not needing it due to the resolution I play at (1080p). I did test the 2 cards individually and they both did not break past +130 offset on their own. In SLI they together would top out at +120 which is typical when you SLI 2 cards together, usually your max overclock is a little less that individual cards. I am noticing more and more people around the +130 mark on average with some being lucky to break that barrier.

For a Single PCB with 2 GPU's that is pretty dang incredible. At +120 offset and +200 on memory I have been able to beat out 2x680's in SLI, but not all as it really is about being lucky and getting 2 GPU's that clock very well. But the nice thing about the 690 is the fact that you can get close or match 2x680's in SLI but use less power, less heat and it takes only one slot.


----------



## homestyle

Quote:


> Originally Posted by *icecpu*
> 
> Help , Anyone ??
> Is my card bad , need RMA ???????
> I Just notice highest setting Heaven cause 690 to artifacting at around 100 sec - 107 sec.
> can someone test their GTX 680 or GTX 690 at highest Heaven setting and carefully keep an eye at 100 sec to 107 sec, they have little quick flash artifacting . That's what happen to my card everytime , same place , same time at Heaven benchmark.
> But the card runs just fine, no errors , no crash
> Use this setting and look carefully at 100 sec to 107 sec (panning around the dragon statue) for artifacting (upper right corner)
> Direct X 11
> Tessellation: extreme
> Shaders: high
> Anisotropy: 16x
> Anti-Aliasing: 8x


it happens to everyone.


----------



## icecpu

Quote:


> Originally Posted by *JeepsRcool*
> 
> Just ran Heaven and didn't notice any thing in the spot you mention. What drivers are you running? Have you tried reinstalling Heaven? Does it happen at stock clocks?


I'm using latest driver, it happen at stock clock, some mention that it's a glitch from the heaven program, I feel relief, I thought I got a bad card
homestyle said it happen to everyone.


----------



## homestyle

this is where it happens. it still happens to me in heaven 3.0 too. if you play with the settings, you can get it to disappear. i believe i had it gone with aa on x4.


----------



## JeepsRcool

Ok i see it now too, I wasnt looking in the corner I was looking at the dragon.
I see you are at 130 core, I get artifacting, and crashes at 130+, but I am stable with 350+mem.
Never going over 72deg on either core, what are your 3dmark11 scores?


----------



## icecpu

Quote:


> Originally Posted by *JeepsRcool*
> 
> Ok i see it now too, I wasnt looking in the corner I was looking at the dragon.
> I see you are at 130 core, I get artifacting, and crashes at 130+, but I am stable with 350+mem.
> Never going over 72deg on either core, what are your 3dmark11 scores?


here is my 3dmark11 scores
http://3dmark.com/3dm11/3628636


----------



## Arizonian

Quote:


> Originally Posted by *icecpu*
> 
> Help , Anyone ??
> Is my card bad , need RMA ???????
> I Just notice highest setting Heaven cause 690 to artifacting at around 100 sec - 107 sec.
> can someone test their GTX 680 or GTX 690 at highest Heaven setting and carefully keep an eye at 100 sec to 107 sec, they have little quick flash artifacting . That's what happen to my card everytime , same place , same time at Heaven benchmark.
> But the card runs just fine, no errors , no crash
> Use this setting and look carefully at 100 sec to 107 sec (panning around the dragon statue) for artifacting (upper right corner)
> Direct X 11
> Tessellation: extreme
> Shaders: high
> Anisotropy: 16x
> Anti-Aliasing: 8x


I know which part your talking about and the line that flickers on the top right of the dragon statue. I've seen it when I've had too high of an over clock. It happens at the same time and place each time. No worries.
Quote:


> Originally Posted by *JeepsRcool*
> 
> Just ran Heaven and didn't notice any thing in the spot you mention. What drivers are you running? Have you tried reinstalling Heaven? Does it happen at stock clocks?
> 
> 
> Spoiler: Warning: Spoiler!


Nice score at 1920x1080.







If you like run the benchmark with below specs and see what you get. If the score is above 108 it would be worth adding yourself to the list.









**OFFICIAL* Top 30 Heaven Benchmark 3.0 Scores*

Render: Direct X 11
Mode: *1680x1050* fullscreen
Shaders: high
Textures: high
Filter: trilinear
Anisotropy: *16x*
Occlusion: enabled
Refraction: enabled
Volumetric: enabled
Anti-Aliasing: *8x*
Tessellation: *extreme*

Nice to see a lot of new GTX 690 owners as of late.


----------



## Vinnce

My GPU's cores are always on different temps is that normal?

E.G. [email protected], [email protected]


----------



## Arizonian

Quote:


> Originally Posted by *Vinnce*
> 
> My GPU's cores are always on different temps is that normal?
> E.G. [email protected], [email protected]


Yes it is normal. My GPU #1 & #2 show different temps as well.

Same base clocks. Dynamic OC max boost varies as well where GPU #1 max boost core 1175 MHz and GPU #2 max boost core is 1200 MHz.


----------



## Marcsrx

This card rocks so hard!


----------



## Marcsrx

BTW how do you show your "Rig" in your signature?


----------



## Arizonian

Quote:


> Originally Posted by *Marcsrx*
> 
> BTW how do you show your "Rig" in your signature?


Go to your 'personal profile' then scroll down to 'create rig'.

Once you've created your rig, you then click on your rig in your profile, it will bring you to a page that will show your specifications on rig page. There you will also have a link to 'add pictures'.


----------



## Lu(ky

I just got done picking up mine at newegg will call about 1 hour ago.. Slapped her inside my tech station and hooked her up to a Silverstone 600w PSU tell I get my SeaSonic Platinum 860w tomorrow








Just ran a quick Heaven bench stock settings and scored around a 94fps and a score around 2380 not bad.. But my poor little 600w psu was begging me to stop the bench I think...


----------



## Marcsrx

Quote:


> Originally Posted by *Arizonian*
> 
> Go to your 'personal profile' then scroll down to 'create rig'.
> Once you've created your rig, you then click on your rig in your profile, it will bring you to a page that will show your specifications on rig page. There you will also have a link to 'add pictures'.


I did, its all there... Not sure why its not in my sig


----------



## xoleras

]

not sure if thats a good score or not, never messed with heaven much.


----------



## emett

Quote:


> Originally Posted by *Marcsrx*
> 
> I did, its all there... Not sure why its not in my sig


You add it to your sig yourself mate.


----------



## Arizonian

Quote:


> Originally Posted by *xoleras*
> 
> 
> 
> 
> 
> 
> 
> 
> ]
> not sure if thats a good score or not, never messed with heaven much.


That's a good score. Our highest single 690 is 112 FPS I believe and scored by our very own OP. Mine was 109.3 FPS. It will get you on the benchmark contest thread if you submit that score.









However they want the screen shot F12 that shows the cobble street in the bench for valid submission.


----------



## Marcsrx

Quote:


> Originally Posted by *emett*
> 
> You add it to your sig yourself mate.


got it, thanx!


----------



## Cheesemaster

So, I dont get some of your guys temps.. I got a quad setup... I run crysis 2 ultra hi-res dx11 (3d) on triple monitors with 135 power target 100mhz core offset 200mhz mem off set, and dont see over 60 Celsius... what gives?


----------



## Cheesemaster

Quote:


> Originally Posted by *Cheesemaster*
> 
> So, I dont get some of your guys temps.. I got a quad setup... I run crysis 2 ultra hi-res dx11 (3d) on triple monitors with 135 power target 100mhz core offset 200mhz mem off set, and dont see over 60 Celsius... what gives?


Here are some temps .. quad 690gtx, triple 120hz monitors, ultra setting, online BF3.... 135% power offset, 100mhz core offset, 200mhz mem off set..... fan 95%..


----------



## emett

Quote:


> Originally Posted by *Cheesemaster*
> 
> So, I dont get some of your guys temps.. I got a quad setup... I run crysis 2 ultra hi-res dx11 (3d) on triple monitors with 135 power target 100mhz core offset 200mhz mem off set, and dont see over 60 Celsius... what gives?


I think a bit of it comes down to the season and weather/ambient temps and as well of course case air flow and custom fan curves. Having 120mm fans every where doesn't necessarily mean your case will have good temps.
Did you read a couple pages back xoleras getting all defensive over his high gtx690 temps?


----------



## xoleras

Don't bring my name up.

Like you said *ambient temps make a big difference.* My ambient temps are on the high side. Trust me, I know what i'm doing in terms of cooling so there's no need to bring up stuff about air flow and manual fan. I've assembled PC's since 1996 so I know a thing or three.


----------



## emett

What are your ambient temps Cheesemaster?


----------



## Cheesemaster

Quote:


> Originally Posted by *emett*
> 
> What are your ambient temps Cheesemaster?


~27-28c


----------



## Arizonian

Ok - I'll ask here first with club members.

I've been watching idle temps on my GTX 690 and noticed when in Google Chrome I run at 46C idle just browsing OCN. I run IE9 or Firefox and it's back down to 32C idle. Spark up Google Chrome again and back up to 46C idle. I do have the GTX 690 over clocked so my idle is higher than when I ran it stock.

In any rate, I made sure hardware acceleration is turned off. Anyone know what's causing Chrome to take my GPU for a little power walk to surf?


----------



## emett

Quote:


> Originally Posted by *Cheesemaster*
> 
> ~27-28c


Pretty warm then wonder where xoleras lives.


----------



## Arizonian

EVGA GTX 690 Signature on sale right now at EVGA if anyone browsing is interested in availability.









It's been available for 8 hrs 19 mins now.


----------



## max883

i Removed this plate in front of the card and the heat and noise went down. heat is blowing better out of the card.

!!NB: dont try this if you dont have a laying pc case.

NB: my case is a HTpc, case is laying down!

Power limit: 135
Core: +110
Mem: +500

max temp in Battlefeald 3: Gpu.1: 70.c Gpu.2: 69.c Vsync off. 99% on both Gpus.


----------



## bnj2

WTB high flow bracket!


----------



## xoleras

Quote:


> Originally Posted by *Arizonian*
> 
> Ok - I'll ask here first with club members.
> I've been watching idle temps on my GTX 690 and noticed when in Google Chrome I run at 46C idle just browsing OCN. I run IE9 or Firefox and it's back down to 32C idle. Spark up Google Chrome again and back up to 46C idle. I do have the GTX 690 over clocked so my idle is higher than when I ran it stock.
> In any rate, I made sure hardware acceleration is turned off. Anyone know what's causing Chrome to take my GPU for a little power walk to surf?


I have noticed this behaviour as well, arizonian.

Chrome idle clocks are around 950ish. In complete idle mode , the clock speed is 350. I'm really not sure how to fix it, 950 gpu clock is overkill for chrome generally....i've noticed with the msi afterburner monitor, that the gpu clock immediately jumps when I open chrome.


----------



## Cheesemaster

Quote:


> Originally Posted by *xoleras*
> 
> I have noticed this behaviour as well, arizonian.
> Chrome idle clocks are around 950ish. In complete idle mode , the clock speed is 350. I'm really not sure how to fix it, 950 gpu clock is overkill for chrome generally....i've noticed with the msi afterburner monitor, that the gpu clock immediately jumps when I open chrome.


i do not have this prob im in chrome right now my idle is 29c


----------



## xoleras

Quote:


> Originally Posted by *Cheesemaster*
> 
> i do not have this prob im in chrome right now my idle is 29c


If you have precision X open you will notice that the gpu clock speed goes from 350 to 915 when you open chrome.

Hence most people will have an increase in temps, especially ones without AC 24/7,. I'm a hawk about keeping an eye on gpu clockspeeds, and chrome causes it to raise from idle state 3.


----------



## Arizonian

Yeah I can be surfing with Internet Explorer or Firefox and 32C up to 36C depending if I'm streaming etc. I can even have up to 4 to 5 tabs open no problem.

The moment I spark up Chrome it immediately climbs to 46C, not even streaming. It will rev at this temperature regardless even if I don't surf or I'm stagnant on one page that dosent have anything going on.

It's if Chrome likes to take my GPU for a constant power walk. I haven't looked at my core clocks but I'm sure they have gone up hence why the higher temperatures. Not really a big deal I'm just curious as to why it's going on?

Again I have hardware acceleration off. Weird that it's not happening to everyone but it is to others huh?


----------



## bnj2

I'm using Afterburner to monitor it and I also see a jump in GPU Power to roughly 35% and temps go as high as 50C.
Happens with Firefox 13.0 as well. I suspect that Flash is the culprit.


----------



## Kyouki

So I just got home and I though I would test this! Because I run chrome almost 24/7 while on comp while Pandora runs and GPU 1 sits at 37c to 40c and GPU 2 at 32c I though this was just normal for my system. So I have been runnning my system now for about 30 min and both GPU sitting around 29/30c lower then I normally see them







So I open Chrome and BOOM







GPU 1 at 36/37/38c GPU at 32c steady so yes It does raise the temps, I am now streaming Pandora for about 30 min and Temps stay the same. This is interesting. I have not yet tested any other browsers but honestly I am not to worried about it.


----------



## Arizonian

Quote:


> Originally Posted by *Kyouki*
> 
> So I just got home and I though I would test this! Because I run chrome almost 24/7 while on comp while Pandora runs and GPU 1 sits at 37c to 40c and GPU 2 at 32c I though this was just normal for my system. So I have been runnning my system now for about 30 min and both GPU sitting around 29/30c lower then I normally see them
> 
> 
> 
> 
> 
> 
> 
> So I open Chrome and BOOM
> 
> 
> 
> 
> 
> 
> 
> GPU 1 at 36/37/38c GPU at 32c steady so yes It does raise the temps, I am now streaming Pandora for about 30 min and Temps stay the same. This is interesting. I have not yet tested any other browsers but honestly I am not to worried about it.


I agree as well that there's nothing to worry about. It is just an observation that I made and was curious if others were experiencing the same. Hardly anything to lose sleep over.









It may very well be flash player working differently in Chrome then it does and Internet Explorer.


----------



## TheeNinjaCat

Does anyone know why the heck it beeps annoyingly loud when under full load? I mean I've been playing bf3 and I tried it on ultra and it runs perfectly and when I crank up the fan speed the temperature stays at around 60 degrees but it emits this really loud and annoying beep that prevents me from playing on ultra which is annoying. When I look in afterburner it does show that it is basically at full load but aren't graphics cards supposed to run at full load, I mean if there is nothing wrong then why does it beep? Does anyone know how to turn this off?


----------



## Kyouki

Quote:


> Originally Posted by *Arizonian*
> 
> I agree as well that there's nothing to worry about. It is just an observation that I made and was curious if others were experiencing the same. Hardly anything to lose sleep over.
> 
> 
> 
> 
> 
> 
> 
> 
> It may very well be flash player working differently in Chrome then it does and Internet Explorer.


Yea Chrome's code is written so that it loads the Graphics/Java code differently so that it a faster browser. Flash may be included in that.


----------



## Marcsrx

I finally got my air/fan setup correctly. I never go above 70/73 C. I play BF3 w/VSync off, all on max settings @ ~120fps. Never got above 72degrees C.


----------



## Qu1ckset

Hey guys, thanks to Stateless i just got my new Evga GTX690 4GB








So now i can be added to the club!
On a side note, i remember reading somewhere there was a tool for the evga gtx690 to control the lights on the green geforce gtx on the side of the card, i searched evga's website and found no such tool.














































and for fun here is the old card i was using before my gtx690 arrived lol


----------



## Arizonian

@ qu1ckest - Conrats on the card.









http://www.evga.com/forums/tm.aspx?m=1606940

EVGA GTX 690 LED Controller Thread


----------



## Stateless

Quote:


> Originally Posted by *Qu1ckset*
> 
> Hey guys, thanks to Stateless i just got my new Evga GTX690 4GB
> 
> 
> 
> 
> 
> 
> 
> 
> So now i can be added to the club!
> On a side note, i remember reading somewhere there was a tool for the evga gtx690 to control the lights on the green geforce gtx on the side of the card, i searched evga's website and found no such tool.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and for fun here is the old card i was using before my gtx690 arrived lol


Cool man. Glad you got it and installed. A suggestion due to your case. The two fans on the bottom, I would put those as exhaust since hot air going to blow out of both ends of the card. It will help with cooling overall since those fans on the bottom are traditionally pulling air in. As it is now, that cool air will push the hot air back into the card. Just a thought!


----------



## Qu1ckset

Quote:


> Originally Posted by *Stateless*
> 
> Cool man. Glad you got it and installed. A suggestion due to your case. The two fans on the bottom, I would put those as exhaust since hot air going to blow out of both ends of the card. It will help with cooling overall since those fans on the bottom are traditionally pulling air in. As it is now, that cool air will push the hot air back into the card. Just a thought!


the difference in temps is very minor, this is the second card of this style i owned and doesn't matter, i need the bottom fans to run the way its intended because all my hdds and ssd are behind the motherboard tray and those 180mm fans are the only thing feeding them airflow!


----------



## juanP

my dual cards


----------



## Qu1ckset

Man i am shocked at how much cooler this gtx690 runs compared to the hd6990 im happy i switched, plus the performance is amazing!
with my hd6990 with fans on manual @60% they were hitting 78-83degrees in bf3
the gtx690 with fans on auto hits 57-61dregrees in bf3


----------



## Arizonian

Quote:


> Originally Posted by *Qu1ckset*
> 
> Man i am shocked at how much cooler this gtx690 runs compared to the hd6990 im happy i switched, plus the performance is amazing!
> with my hd6990 with fans on manual @60% they were hitting 78-83degrees in bf3
> the gtx690 with fans on auto hits 57-61dregrees in bf3


Kepler is a complete redesign ground up. It's not fermi.

Great work. Most solid built GPU ever. Nvidia out did themselves as the best dual GPU ever. It scales very close (within 6% or less) to the performance of two 680's in SLI.

My favorite is 'Frame metering' new feature which syncs both GPU's in rendering frames. It has no more micro stutter than one card card does.









My single 580 ran 10C hotter than my dual 690.


----------



## dboythagr8

Quote:


> Originally Posted by *Arizonian*
> 
> Yeah I can be surfing with Internet Explorer or Firefox and 32C up to 36C depending if I'm streaming etc. I can even have up to 4 to 5 tabs open no problem.
> The moment I spark up Chrome it immediately climbs to 46C, not even streaming. It will rev at this temperature regardless even if I don't surf or I'm stagnant on one page that dosent have anything going on.
> It's if Chrome likes to take my GPU for a constant power walk. I haven't looked at my core clocks but I'm sure they have gone up hence why the higher temperatures. Not really a big deal I'm just curious as to why it's going on?
> Again I have hardware acceleration off. Weird that it's not happening to everyone but it is to others huh?


This happens to me. Mine gets up to like 50c in Chrome. In IE it idles around 37c.


----------



## tonyjones

Quote:


> Originally Posted by *juanP*
> 
> my dual cards


How do they scale ? any Benchmark specs


----------



## icecpu

Quote:


> Originally Posted by *juanP*
> 
> my dual cards


Which case is that ?


----------



## bnj2

Quote:


> Originally Posted by *icecpu*
> 
> Which case is that ?


I think it is a Fractal Design R3


----------



## Imprezala

looks like the Fractal Design Arc MIDI


----------



## Arizonian

The Fractal Design is missing power atm.


----------



## juanP

its a fractal arc midi case.

i have not yet connected the power supplies yet. still waiting for the AP15 gentle typhoon fans for my H100.

This rig will be overkill for me, i did an impulse buy and ended up with two cards. one card is way more than enough for me.

I will post pics with specs once build is complete.

I took out the fan filters from top and front for better air flow. it was easy to take them out by removing the grill tabs.

I also took out the HDD slots as i am only using one ssd for now.


----------



## shiloh

Hello

I Installed the Koolance waterblock yesterday. For those who plan to install a waterblock on your card, you will need a Torx T7 screwdriver to remove the stock cooler. Now I'm waiting on my new tubing to arrive to install the card in my main rig. The tubing should be here tomorrow.

here is a pic of the card with the koolance block. It looks beter than I expected!


----------



## Kyouki

I wonder with some modifcation to the stock shrold if you could fit it back on after you remove the fan and gut it out. make opening for the two post that come out for the water hook up. Hmmmm... Would be kinda cool.


----------



## TassM

Bad luck for me. My card hasn't even seen its first boot up and it has to be RMA. MicroCenter has. I idea when new cards are coming in. I guess I will sit with my thumb up my butt waiting


----------



## Stateless

Quote:


> Originally Posted by *TassM*
> 
> Bad luck for me. My card hasn't even seen its first boot up and it has to be RMA. MicroCenter has. I idea when new cards are coming in. I guess I will sit with my thumb up my butt waiting


Why? Did it come physically damaged?


----------



## TassM

2 ports are bad, won't register on in device manager. Had to take it back to have it checked and they comfirmed bad card. They let me keep it till another one comes in. I told them I need it to set up benchmark and what not. So far it seems to work, but its spotty to say the least. By spotty I mean sometimes it won't boot up and sometimes it boots fine. I can't Oc the system as it stands so I need to wait.

BTW, how the sag on the card with the waterblock on the card?


----------



## tonyjones

Can't wait for my next build, it's on hold right now, just purchased a 15" MacBook Pro (Retina Display) .... i know i know


----------



## error-id10t

Quote:


> Originally Posted by *shiloh*
> 
> Hello
> I Installed the Koolance waterblock yesterday. For those who plan to install a waterblock on your card, you will need a Torx T7 screwdriver to remove the stock cooler. Now I'm waiting on my new tubing to arrive to install the card in my main rig. The tubing should be here tomorrow.
> here is a pic of the card with the koolance block. It looks beter than I expected!


Looks nice I think. Let us know if it lets you OC any better, the temps would obviously be better.


----------



## ReignsOfPower

Koolance block looks nice. I still want an EK one. Anyone heard any news regarding its release date?


----------



## Stateless

Quote:


> Originally Posted by *TassM*
> 
> 2 ports are bad, won't register on in device manager. Had to take it back to have it checked and they comfirmed bad card. They let me keep it till another one comes in. I told them I need it to set up benchmark and what not. So far it seems to work, but its spotty to say the least. By spotty I mean sometimes it won't boot up and sometimes it boots fine. I can't Oc the system as it stands so I need to wait.
> BTW, how the sag on the card with the waterblock on the card?


There is no sag on the card with the block. The PCB is pretty thick and when screwed in, it does not sag at all.


----------



## juanP

got my wiring done yesterday. these cards are really quiet when turned on. i gotta run some tests today.


----------



## PeteJM

So, people kept asking me "Why dont you use a 690..." constantly.









So since I am using 3 monitors, I pulled the trigger on one. No regrets what-so-ever so far. Cant wait till it comes in and Im able to use it.









Going to be using my 7970 in my workstation so I can actually have some power for when I am working on the build.


----------



## Forrester

just ordered mine from evga


----------



## Marcsrx

Modified my case a little more. Removed the HDD bay as it was blocking airflow from my 690. Pain in the arse as it was riveted into the case. Thankfully they were cheap rivets. Also mounted my fan on the back side of the mobo on the exterior as it was to thick and pressing on the mobo when installed on the inside of the case. I switched all my fans so that they are all blowing into the case except the two in the front of the case. All the changes lowered my 690 temps and CPU temp a few degrees:thumb:


----------



## Marcsrx

Quote:


> Originally Posted by *Forrester*
> 
> just ordered mine from evga


Congrats! Post pictures when done!!


----------



## Arizonian

Quote:


> Originally Posted by *Marcsrx*
> 
> Modified my case a little more. Removed the HDD bay as it was blocking airflow from my 690. Pain in the arse as it was riveted into the case. Thankfully they were cheap rivets. Also mounted my fan on the back side of the mobo on the exterior as it was to thick and pressing on the mobo when installed on the inside of the case. I switched all my fans so that they are all blowing into the case except the two in the front of the case. All the changes lowered my 690 temps and CPU temp a few degrees:thumb:
> 
> 
> Spoiler: Warning: Spoiler!


Good work with your rig Marcsrx









@ Forrester - Congrats. It's a sweet well built card. Your going to love it. Definetly post back with pics once the 690 is in it's new home.


----------



## shiloh

I finally was able to install my 690 into my loop yesterday. Didnt have time to bench and test overclock potential under water. Will probably bench it over the weekend and report back. Here is a few shots of the 690 with the Koolance block installed in my rig:


----------



## xoleras

Ah man you WC guys are killin' me!

Let us know what your OC results are







I'm going water in the next month! Already bought some of my parts for it.


----------



## Cheesemaster

Fried my mother board; and grabbed new one. Now I am I am running 5.0ghz 24/7 on my cpu and I can overclock my dual 690gtx's a lil more. I am now at 140mhz core offset and 400mhz memory offset, yay!


----------



## Arizonian

Quote:


> Originally Posted by *Cheesemaster*
> 
> Fried my mother board; and grabbed new one. Now I am I am running 5.0ghz 24/7 on my cpu and I can overclock my dual 690gtx's a lil more. I am now at 140mhz core offset and 400mhz memory offset, yay!


Sweet bud. What mother board?

You saying your previous mother board kept you from Overclocking as high? How did you fry the previous mother board?

Would love to see your new benches on the heaven and 3D Mark threads.









You should fill out your profile / edit systems so we can see you rigs components?


----------



## Cheesemaster

Quote:


> Originally Posted by *Arizonian*
> 
> Sweet bud. What mother board?
> You saying your previous mother board kept you from Overclocking as high? How did you fry the previous mother board?
> Would love to see your new benches on the heaven and 3D Mark threads.
> 
> 
> 
> 
> 
> 
> 
> 
> You should fill out your profile / edit systems so we can see you rigs components?


It was A RIVE main board... I had it since it first came out beat it to all hell finding the limits and what not. here are some pics of temps playing bf3 on ultra with surround.


----------



## Testier

how does 690 OC with compared with 680 sli/7970 cf?


----------



## Arizonian

Quote:


> Originally Posted by *Testier*
> 
> how does 690 OC with compared with 680 sli/7970 cf?


The 690 has the best scaling than all other Nvidia duals previously. In some games it's not as good and others it's better. Reviews put it within equal to 6% of 680's in perfomance. Resolution also factors in that can vary performance, yet still very close to SLI 680's and Crossfire 7970's.

Now take over clocking capablity luck and it will vary between cards. Example in the Heaven and 3DMark11 benchmarks you'll see a 690 beating some 680's & 7970's. Just like you'll see great over clocking 670's & 7950's beating 680's & 7970's. Luck of the draw.

My benchmarks I ran and I consider my 690 an average over clocker, not poor and not great.


----------



## Testier

Quote:


> Originally Posted by *Arizonian*
> 
> The 690 has the best scaling than all other Nvidia duals previously. In some games it's not as good and others it's better. Reviews put it within equal to 6% of 680's in perfomance. Resolution also factors in that can vary performance, yet still very close to SLI 680's and Crossfire 7970's.
> Now take over clocking capablity luck and it will vary between cards. Example in the Heaven and 3DMark11 benchmarks you'll see a 690 beating some 680's & 7970's. Just like you'll see great over clocking 670's & 7950's beating 680's & 7970's. Luck of the draw.


Considering the price of it is almost the same as 2 x 680, what are it benefits besides a bragging right, looks really nice, power consumption, and one PCI lane?
I am assuming here that the 680s would OC better.


----------



## Arizonian

Quote:


> Originally Posted by *Testier*
> 
> Considering the price of it is almost the same as 2 x 680, what are it benefits besides a bragging right, looks really nice, power consumption, and one PCI lane?
> I am assuming here that the 680s would OC better.


Over clocking is a silicon lottery really. If one of two cards is a poor over clocker it effects both over clockability in SLI and same as the two cards on the one 690. Generally the two singles will be better in performance almost every time if your looking for best performance option.

The benefits you mentioned of a dual GTX 690 clarified:

1. *Less power consumption* than 680 SLI.

2. Single PCIe slot running both cards at *PCIe 3.0 x 16* for greater bandwidth and allows you to utilize other PCIe slots if needed.

3. *Better quality built.* Starting with the mother board thick PCB. Five phase PWM's on each side over four PWM's on reference 680's at same price of $499 each. High density Samsung memory over Hynix.

_Look at the home page of this club thread and you'll see specifics how over built this card really is. One of the best built GPU's to date anyway ground up._

4. *Fan noise* compares to a single GTX 680. Quieter than two in SLI on air.

5. *Vapor chamber cooling* over heat sink cooling of reference 680's of same $499 price x2.

6. *Runs cooler temps* over two in SLI. My single 680 in second rig runs same temps as my 690 in idle or full throttle.

_For some people this is their only option based on their rig set up and gives them the flexiblity of using a single PCIe slot and depending on power supply may not have enough for two in SLI but can get away with one dual.
_
As for myself I've never owned a dual GPU and factor in all these things, I thought it would be a great hands on learning experience. I almost went SLI myself but since I wasn't giving up that much performance over SLI I couldn't pass it up based on what I gained. To each his own.









If your asking because your thinking of going crossfire with your 7970 or switch to a dual 690 your best off to crossfire since your half way there.


----------



## Romin

I could not use my card ! even after reinstalling drivers more than 2 times, windows did not recognize the card ! had to swap back to my old 570 for now! In device manger showed the Error 43 !

Error.png 115k .png file


----------



## Testier

Quote:


> Originally Posted by *Arizonian*
> 
> Over clocking is a silicon lottery really. If one of two cards is a poor over clocker it effects both over clockability in SLI and same as the two cards on the one 690. Generally the two singles will be better in performance almost every time if your looking for best performance option.
> The benefits you mentioned of a dual GTX 690 clarified:
> 1. *Less power consumption* than 680 SLI.
> 2. Single PCIe slot running both cards at *PCIe 3.0 x 16* for greater bandwidth and allows you to utilize other PCIe slots if needed.
> 3. *Better quality built.* Starting with the mother board thick PCB. Five phase PWM's on each side over four PWM's on reference 680's at same price of $499 each. High density Samsung memory over Hynix.
> _Look at the home page of this club thread and you'll see specifics how over built this card really is. One of the best built GPU's to date anyway ground up._
> 4. *Fan noise* compares to a single GTX 680. Quieter than two in SLI on air.
> 5. *Vapor chamber cooling* over heat sink cooling of reference 680's of same $499 price x2.
> 6. *Runs cooler temps* over two in SLI. My single 680 in second rig runs same temps as my 690 in idle or full throttle.
> _For some people this is their only option based on their rig set up and gives them the flexiblity of using a single PCIe slot and depending on power supply may not have enough for two in SLI but can get away with one dual.
> _
> As for myself I've never owned a dual GPU and factor in all these things, I thought it would be a great hands on learning experience. I almost went SLI myself but since I wasn't giving up that much performance over SLI I couldn't pass it up based on what I gained. To each his own.
> 
> 
> 
> 
> 
> 
> 
> 
> If your asking because your thinking of going crossfire with your 7970 or switch to a dual 690 your best off to crossfire since your half way there.


I think I will be pushing it with 2 x 7970s......... considering my PSU , OCs, and etc. Too close for my taste.

Edit: you are right for how overbuilt this card is.......... It looks.........AWESOME.


----------



## Arizonian

Quote:


> Originally Posted by *Testier*
> 
> I think I will be pushing it with 2 x 7970s......... considering my PSU , OCs, and etc. Too close for my taste.
> Edit: you are right for how overbuilt this card is.......... It looks.........AWESOME.


Oh I forgot one other little thing highly worth noting:
Quote:


> *Frame rate Metering*
> Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering. In SLI mode, two GPUs share the workload by operating on successive frames; one GPU works on the current frame while the other GPU works on the next frame. But because the workload of each frame is different, the two GPUs will complete their frames at different times. Sending the frames to the monitor at varying intervals can result in perceived stuttering.
> 
> The GeForce GTX 690 features a metering mechanism (similar to a traffic meter for a freeway entrance) to regulate the flow of frames. By monitoring and smoothing out any disparities in how frames are issued to the monitor, frame rates feel smoother and more consistent.


Quote:


> The GTX 690 uses hardware based frame metering for smooth, consistent frame rates across both GPUs. (In other words - say goodbye to micro-stuttering, which is *sometimes* perceived by *some* gamers in ALL multi-GPU setups, whether it's dual-GPU cards, or two/three/four single-GPUs in SLI).


Boils down to this dual GPU has the same amount of micro stutter, if any, as one GPU does. It's truly smooth as hell and haven't experienced micro stutter yet.


----------



## shiloh

I got some watercooling results this morning for people who have been asking for them. I have played a couple of hours with the card this morning, mainly with Heaven benchmark.

I was able to match my best "stock cooler" overclock but was not able to get beyond it. Anything past +175 offset (or 1200mhz core) will crash (in heaven at least, as I said I didnt test other games/bench). So bottom line, water cooling will not give you significantly higher OC. The stock card OC capability is not limited by heat but, I think, from power delivery.

Here are the screenshots of my last and best run of the morning:`


----------



## thestache

Quote:


> Originally Posted by *Stateless*
> 
> There is no sag on the card with the block. The PCB is pretty thick and when screwed in, it does not sag at all.


Soon as I pulled my waterblock out the box it was my first thought, how the hell will the PCB mannage with this thing and how much sag will I get. But I'm glad it does becasue it was skeptical. Lol.

Also the verdict with OCing is no gains other than the obvious temp and noise, until voltage is unlocked, if it ever is?

Anyone ordered the XSPC block too, they look really neat?


----------



## jam3s

I'm crying in a corner right now.

These 690's are gorgeous. I could never afford one, that's for sure.


----------



## Qu1ckset

Quote:


> Originally Posted by *jam3s*
> 
> I'm crying in a corner right now.
> These 690's are gorgeous. I could never afford one, that's for sure.


Thats were saving comes in handy!


----------



## tonyjones

Two words: Sperm Bank


----------



## Marcsrx

and or credit card!


----------



## Testier

Quote:


> Originally Posted by *Arizonian*
> 
> Oh I forgot one other little thing highly worth noting:
> Quote:
> 
> 
> 
> *Frame rate Metering*
> Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering. In SLI mode, two GPUs share the workload by operating on successive frames; one GPU works on the current frame while the other GPU works on the next frame. But because the workload of each frame is different, the two GPUs will complete their frames at different times. Sending the frames to the monitor at varying intervals can result in perceived stuttering.
> The GeForce GTX 690 features a metering mechanism (similar to a traffic meter for a freeway entrance) to regulate the flow of frames. By monitoring and smoothing out any disparities in how frames are issued to the monitor, frame rates feel smoother and more consistent.
> 
> 
> 
> Quote:
> 
> 
> 
> The GTX 690 uses hardware based frame metering for smooth, consistent frame rates across both GPUs. (In other words - say goodbye to micro-stuttering, which is *sometimes* perceived by *some* gamers in ALL multi-GPU setups, whether it's dual-GPU cards, or two/three/four single-GPUs in SLI).
> 
> Click to expand...
> 
> Boils down to this dual GPU has the same amount of micro stutter, if any, as one GPU does. It's truly smooth as hell and haven't experienced micro stutter yet.
Click to expand...

Even when the FPS is down to 30ish, still no microstuttering? This really needs to be implanted on all dual GPU cards..................


----------



## Forrester

Does anyone know which 690 waterblock cools the best? I've only seen one by koolance and xspc, and I'm unsure as to which one i should grab.


----------



## Stateless

Quote:


> Originally Posted by *shiloh*
> 
> I finally was able to install my 690 into my loop yesterday. Didnt have time to bench and test overclock potential under water. Will probably bench it over the weekend and report back. Here is a few shots of the 690 with the Koolance block installed in my rig:


Shiloh,

Looking at the last picture, it looks like you are missing 2 power pins in the second power connector. Just a heads up!


----------



## Stateless

Quote:


> Originally Posted by *Sexparty*
> 
> Soon as I pulled my waterblock out the box it was my first thought, how the hell will the PCB mannage with this thing and how much sag will I get. But I'm glad it does becasue it was skeptical. Lol.
> Also the verdict with OCing is no gains other than the obvious temp and noise, until voltage is unlocked, if it ever is?
> Anyone ordered the XSPC block too, they look really neat?


I have been too busy to do much work on the rig, but I always beleived and the user above that also has a waterblock has proven that the limitation is not due to the heat, but the fact that you cannot add any more voltage to OC higher. If it was possible, these cards would own big time. I am at +120 offset and dont crack 32c in heat. What Watercooling will do is allow you maximize the OC that could normally do on air, but not worry about having the 70c downclock because it will never hit 70c under water. For me, my old case had terrific air cooling and I rarely would hit 70c, but of course I had the cards fan almost at max and it does get a little noisy, so overall for me it was well worth going water.


----------



## Stateless

Quote:


> Originally Posted by *Forrester*
> 
> Does anyone know which 690 waterblock cools the best? I've only seen one by koolance and xspc, and I'm unsure as to which one i should grab.


Not sure, but I cant beleive this has not been posted yet, but EVGA released the 690 waterblock. Here is the link:

http://www.evga.com/products/moreInfo.asp?pn=400-CU-G690-B1&family=Accessories - Hardware&sw=4



Sells for $169.99 and even has a cool little light thing on the top of the card. I personally have the Koolance block. It looks good and my temps never exceeded 32c so far.


----------



## Forrester

the evga hydro coppers have always been very restrictive


----------



## Lu(ky

Note sure if anyone has posted this yet here but another water bock from XSPC Razor GTX 690.. All I can say I really hope eK does not sell there GTX 690 FB with circles on it...


----------



## Sujeto 1

guys what do you suggest to buy for my rig: GTX 690 or ( GTX 680 + SSD 512GB)??


----------



## Lu(ky

Quote:


> Originally Posted by *Sujeto 1*
> 
> guys what do you suggest to buy for my rig: GTX 690 or ( GTX 680 + SSD 512GB)??


If it was me I would update your CPU, Motherboard and memory first.. Not sure how high of resolution your monitor will go to? 1920x1080 I would get a GTX 670 instead.. Your current cpu will be bogging your gpu down.. and your current mobo is only sata II..

I am telling you there ares some really sweet deals out there to upgrade CPU/MOBO like at MicroCenter stores.. I highly suggest you get a P67 up to a Z77 1155 socket.. They have some great deals for a 3570K and mobo for less then $250.00 there..
For the $1100 you are planning on spending on a GTX 690 you could get a whole new system with a GTX 670 and Crucial 256GB SSD for that price.. I am telling you coming from a e8400 even OCed to a new 1155 sockets is a night and day difference. And coming away from a GTX 260 to a GTX 670 is huge... Go to AnandTech.com and a go to there BENCH section and choose your GPU vs a new ones and CPU as well..

Just my 2 cents...


----------



## Sujeto 1

Quote:


> Originally Posted by *Lu(ky*
> 
> If it was me I would update your CPU, Motherboard and memory first.. Not sure how high of resolution your monitor will go to? 1920x1080 I would get a GTX 670 instead.. Your current cpu will be bogging your gpu down.. and your current mobo is only sata II..
> I am telling you there ares some really sweet deals out there to upgrade CPU/MOBO like at MicroCenter stores.. I highly suggest you get a P67 up to a Z77 1155 socket.. They have some great deals for a 3570K and mobo for less then $250.00 there..
> For the $1100 you are planning on spending on a GTX 690 you could get a whole new system with a GTX 670 and Crucial 256GB SSD for that price.. I am telling you coming from a e8400 even OCed to a new 1155 sockets is a night and day difference. And coming away from a GTX 260 to a GTX 670 is huge... Go to AnandTech.com and a go to there BENCH section and choose your GPU vs a new ones and CPU as well..
> Just my 2 cents...


LOL im sorry i didnt explain my self, of course i'm planing to buy another rig i will give my actual rig to my little bro. I'm planing to buy a 2500K + 16 gb ram. and yes monitor stay on one 23 inchs 1920x1080p. I have 1700 dolars to buy the rig, so if i buy GTX 690 i wont get any SSD







, i already bougth case: storm enforcer, hard drive 1 Tb, 16 GB ram vengeance, H80 Cooling, and power supply hx1050 corsair. Just left CPU and Motherboard.


----------



## Arizonian

Quote:


> Originally Posted by *Sujeto 1*
> 
> guys what do you suggest to buy for my rig: GTX 690 or ( GTX 680 + SSD 512GB)??


The 680 or 670 would be more than enough for a 60 Hz 1920x1200 resolution monitor and an SSD will be a HUGE improvement to performance.

A GTX 690 is great for a single 120 Hz monitor for 2D or 3D Vision gaming. On a 60 Hz monitor a 670 or 680 will easily keep that maxed refresh rate.


----------



## Sujeto 1

Quote:


> Originally Posted by *Arizonian*
> 
> The 680 or 670 would be more than enough for a 60 Hz 1920x1200 resolution monitor and an SSD will be a HUGE improvement to performance.
> A GTX 690 is great for a single 120 Hz monitor for 2D or 3D Vision gaming. On a 60 Hz monitor a 670 or 680 will easily keep that maxed refresh rate.


You are very rigth, is just i look at her, and is so sexy and i have the money so i say hey lets do this . XD worlds end at 2013, so w t hell


----------



## Arizonian

Quote:


> Originally Posted by *Sujeto 1*
> 
> You are very rigth, is just i look at her, and is so sexy and i have the money so i say hey lets do this . XD worlds end at 2013, so w t hell


I can't blame you because I couldn't resist. It's one of a kind.


----------



## Orc Warlord

I'm kind of in the same boat.

Sig rig is what I have, and I'm wondering whether to buy a 690 + 128gb SSD.

Just not sure if dumping $1000 on a graphics card is a good idea, considering how fast technology moves.

I want a card that can run BF3 on maxed out settings with at 60 fps across 3 monitors.


----------



## Stateless

I guess I was talking out of my ass when I stated here and on other forums that once you found your highest clock on AIR it would NOT matter if you went to H20 or not because all H20 will do is run the SAME clocks but a lot cooler. Well I was wrong. My highest clock on my 690 was +130 Offset with 135% power on air hitting close to 70c with fans at around 83-85%. When I tried to go +140 on the clock, heat would rise to around 74c and it would crash after 1 1/2 times through Unigine.

As it stands right now, I am through 7 loops of Unigine at +140 Offset and going strong. Highest temp is 37c on GPU 1, 34c on GPU2. So something un-achieveable on AIR is achievable on Water! Who knew. Almost everyone was saying the same thing, that once you find your highest clock on AIR, H20 would not help it get higher due to the crash is not due to heat, but due to not enough voltage to sustain the OC. Guess everyone stating that is WRONG!

I am still in the testing phase, I generally dont consider it stable unless it can go through 10 loops or about an hour or so of Unigine. Generally if it survives that, it generally will survive anything else I throw at it. So, if it passes another 10 min of Unigine, I am going to +150 to see what happens.


----------



## Arizonian

Quote:


> Originally Posted by *Stateless*
> 
> I guess I was talking out of my ass when I stated here and on other forums that once you found your highest clock on AIR it would NOT matter if you went to H20 or not because all H20 will do is run the SAME clocks but a lot cooler. Well I was wrong. My highest clock on my 690 was +130 Offset with 135% power on air hitting close to 70c with fans at around 83-85%. When I tried to go +140 on the clock, heat would rise to around 74c and it would crash after 1 1/2 times through Unigine.
> As it stands right now, I am through 7 loops of Unigine at +140 Offset and going strong. Highest temp is 37c on GPU 1, 34c on GPU2. So something un-achieveable on AIR is achievable on Water! Who knew. Almost everyone was saying the same thing, that once you find your highest clock on AIR, H20 would not help it get higher due to the crash is not due to heat, but due to not enough voltage to sustain the OC. Guess everyone stating that is WRONG!
> I am still in the testing phase, I generally dont consider it stable unless it can go through 10 loops or about an hour or so of Unigine. Generally if it survives that, it generally will survive anything else I throw at it. So, if it passes another 10 min of Unigine, I am going to +150 to see what happens.


Definitely keep us posted we are all ears.


----------



## Stateless

Quote:


> Originally Posted by *Arizonian*
> 
> Definitely keep us posted we are all ears.


Well +140 is stable. Ran unigine for an hour, played some Battlefield 3 and no issues. Tomorrow, I will go to +150 to see what happens!!!


----------



## juanP

my scores with cpu at 4500 on 3770k & stock gpu's


----------



## ceteris

Aquacomputer just posted their block as well!



http://shop.aquacomputer.de/product_info.php?products_id=2894

Of course, since I am in the US, it will take forever for me until it hits our local resellers and now I am torn between getting this one or the Swiftech block. I have already been waiting a long time too









Grats to all the new owners! I've had mine for about a month now and I'm still amazed and very happy everytime I look at it through the side panel


----------



## thestache

Quote:


> Originally Posted by *Stateless*
> 
> I have been too busy to do much work on the rig, but I always beleived and the user above that also has a waterblock has proven that the limitation is not due to the heat, but the fact that you cannot add any more voltage to OC higher. If it was possible, these cards would own big time. I am at +120 offset and dont crack 32c in heat. What Watercooling will do is allow you maximize the OC that could normally do on air, but not worry about having the 70c downclock because it will never hit 70c under water. For me, my old case had terrific air cooling and I rarely would hit 70c, but of course I had the cards fan almost at max and it does get a little noisy, so overall for me it was well worth going water.


Alright so there is till some headroom even with the voltage locked. Good to know, I'll have a go next week when the rest of my build comes and see what I can get out of it.


----------



## thestache

Quote:


> Originally Posted by *ceteris*
> 
> Aquacomputer just posted their block as well!
> 
> http://shop.aquacomputer.de/product_info.php?products_id=2894
> Of course, since I am in the US, it will take forever for me until it hits our local resellers and now I am torn between getting this one or the Swiftech block. I have already been waiting a long time too
> 
> 
> 
> 
> 
> 
> 
> 
> Grats to all the new owners! I've had mine for about a month now and I'm still amazed and very happy everytime I look at it through the side panel


Probably the best looking block so far, shame about the colour.


----------



## Divineshadowx

Hey guys, so I finished my build yesterday which includes GTX 690's in quad sli. I'm having a problem tho, during games like BF3, the gpu's will throttle themselves to down to about 550mhz, and sometimes even the idle 324mhz, and it reduces the fps to 30-40, which is annoying when playing on a 120hz monitor. I'm using EVGA precision for the "target" 120 fps but it does not seem to do much since the gpu's will randomly throttle themselves and create lag. Also, with the precision monitor, the gpu's will almost NEVER go above 50% usage, any ideas?


----------



## Arizonian

Quote:


> Originally Posted by *Divineshadowx*
> 
> http://www.overclock.net/content/type/61/id/922442/width/600/height/520/flags/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey guys, so I finished my build yesterday which includes GTX 690's in quad sli. I'm having a problem tho, during games like BF3, the gpu's will throttle themselves to down to about 550mhz, and sometimes even the idle 324mhz, and it reduces the fps to 30-40, which is annoying when playing on a 120hz monitor. I'm using EVGA precision for the "target" 120 fps but it does not seem to do much since the gpu's will randomly throttle themselves and create lag. Also, with the precision monitor, the gpu's will almost NEVER go above 50% usage, any ideas?


Congrats Divineshadowx on your GTX 690 SLI set up. Looks great.







Also I see it's your first post on OCN....Welcome aboard as well.









Onto your issue. I'm going to go over the obvious first. Have your gone through Nvidia Control Panel with a fine tooth and comb through the settings?

Make sure 'Configure Multi-GPU, Surround, PhyX' setting has -Multi-GPU enabled and 'maximize 3D performance' checked.

In 'Manage 3D Settings' 'Global Settings' tab:

Multi-GPU rendering mode: NVIDIA recommended.
Multi-display / mixed-GPU acceleration: Single Display performance mode (unless you have mutliple monitors).
Power Management: Prefer maximum performance

_On a side note: When you find time a good thing to do is fill out your specs in Your Profile - Edit Systems section so other members can see your components in your signature when posting._


----------



## truestorybro545

Quote:


> Originally Posted by *Divineshadowx*
> 
> 
> Hey guys, so I finished my build yesterday which includes GTX 690's in quad sli. I'm having a problem tho, during games like BF3, the gpu's will throttle themselves to down to about 550mhz, and sometimes even the idle 324mhz, and it reduces the fps to 30-40, which is annoying when playing on a 120hz monitor. I'm using EVGA precision for the "target" 120 fps but it does not seem to do much since the gpu's will randomly throttle themselves and create lag. Also, with the precision monitor, the gpu's will almost NEVER go above 50% usage, any ideas?


I don't mean to intrude, guys (I can't afford this right now), but....


----------



## Divineshadowx

Well. I set the power mode to performance, and it seems better, but only 100 fps on Crysis 2. about 55-60% load on each of the 4 gpu's, and 915MHz clock speed, it's not reaching 120fps but it's also not activating the boost clock, not sure what's wrong. Does the usage mean anything? If it is not reaching the targeted 120fps doesn't that mean it should be using 100% power?


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well. I set the power mode to performance, and it seems better, but only 100 fps on Crysis 2. about 55-60% load on each of the 4 gpu's, and 915MHz clock speed, it's not reaching 120fps but it's also not activating the boost clock, not sure what's wrong. Does the usage mean anything? If it is not reaching the targeted 120fps doesn't that mean it should be using 100% power?


What drivers you using?

Whenever I've had low GPU usage problems it's almost always been driver related. Remove your 120FPS target and let them run on their own also and see if that helps.


----------



## juanP

Those AquaComputer blocks look nice


----------



## Kyouki

OK guys your going to roll on the ground cracking up but I had to do this just to find out what kinda difference in temps I would get! This costed me nothing all extra stuff laying around! So when I run with a mild overclock on my gtx690 and run heaven I was hitting 80c. This was due to the Hot air dumping in case. So I built a ramp/tunnel to a old CD/DVD shroud enclosure then add a small fan i had laying around to bring the hot air out the front.
inside the back showing tunnel

front showing Fan

added cover

inside comp

outside comp


So overall Testing I ran heaven and I dropped 10c-15c after 3 runs and a bench it stayed at 65c-70c before up to mid 80c, So I am happy with results. I upped my overclock and still stayed at 68c-70c so This little Mod helped! so I think since this was a test I am going to build a better quality one out of acrylic if i don't end up water cooling it! as for ideal temp it only changed by maybe 1-2c but What do you guys think hahah.


----------



## Divineshadowx

I'm using the newest drivers. During Crysis 2, sometimes the fps goes down about 90, but the gpu's are not close to 100% usage, I dont know if that means anything. Other times the fps is in the range of 120-240 with v-sync off. But, on Unigine Heaven, the load is about 80% - 90% for each of the 4 gpu's.



That's what I got, CPU is at 4.5GHz, Ram is at 2133mhz @ 10-11-10-30-2T, and no overclock on the 2 690's, just set the power target to 130%, didn't know if that would do anything. Does that seem normal for 690 in quad sli?


----------



## dboythagr8

You are using a 1080p monitor. In games, your GPU usage in Quad SLI will rarely be 80-90% at that low of resolution.


----------



## Cheesemaster

Dude what gives i got quad sli.. yeah i think its your OC on the cards here is my score..


----------



## xoleras

Quote:


> Originally Posted by *Cheesemaster*
> 
> Dude what gives i got quad sli.. yeah i think its your OC on the cards here is my score..


You: 1680x1050

Him: 1920x1080

Also...please tell me you're not using quad sli on a single screen...


----------



## Kyouki

He not but when you run heaven to be placed in ranking you have to run at that setting.


----------



## juanP

cheesemaster how are you getting so high scores ? is it because of your cpu? or did you OC your gpus too?

i just ran my first unigine test on stock gpus and i get very low scores. maybe i am missing some setups.


----------



## thestache

Quote:


> Originally Posted by *Kyouki*
> 
> OK guys your going to roll on the ground cracking up but I had to do this just to find out what kinda difference in temps I would get! This costed me nothing all extra stuff laying around! So when I run with a mild overclock on my gtx690 and run heaven I was hitting 80c. This was due to the Hot air dumping in case. So I built a ramp/tunnel to a old CD/DVD shroud enclosure then add a small fan i had laying around to bring the hot air out the front.
> inside the back showing tunnel
> 
> front showing Fan
> 
> added cover
> 
> inside comp
> 
> outside comp
> 
> So overall Testing I ran heaven and I dropped 10c-15c after 3 runs and a bench it stayed at 65c-70c before up to mid 80c, So I am happy with results. I upped my overclock and still stayed at 68c-70c so This little Mod helped! so I think since this was a test I am going to build a better quality one out of acrylic if i don't end up water cooling it! as for ideal temp it only changed by maybe 1-2c but What do you guys think hahah.


Great job, suprised it dropped your temps by thaaaat much.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Kyouki*
> 
> OK guys your going to roll on the ground cracking up but I had to do this just to find out what kinda difference in temps I would get! This costed me nothing all extra stuff laying around! So when I run with a mild overclock on my gtx690 and run heaven I was hitting 80c. This was due to the Hot air dumping in case. So I built a ramp/tunnel to a old CD/DVD shroud enclosure then add a small fan i had laying around to bring the hot air out the front.
> inside the back showing tunnel
> SNIP**
> So overall Testing I ran heaven and I dropped 10c-15c after 3 runs and a bench it stayed at 65c-70c before up to mid 80c, So I am happy with results. I upped my overclock and still stayed at 68c-70c so This little Mod helped! so I think since this was a test I am going to build a better quality one out of acrylic if i don't end up water cooling it! as for ideal temp it only changed by maybe 1-2c but What do you guys think hahah.


You could get one of these:

http://www.frozencpu.com/products/5433/cpa-155/Lian_Li_Triple_525_Bay_120mm_Fan_Module_-_Black_CCFANMODULE.html?tl=g43c241s612&id=MIFT5wyx

I have one on my 800d and it does wonders. I now mounted a 120mm rad on it.


----------



## thestache

Quote:


> Originally Posted by *juanP*
> 
> cheesemaster how are you getting so high scores ? is it because of your cpu? or did you OC your gpus too?
> i just ran my first unigine test on stock gpus and i get very low scores. maybe i am missing some setups.


Take a look at his resolution and settings.

Set yours to these: http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores


----------



## dboythagr8

Quote:


> Originally Posted by *juanP*
> 
> cheesemaster how are you getting so high scores ? is it because of your cpu? or did you OC your gpus too?
> i just ran my first unigine test on stock gpus and i get very low scores. maybe i am missing some setups.


what you have to be running quad sli right? no way you're getting those scores with one 690...?


----------



## V3teran

Forget 150 go for 160,i reckon it will be stable.Depending on how they perform under water depends on wether i put mine under water,looking forward to your results!


----------



## Kyouki

Quote:


> Originally Posted by *Sexparty*
> 
> Great job, suprised it dropped your temps by thaaaat much.


Yea I was surprised as well but after thinking about it my case set up allowing it to dump hot air right in front of that intake fan for the graphic card on the SilverStone TJ10 then pushing the hot air right back into the GTX690 going in a loop just feeding it hot air. This sections that fan off only allowing it to bring in room temperature air and taking hot air out completely.
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> You could get one of these:
> http://www.frozencpu.com/products/5433/cpa-155/Lian_Li_Triple_525_Bay_120mm_Fan_Module_-_Black_CCFANMODULE.html?tl=g43c241s612&id=MIFT5wyx
> I have one on my 800d and it does wonders. I now mounted a 120mm rad on it.


Yea I have been looking at something like this as an option. Since I only had 3 bays left I was trying to keep it minimal so I was trying to use a signal bay first, but I bet I would get a little bit better results with something like that.


----------



## juanP

Quote:


> Originally Posted by *dboythagr8*
> 
> what you have to be running quad sli right? no way you're getting those scores with one 690...?


yes its a quad sli setup


----------



## Cheesemaster

man i feel stupid.. ill run it at 1920x1080 and see what i get


----------



## Cheesemaster

Quote:


> Originally Posted by *xoleras*
> 
> You: 1680x1050
> Him: 1920x1080
> Also...please tell me you're not using quad sli on a single screen...


Here is my score @ 1920x1080


----------



## Cheesemaster

Quote:


> Originally Posted by *Sexparty*
> 
> Take a look at his resolution and settings.
> Set yours to these: http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores


here is at my max rez 5760x1080......


----------



## Agenesis

Best stress test for 690? I have OCCT/Heaven/Furmark/Kombustor and I'm not sure which one to use for stability testing. Just want to make sure I didn't get a 1k paperweight, lol.


----------



## juanP

is there a good guide for precision-x somewhere which i can use for ocin'g the 690's ?


----------



## Cheesemaster

you tube legion did a tutorial


----------



## MoMann

Hello I plan on getting two 690's and im wondeing what size PSU will be good?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *MoMann*
> 
> Hello I plan on getting two 690's and im wondeing what size PSU will be good?


Depending on the cpu, a good 850w will do the job just fine. If you're running a six core SB-E cpu, go for a 1000w unit to be on the safe side.

I probably just get a 1000w unit anyways for a couple bucks more.


----------



## MoMann

Thanks!


----------



## MoMann

This?
http://www.ocztechnology.com/ocz-zx-series-850w-power-supply.html


----------



## Kyouki

Man seeing more and more of these 690s in quad SLI making me want to drop another 1k on a 2nd one! hahah


----------



## MoMann

Was that a good PSU?


----------



## Stateless

Quote:


> Originally Posted by *V3teran*
> 
> Forget 150 go for 160,i reckon it will be stable.Depending on how they perform under water depends on wether i put mine under water,looking forward to your results!


I hit 150 for over 2 hours of Unigine without a crash! So far, been able to go and extra 20 over AIR. Going to go to 160 tonight after the Baskeball game to see what happens. This is completely blowing me away though. Could not break 130 on eathier of my 690's (tested individually and in SLI) on AIR and this is with temps staying at 70c or below...but on water, they are passing it with ease.


----------



## Stateless

Quote:


> Originally Posted by *Agenesis*
> 
> Best stress test for 690? I have OCCT/Heaven/Furmark/Kombustor and I'm not sure which one to use for stability testing. Just want to make sure I didn't get a 1k paperweight, lol.


I have found that using Unigine works best followed by a few runs of 3dMark11 and playing some demanding games.

I generally will not consider it 100% stable unless it passes 1 hour of Unigine, 4-5 runs of 3dmark 11, Witcher 2, BF3 and Metro 2033.

But in all my testing, going back to my 460, if it generally can pass over an hour of Unigine, it usually can pass everything else.


----------



## ReignsOfPower

Bang


Interesting inlet / outlet port arrangement. How would I connect a hose from the back end of teh card if there is no protrusion

For example:


http://www.ekwb.com/news/221/19/EK-introduces-EK-FC690-GTX-and-EK-FC670-GTX/


----------



## RayTrace77

Hi There,

Thought i'd join the bandwagon for the owners of the 690, just got mine 3 days ago. There's not many benchmarks been done for games so far so quite hard to tell what's supposed to be ok or not.

Seems to be a bit off though, getting low frame rates on Metro 2033(1920 x 1080,4X MSAA,Very High, Physx = around 40 fps), when I check the reviews on different websites they show higher rates.

also here's my Unigine benchmark, seems quite low for a single 690 ?

Any Ideas ?



Heres the setting I used in Unigine:



On a site note:
You guys having quad SLI Setups is insane, 8GB of VRAM, 8 !

My Specs are :

Intel Core i7 980X @ 3.33Ghz @ 1.275v
Corsair H50 Hydro Cooler
Gigabyte GA-X58A-UD3R Motherboard
12GB Corsair RAM @ 1600 Mhz
1 x Intel 520 120GB SSD
1 x Western Digital 2TB Green Drive(The Games run off this drive)
1 x Corsair AX850 PSU


----------



## SimpleTech

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Bang
> 
> Interesting inlet / outlet port arrangement. How would I connect a hose from the back end of teh card if there is no protrusion
> For example:
> 
> http://www.ekwb.com/news/221/19/EK-introduces-EK-FC690-GTX-and-EK-FC670-GTX/


They come with this:









If you want to do a multi-gpu setup, you are going to have to use their EK bridges.


----------



## emett

Quote:


> Originally Posted by *RayTrace77*
> 
> also here's my Unigine benchmark, seems quite low for a single 690 ?
> Any Ideas ?


Enable SLI in the nvidia control panel.


----------



## Cheesemaster

My system did not like the new beta drivers went back to the official driviers and ran my best 3dmark11 yet! starting to get it all dialed in....


----------



## Stateless

I was finally able to find some time to discover my highest offset on my 690 under H20. I was able to hit +160 on the core and is 100% Stable!!! On Air, even with heat not being an issue, the highest I was able to hit was +130 on the core..anything above that and it would be crash city. So while not a huge jump, it does prove that going to water can improve the OC even at the same voltage. While +30 is not a lot, it still is pretty impressive. My boost is now at 1202 due to it. I will now work on the memory to see what it could be pushed too next.

As with all Kepler based cards, luck is a big factor in overclocking due to the voltage lock. I know some that are able to go higher than +160 and can only imgaine if they went H20 how far it could be pushed. Also worth noting that temps max out at 34c on GPU 1 and 31C on GPU2!!!


----------



## RayTrace77

Hi Emett,

Thanks for the reply, I checked and am sure SLI is enabled however this is what my control panel looks like, i've not been able to find any forum where it says if this is ok or not.

there are 3 DVI connections and to get my monitor(1920 x 1080 @ 120hz) on GPU A I had to connect it to the DVI connection 2 as DVI connection 1 is on GPU B.

Does the pic look correct or is it because Physx is on GPU B that something is wrong ?



I have the MSI 690 and when I check Afterburner for temps I see that both GPU's during gaming go up in temp so not sure what's wrong, may try a clean install of Windows.

Thanks

Ray

UPDATE:

I went to the Control Panel's "Adjust image settings with preview" and changed it to "Let 3D application decide"

Heres the score I got now when I re-ran the Unigine benchmark, does this look ok or is it still low ?



Thanks

Ray


----------



## Stein357

Hey guys, quick questions regarding cooling. My 650D currently has my front 200mm fan as in intake. The air from the 2nd GPU and it collide. Would it be more effectively cooled if I turned the 200mm into an exhaust?


----------



## Arizonian

Quote:


> Originally Posted by *Stein357*
> 
> Hey guys, quick questions regarding cooling. My 650D currently has my front 200mm fan as in intake. The air from the 2nd GPU and it collide. Would it be more effectively cooled if I turned the 200mm into an exhaust?


I personally intake from the front, side (open cooler air), with extra fan on bottom pushing up into tower and exhaust out the rear and top.

As long as its directional. Usually there isn't that much space in the rear for cooler air and in theory heat rises. Best air flow scenario can vary between rigs.


----------



## dboythagr8

Quote:


> Originally Posted by *RayTrace77*
> 
> UPDATE:
> I went to the Control Panel's "Adjust image settings with preview" and changed it to "Let 3D application decide"
> Heres the score I got now when I re-ran the Unigine benchmark, does this look ok or is it still low ?
> 
> Thanks
> Ray


That looks about right. Looked at my Heaven @ 1080p and it was 81.5 fps.


----------



## Stein357

Quote:


> Originally Posted by *Arizonian*
> 
> I personally intake from the front, side (open cooler air), with extra fan on bottom pushing up into tower and exhaust out the rear and top.
> As long as its directional. Usually there isn't that much space in the rear for cooler air and in theory heat rises. Best air flow scenario can vary between rigs.


The 650D doesn't have any intakes on the bottom unfortunately, or else it would be perfect. I'll play with it and see if it makes a significant change in temperatures


----------



## RayTrace77

Thanks dboythagr8,

Games seem to run a lot smoother now maxed out, specially Metro(1080p, DX11,Very High, 4 X MSAA, 16X AF) and i was getting around 60-100 fps I think, granted PhysX was turned off.

I also tried Max Payne 3 at maxxed settings(1080p) including the 8 x MSAA which it said required 4060MB of video memory out of the 4096MB i had, haha. But it ran very smoothly, averaging between 50-90 fps.

@ Stein357,

I agree with Arizonian, I myself have a Cooler Master HAF-X and intake on the front with the 1 x 230MM Fan and side 200MM Fan and exhaust out the back(1 x 120MM Fan)/top(1 x 200MM Fan), would you ever consider physical mods to the case to suit your needs, I used to have an Antec 900 case before and did physical mods to that which helped(quite time consuming though).


----------



## bhk1004

hello everyone, just stumbled upon this thread.

will be reading through this thread for a little bit of fun.

just got my hydrocopper block today and overclocked my 690 gtx.. +185 and +700 mem.

http://3dmark.com/3dm11/3702560

figured i would share something before 1 ransacked all the knowledge that was dropped in the earlier pages.


----------



## ReignsOfPower

Quote:


> Originally Posted by *SimpleTech*
> 
> They come with this:
> 
> 
> 
> 
> 
> 
> 
> 
> If you want to do a multi-gpu setup, you are going to have to use their EK bridges.


Still doesnt make sense to me lol


----------



## Arizonian

I did a little test after noticing my GTX 690's second GPU #2 is the stronger side of the card. Kepler boost does better on that side even synced. I switched the DVI connection to GPU #2 and PhysX to auto moved to GPU #1.

I went from maxed 109.3 Heaven score to 109.6 with same OC. Yet no gains in 3DMark11. Haven't tried gaming yet. Is there any reason why I shouldn't leave it like this?

This OC has been a stable 24/7 over clock for me btw. I haven't had much luck on memory.

[Just a note, with 43 members in the club to date and 46 GTX 690's amongst us makes this a $46,000 club.







]


----------



## RayTrace77

Seems interesting, also as connection 1 is on GPU B, maybe that was the way they wanted it, that was the way I first had it, i'd imagine it's fine to leave it like that, after all if you are using 1 monitor like me, makes sense to connect it to connection 1.

Also, make that a $47,000 Club









Here's my rig:


----------



## dboythagr8

I can't tell, but is that a HAF X? If so what kind of temps do you see on the 690?

Looking at the Owner's Club, EVGA has made a _killing_ on the 690. I assume that extends to the entire 600 line as well.


----------



## Stein357

Quote:


> Originally Posted by *RayTrace77*
> 
> @ Stein357,
> I agree with Arizonian, I myself have a Cooler Master HAF-X and intake on the front with the 1 x 230MM Fan and side 200MM Fan and exhaust out the back(1 x 120MM Fan)/top(1 x 200MM Fan), would you ever consider physical mods to the case to suit your needs, I used to have an Antec 900 case before and did physical mods to that which helped(quite time consuming though).


Can't bring myself to cut up my case, the GPUs stay under 70 when I'm gaming so it's not THAT big of a problem.


----------



## Tslm

Quote:


> Originally Posted by *Arizonian*
> 
> I did a little test after noticing my GTX 690's second GPU #2 is the stronger side of the card. Kepler boost does better on that side even synced. I switched the DVI connection to GPU #2 and PhysX to auto moved to GPU #1.


If you're running multi-gpu and PhysX is set to run on say the 2nd gpu, does the 2nd gpu render frames as well as run PhysX? Haven't actually used it since I switched to nvidia, BL2 will probably be the first I play


----------



## Arizonian

Quote:


> Originally Posted by *Tslm*
> 
> If you're running multi-gpu and PhysX is set to run on say the 2nd gpu, does the 2nd gpu render frames as well as run PhysX? Haven't actually used it since I switched to nvidia, BL2 will probably be the first I play


It uses GPU #1 for PhysX.


----------



## Tslm

So in your case GPU 1 doesn't contribute to rendering frames in PhysX games? Or is the rendering load split but GPU 1 also has Physx added on top of its workload?


----------



## Arizonian

I'm not sure. When I ran a Heaven Benchmark I went from maxed 109.3 after many tries and with same over clock on GPU and CPU switched and I got 109.6 score. I've not tested in gaming which I will give a try right now and see if I see any difference. This is my first dual GPU so I'm not sure if GPU #2 is working as GPU #1 if the monitor is connected to it.

Let me see what kind of specs I get and will report back.

So what your thinking is even though I have the monitor hooked to GPU #2 - GPU #1 is still my main GPU regardless?


----------



## Tslm

I'm just not sure how the drivers distribute workloads in SLI when physx comes into play.

I'm guessing if you hooked you monitor up to GPU #2 that its probably the master gpu. So in a normal scenario GPU #2 would push out pixels say every even frame and GPU #1 would take every odd frame... but with Physx you've assigned your slave GPU (#1) to handle PhysX calcs. I'm wondering if that means GPU #1 still helps push out pixels with GPU #2 while bearing the extra load of PhysX, or if it just sits there doing PhysX stuff like the master GPU does the rendering.


----------



## Arizonian

Quote:


> Originally Posted by *Tslm*
> 
> I'm just not sure how the drivers distribute workloads in SLI when physx comes into play.
> I'm guessing if you hooked you monitor up to GPU #2 that its probably the master gpu. So in a normal scenario GPU #2 would push out pixels say every even frame and GPU #1 would take every odd frame... but with Physx you've assigned your slave GPU (#1) to handle PhysX calcs. I'm wondering if that means GPU #1 still helps push out pixels with GPU #2 while bearing the extra load of PhysX, or if it just sits there doing PhysX stuff like the master GPU does the rendering.


Great points.

I just tested Crysis 2 with GPU#2 and GPU#1 hooked to main monitor and it did not effect gaming at all either way with increase in FPS or decrease. Nor did it make any difference to video quality or made it worse. Both GPU's had same boost going on and I saw no gains by switching.

Since both GPU's on a dual card are in PCIe slot #1 I assumed I could designate which GPU to work as main. Being my first dual GPU and GPU#2 having better Kepler Boost I thought it would be worth a try. No way to really confirm this. I digress because technically both GPU's are only as good as the lesser of the two.

I'm switching back to GPU#1 and call it a night.


----------



## RayTrace77

@dboythagr8

It is indeed a HAF X 942.

During normal game play (Metro 2033,BF3, Crysis 2,Max Payne), Normally never goes above 75 Degrees, couple of days ago I did a long 4-5 hour gaming session on Max Payne and it never went above 75, however when my room gets hot(im in a loft), I've seen it reach a peak of 80 Degrees. I'm a stickler for quietness, I have my PC on 24/7 so I want it as quite as possible and as powerful, currently it is extremely quiet so I think I got a fairly good ratio of temps/noise, though a fully WC system would be what I want I'd need to study a lot regarding what part's I need and would need more time.

I always want to improve the airflow in the case but not really sure what else I can do, I didn't want to get messy and cable tie fans etc to odd places as I used to do that and didn't like it.

regarding the way the Driver split's the workload when PhysX is on, I'd say that in the nVidia Control Panel if it says GPU B is doing PhysX then it must be rendering frames and doing PhysX on top because where you can manually select the option to choose the PhysX processor you have a option to tick "Dedicate to PhysX" which I'd imagine lock's down the selected GPU to PhysX only.


----------



## Forrester

i think i may have asked this before, but i cant remember, so sorry, but any idea which block performs the best? or are they all so similar it would only be differences in the 1-3 C range? like this:



and stateless, you were saying you had great temps with your koolance block. how much raddage you running?


----------



## Stateless

Quote:


> Originally Posted by *Forrester*
> 
> i think i may have asked this before, but i cant remember, so sorry, but any idea which block performs the best? or are they all so similar it would only be differences in the 1-3 C range? like this:
> 
> and stateless, you were saying you had great temps with your koolance block. how much raddage you running?


My loop consist of the following:

3 XSPC's 360 Rad's, CPU Waterblock with my 3930k at 4.8 OC, GPU with a +160 OC.


----------



## juanP

my mild gpu oc , still need to scale down the power and increase the gpu speed.


----------



## Arizonian

Quote:


> Originally Posted by *juanP*
> 
> my mild gpu oc , still need to scale down the power and increase the gpu speed.
> 
> 
> Spoiler: Warning: Spoiler!


Wow that's a huge score from one GTX 690 on 3DMark11 - See *this thread* of benchmarks to compare your score.









I can't beleive how high of a memory over clock some of these 690's are getting. Seems I crash anything past 1602 Memory which is a 100 MHz boost from stock. My memory offset is +200 but I run it 24/7 lower at +184 which is 1594 Memory Clock.

My *3DMark11* score was *P16682* at 4.3 GHz CPU over clock on my i7 3770K. I'm positive that even if I over clocked my GPU just a tad more to 4.5 GHz it wouldn't come close to that.

I've not had any luck with three cards thus far. GTX 680 / GTX 680 SC and my current GTX 690. People are getting crazy memory over clocks which really help in some games and benchmarks.

You should enter your score on OCN *here*. You should enter your score and get yourself on that list bud.


----------



## MrTOOSHORT

^^^ two gtx690s.


----------



## juanP

Quote:


> Originally Posted by *Arizonian*
> 
> Wow that's a huge score from one GTX 690 on 3DMark11


i am running two gtx 690's . with two of them that's a low score i think.


----------



## PowerK

Quote:


> Originally Posted by *juanP*
> 
> i am running two gtx 690's . with two of them that's a low score i think.


The score is about right.
You need to understand that "Performance" preset of 3DMark11 only runs at 1280x720 resolution. At this low resolution, 670/680 SLI, 690 and 690 Quad-SLI are heavily CPU limited. I think only Extreme preset (1920x1080) gives meaningful figures for current generation of Multi-GPU systems. After all, 720p is pretty much standard resolution of those consoles like PS3, X360 which were released like 6~7 years ago.


----------



## Arizonian

Quote:


> Originally Posted by *juanP*
> 
> i am running two gtx 690's . with two of them that's a low score i think.


That's about right if you look at others playing in Performance mode in the 3DMark11 Performance Benchoff here on OCN. You got some room however with Core over clock still I bet.

The others with higher score on that list are on heavily over clocked 3930's & 3960's.


----------



## theyedi

As far as memory overclocks go, maybe a higher overclock would actually be more stable. Case in point, my 7970 was not stable at all when overclocked between 1475 and 1600. However, at 1800-1815 it's entirely stable


----------



## max883

http://3dmark.com/3dm11/3720840 is this score ok?

i was hoping for 18.000p maybe i can overclock my gpu more. im going to get a water block soon then maybe

power target: 135
GPU: +100
mem: +500


----------



## Arizonian

Quote:


> Originally Posted by *max883*
> 
> http://3dmark.com/3dm11/3720840 is this score ok?
> i was hoping for 18.000p maybe i can overclock my gpu more. im going to get a water block soon then maybe
> power target: 135
> GPU: +100
> mem: +500


That score is a very good score.


----------



## hzac

I didnt know how powerful the GTX690 was until i looked into this thread. It seriously boggles the mind haha. And some of you guys have 2. They are like 1700$ retail in australia.. I seriously cant get my head around the 80fps at 1600p with 4xAA on Bf3. I just wanted to say i am truly jealous of you guys. I could only dream of owning something like this haha.


----------



## 4ofus

Joined the 690 club.


----------



## tonyjones

man you all are ballers, what is your day job, able to afford $1000 video card


----------



## Arizonian

Quote:


> Originally Posted by *tonyjones*
> 
> man you all are ballers, what is your day job, able to afford $1000 video card


I started saving in January of last year after getting a GTX 580. No high paying job here. I wanted to SLI it. By the time I had enough for a second 580 I decided to hold off, since I was doing fine gaming, until new series came out and kept saving.

Get an envelope, put aside $20 here and there and once it's in the envelope *forget you have it*. Discipline. Eat out less. Drink beer at home rather than a pub. Took me 14 months and some tax refund money. The delay of Kepler helped be ready at release date.









Start now, by this time next year you'll have enough for two 770's or better in SLI.









The 690 will keep me styling for the next two years before handing it down to the second rig. Already started saving for a huge capacity SSD next year. Fingers crossed.

_Selling your current card will also help get you there when the time comes._ Oh almost forgot, and don't tell the Mrs.


----------



## juanP

Quote:


> Originally Posted by *Arizonian*
> 
> Oh almost forgot, and don't tell the Mrs.


+1 to that


----------



## Divineshadowx

So I overclocked a bit, and then ran oc scanner, only to see that i was getting a max of 120fps. I saw a video of a guy getting 160fps with a single gtx 680, and I have 2 gtx 690's... The boost clock jumps around every second, it never stays constant, and so does the voltage. Also, precision X says the gpu clock is 914mhz, when ever was it 914mhz, I mean it came from the factory at 915mhz stock... also shouldn't it be throttling itself when idle, back to the 314mhz or whatever it was? The NV-Z says 4 way sli enabled and so does nvdia control panel so that couldn't be the problem, whats wrong? Please help


----------



## Qu1ckset

Quote:


> Originally Posted by *tonyjones*
> 
> man you all are ballers, what is your day job, able to afford $1000 video card


Saving money, and buying used and reselling is key!
For awhile id only buy used cards, like 5970s,6970s,6990,(obviously the best cards for the time) etc and id resell them before there value went down to much, like for example i bought my last card in the summer of 2011, which was the 6990 got it for $450 , and sold it last month for $500. i made $50 and got to use it for a year!







, and in the mean time you save money for awesome releases like the gtx690!

I don't think i will ever sell my gtx690 because i love the card so much, honestly the nicest graphics card i have ever held in my hand (build quality) , any owner in this thread could agree!
(plus i bought it for $1100, no way im getting that same price back next year lol)

Oh and i found dual gpu cards hold there value longer then single gpu cards


----------



## Lu(ky

Quote:


> Originally Posted by *Qu1ckset*
> 
> Saving money, and buying used and reselling is key!
> For awhile id only buy used cards, like 5970s,6970s,6990,(obviously the best cards for the time) etc and id resell them before there value went down to much, like for example i bought my last card in the summer of 2011, which was the 6990 got it for $450 , and sold it last month for $500. i made $50 and got to use it for a year!
> 
> 
> 
> 
> 
> 
> 
> , and in the mean time you save money for awesome releases like the gtx690!
> I don't think i will ever sell my gtx690 because i love the card so much, honestly the nicest graphics card i have ever held in my hand (build quality) , any owner in this thread could agree!
> (plus i bought it for $1100 know way im getting that same price back next year lol)
> Oh and i found dual gpu cards hold there value longer then single gpu cards


Yeah knowing when the good stuff is coming out and selling your old stuff at the right time to get most of your moneys back is key...
I just ran 3DMARK11 and saw my old setup was a 2600K at 4.5GHz with a 6990 & 6970 in TRIFire I was getting around 14K score with my stock 3770K and Stock GTX 690 I am beating that setup by 400 points without any OC yet..
This GTX 690 is a beast...


----------



## Qu1ckset

So the evga gtx690 backplates are coming out mid july?


----------



## Divineshadowx

Omg this is so frustrating, same exact fps in bf3 and benchmarks with a single 690 vs 2 of them... *** is wrong with my computer.


----------



## thestache

Could someone confirm native PCIe 3.0 support for the GTX 690 on the X79 platform? More importantly running three monitors?

Buying my new set-up tuesday for my GTX 690 which I have already and will be using a multi monitor set-up, so really need the PCIe 3.0 support and it supported with multi monitor set-ups. Because this is one of the only set-ups that will actually benefit from it/use it.

If I understand this correctly PCIe 3.0 although disabled has always worked on the X79 platform with a registry hack. It just didn't work with a multi monitor set-up, which sucked because it's the only set-up that actually needed it. However now it's been allowed by Nvidia it does work, although with mixed results but not for multi monitor users. Still.

I ask because up until now I was under the impression all 600 series cards were limited to PCIe 2.0 by Nvidia. But someone told me today that the GTX 690 has native PCIe 3.0 support on X79, it was only GTX 680/670 that didn't? If that's true then it shouldn't have the same problems the GTX 680/670s are having with PCIe 3.0 and multi monitor set-ups.

If anyone can help I really appreciate it and look forward to joining the club once I actually plug my GTX 690 into something.

Thanks.


----------



## SimpleTech

Quote:


> Originally Posted by *Sexparty*
> 
> Could someone confirm native PCIe 3.0 support for the GTX 690 on the X79 platform? More importantly running three monitors?
> Buying my new set-up tuesday for my GTX 690 which I have already and will be using a multi monitor set-up, so really need the PCIe 3.0 support and it supported with multi monitor set-ups. Because this is one of the only set-ups that will actually benefit from it/use it.
> If I understand this correctly PCIe 3.0 although disabled has always worked on the X79 platform with a registry hack. It just didn't work with a multi monitor set-up, which sucked because it's the only set-up that actually needed it. However now it's been allowed by Nvidia it does work, although with mixed results but not for multi monitor users. Still.
> I ask because up until now I was under the impression all 600 series cards were limited to PCIe 2.0 by Nvidia. But someone told me today that the GTX 690 has native PCIe 3.0 support on X79, it was only GTX 680/670 that didn't? If that's true then it shouldn't have the same problems the GTX 680/670s are having with PCIe 3.0 and multi monitor set-ups.
> If anyone can help I really appreciate it and look forward to joining the club once I actually plug my GTX 690 into something.
> Thanks.


All of Kepler is PCIe 3.0. Nvidia disabled it (_referring to X79_) through their drivers for whatever reason (certification?) but you can enable it using their "hack". I personally wouldn't worry about it since it's been tested that these cards don't saturate the bus (*source*). Though there is a member (*Vega*) who showed that he got a higher frame rate when he switched from PCIe 2.0 to 3.0. I think more testing needs to be done.


----------



## 4ofus

Quote:


> Originally Posted by *SimpleTech*
> 
> All of Kepler is PCIe 3.0. Nvidia disabled it through their drivers for whatever reason (certification?) but you can enable it using their "hack". I personally wouldn't worry about it since it's been tested that these cards don't saturate the bus (*source*). Though there is a member (*Vega*) who showed that he got a higher frame rate when he switched from PCIe 2.0 to 3.0. I think more testing needs to be done.


I did the registry hack on my x79 Sabertooth. When I benchmarked my GTX680 before/after my score actually dropped 200 points by enabling PCIe3.0.


----------



## thestache

Quote:


> Originally Posted by *SimpleTech*
> 
> All of Kepler is PCIe 3.0. Nvidia disabled it through their drivers for whatever reason (certification?) but you can enable it using their "hack". I personally wouldn't worry about it since it's been tested that these cards don't saturate the bus (*source*). Though there is a member (*Vega*) who showed that he got a higher frame rate when he switched from PCIe 2.0 to 3.0. I think more testing needs to be done.


GTX 680 running a single monitor, no. But a GTX 690 running multiple monitors, yes. Several sites and credible sources all agree it benefits from PCIe 3.0, even slightly at 1080p few percent. Thanks for the link to the review but it annoys me when sites upload rubbish like that. We all know in those situations (single card) it doesn't matter but bring several GPUs to the same PCIe slot at multi monitor resolutions or choke the PCIe lanes of a motherboard (x8x8x8x8) and it will.

which is exactly what Vegas test was. 16x8x8x8x with GTX 680s and three monitors. Switching to PCIe 3.0 in that situation doubles your bandwidth and was the reason for nearly doubling his frame rates. No single GPU card seems to need more than PCIe 3.0 8x = 16GB/s at the moment but I'm pretty confident the GTX 690 can take advantage of more (PCIe 3.0 16x = 32GB/s) whether it's the full extent of the bandwidth or just enough to make a difference is however is up to who benchmarks it first I guess.


----------



## Arizonian

Quote:


> Originally Posted by *Qu1ckset*
> 
> So the evga gtx690 backplates are coming out mid july?


Yes - it was stated on EVGA website forum by a moderator expect the GTX 690 back plate by mid-July. It was also confirmed no rear high-flow bracket however will be manufactured.

Personally I feel it's unneeded the cards stock air cooler with vapor chamber cooling is more than efficient.
Quote:


> Originally Posted by *SimpleTech*
> 
> All of Kepler is PCIe 3.0. Nvidia disabled it through their drivers for whatever reason (certification?) but you can enable it using their "hack". I personally wouldn't worry about it since it's been tested that these cards don't saturate the bus (*source*). Though there is a member (*Vega*) who showed that he got a higher frame rate when he switched from PCIe 2.0 to 3.0. I think more testing needs to be done.


Nvidia didn't disable PCIe 3.0 on Z77 boards. I can confirm that my dual GPU in PCIe slot one is running both GPU's on the single PCB at PCIe 3.0 x16.

The X79 boards atm are being effected. I haven't paid much attention to the specifics on the X79 mobo's but from what I've been hearing on OCN a registry hack is being used to enable PCIe 3.0 but I cannot confirm it's performance gains on the X79 platform.



When I start the GPU-Z render test it reports two GPU's in PCIe 3.0 x16 as well.


----------



## Lu(ky

Quote:


> Originally Posted by *SimpleTech*
> 
> All of Kepler is PCIe 3.0. Nvidia disabled it through their drivers for whatever reason (certification?) but you can enable it using their "hack". I personally wouldn't worry about it since it's been tested that these cards don't saturate the bus (*source*). Though there is a member (*Vega*) who showed that he got a higher frame rate when he switched from PCIe 2.0 to 3.0. I think more testing needs to be done.


Are the Z77 PCIe 3.0 disabled from Nvidia as well? Or only on the X79 platform?


----------



## Arizonian

Quote:


> Originally Posted by *Lu(ky*
> 
> Are the Z77 PCIe 3.0 disabled from Nvidia as well? Or only on the X79 platform?


It doesn't effect Z77 / Ivy combo's. There was never an issue with Z77. Page back post of GPU-Z clearly showing PCIe 3.0 x16.

http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/1500#post_17552233

As for the X79 boards and PCIe 3.0 - found this on Nvidia's website. Geforce 600 Series Gen3 Support On X79 Platform - dated June 19, 2012.

Found this guide on EVGA Forum regarding enabling PCIe 3.0. How to enable PCI-E 3.0 in Windows 7


----------



## Arizonian

New Question : Next topic (New post)

I've been using Precision X and GPU-Z to read my specs when over clocking. My memory offset +183 & GPU-Z reporting 1594 MHz Memory Clock steady.

Thought I'd give Afterburner another try and it is logging same offset at 3181 MHz Memory Clock.

So my question to 690 owners:

*What is your Memory Clocks offset and what is Memory Clock?*

*Why the discrepancy between GPU-Z and Afterburner readings?*


----------



## thestache

Quote:


> Originally Posted by *Arizonian*
> 
> New Question : Next topic (New post)
> I've been using Precision X and GPU-Z to read my specs when over clocking. My memory offset +183 & GPU-Z reporting 1594 MHz Memory Clock steady.
> Thought I'd give Afterburner another try and it is logging same offset at 3181 MHz Memory Clock.
> So my question to 690 owners:
> *What is your Memory Clocks offset and what is Memory Clock?*
> *Why the discrepancy between GPU-Z and Afterburner readings?*


No please, someone with X79 please confirm the GTX 690 has either native PCIe 3.0 and multi monitor set-ups work or if they at least work with the new registry/driver fix Nvidia released a few days ago. Single monitors isn't enough, have to know if it works with surround vision.

I have to buy a new Z77 or X79 system for my GTX 690 on tuesday and it needs native PCIe 3.0 not just for single monitors but also multi monitors in surround vision. Because the purpose of this build is for my surround vision set-up.


----------



## Anzial

Quote:


> Originally Posted by *Sexparty*
> 
> No please, someone with X79 please confirm the GTX 690 has either native PCIe 3.0 and multi monitor set-ups work or if they at least work with the new registry/driver fix Nvidia released a few days ago. Single monitors isn't enough, have to know if it works with surround vision.


x79 is a roll of dice - the hack may work on some mobos but then it may cause problems on others. That's why nvidia refuses to officially support pcie 3.0 on x79. The 690 itself is, of course, pcie 3.0 and backwards compatible.


----------



## JeepsRcool

Quote:


> Originally Posted by *Sexparty*
> 
> No please, someone with X79 please confirm the GTX 690 has either native PCIe 3.0 and multi monitor set-ups work or if they at least work with the new registry/driver fix Nvidia released a few days ago. Single monitors isn't enough, have to know if it works with surround vision.
> I have to buy a new Z77 or X79 system for my GTX 690 on tuesday and it needs native PCIe 3.0 not just for single monitors but also multi monitors in surround vision. Because the purpose of this build is for my surround vision set-up.


No registry hacks here. 304.48 beta

Quote:


> Originally Posted by *Arizonian*
> 
> New Question : Next topic (New post)
> I've been using Precision X and GPU-Z to read my specs when over clocking. My memory offset +183 & GPU-Z reporting 1594 MHz Memory Clock steady.
> Thought I'd give Afterburner another try and it is logging same offset at 3181 MHz Memory Clock.
> So my question to 690 owners:
> *What is your Memory Clocks offset and what is Memory Clock?*
> *Why the discrepancy between GPU-Z and Afterburner readings?*


My memory offset in PresicionX is 350 gpu-z shows 1677, x4=6708 effective now divide by 2 and minus the stock 3k and you have the 350+ from PreX.
Have not tested after burner, I have evga so I am sticking with evga OCsw.


----------



## Anzial

Quote:


> Originally Posted by *Arizonian*
> 
> New Question : Next topic (New post)
> I've been using Precision X and GPU-Z to read my specs when over clocking. My memory offset +183 & GPU-Z reporting 1594 MHz Memory Clock steady.
> Thought I'd give Afterburner another try and it is logging same offset at 3181 MHz Memory Clock.


Well, I'd say Afterburner is simply doubling up data rate: 1594x2 is 3188 which is pretty much the same as 3181


----------



## Anzial

Quote:


> Originally Posted by *JeepsRcool*
> 
> No registry hacks here. 304.48 beta


FYI, your image got shrunk so that nothing really readable on that GPU-Z (I assume) screenshot of yours. Just post the GPU-Z along with CPU-Z screenshot, not the entire screen.


----------



## Arizonian

Quote:


> Originally Posted by *Anzial*
> 
> Well, I'd say Afterburner is simply doubling up data rate: 1594x2 is 3188 which is pretty much the same as 3181










I wasn't even thinking of that since the Core Clock wasn't being doubled and showing per single GPU. Thanks.


----------



## JeepsRcool

Quote:


> Originally Posted by *Anzial*
> 
> FYI, your image got shrunk so that nothing really readable on that GPU-Z (I assume) screenshot of yours. Just post the GPU-Z along with CPU-Z screenshot, not the entire screen.


Umm just right click on it and open in new window, it takes you to the full size......


----------



## Anzial

Quote:


> Originally Posted by *JeepsRcool*
> 
> Umm just right click on it and open in new window, it takes you to the full size......


be as it may, don't you think it'd be simpler just to post cpuz+gpuz screenshots instead of the 90% empty desktop?







or you just like to show off the size of your epee... erghm, I mean, surround screen?


----------



## JeepsRcool

Quote:


> Originally Posted by *Anzial*
> 
> be as it may, don't you think it'd be simpler just to post cpuz+gpuz screenshots instead of the 90% empty desktop?
> 
> 
> 
> 
> 
> 
> 
> or you just like to show off the size of your epee... erghm, I mean, surround screen?


Yes, showing the full screen was intended as the original question asked if it worked with surround.


----------



## Anzial

Quote:


> Originally Posted by *JeepsRcool*
> 
> Yes, showing the full screen was intended as the original question asked if it worked with surround.


you misread the question. It was not about surround but pcie 3.0 support on x79.







it's kinda obvious that 690 will support surround, with 2 gpus and 4 outputs working simultaneously (I love 3 dual-link dvis!)


----------



## Arizonian

3 Dual-link DVI's = 3D Surround 3D Vision capable 120 Hz monitors.


----------



## Stein357

Does the mini display port support audio? I want to hook up my TV to my 690 via a MDP -> HDMI cable. If no, what do I need to do to hook up my sound card to my TV?


----------



## Cheesemaster

Latest and best Unigine run...


----------



## thestache

Quote:


> Originally Posted by *JeepsRcool*
> 
> No registry hacks here. 304.48 beta
> 
> My memory offset in PresicionX is 350 gpu-z shows 1677, x4=6708 effective now divide by 2 and minus the stock 3k and you have the 350+ from PreX.
> Have not tested after burner, I have evga so I am sticking with evga OCsw.


Thanks mate.

I'll proceed with my X79 purchase then.

+ rep


----------



## juanP

Quote:


> Originally Posted by *Cheesemaster*
> 
> Latest and best Unigine run...


what are all your gpu and cpu settings for that score?
also are you on air or water?


----------



## Cheesemaster

Quote:


> Originally Posted by *juanP*
> 
> what are all your gpu and cpu settings for that score?
> also are you on air or water?


air on gpu, cpu under water... [email protected], 16gig RAM @ 2400mhz 9-11-11-28-1T, Dual 690gtx @ 125 core offset 200 memory offset 135% power target


----------



## Divineshadowx

Does anyone know a way of checking a cpu oc besides cpu-z and real temp? They both say my cpu is oc'd, but windows does not, it says it is running stock, and so does dxdiag, and so does my bios during the boot and also in the menu. On my previous comp windows and the bios changed with the oc, it doesnt't seem to do that anymore though.


----------



## Arizonian

Quote:


> Originally Posted by *Divineshadowx*
> 
> Does anyone know a way of checking a cpu oc besides cpu-z and real temp? They both say my cpu is oc'd, but windows does not, it says it is running stock, and so does dxdiag, and so does my bios during the boot and also in the menu. On my previous comp windows and the bios changed with the oc, it doesnt't seem to do that anymore though.


*OCCT*


----------



## theyedi

Hmm, the clocks sometimes downclocks (both mem and gpu) and lags for a few seconds in D3. Using the latest beta drivers. Any idea why it does this?


----------



## 4ofus

Quote:


> Originally Posted by *theyedi*
> 
> Hmm, the clocks sometimes downclocks (both mem and gpu) and lags for a few seconds in D3. Using the latest beta drivers. Any idea why it does this?


Haven't been monitoring my system while playing D3 but I still have the stuttering issue even after upgrading from the 680 to the 690. Supposedly the latest 304.48 beta driver fixed this for some people but unfortunately I'm not one of them.


----------



## jahcson

You may add me to this fine list of users, this GTX 690 card is a polished diamond -
couldn't resist when I found out they SLI'd the 680 to make this one card: such a work of art !









ASUS Nvidia GeForce GTX 690 - 1019MHz Boost Clock - 4GB 6008MHz GDDR5 (Quad Data Rate) PCI-E 3.0 DX11 - DVI - HDMI - HD Audio - Nvidia Cuda Technology (3072 Cores) - 2560 x 1600 Max Res


Proof of Ownership

I'll be around - we have something great in common people, happy, happy, happy, joy joy, joy, aussie, aussi, aussie, oye, oye, oye


----------



## V3teran

@Sorry Stateless i been away for a while while i change isp's...so your card is stable at 160?You gonna try say 180 on the core?


----------



## Bigm

So this may be a dumb question to some but would it be smarter to get a pair of these for 3d vision surround rather than 3-4 4096mb 670/680s?

Edit: Paired with a RIVE and 3930k


----------



## xoleras

Quote:


> Originally Posted by *Bigm*
> 
> So this may be a dumb question to some but would it be smarter to get a pair of these for 3d vision surround rather than 3-4 4096mb 670/680s?
> Edit: Paired with a RIVE and 3930k


'

That depends on your resolution. If you're doing stereo 3d 2gb is fine. For 3d surround at 5760 x 1200, 4gb is helpful and will allow you to run higher AA levels without running out of VRAM. AA is a VRAM hog - with 2gb some games will fail unless you run No AA or FXAA.

But for 3d vision your'e fine at 2gb IMO.


----------



## Michalius

Anyone put on a Razor block yet? Curious to how it performs. I decided to take a leap of faith and just buy it. Shipping from Frozen CPU is a week, so hopefully I'll have the time to get it up and running next Monday.


----------



## Stateless

Quote:


> Originally Posted by *V3teran*
> 
> @Sorry Stateless i been away for a while while i change isp's...so your card is stable at 160?You gonna try say 180 on the core?


Yes stable at 160 on the core, but not beyond that. Have not tried 165, but 170 caused a crash withing 1 minute, but 160 can run all day and not have an issue. I plan to work on doing the memory later tonight to see how far I can get on that.


----------



## pilla99

Pretty sure I know the answer to this but what is the minimum PSU for SLI'ing these suckers. I have a 750 gold standard now. I don't think that cuts it. If I upgrade I might as well go 1200 or something but what is the minimum?


----------



## Arizonian

Quote:


> Originally Posted by *Stateless*
> 
> Yes stable at 160 on the core, but not beyond that. Have not tried 165, but 170 caused a crash withing 1 minute, but 160 can run all day and not have an issue. I plan to work on doing the memory later tonight to see how far I can get on that.


Compared to my OC on the Core that's great. I'll give you a comparison to my over clocks.

My 100% stable is +124 MHz Core / +184 Memory 24/7 7 days a week. I'm at 1144 Core / 1594 Memory / 1149 Boost. Core Clock GPU#1 1170 & GPU#2 1189 Kepler Boost.

I haven't had any luck with three cards so far. A vanilla GTX 680 which I sold for some profit, a GTX 680 SC in my current second rig, and my GTX 690.

All low Memory over clocks and average to low Core Clocks.

What is your Base Core Clock on +160 offset and where does the Kepler Boost show when maxed?


----------



## thestache

Can I join now?

But seriously is anyone having this problem. GPU-Z says my GTX 690s are running at PCIe 3.0 16x but the test system I have them in are a P8P67 WS Revoultion with a i7 2700k neither of which support PCIe 3.0. SO I'm wondering does everyone else's GTX 690 says it's running at PCIe 3.0 even when it's not or is it just me?



Talk about a serious CPU bottleneck. Honestly didn't think an i7 2700k @ 5000mhz would bottleneck quad SLI GTX 690s but my god they did. Most I was getting 50% usage accross all of them. Only going to keep one with the new 2011 platform I'm buying for it today.


----------



## Divineshadowx

Are you sure its a cpu bottleneck though? I get 30-50% gpu usage on bf3 maxed when i'm getting 90fps, when my refresh rate is 120, but on heaven i get 80-90% usage and I get well over 120fps. This is on my 3770k that i got to 4.5ghz today (my h100 seems broken). I think its something different than a CPU bottleneck, because my CPU is almost never at 100% during benchmarking.


----------



## V3teran

Quote:


> Originally Posted by *Stateless*
> 
> Yes stable at 160 on the core, but not beyond that. Have not tried 165, but 170 caused a crash withing 1 minute, but 160 can run all day and not have an issue. I plan to work on doing the memory later tonight to see how far I can get on that.


Hmm what are your temps at 160?If there still low/good then it could be an increase in voltage to give it stability.
Are there any volt mods out yet?


----------



## andygoyap

add me up, i think it might be time to replace my triple 580's


----------



## burningrave101

EVGA GTX 690 pre-order at Amazon $999.99

http://www.amazon.com/dp/B007ZRO3U4/?tag=nisa-20&m=ATVPDKIKX0DER


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Are you sure its a cpu bottleneck though? I get 30-50% gpu usage on bf3 maxed when i'm getting 90fps, when my refresh rate is 120, but on heaven i get 80-90% usage and I get well over 120fps. This is on my 3770k that i got to 4.5ghz today (my h100 seems broken). I think its something different than a CPU bottleneck, because my CPU is almost never at 100% during benchmarking.


Has to be. Overclocked the cards and 3DMark did not see a gain at all. Not to mention my combined score in 3DMark was far less with quad SLI than the single GTX 690. Lost about 5FPS on the combined benchmark also. CPU, PCIe bandwidth, something like that.

Finally got my new X79 platform up and running after a million DRAM issues so I'm about to do some new tests and see how it changes.


----------



## burningrave101

A couple of ASUS GTX 690's are in stock at NeweggBusiness and you can use the 10% coupon MEMKENTRY10 to get $100 off.

http://www.neweggbusiness.com/Product/Product.aspx?Item=N82E16814121636


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Has to be. Overclocked the cards and 3DMark did not see a gain at all. Not to mention my combined score in 3DMark was far less with quad SLI than the single GTX 690. Lost about 5FPS on the combined benchmark also. CPU, PCIe bandwidth, something like that.
> Finally got my new X79 platform up and running after a million DRAM issues so I'm about to do some new tests and see how it changes.


Well, i got about what you got, except you have a bit more core oc and your cpu should be slightly faster.



I'm running on dual pci-e 3.0 @16x, so that can't be the problem you got. So maybe it is the CPU, I mean my CPU did a horrible job on the physics test, the gpu should do that part all together in theory anyway, so that's not even an accurate test. Let us know how your 2011 does.


----------



## pilla99

Possible dumb question. My 690 is out for delivery right now and my first catleap just got here.
Can you use EVGA precision with an Asus card?


----------



## y2kcamaross

Quote:


> Originally Posted by *pilla99*
> 
> Possible dumb question. My 690 is out for delivery right now and my first catleap just got here.
> Can you use EVGA precision with an Asus card?


Yes you can


----------



## SimpleTech

Quote:


> Originally Posted by *pilla99*
> 
> Can you use EVGA precision with an Asus card?


Yes you can. It's the same reference design inside of a different manufacturer packaging.









Forgot to click one additional page.. LOL!


----------



## thestache

What's another program to use to validate PCIe 3.0 speeds?

Because I'm getting the same results in GPU-Z with my new X79 system which does support PCIe 3.0 as my old P8P67 system which shouldn't have been able to support PCIe 3.0. Maybe GPU-Z is bugged with the GTX 690 or maybe it's the GTX 690 and it's driver but something is off.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, i got about what you got, except you have a bit more core oc and your cpu should be slightly faster.
> 
> I'm running on dual pci-e 3.0 @16x, so that can't be the problem you got. So maybe it is the CPU, I mean my CPU did a horrible job on the physics test, the gpu should do that part all together in theory anyway, so that's not even an accurate test. Let us know how your 2011 does.


Won't be able to test for a while, having serious issues with my PC right now...

http://www.overclock.net/t/1275002/i7-3820-cpu-stopping-major-lag-spikes-low-settings-in-bf3-with-gtx-690


----------



## clone38

Same here with mine.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Won't be able to test for a while, having serious issues with my PC right now...
> http://www.overclock.net/t/1275002/i7-3820-cpu-stopping-major-lag-spikes-low-settings-in-bf3-with-gtx-690


Well, I dont think a CPU can just stop, your comp would just crash then. Is it just bf3 that lags? try crysis 2 maybe and heaven. A malfunctioning CPU is really rare. Your ram is at stock so I dont see that being the problem, did it pass an error test? Not sure why you turned HT off, that I dont think would change anything. I get about 30-50% load on my cpu in bf3 maxed, the game isnt very cpu intensive. Do you get about 90-120+ on max? If so then it should not be a bottleneck anywhere. First I would just reinstall everything if only bf3 has the issue. Then make sure everything is at stock, and that you have messed with the nvidia control panel settings. Just curious btw, why a 3820 lol, might as well went with a 3930k or ivy?


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, I dont think a CPU can just stop, your comp would just crash then. Is it just bf3 that lags? try crysis 2 maybe and heaven. A malfunctioning CPU is really rare. Your ram is at stock so I dont see that being the problem, did it pass an error test? Not sure why you turned HT off, that I dont think would change anything. I get about 30-50% load on my cpu in bf3 maxed, the game isnt very cpu intensive. Do you get about 90-120+ on max? If so then it should not be a bottleneck anywhere. First I would just reinstall everything if only bf3 has the issue. Then make sure everything is at stock, and that you have messed with the nvidia control panel settings. Just curious btw, why a 3820 lol, might as well went with a 3930k or ivy?


It stutters every few seconds. GPU usage drops to 10% when it happens and then suddenly increases again to 50-60%.

I have sitting next to me:

x2 X79 Sabertooth Motherboards
i7 3820
i7 3930k
Corsair Vengenace 16GB RAM 1866mhz
Corsair 8GB RAM 1600mhz
x2 EVGA GTX 690s
Corsair 60GB Force 3 SSD

Have tried both CPUs in both motherboards, with both sets of RAM and still have the same issue. Used the same SSD and GTX 690. So I'm really confused what's causing it. 3DMark 11 gives really low scores/bad performance but the stuttering isn't as bad P11000-13000. And this is on a clean install of WIndows 7 64bit.

The GTX 690 is fine and so is it's drivers, it worked beautifully in my P8P67 i7 2700k set-up P17000. The SSD worked fine before I formatted it. Can not correctly formatting an SSD cause these problems? Outside of that the only think I can think of is software/drivers but what on earth would cause my CPU to act like this? Both of them. Low or ultra settings happens just as bad.

I went from the i7 2700k to the i7 3820 because I'm building x2 PCs and didn't have enough for x2 i7 3930k. Will ebay it and get another one in a month or two when I have the cash. Because yeah, screw downgrading to a i7 3820 from my beast i7 2700k. Was a really good overclocking chip.

I'm going to youtube what it does. Game just stops, then starts again. Doesn't slow down it literally comes to a halt then speeds up again. Never had anything like this happen.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, I dont think a CPU can just stop, your comp would just crash then. Is it just bf3 that lags? try crysis 2 maybe and heaven. A malfunctioning CPU is really rare. Your ram is at stock so I dont see that being the problem, did it pass an error test? Not sure why you turned HT off, that I dont think would change anything. I get about 30-50% load on my cpu in bf3 maxed, the game isnt very cpu intensive. Do you get about 90-120+ on max? If so then it should not be a bottleneck anywhere. First I would just reinstall everything if only bf3 has the issue. Then make sure everything is at stock, and that you have messed with the nvidia control panel settings. Just curious btw, why a 3820 lol, might as well went with a 3930k or ivy?


----------



## 4ofus

Quote:


> Originally Posted by *Sexparty*
> 
> What's another program to use to validate PCIe 3.0 speeds?
> Because I'm getting the same results in GPU-Z with my new X79 system which does support PCIe 3.0 as my old P8P67 system which shouldn't have been able to support PCIe 3.0. Maybe GPU-Z is bugged with the GTX 690 or maybe it's the GTX 690 and it's driver but something is off.


I think there is a registry value that you can check to find out if PCIe 3.0 is enabled. Unfortunately I don't have a link.


----------



## thestache

Quote:


> Originally Posted by *4ofus*
> 
> I think there is a registry value that you can check to find out if PCIe 3.0 is enabled. Unfortunately I don't have a link.


I'll google it, thanks.


----------



## 4ofus

Quote:


> Originally Posted by *Sexparty*
> 
> I'll google it, thanks.


I forgot to mention in my previous post, I too have the x79 sabertooth and 3820. (cpu was free so I can't complain) Anyway, when I enabled PCIe 3.0 my 3dmark11 score actually dropped 200 points. That was back when I had my gtx680. I just left it enabled when I upgraded to the 690. Haven't tried benchmarking it on PCIe 2.0.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> It stutters every few seconds. GPU usage drops to 10% when it happens and then suddenly increases again to 50-60%.
> I have sitting next to me:
> x2 X79 Sabertooth Motherboards
> i7 3820
> i7 3930k
> Corsair Vengenace 16GB RAM 1866mhz
> Corsair 8GB RAM 1600mhz
> x2 EVGA GTX 690s
> Corsair 60GB Force 3 SSD
> Have tried both CPUs in both motherboards, with both sets of RAM and still have the same issue. Used the same SSD and GTX 690. So I'm really confused what's causing it. 3DMark 11 gives really low scores/bad performance but the stuttering isn't as bad P11000-13000. And this is on a clean install of WIndows 7 64bit.
> The GTX 690 is fine and so is it's drivers, it worked beautifully in my P8P67 i7 2700k set-up P17000. The SSD worked fine before I formatted it. Can not correctly formatting an SSD cause these problems? Outside of that the only think I can think of is software/drivers but what on earth would cause my CPU to act like this? Both of them. Low or ultra settings happens just as bad.
> I went from the i7 2700k to the i7 3820 because I'm building x2 PCs and didn't have enough for x2 i7 3930k. Will ebay it and get another one in a month or two when I have the cash. Because yeah, screw downgrading to a i7 3820 from my beast i7 2700k. Was a really good overclocking chip.
> I'm going to youtube what it does. Game just stops, then starts again. Doesn't slow down it literally comes to a halt then speeds up again. Never had anything like this happen.


s

Theres only one way to format an ssd during windows installation, so i'm sure you dod it correctly. You got 21k with 2 690s so is 13k really that bad woth just 1? I know its not 100% scaleing but it could be within margin of error. Your gpu usage goes down to about 10% but what happens to your cpu usage during the freeze? If it that also goes down try prime 95 and look if the usage ever goes below 100%, if it does than it seems like a faulty cpu, or maybe that offset voltage is screwing with things, i would try a stable constant voltage. If the cpu does not go down in usage during prime, then it could be a ram problem, try lowering the speed? The info starts from the ssd, then goes to the ram, then the cpu, then the cpu theoretically sends it to the gpu for further processing, so if there is an error in the cpu stage the gpu will not get its feed, but it could also be the ram not feeding the cpu, i would just reseat everything, sounds like a hassle but might as well, and reinstall drivers.


----------



## Shadowness

I think i can add some more information. I have this spiking too, on my laptop that i currently use as main rig until my 3960x / EVGA 690 HC during summer.

Anyway, this seems to be happening only in Battlefield 3, on almost every server. My bet is that its probably the gameserver host, as i am more then sure that all my hardware is working top notch. It can also be a result of last patch.

Whatever it is, i highly doubt its your hardware, because at least for me, this started happening no longer than before yesterday if i recall correctly

Hope it helps a bit.

EDIT :

http://battlelog.battlefield.com/bf3/forum/threadview/2832654489810385858/

http://battlelog.battlefield.com/bf3/forum/threadview/2832654348090627973/

Does it describe your issue ?


----------



## pilla99

Can someone please post their overclock settings or a link to a working guide. I have tried like the whole first page of google results from 690 overclock and have one issue or another with the card crashing most of the time.

What is a semi mild overclock that I can achieve? I am on the current beta drivers.. 304.48
Thank You

Edit: Right I am running maximum power target of 135% with stock clocks. That seems to be running great.


----------



## Stateless

Quote:


> Originally Posted by *pilla99*
> 
> Can someone please post their overclock settings or a link to a working guide. I have tried like the whole first page of google results from 690 overclock and have one issue or another with the card crashing most of the time.
> What is a semi mild overclock that I can achieve? I am on the current beta drivers.. 304.48
> Thank You
> Edit: Right I am running maximum power target of 135% with stock clocks. That seems to be running great.


Experiment! That all there is too it really. On Air, the average 690 can easily do a +100 on the core at full power. You will also need to have a good fan profile so that the fan speed increases as the temps do. Leaving it at the stock setting for the fan is not a good idea since it is not aggressive enough.

Ideally, you should only Overclock the Core first until you find a stable speed, then you can begin to OC the memory. It does take alot of time and testing to ensure it is stable, but on AIR I don't see why you cant get at least a +100 on the Core and perhaps around +200 on the memory without much issue.


----------



## pilla99

Well that's what I've been doing. At least I'm at a start with 100mhz core OC and 135% target.
Voltage scares me.

Also how hot is pushing it too hot? The ambient temp around here is pretty high. Gaming in more casual endeavor leaves the card at a cool 50 - 60, even the 40's sometimes. (Hon, Dota, CS GO)
But BF3 with this OC on auto fan control puts me at 81C max. Is that too hot?

Manual fan control ends up being super loud and doesn't seem to cool that card that much better.


----------



## thestache

Quote:


> Originally Posted by *pilla99*
> 
> Well that's what I've been doing. At least I'm at a start with 100mhz core OC and 135% target.
> Voltage scares me.
> Also how hot is pushing it too hot? The ambient temp around here is pretty high. Gaming in more casual endeavor leaves the card at a cool 50 - 60, even the 40's sometimes. (Hon, Dota, CS GO)
> But BF3 with this OC on auto fan control puts me at 81C max. Is that too hot?
> Manual fan control ends up being super loud and doesn't seeMm to cool that card that much better.


Yeah 81deg is too hot.

Don't be scared about voltage. Voltage is limited at the moment and no settings that are capable with the limitation will damage your GPUs. Not to mention you don't need it. You'll hit your temperature barrier long before you run into that problem...

Out of the box both of mine did 135% power target, +100mhz core, +150mhz memory with the default voltage and yours should too. Which is a small and nice overclock that you should be able to set and not worry about again.

Just set your fan profile to ramp up to 80-100% at 80deg and you should be fine. Sooner if your card runs a bit hotter. The GTX 690 is really quiet so don't be scared about fan speed.


----------



## pilla99

Well I set it to that. Put my fan profile to the following. Going to play a little BF3 and see what happens.
Have a feeling I'm not making it into work tomorrow.

Update: Well that went well. This card eats through that game never going below 60fps at 2560x1440 with that OC on Ultra.
I turned vsync on and left the fan profile back to auto because it sounded like a jet taking off and only reached 74C.

Every other game I play will demand less for vsync and generate less heat. So I know 75C is about my ceiling under max real world load.


----------



## s74r1

Just got mine today.

EVGA GTX 690










Overclock results so far stable @:
Core: <1200MHz is stable, but anything above and the display driver crashes, so I have it @ +130 core offset which results in 1162MHz, 1176MHz, and 1189MHz during gaming.
Mem: 3499MHz, but it artifacts around 3600 so i still have some headroom. (Edit: had to back down to 3445 for stability)

Fan is at default curve, and voltage didn't seem to do anything, but overall it's about a 15% OC on core and mem.

Are these pretty good clocks for a 690? What's everyone else getting?

P.S. Be careful removing screws if you decide to replace the TIM, I stripped one of the black screws and couldn't get the heatsink off. Don't know why NVIDIA uses such flimsy screws and makes them so tight. I'll eventually try super glue or drill it out. (Does anyone know where I can get a replacement black screw?)


----------



## thestache

Quote:


> Originally Posted by *s74r1*
> 
> Just got mine today.
> EVGA GTX 690
> 
> 
> 
> 
> 
> 
> 
> 
> Overclock results so far stable @:
> Core: <1200MHz is stable, but anything above and the display driver crashes, so I have it @ +130 core offset which results in 1162MHz, 1176MHz, and 1189MHz during gaming.
> Mem: 3499MHz, but it artifacts around 3600 so i still have some headroom.
> Fan is at default curve, and voltage didn't seem to do anything, but overall it's about a 15% OC on core and mem.
> Are these pretty good clocks for a 690? What's everyone else getting?
> P.S. Be careful removing screws if you decide to replace the TIM, I stripped one of the black screws and couldn't get the heatsink off. Don't know why NVIDIA uses such flimsy screws and makes them so tight. I'll eventually try super glue or drill it out. (Does anyone know where I can get a replacement black screw?)


I'll be careful when installing my waterblocks then.

Might be able to hacksaw it and use a large flat head screwdriver. But no idea, other than contacting EVGA and asking or something.


----------



## thestache

Quote:


> Originally Posted by *pilla99*
> 
> Well I set it to that. Put my fan profile to the following. Going to play a little BF3 and see what happens.
> Have a feeling I'm not making it into work tomorrow.
> Update: Well that went well. This card eats through that game never going below 60fps at 2560x1440 with that OC on Ultra.
> I turned vsync on and left the fan profile back to auto because it sounded like a jet taking off and only reached 74C.
> Every other game I play will demand less for vsync and generate less heat. So I know 75C is about my ceiling under max real world load.


Good stuff.


----------



## s74r1

Anyone else notice that GPU #2 has a higher temperature threshold before it downclocks? seems to be 5 degrees difference. For example at 85c GPU1 and 80c GPU2, they both run at 1150 here with +129 boost, but if GPU #2 drops to 79 it goes to 1163. I think i'm gonna keep them here since they don't jump around as much, and I'd rather not have a tornado in my computer.

I wish boost could just be disabled and have a steady clock. Think this is a possibility in a driver/firmware mod? or at least remove the downclocking from temperature increases? It would make things much easier since my GPU's are stable around 1175ish but to keep it there, it has a chance of boosting higher and becoming unstable.

Edit: I also wonder the impact of different GPU boosts with regards to micro-stutter. I know these have a new frame-metering technique but it still makes me uneasy seeing both at a different clock.
Edit2: and i seem to need 13MHz less on GPU #2 to remain at equal clocks during gaming.


----------



## thestache

Quote:


> Originally Posted by *s74r1*
> 
> Anyone else notice that GPU #2 has a higher temperature threshold before it downclocks? seems to be 5 degrees difference. For example at 85c GPU1 and 80c GPU2, they both run at 1150 here with +129 boost, but if GPU #2 drops to 79 it goes to 1163. I think i'm gonna keep them here since they don't jump around as much, and I'd rather not have a tornado in my computer.
> I wish boost could just be disabled and have a steady clock. Think this is a possibility in a driver/firmware mod? or at least remove the downclocking from temperature increases? It would make things much easier since my GPU's are stable around 1175ish but to keep it there, it has a chance of boosting higher and becoming unstable.
> Edit: I also wonder the impact of different GPU boosts with regards to micro-stutter. I know these have a new frame-metering technique but it still makes me uneasy seeing both at a different clock.


You can turn GPU boost off but benchmarks showed it made no difference to overall performance, just temperatures because the GPU was working harder. Google it if you still want to try it. Sorry I cant remember where I read the article.


----------



## s74r1

Quote:


> Originally Posted by *Sexparty*
> 
> You can turn GPU boost off but benchmarks showed it made no difference to overall performance, just temperatures because the GPU was working harder. Google it if you still want to try it. Sorry I cant remember where I read the article.


I just read that article, and I find it hard to believe they did that comparison test properly. if they'd set the card's clocks and voltage to the same exact setting the boost was at, it would draw nearly identical power and perform the same. plus you can squeeze some extra MHz out of your card without worrying about it boosting above it's stability point while it's first warming up.


----------



## Stray_Bullet

Hello everyone. New here, and a owner of a new GTX690. I recall from the 590 owners club, that I watched diligently btw, that I must show proof of owner ship. So hears my card, whether it is necessary or not








EVGA GTX690 Signature



I have some questions about the 2 x 6 pin to 8 pin connectors that came with my card, and whether or not they are necessary to use. I run a Corsair AX850. So....... This...



Or just this??....



I will be here often and cant really wait to see how this Card performs! So stoked!!!! Thank you everyone


----------



## SimpleTech

While both are correct, I would use the second one where you aren't using any adapters since it minimizes on cable mess.


----------



## Arizonian

@ Stray_Bullet - as far as I know the red extension cables do not need to be used. I'm using white Bitfenix cables attached to my Corsair AX 850 and haven't had any issues. See Rig pics in Ivy Cruncher BIO.

Actually I use one white and one black to go with the theme of my rig.


----------



## Stray_Bullet

Quote:


> Originally Posted by *SimpleTech*
> 
> While both are correct, I would use the second one where you aren't using any adapters since it minimizes on cable mess.


Quote:


> Originally Posted by *Arizonian*
> 
> @ Stray_Bullet - as far as I know the red extension cables do not need to be used. I'm using white Bitfenix cables attached to my Corsair AX 850 and haven't had any issues. See Rig pics in Ivy Cruncher BIO.
> Actually I use one white and one black to go with the theme of my rig.


Sweet! Thank you both!


----------



## undercoverb0ss

Finally got my GTX 690 yesterday









Using the latest nvidia beta drivers, my best OC is:

135%
100
600

Core won't budge over 100 for 100% stability in Heaven benchmarking.

Here's my best score so far in Heaven:


----------



## Arizonian

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Finally got my GTX 690 yesterday
> 
> 
> 
> 
> 
> 
> 
> 
> Using the latest nvidia beta drivers, my best OC is:
> 135%
> 100
> 600
> Core won't budge over 100 for 100% stability in Heaven benchmarking.
> Here's my best score so far in Heaven:
> 
> 
> Spoiler: Warning: Spoiler!


Hi undercoverb0ss, what a way to start your first post on OCN with a GTX 690.....congrats.









Looking at offsets is hard to determine where your Kepler Boost clocks are reaching. Your offset for memory if set to high even though it's stable it may keep the Core clock from reaching it's potential.

Did you start over clocking with your core clock first to find stable with no over clock to the memory?

Though Memory over clocks do help, the main performance gains seem to come from Core. I saw from your benchmark your on an i5 2500K and your score seems to be where it should be for that set up.

Welcome aboard the club.


----------



## undercoverb0ss

Quote:


> Originally Posted by *Arizonian*
> 
> Hi undercoverb0ss, what a way to start your first post on OCN with a GTX 690.....congrats.
> 
> 
> 
> 
> 
> 
> 
> 
> Looking at offsets is hard to determine where your Kepler Boost clocks are reaching. Your offset for memory if set to high even though it's stable it may keep the Core clock from reaching it's potential.
> Did you start over clocking with your core clock first to find stable with no over clock to the memory?
> Though Memory over clocks do help, the main performance gains seem to come from Core. I saw from your benchmark your on an i5 2500K and your score seems to be where it should be for that set up.
> Welcome aboard the club.


Started with memory at 0 overclock and was only able to get to 100 core.

Using OC Scanner shows both GPUS at 1160Mhz


----------



## Arizonian

Quote:


> Originally Posted by *undercoverb0ss*
> 
> Started with memory at 0 overclock and was only able to get to 100 core.
> Using OC Scanner shows both GPUS at 1160Mhz


Cool then, just checking. Your memory is very nice TBH with that type of a +600 offset as my limit is +200 offset giving me 1594 Memory Clock.

My Core on +129 offset Base clock 1044 MHz gives me a Kepler Boost of GPU #1 1175 GPU #2 1201 MHz.


----------



## Shogon

It will be here Tuesday!


----------



## undercoverb0ss

Quote:


> Originally Posted by *Shogon*
> 
> ]
> It will be here Tuesday!


All I did was read this thread constantly while I was waiting for mine


----------



## Kyouki

Just wondering after a restart of the computer do you guys have to re-OC with precisionX mine always defaults back to 100%/0+/0+ and i have to go profiles then select it and apply. I would think it would just boot with what ever my last used profile is.


----------



## Anzial

Quote:


> Originally Posted by *Kyouki*
> 
> Just wondering after a restart of the computer do you guys have to re-OC with precisionX mine always defaults back to 100%/0+/0+ and i have to go profiles then select it and apply. I would think it would just boot with what ever my last used profile is.


Check the option "windows start up"


----------



## Stateless

Sorry that I have not been able to post results of my Watercooled GTX-690 results, but have been busy. I was able to finally get to the end of my OC with H20 and results were pretty significant versus what I was able to do on Air.

Under Air, the best I could do is a +130 on the Core & +200 or so on the memory. With H20, I am 100% stable at +160 on the core and the memory at a wopping +750! I was shocked to see the memory go that high and be fine. I did figure out a quick way to do check to see if too much is put for the memory offset. Just move your slider in Precision X to some high offset and as soon as you hit apply it will artifact right away on the desktop. When I did this with a +100 it instantly corrupted and got a display driver not working error. It did not do this at +775, but shortly after launching Unigine it did crash/corrupt.

In the end, it is interesting that this card can go much farther on Water than on Air, even when Air temps were under control. As stated earlier in my posts, I had outstanding Air Cooling in my case and anything abotu +130 on the Core would crash, even with temps not breaking 70c. I just think the way they are made is temp's do have a play in how they clock. While we can say that about any card, I think Kepler is designed in a way that when temps are really low under 100% load that it allows it to clock higher because of the distance between the where it is at and the 70c threshold where it begins to downclock a little. My Temps never break 39c on GPU, while GPU never broke about 35c and this is with a few hours of Unigine, Heaven and some gaming.

Will post Unigine Benchmark scores a little later, but getting a solid 1202 Mhz on both GPU's!!!


----------



## 4ofus

Finally got a Heaven score. I was hoping for over 100fps but I'll take it. Everything in my system is running at stock speeds.


----------



## Arizonian

Quote:


> Originally Posted by *Stateless*
> 
> Sorry that I have not been able to post results of my Watercooled GTX-690 results, but have been busy. I was able to finally get to the end of my OC with H20 and results were pretty significant versus what I was able to do on Air.
> Under Air, the best I could do is a +130 on the Core & +200 or so on the memory. With H20, I am 100% stable at +160 on the core and the memory at a wopping +750! I was shocked to see the memory go that high and be fine. I did figure out a quick way to do check to see if too much is put for the memory offset. Just move your slider in Precision X to some high offset and as soon as you hit apply it will artifact right away on the desktop. When I did this with a +100 it instantly corrupted and got a display driver not working error. It did not do this at +775, but shortly after launching Unigine it did crash/corrupt.
> In the end, it is interesting that this card can go much farther on Water than on Air, even when Air temps were under control. As stated earlier in my posts, I had outstanding Air Cooling in my case and anything abotu +130 on the Core would crash, even with temps not breaking 70c. I just think the way they are made is temp's do have a play in how they clock. While we can say that about any card, I think Kepler is designed in a way that when temps are really low under 100% load that it allows it to clock higher because of the distance between the where it is at and the 70c threshold where it begins to downclock a little. My Temps never break 39c on GPU, while GPU never broke about 35c and this is with a few hours of Unigine, Heaven and some gaming.
> Will post Unigine Benchmark scores a little later, but getting a solid 1202 Mhz on both GPU's!!!


Very informative observation between air and water. Kepler 28nm fabrication has similar findings when Tahiti first went under water. It helped but not as much as going from air to water did past gen. Seems 28nm fabrication has it's own breed of characteristics that are similar across the board.

Your offsets are like mine +129 Core / +184 Memory - on air GPU #1 1175 & GPU #2 1201 with Kepler Boost. I'm impressed both yours broke 1200 on a dual GPU....WTG.







I'll take both above 1200 MHz all day long.









Be looking foward to your Heaven 3.0 Bench and 3DMark11 Performance scores. Don't forget to post them if you do well in the bench off threads too.


----------



## ReignsOfPower

I gave up waiting for the EK Waterblock and just picked up the Koolance one instead. I actually really hate the port design of the new EK blocks. As if make me use a bracket thing then 90 degree knuckles to hook it up to my CPU and stuff.

Anyways I'll pickup a EVGA Backplate when they become available. For the time being I'll just rock it with it bare


----------



## thestache

Installed my Koolance waterblock and am having some issues.

Seems like the PCB is warped but didn't even tighten the screws that much, one GPU is reaching temps of 80deg whilst the other is a perfect 35deg and I'm getting a horrible buzzing sound from the card or the motherboard CPU directly behind it (not sure). Have I somehow seated the waterblock wrong even though it seems tight and to have contacted right?

I am getting a lot of heat from the PLX chip area and the GPU1 area on the back of the card so would those be the problem areas? I'm super confused. Anyone got any knowledge they can throw my way to maybe help me figure this out.


----------



## juanP

how many rads are required at a minimum to cool quad sli 690's ?


----------



## Romin

Well, I RMAed mine for a refund, since it was faulty out of the box. will go for a 680 classy or two !


----------



## MrTOOSHORT

Quote:


> Originally Posted by *juanP*
> 
> how many rads are required at a minimum to cool quad sli 690's ?


Usually it's 120mm of rad for one component(per gpu). So a 480mm rad minimum is required. For a 3770k and two gtx690s, I'd go one 480 and one 240 for good cooling imo.


----------



## Fallendreams

YAY I Cant wait!


----------



## Arizonian

Quote:


> Originally Posted by *Fallendreams*
> 
> 
> 
> 
> 
> 
> 
> 
> YAY I Cant wait!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


It is a good feeling.







Post a pic when you get it and let us know how it does for you. If you have any questions regarding over clocking etc you know where to come.


----------



## Fallendreams

Quote:


> Originally Posted by *Arizonian*
> 
> It is a good feeling.
> 
> 
> 
> 
> 
> 
> 
> Post a pic when you get it and let us know how it does for you. If you have any questions regarding over clocking etc you know where to come.


Will do! Thank you


----------



## Divineshadowx

I was thinking about selling my 3770k and mobo and getting a 3930k/3960x with an x79. Anyone think this is worth it? I feel the 3770k + z77 is a bottleneck for quad 690. Suggestions please.


----------



## Tslm

Quote:


> Originally Posted by *Divineshadowx*
> 
> I was thinking about selling my 3770k and mobo and getting a 3930k/3960x with an x79. Anyone think this is worth it? I feel the 3770k + z77 is a bottleneck for quad 690. Suggestions please.


I'd stick with what you have tbh unless you mostly just benchmark. That's a sweet setup!


----------



## lacrossewacker

coming to this thread is like visiting the gods on Mount Olympus. I am not worthy!


----------



## emett

lol


----------



## Fallendreams

Quote:


> Originally Posted by *lacrossewacker*
> 
> coming to this thread is like visiting the gods on Mount Olympus. I am not worthy!


Yeah my E-Peen pretty much huge now.


----------



## Shogon

<3 PLX chip

So far I'm at 135% power target, +125 GPU offset, and +250 mem offset. Still overclocking this card, and only max temp of 76C so far!


----------



## Lu(ky

I have been waiting for the eK GTX 690 FB to come in to the US for weeks now. I might just pull the trigger on the XSPC Razor block instead does anyone know if this Razor block is restricted or have any problems?
Thanks


----------



## Divineshadowx

Quote:


> Originally Posted by *Lu(ky*
> 
> I have been waiting for the eK GTX 690 FB to come in to the US for weeks now. I might just pull the trigger on the XSPC Razor block instead does anyone know if this Razor block is restricted or have any problems?
> Thanks


Is the EVGA one bad? I know EVGA isn't really focused on cooling but it looks decent.


----------



## Shogon

I believe the EVGA block is made by Swiftech, I have not had any experience with them. Both the HydroCopper and Razor block look nice, I think they will both cool within a few C of each other. I'm split between those 2.


----------



## Lu(ky

Quote:


> Originally Posted by *Shogon*
> 
> I believe the EVGA block is made by Swiftech, I have not had any experience with them. Both the HydroCopper and Razor block look nice, I think they will both cool within a few C of each other. I'm split between those 2.


Yeah for the price of a full Acrylic/Nickel block they want on eK website $180.00 + VAC ouch







And they have the new design with the circles on it which I did not want..









Edited just found the block at FCPU.com site for $155.00 MMmmm should I jump on it...


----------



## egotrippin

The EVGA block is made by swiftech. I had the EVGA GTX 580 Classified Hydro Copper previously and it was a pretty good card with low temps but my 690 with the koolance block is even lower. I'm getting temps when browsing the web/watching videos etc of 27c, gaming at 40c and benchmarking when overclocked topped out at 51c.


----------



## Divineshadowx

Well the 690 im sure runs cooler than the 580 on air and water. I havn't heard positive reviews about koolance, is it decent? I'm going to water cool my system, my new 3930k and both 690's, and need a good waterblock.


----------



## pilla99

Man this thing is running in the 82c range with the ambient temp at about 75
I have the OC a decent rate but I am not sure if I should be concerned or not about that temp.

The only game that can bring it to that point is of course BF3.

Everything else I play keeps it at a cool 50-60 range.

Edit: Also this post is to test the new keyboard and mouse. Me gusta.


----------



## Arizonian

Quote:


> Originally Posted by *egotrippin*
> 
> The EVGA block is made by swiftech. I had the EVGA GTX 580 Classified Hydro Copper previously and it was a pretty good card with low temps but my 690 with the koolance block is even lower. I'm getting temps when browsing the web/watching videos etc of 27c, gaming at 40c and benchmarking when overclocked topped out at 51c.
> 
> 
> Spoiler: Warning: Spoiler!


Nice looking block on the 690 and clean set up in your case.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well the 690 im sure runs cooler than the 580 on air and water. I havn't heard positive reviews about koolance, is it decent? I'm going to water cool my system, my new 3930k and both 690's, and need a good waterblock.


That's going to be a hot CPU. The waterblock I have on my CPU is an Apogee HD by swiftech. Check out martinsliquidlab.org for reviews and testing. The Swiftech Apogee HD supposed to be one of the best if not THE best for watercooling your CPU and there are 3 ports for outgoing flow so your CPU block can work like a manifold to give you more options for cooling.



As far as comparisons between 690 GPU waterblocks, I haven't yet seen any. Koolance was a first mover in this area and released their card about a month before anybody else. I have a bunch of Koolance products and they are all solid. They've been in the business a long time and have excellent customer service.


----------



## egotrippin

Quote:


> Originally Posted by *Arizonian*
> 
> Nice looking block on the 690 and clean set up in your case.


Thanks dude! I'm not done yet, I still have to finish making new psu cables and sleeving them.

Those are some wicked white fins on your RAM, did you paint those yourself?


----------



## Arizonian

Quote:


> Originally Posted by *egotrippin*
> 
> Thanks dude! I'm not done yet, I still have to finish making new psu cables and sleeving them.
> Those are some wicked white fins on your RAM, did you paint those yourself?


Will be looking forward to seeing a pic of your rig completed here.









Yes I painted the RAM. Signature has a link to the job. Pretty easy. Last finishing touch will be the EVGA GTX 690 back plate due out in another week approximately.


----------



## egotrippin

Quote:


> Originally Posted by *Arizonian*
> 
> Will be looking forward to seeing a pic of your rig completed here.
> 
> 
> 
> 
> 
> 
> 
> 
> Yes I painted the RAM. Signature has a link to the job. Pretty easy. Last finishing touch will be the EVGA GTX 690 back plate due out in another week approximately.


Backplate?! I could use one of those, the PCB is starting to deform under it's own weight.


----------



## Shogon

Quote:


> Originally Posted by *pilla99*
> 
> Man this thing is running in the 82c range with the ambient temp at about 75
> I have the OC a decent rate but I am not sure if I should be concerned or not about that temp.
> The only game that can bring it to that point is of course BF3.
> Everything else I play keeps it at a cool 50-60 range.
> Edit: Also this post is to test the new keyboard and mouse. Me gusta.


Nothing to worry about. I've hit 91C in EVGA OC Scanner X testing an overclock, ran for an hour and never crashed or artifacts. I've hit close to that in Shogun 2 and around 75C in BF3.


















I'm going to get the Razor block to match my Raystorm, so in a week or so I'll see how it performs.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well the 690 im sure runs cooler than the 580 on air and water. I havn't heard positive reviews about koolance, is it decent? I'm going to water cool my system, my new 3930k and both 690's, and need a good waterblock.


Works really well.

My GTX 690 with koolance block idles around 25 and under load I have not seen over 40deg soo far. Thats with a 1200mhz core overclock and 3305mhz on the memory, mild overclock on the CPU. I'm running pump>140mm rad>GTX 690>280mm rad>3930k>res.


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> Works really well.
> My GTX 690 with koolance block idles around 25 and under load I have not seen over 40deg soo far. Thats with a 1200mhz core overclock and 3305mhz on the memory, mild overclock on the CPU. I'm running pump>140mm rad>GTX 690>280mm rad>3930k>res.


Gaming, my GPU with the koolance block is hanging around 37-41c and during light use it's around 27c.

Can you run the Heaven benchmark with all the options maxed out and set the resolution to 1920x1080? It's a free download if you haven't used it before. My 690 hit 51c while running this benchmark. I'd be curious to see how yours performs. Here's my results with no overclock:


----------



## ReignsOfPower

Here's mine









Temps are indeed lower with the GTX690 over the GTX590. About 10C lower idles.


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> Gaming, my GPU with the koolance block is hanging around 37-41c and during light use it's around 27c.
> Can you run the Heaven benchmark with all the options maxed out and set the resolution to 1920x1080? It's a free download if you haven't used it before. My 690 hit 51c while running this benchmark. I'd be curious to see how yours performs. Here's my results with no overclock:


Pump pretty much exploded coolant everywhere two days ago so can't do anything with it until it's all fixed.

But going off the heaven benchmark thread's settings I'm scoring 2742 with my GTX 690 system thats running on air (which is fair decent without a proper CPU overclock). Can't remember what the last score for the watercooled rig was but it wasn't as high since the 3930k was pretty much running stock.

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-3-0-scores


----------



## atibingler

hey owners.. just wondering about how long will the 690 hold the ultra graphics standard in years? 5?


----------



## egotrippin

Quote:


> Originally Posted by *atibingler*
> 
> hey owners.. just wondering about how long will the 690 hold the ultra graphics standard in years? 5?


I'm not an expert in these things but it's been just over 1 year since the release of the 590 which was two 580s on one board. I think the 580 came out two years ago so if they keep refreshing the product line every two years I'd say that's how much time until there's GTX 780s and a couple months after that, 790s. That's just my "barely" educated guess from a couple google searches I did before I bought mine.


----------



## Fallendreams

This baby came in on Friday! I love it!


----------



## tonyjones

nice Fallendreams! I want!


----------



## Fallendreams

Quote:


> Originally Posted by *tonyjones*
> 
> nice Fallendreams! I want!


Thanks


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Works really well.
> My GTX 690 with koolance block idles around 25 and under load I have not seen over 40deg soo far. Thats with a 1200mhz core overclock and 3305mhz on the memory, mild overclock on the CPU. I'm running pump>140mm rad>GTX 690>280mm rad>3930k>res.


Those are some nice temps, but I would prefer a full block, seing half of if seems a bit weird to me, could be different for you though. I see you have a single loop, and 1 rad? Maybe more but it is all i can tell from the pic. Do you think it would be better for 2 loops? I got 2 690's so more heat than your single one, and my Cosmos 2 supports a 360 rad top, and x2 240's on the bottom.


----------



## atibingler

Quote:


> Originally Posted by *egotrippin*
> 
> I'm not an expert in these things but it's been just over 1 year since the release of the 590 which was two 580s on one board. I think the 580 came out two years ago so if they keep refreshing the product line every two years I'd say that's how much time until there's GTX 780s and a couple months after that, 790s. That's just my "barely" educated guess from a couple google searches I did before I bought mine.


No problem, thanks for the answer.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Those are some nice temps, but I would prefer a full block, seing half of if seems a bit weird to me, could be different for you though. I see you have a single loop, and 1 rad? Maybe more but it is all i can tell from the pic. Do you think it would be better for 2 loops? I got 2 690's so more heat than your single one, and my Cosmos 2 supports a 360 rad top, and x2 240's on the bottom.


It is a full cover waterblock, you just see a tiny bit on the sides. Which is just Koolance's style.

http://koolance.com/image/cache/data/products/vid-nx690_p2-700x700.jpg



I have a single loop.

No need for dual loops, just a properly set-up system... Mine goes pump>140mm rad>GTX 690>280mm rad>3930k>res. As a general guide you use 120mm of rad to cool each CPU or GPU so I'd reccomend a 360 or bigger on top for GTX 690 SLI. I'd set it up the same as mine just a 360mm up top and a 240mm on the bottom and you'll never have any problems.

Always go component, then rad, component, then rad like my picture so you're always cooling the water after it's been heated by a component.


----------



## ReignsOfPower

Quote:


> Always go component, then rad, component, then rad like my picture so you're always cooling the water after it's been heated by a component.


While this may make a small difference temperature wise (remember the flow probably passes over the same point many times a minute depending on flow rate) it's not going to overcome the thermal limitations of your loop. I would rather make an aesthetically appealing loop than be limited to the Block - Radiator - Block - Radiator - routine.

Lets say you go, Radiator - Block - Block - Radiator (like I have as per the below picture) you may see a few degrees differences on some components, but the overall heat dumped into the water temperature will be 100% identical. My temperatures are sitting at 30C on the CPU and 27C on the cores of the 690 with a 240mm rad and a 200mm radiator. And no, the fans are not screaming, 2x140mm @ 1200 rpm and 1x180mm @ 1150rpm










UPDATE - Gasve 3DMark 11 a whirl - here is my score with a quick overclock that I found to be stable from another user here. Temps on GPU did not exceed 39C.
http://3dmark.com/3dm11/3843864 - P17735 +135% Power +100 GPU +500 Mem

Is this any good as far as GTX690 clocks is concerned? I haven't been paying close attention to overclocking this card.


----------



## theyedi

Question about having 2 monitors set up. In the nvidia control panel, it shows that an individual GPU will drive each screen independently. Does that mean that only one GPU will be used per game running on one screen (a game that would normally use both GPUs if only one monitor was hooked up)?


----------



## dboythagr8

Quote:


> Originally Posted by *theyedi*
> 
> Question about having 2 monitors set up. In the nvidia control panel, it shows that an individual GPU will drive each screen independently. Does that mean that only one GPU will be used per game running on one screen (a game that would normally use both GPUs if only one monitor was hooked up)?


Yes. I have two monitors plugged into the 690. When gaming it only uses one of the GPUs. In order to use both GPUs, have to go into NVCP and click "Maximize 3D performance". That turns off the 2nd monitor and frees up that GPU to be used in SLI.


----------



## theyedi

Quote:


> Originally Posted by *dboythagr8*
> 
> Yes. I have two monitors plugged into the 690. When gaming it only uses one of the GPUs. In order to use both GPUs, have to go into NVCP and click "Maximize 3D performance". That turns off the 2nd monitor and frees up that GPU to be used in SLI.


that... sucks. single cards can render two screens while simultaneously running a game on one screen at 100%, yet a dual gpu card can't do the same?


----------



## ReignsOfPower

That is not true at all. You have have SLI enabled and drive a main monitor + an off screen with full 3D SLI acceleration. I do it day in day out.


----------



## pilla99

I read a thread that had advice on running dual monitors with both GPU's...
Something like, the top left DVI is the first GPU.

The top right and bottom DVI are the second. Plug the monitors into the top right and bottom for multi GPU use... haven't tried it myself yet.

Thread here : http://www.evga.com/forums/tm.aspx?m=1657353&mpage=1


----------



## Sujeto 1

Guys, can i ask if you ever had a plan into getting a second GTX 690 at near future?


----------



## Qu1ckset

Quote:


> Originally Posted by *Sujeto 1*
> 
> Guys, can i ask if you ever had a plan into getting a second GTX 690 at near future?


i was thinking about it , but i think im just going to wait until maxwell !


----------



## Arizonian

I'm more than fine with single 690 and will keep it for two years before handing it down to the second rig. Then it's Maxwell 890 or 880 SLI.


----------



## egotrippin

Quote:


> Originally Posted by *Sujeto 1*
> 
> Guys, can i ask if you ever had a plan into getting a second GTX 690 at near future?


Yes. It's not like I could practically use any of that additional power... *but I want it anyway*.

It's like a poor mans hot rod, except I can't drive it anywhere and it's not going to get me laid.


----------



## dboythagr8

Quote:


> Originally Posted by *ReignsOfPower*
> 
> That is not true at all. You have have SLI enabled and drive a main monitor + an off screen with full 3D SLI acceleration. I do it day in day out.


Then what am I setting wrong?

Right now I'm typing on my U3011. I have my second monitor next to me. In the CP I have to click "Activate all Displays" in order for the second monitor in the GPU diagram below to turn from grey to green. If I click on "Maximize 3D performance" the secondary monitor goes back to being greyed out with just the U3011 showing as green in the picture...

Went here: http://www.geforce.com/hardware/technology/sli/system-requirements

Clicked 690. SLI Mode. SLI Multi-monitor Mode. The "Find configuration" button is greyed out.

690->Multi-Gpu->Multi-GPU on I can click the Find config button. I do that and I changed the placement of one of my DVI connectors to match the picture.

Go back to CP and Max 3D Perf is selected, but 'Span Displays with Surround', and 'Activate All Displays' is unavailable. Only other selectable option is "Disable-Multi GPU".

EDIT: Seems I got it to work, had to go to change a setting in Windows. So now I am typing this on my secondary monitor while BF3 is running on the U3011. The issue now is whenever I click on this secondary monitor to do whatever, it takes BF3 out of full screen mode and runs it in a window. I have to click back on the window screen for it to return to full size. Is this normal?


----------



## ReignsOfPower

If you stick both your U3011 and secondary monitor on GPU A, you should be able to have both displays driven with SLI enabled. What you are experiencing is normal. If you want to freely move between your game and and the offscreen, the game has to be in windowed mode (Or 'Windowless fullscreen") Alot of MMO's have this feature I know.

GPU A should give you 2x DVI ports and 1x DP. If you wanted a 4th monitor then you would need GPU B to drive a display. That would then of course need you to disable SLI.


----------



## thestache

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Quote:
> 
> 
> 
> Always go component, then rad, component, then rad like my picture so you're always cooling the water after it's been heated by a component.
> 
> 
> 
> While this may make a small difference temperature wise (remember the flow probably passes over the same point many times a minute depending on flow rate) it's not going to overcome the thermal limitations of your loop. I would rather make an aesthetically appealing loop than be limited to the Block - Radiator - Block - Radiator - routine.
> Lets say you go, Radiator - Block - Block - Radiator (like I have as per the below picture) you may see a few degrees differences on some components, but the overall heat dumped into the water temperature will be 100% identical. My temperatures are sitting at 30C on the CPU and 27C on the cores of the 690 with a 240mm rad and a 200mm radiator. And no, the fans are not screaming, 2x140mm @ 1200 rpm and 1x180mm @ 1150rpm
> 
> 
> 
> 
> 
> 
> 
> 
> UPDATE - Gasve 3DMark 11 a whirl - here is my score with a quick overclock that I found to be stable from another user here. Temps on GPU did not exceed 39C.
> http://3dmark.com/3dm11/3843864 - P17735 +135% Power +100 GPU +500 Mem
> Is this any good as far as GTX690 clocks is concerned? I haven't been paying close attention to overclocking this card.
Click to expand...

So you'd rather a loop that has lower performance simply because of a personal preference in aesthetics... Interesting. Especially since my case is nothing but great looking in everyway. Spending $600-1000 on a custom water cooling loop that doesn't cool your PC as best as it can because it isn't set-up for optimal performance seems silly to me. Layout, theme and components are far more important to a watercooling loops aesthetics than an extra tube.

But anyways, I've got 2 GTX 690s and one is running 135% +165core +350memory and the other 135% +170core +300memory. Ones under water, one isn't. So you should have more headroom on your GPUs. +150core at least I'd say. Memory I haven't had enough time to find the sweet spot yet but your 3DMark score is good. Should be able to break 18000 without too much trouble.


----------



## ReignsOfPower

^ Your loop looks neat in its current arrangement. The amount of spaghetti mess I've seen is just what leads me to say that you don't necessarily need to worry about the flow direction or arrangement of components. You can easily go GPU's first, then CPU if u want one loop. It's all much of a muchness. It's more important to have adequate radiators and pump for the job.

Thanks for the tips on the GPU overclocking. I'll give a whirl.

UPDATE - 3DMark 11 - +135% Power +150 Core +500 Mem (1.175V)
P18255 - http://3dmark.com/3dm11/3851795


----------



## dboythagr8

Quote:


> Originally Posted by *ReignsOfPower*
> 
> If you stick both your U3011 and secondary monitor on GPU A, you should be able to have both displays driven with SLI enabled. What you are experiencing is normal. If you want to freely move between your game and and the offscreen, the game has to be in windowed mode (Or 'Windowless fullscreen") Alot of MMO's have this feature I know.
> 
> GPU A should give you 2x DVI ports and 1x DP. If you wanted a 4th monitor then you would need GPU B to drive a display. That would then of course need you to disable SLI.


I plugged the monitors in according to NVIDIA diagram for multi gpu, dual screen. I originally had both monitors plugged into the DVIs that you refer to as GPU A. That was wrong according to the CP. It recommended that I move one monitor into the DVI next to display port, and the second monitor into DVI-D.

That's what I did and adjusted settings under windows resolution and now it works as it should .

Sent from my EVO 4G using Tapatalk 2


----------



## Divineshadowx

Can anyone recommend a good case for water cooling. The Cosmos 2 pisses me off, its the probably the biggest case ever made, and i measured it today and from the top to the motherboard is about 45mm.... a normal 120mm fan is 25mm, that leaves room for a 20mm rad.... A 20MM RAD FOR A CASE THAT WEIGHS 50 POUNDS AND IS AS TALL AS MY DESK... Looking for something not more than 300 and I prefer a window because 690s are sexy and so is tubing, and also some room on the top for a rad... and some rads on the bottom also, Thanks


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Can anyone recommend a good case for water cooling. The Cosmos 2 pisses me off, its the probably the biggest case ever made, and i measured it today and from the top to the motherboard is about 45mm.... a normal 120mm fan is 25mm, that leaves room for a 20mm rad.... A 20MM RAD FOR A CASE THAT WEIGHS 50 POUNDS AND IS AS TALL AS MY DESK... Looking for something not more than 300 and I prefer a window because 690s are sexy and so is tubing, and also some room on the top for a rad... and some rads on the bottom also, Thanks


NZXT Switch 810. 360mm or 420mm top and 240mm bottom, both with enough room for push pull on a 60mm rad.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Can anyone recommend a good case for water cooling. The Cosmos 2 pisses me off, its the probably the biggest case ever made, and i measured it today and from the top to the motherboard is about 45mm.... a normal 120mm fan is 25mm, that leaves room for a 20mm rad.... A 20MM RAD FOR A CASE THAT WEIGHS 50 POUNDS AND IS AS TALL AS MY DESK... Looking for something not more than 300 and I prefer a window because 690s are sexy and so is tubing, and also some room on the top for a rad... and some rads on the bottom also, Thanks


Silverstone TJ07 - I have a 480 and a 120 radiator in the bottom (60mm thick each) and there's room at the top for another 360.


----------



## Anzial

Quote:


> Originally Posted by *Sexparty*
> 
> So you'd rather a loop that has lower performance simply because of a personal preference in aesthetics...


If the waterflow is fast enough, there will be no performance penalty for going either way. If you really want maximum performance, you need to setup separate loops for each GPU and CPU, mobo etc. But you are doing it that, why I wonder?

Let me ask you another question - do you really believe that a few degrees difference are gonna let you oc the GPU another 100-200mhz? I doubt it. What needs the cool water the most is your CPU, GPUs do not scale all that well - and if you place them after CPU in the loop, they'll do the same as if they were after a rad. CPU, on other hand, might suffer a little loss in oc'ing headroom because it won't be getting all the cool water it would otherwise.


----------



## 4ofus

Quote:


> Originally Posted by *Sujeto 1*
> 
> Guys, can i ask if you ever had a plan into getting a second GTX 690 at near future?


I've thought about it several times. I just cannot justify it for my use. I'm a single monitor gamer. Many would argue my old GTX680 was more than enough. My GTX690 is just overkill but I like it that way. lol


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> NZXT Switch 810. 360mm or 420mm top and 240mm bottom, both with enough room for push pull on a 60mm rad.


Are there any decent static pressure 140mm fans? I know for some reason corsair doesn't have their sp series in 140mm.


----------



## burningrave101

EVGA GTX 690 $999.99 in stock:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130781


----------



## thestache

Quote:


> Originally Posted by *Anzial*
> 
> If the waterflow is fast enough, there will be no performance penalty for going either way. If you really want maximum performance, you need to setup separate loops for each GPU and CPU, mobo etc. But you are doing it that, why I wonder?
> Let me ask you another question - do you really believe that a few degrees difference are gonna let you oc the GPU another 100-200mhz? I doubt it. What needs the cool water the most is your CPU, GPUs do not scale all that well - and if you place them after CPU in the loop, they'll do the same as if they were after a rad. CPU, on other hand, might suffer a little loss in oc'ing headroom because it won't be getting all the cool water it would otherwise.


If you want to argue a certain 'look' justifies not setting up your system to maximize performance because of one tube, then by all means go ahead. But since I don't care or want to hear about it, go tell someone else. It's a waste of my life hearing about how people want to spend thousands of dollars on watercooling loops and then not have it work as best as it can because of one insignificant tube.

It is important for me that my CPU be as cool as possible because I overclock them as high as they can possibly go within an acceptable temperature range. My last CPU was at 5000mhz and this will be too. Something only achievable by watercooling and still limited by how good the loop is.

If you however don't overlock anything high enough to take advantage of the watercooling loop in your system then that's fine but next time save your self $500-1000 and stick with air because you're just wasting your money and time.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Are there any decent static pressure 140mm fans? I know for some reason corsair doesn't have their sp series in 140mm.


Corsair's new fans aren't great anyways, they just look good. Better fans Are out there that are much cheaper. As for good 140mm fans, most have good pressure because of their size and run at lower RPM.


----------



## ReignsOfPower

Quote:


> Originally Posted by *Sexparty*
> 
> If you want to argue a certain 'look' justifies not setting up your system to maximize performance because of one tube, then by all means go ahead. But since I don't care or want to hear about it, go tell someone else. It's a waste of my life hearing about how people want to spend thousands of dollars on watercooling loops and then not have it work as best as it can because of one insignificant tube.
> It is important for me that my CPU be as cool as possible because I overclock them as high as they can possibly go within an acceptable temperature range. My last CPU was at 5000mhz and this will be too. Something only achievable by watercooling and still limited by how good the loop is.
> If you however don't overlock anything high enough to take advantage of the watercooling loop in your system then that's fine but next time save your self $500-1000 and stick with air because you're just wasting your money and time.


LOL mate we're not having a go, no need to be so defensive. We're not trying to "waste your life" by having a chat about loops. Plus, we already said your loop looks good in its current configuration so I think that speaks for itself.

Anzail is 100% correct in his statement by the way. It really means sweet F.A. Sorry if you take that to heart as well (in advance).

http://en.wikipedia.org/wiki/Laws_of_thermodynamics


----------



## Blaze0303

690's are now in stock on newegg if anyone is interested.


----------



## ceteris

Yep! EVGA has Signature and Hydro Copper Signature up too.


----------



## MACH1NE

any plans on releasing a 690 lightning


----------



## Arizonian

Ok gentlemen let's keep calm. Not agreeing with each other is fine and debating is welcome, just not directly personal please.

We're all club members and here to help each other.


----------



## egotrippin

Has anybody here tried using their 690 for folding? After I enabled the beta client for Kepler support and started running on the 690 I immediately hear a high pitch whine that kind of freaked me out. I don't hear this sound when gaming or benchmarking. I adjusted the power target and underclocked it to see if the sound would change and predictably, the slower the clock speed the lower pitch the sound was. Likewise, overclocking caused the pitch to go higher. This sounds worrisome to me. I've decided to not attempt folding until they have a new version. GPUs aren't supposed to squeel like a blown capacitor.


----------



## Shogon

Quote:


> Originally Posted by *Sujeto 1*
> 
> Guys, can i ask if you ever had a plan into getting a second GTX 690 at near future?


Not really, that's why I ordered the LZP-650 so it is impossible for me









It would be cool but SLI is plenty enough for me.

Could be coil whine ego, I had that a bit when I first started this beast up in gaming. Haven't heard it since I stuck it in. But I have not tried folding on it yet, I will not till there is a stable client for it.


----------



## fat_italian_stallion

Add me to the list please. Finally got to pick up my 2nd card after dealing with EVGA only sending one per household. 2x EVGA 690 Signature Editions


----------



## FiShBuRn

Im trading my 580 SLI for one GTX 690 but i have i doubt...ASUS or MSI (EVGA is out of stock)? Or is just the same?

Thanks


----------



## Arizonian

Quote:


> Originally Posted by *FiShBuRn*
> 
> Im trading my 580 SLI for one GTX 690 but i have i doubt...ASUS or MSI (EVGA is out of stock)? Or is just the same?
> Thanks


Reference cards it's the same except for customer service or who you prefer as your vendor.


----------



## Lu(ky

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Add me to the list please. Finally got to pick up my 2nd card after dealing with EVGA only sending on her household. 2x EVGA 690 Signature Editions


Nice setup there...


----------



## ReignsOfPower

What have other water overclockers managed? Lets collaborate and gauge just how much more we can push than the aircoolers.

So far I've managed the following:
135% power
+180 Core Offset
+500 Memory Offset.

I've set the VGPU to 1.175V which is the highest it lets you go in EVGA Precision X.

This all translates to 1095MHz / 1228MHz Boost on the core
1752 MHz (QDR) = 7GHz on the Memory

Temperatures sitting at 23C Idle / 35-42C Load in my system. 16C ambient.

This all translated into approximately P18300


----------



## egotrippin

Quote:


> Originally Posted by *ReignsOfPower*
> 
> What have other water overclockers managed? Lets collaborate and gauge just how much more we can push than the aircoolers.
> So far I've managed the following:
> 135% power
> +180 Core Offset
> +500 Memory Offset.
> I've set the VGPU to 1.175V which is the highest it lets you go in EVGA Precision X.
> This all translates to 1095MHz / 1228MHz Boost on the core
> 1752 MHz (QDR) = 7GHz on the Memory
> Temperatures sitting at 23C Idle / 35-42C Load in my system. 16C ambient.
> This all translated into approximately P18300


I was about to propose the same question. I've done a few searches to see others overclock settings and, predictably, found conflicting information. One person said that the voltage is already maxed out when boosting so there's no point. Another person said not to bother overclocking the memory since most of the gains come from GPU clock offset and another review I read showed them only setting the power target at +120% when overclocking.

I'm a bit hesitant to OC my card, my screen went black and the system froze up last time I overclocked it and I had to hard reset.
Windows wouldn't boot so I had to re-install.


----------



## ceteris

I'm still waiting on EVGA backplate and WC Heatkiller to be released









... or at least the backplate so I can settle with another block. I don't want to have to take out the card more than once


----------



## fat_italian_stallion

Quote:


> Originally Posted by *egotrippin*
> 
> I was about to propose the same question. I've done a few searches to see others overclock settings and, predictably, found conflicting information. One person said that the voltage is already maxed out when boosting so there's no point. Another person said not to bother overclocking the memory since most of the gains come from GPU clock offset and another review I read showed them only setting the power target at +120% when overclocking.
> I'm a bit hesitant to OC my card, my screen went black and the system froze up last time I overclocked it and I had to hard reset.
> Windows wouldn't boot so I had to re-install.


I've been hitting a wall regardless of increasing the voltage or not, especially when increasing the mem. 185 is the highest I can get core with no mem offset. Best compromise so far is 160 core 450 mem.


----------



## ReignsOfPower

Yeah my wall is pretty much what I posted above, but if u pull back memory overclock u can get more on the core which will bring higher scores than a 7GHz memory overclock. I can't break 18.4k in 3D Mark 11. I might be able to get more if I pushed CPU more but I have no intention of going over 4.6


----------



## fat_italian_stallion

Quote:


> Originally Posted by *ReignsOfPower*
> 
> Yeah my wall is pretty much what I posted above, but if u pull back memory overclock u can get more on the core which will bring higher scores than a 7GHz memory overclock. I can't break 18.4k in 3D Mark 11. I might be able to get more if I pushed CPU more but I have no intention of going over 4.6


It seems the sweet spot for benches is around 5.0 with watercooling to really see the most out of ur cards. Everything higher I've seen the rig is on phase or ln2


----------



## Arizonian

Quote:


> Originally Posted by *ReignsOfPower*
> 
> What have other water overclockers managed? Lets collaborate and gauge just how much more we can push than the aircoolers.
> So far I've managed the following:
> 135% power
> +180 Core Offset
> +500 Memory Offset.
> I've set the VGPU to 1.175V which is the highest it lets you go in EVGA Precision X.
> This all translates to 1095MHz / 1228MHz Boost on the core
> 1752 MHz (QDR) = 7GHz on the Memory
> Temperatures sitting at 23C Idle / 35-42C Load in my system. 16C ambient.
> This all translated into approximately P18300


You fair a little better than me my friend.









My GTX 690 OC results 24/7 is *[+135 Power / +129 Core / +183 Memory]* translates to *1044 Core* / *1594 Memory* / *1149 Boost* = Kepler Boost GPU #1 *1175 MHz* + GPU #2 *1201 MHz*. In fact I can take my power target down to +125 and fair no different on the KB.

Why +183 Memory offset you ask? It's an even 204.0 GHz bandwidth. As soon as I hit +201 Offset on the Memory I crash in games, and sometimes slight artifcats on desktop and then 'drivers fail to respond'. So I settled for an even bandwidth....









Interesting to learn how others are fairing on Memory over clocks with high density Samsung's. See if they're really good or not. For me I'd say I should have done better. Others I've seen posted always post Memory as an offset rather than the actual MHz..

Temps I run at 29 C-31 C idle. Full load I'm at border lining it to 64 C - 69 C. Took many days for me to figure the exact over clock and keep me under 70 C with fan settings in the mix.

I have a 3770K @ [4.5] - 3DMark11 *17011* - Heaven 3.0 *110.0 FPS*. I don't take the benching to heart in rating the performance of my card because it's synthetic over gaming and the game I play where this card really shines.









This is my first dual GPU, unlike yourself bud.







It's new to me and it's been fun learning hands on. I'm passing this one along when Maxwell comes out and hand it down to the kids rig.

Around March it's all about an epeen large capacity SSD.

On a side note: I have been having fun with optimal 60 FPS in 3D Vision on my monitor. It does have up to lower 40 FPS dips but it's sporadic and not performance hinder as when it hits 30 FPS or below.

A lot of nice rigs at OCN and I'm impressed with Fat_Itallion_Stallion's rig. Very clean, nice mix of components aesthetically.









Glad to see the club active and now in a postive way.


----------



## Anzial

Quote:


> Originally Posted by *Arizonian*
> 
> I have a 3770K @ [4.5] - 3DMark11 *17011* - Heaven 3.0 *110.0 FPS*. .


which driver are you using?


----------



## egotrippin

Quote:


> Originally Posted by *ReignsOfPower*
> 
> What have other water overclockers managed? Lets collaborate and gauge just how much more we can push than the aircoolers.
> So far I've managed the following:
> 135% power
> +180 Core Offset
> +500 Memory Offset.
> I've set the VGPU to 1.175V which is the highest it lets you go in EVGA Precision X.
> This all translates to 1095MHz / 1228MHz Boost on the core
> 1752 MHz (QDR) = 7GHz on the Memory
> Temperatures sitting at 23C Idle / 35-42C Load in my system. 16C ambient.
> This all translated into approximately P18300


Crashes when running Heaven Benchmark. Display goes black and I have to hard reboot.

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> I've been hitting a wall regardless of increasing the voltage or not, especially when increasing the mem. 185 is the highest I can get core with no mem offset. Best compromise so far is 160 core 450 mem.


Crashes when running Heaven Benchmark. Display goes black and I have to hard reboot.

Quote:


> Originally Posted by *Arizonian*
> 
> You fair a little better than me my friend.
> 
> 
> 
> 
> 
> 
> 
> 
> My GTX 690 OC results 24/7 is *[+135 Power / +129 Core / +183 Memory]* translates to *1044 Core* / *1594 Memory* / *1149 Boost* = Kepler Boost GPU #1 *1175 MHz* + GPU #2 *1201 MHz*. In fact I can take my power target down to +125 and fair no different on the KB.
> Why +183 Memory offset you ask? It's an even 204.0 GHz bandwidth. As soon as I hit +201 Offset on the Memory I crash in games, and sometimes slight artifcats on desktop and then 'drivers fail to respond'. So I settled for an even bandwidth....
> 
> 
> 
> 
> 
> 
> 
> 
> Interesting to learn how others are fairing on Memory over clocks with high density Samsung's. See if they're really good or not. For me I'd say I should have done better. Others I've seen posted always post Memory as an offset rather than the actual MHz..
> Temps I run at 29 C-31 C idle. Full load I'm at border lining it to 64 C - 69 C. Took many days for me to figure the exact over clock and keep me under 70 C with fan settings in the mix.
> I have a 3770K @ [4.5] - 3DMark11 *17011* - Heaven 3.0 *110.0 FPS*. I don't take the benching to heart in rating the performance of my card because it's synthetic over gaming and the game I play where this card really shines.
> 
> 
> 
> 
> 
> 
> 
> 
> This is my first dual GPU, unlike yourself bud.
> 
> 
> 
> 
> 
> 
> 
> It's new to me and it's been fun learning hands on. I'm passing this one along when Maxwell comes out and hand it down to the kids rig.
> Around March it's all about an epeen large capacity SSD.
> On a side note: I have been having fun with optimal 60 FPS in 3D Vision on my monitor. It does have up to lower 40 FPS dips but it's sporadic and not performance hinder as when it hits 30 FPS or below.
> A lot of nice rigs at OCN and I'm impressed with Fat_Itallion_Stallion's rig. Very clean, nice mix of components aesthetically.
> 
> 
> 
> 
> 
> 
> 
> 
> Glad to see the club active and now in a postive way.


Crashes when running Heaven Benchmark. Display goes black and I have to hard reboot.

I think if I need more power I'll just buy another card.


----------



## Arizonian

Quote:


> Originally Posted by *Anzial*
> 
> which driver are you using?


301.24 - until next beta goes live. I stopped downloading beta drivers since my GTX 580 and it's been smooth sailing.


----------



## ReignsOfPower

egotrippin - you make me laugh lol. Are you sure the rest of your system is stable?
Try this - Power +135% Core +150, Memory +300. Then go up 10 at a time up on the memory until it becomes unstable again. Once you hit that limit, go back 10 on the memory, then start pushing core 5 at a time until unstable. Then pull back your memory 10 and try again on the same crash-ey core state. If still crash-ey, pull memory back again another 10. Wash rinse repeat.

Here is my best run I've managed. I wouldn't say this is a 100% stable setting. My drivers started playing up after the test. Weird. Had to hard reset. +150 on the core I have had complete stability in my limited testing. 180 barely bumped up the bench scores.

3DMark 11
http://3dmark.com/3dm11/3243901 - P11354 - GTX 590 - Highest overclock I managed.
http://3dmark.com/3dm11/3837904 - P16033 - GTX 690 - Stock GPU run.
http://3dmark.com/3dm11/3886447 - P18331 - GTX 690 - +135%, +180 GPU +500 Memory - I don't think I'll be able to hit 20k even if I dial in 5GHz and some finnikky OS tweaks.










Some oldies for comparison sake (X58 -> X79) - 3D Mark Vantage
http://3dmark.com/3dmv/2033299 - P19509 - 4870x2 (OC)
http://3dmark.com/3dmv/2937943 - P31227 - 5970 (OC)
http://3dmark.com/3dmv/3042854 - P41053 - GTX 590 (Stock)
http://3dmark.com/3dmv/4183956 - P64292 - GTX 690 (OC)


----------



## thestache

Quote:


> Originally Posted by *ReignsOfPower*
> 
> egotrippin - you make me laugh lol. Are you sure the rest of your system is stable?
> Try this - Power +135% Core +150, Memory +300. Then go up 10 at a time up on the memory until it becomes unstable again. Once you hit that limit, go back 10 on the memory, then start pushing core 5 at a time until unstable. Then pull back your memory 10 and try again on the same crash-ey core state. If still crash-ey, pull memory back again another 10. Wash rinse repeat.
> Here is my best run I've managed. I wouldn't say this is a 100% stable setting. My drivers started playing up after the test. Weird. Had to hard reset. +150 on the core I have had complete stability in my limited testing. 180 barely bumped up the bench scores.
> 3DMark 11
> http://3dmark.com/3dm11/3243901 - P11354 - GTX 590 - Highest overclock I managed.
> http://3dmark.com/3dm11/3837904 - P16033 - GTX 690 - Stock GPU run.
> http://3dmark.com/3dm11/3886447 - P18331 - GTX 690 - +135%, +180 GPU +500 Memory - I don't think I'll be able to hit 20k even if I dial in 5GHz and some finnikky OS tweaks.
> 
> 
> 
> 
> 
> 
> 
> 
> Some oldies for comparison sake (X58 -> X79) - 3D Mark Vantage
> http://3dmark.com/3dmv/2033299 - P19509 - 4870x2 (OC)
> http://3dmark.com/3dmv/2937943 - P31227 - 5970 (OC)
> http://3dmark.com/3dmv/3042854 - P41053 - GTX 590 (Stock)
> http://3dmark.com/3dmv/4183956 - P64292 - GTX 690 (OC)


My screen went black, had to hard re-set. Event viewer said display driver stopped working at +160 on the core yesterday and I thought that was stable. So same thing. Maybe it's not the GPU and a driver thing.


----------



## fat_italian_stallion

Finally ran into a VRAM issue with these cards. Won't allow you to play max payne 3 @ 2560x1440 with msaa 8x, only 4x. The game just sits there and tells you to go to hell for not having enough and won't even let you apply the setting regardless. Hopefully 8gb models are released at some point.


----------



## egotrippin

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Finally ran into a VRAM issue with these cards. Won't allow you to play max payne 3 @ 2560x1440 with msaa 8x, only 4x. The game just sits there and tells you to go to hell for not having enough and won't even let you apply the setting regardless. Hopefully 8gb models are released at some point.


I'm reading on [H] about this issue. http://hardforum.com/showthread.php?t=1696720&page=11

Apparently running FXAA works and looks better anyway, according to a user who also has a 3GB card and has tried both MSAA and FXAA. I just bought Max Payne on Steam and left it downloading when I came to work. I'll check out the settings when I get home and play for the first time.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *egotrippin*
> 
> I'm reading on [H] about this issue. http://hardforum.com/showthread.php?t=1696720&page=11
> Apparently running FXAA works and looks better anyway, according to a user who also has a 3GB card and has tried both MSAA and FXAA. I just bought Max Payne on Steam and left it downloading when I came to work. I'll check out the settings when I get home and play for the first time.


FXAA and MSAA run alongside each other in this game somehow. Might be on different objects is what I'm thinking. TBH I couldn't tell the difference between 2x and 4x so 8x probably doesn't make that much of a difference. Seems that no card on the market can run it either since it requires 11gb of vram for 8x msaa @ 2560x1440


----------



## egotrippin

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> FXAA and MSAA run alongside each other in this game somehow. Might be on different objects is what I'm thinking. TBH I couldn't tell the difference between 2x and 4x so 8x probably doesn't make that much of a difference. Seems that no card on the market can run it either since it requires 11gb of vram for 8x msaa @ 2560x1440


Ha! Seriously? I haven't had a shot to try playing yet.... 29,334 MB is taking longer than I thought to download.... 7 hours to go.

I've read so much in forums about people badmouthing the 2GB VRAM on 680/690 but I've hardly seen any game go over 2GB by more than a few MB.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *egotrippin*
> 
> Ha! Seriously? I haven't had a shot to try playing yet.... 29,334 MB is taking longer than I thought to download.... 7 hours to go.
> I've read so much in forums about people badmouthing the 2GB VRAM on 680/690 but I've hardly seen any game go over 2GB by more than a few MB.


Yeah, quite odd. The funny thing is that even a sapphire 7970 6gb can't play it with 8x msaa at that res.


----------



## FiShBuRn

Im testing the OC potencial of mine, so i got: 135% core +150 mem +300, stable at heaven and 3dmark temps below 70c, and im getting 1202 KB on both GPUs, is this good? Playing BF3 only reaches 61c.


----------



## Stein357

Mine is Heaven stable at 135%/+135/+175. I'd say that yours is pretty, pretty, pretty, pretty good. Or I got a dud.


----------



## FiShBuRn

Is it normal when a game doesnt use SLI the GPU2 is the one with load?

EDIT: nvm in nv cp my primary gpu was the second...


----------



## egotrippin

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Yeah, quite odd. The funny thing is that even a sapphire 7970 6gb can't play it with 8x msaa at that res.


Works fine at 1080.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *egotrippin*
> 
> Works fine at 1080.


you're only at 1080p. I'm at 2560x1440. Apparently the 690 is the only card that can play MP3 on max at 1080p with a playable framerate.


----------



## egotrippin

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> you're only at 1080p. I'm at 2560x1440. Apparently the 690 is the only card that can play MP3 on max at 1080p with a playable framerate.


Yup. Frame rate is steady at 60 fps except the cut scenes are stuttering a bit and I'm not sure why.

I'll be upgrading monitors... not sure if I want to go with a triple monitor setup or one 30inch ZR30W at 2560 x 1600. What do you use?


----------



## fat_italian_stallion

Quote:


> Originally Posted by *egotrippin*
> 
> Yup. Frame rate is steady at 60 fps except the cut scenes are stuttering a bit and I'm not sure why.
> I'll be upgrading monitors... not sure if I want to go with a triple monitor setup or one 30inch ZR30W at 2560 x 1600. What do you use?


zr2740w. So far it's been great. Price is right and the 16:9 aspect ratio is better for games than a 16:10 (you actually lose part of the image instead of gaining with the extra pixels Same with 1080p over 1200p).


----------



## jcde7ago

I haven't had a chance to keep up with this thread with so much real life stuff going on, and the fact that i've been on vacation + away for weeks.

Give me some time to update the owners list (again) for those who need to be added - i know there's quite a few. Moving forward, PM me if you want to be added - this at least gives me a list to go off of and will definitely help. Thanks guys, and once again, there are some sweet lookin' rigs here from fellow 690 owners!


----------



## Arizonian

Quote:


> Originally Posted by *Stein357*
> 
> Mine is Heaven stable at 135%/+135/+175. I'd say that yours is pretty, pretty, pretty, pretty good. Or I got a dud.


Your about what I'm at 24/7 as far as offsets go + 125 / +129 / +183 - relates to 1044 Core / 1594 Memory / 1149 Boost - GPU #1 1175 & GPU #2 1201 MHz Kepler boost.

Offsets can vary between cards what yields it actually produces in over clocked Core & Memory.

What is your kepler boost hitting for each of your GPU's?

Quote:


> Originally Posted by *jcde7ago*
> 
> I haven't had a chance to keep up with this thread with so much real life stuff going on, and the fact that i've been on vacation + away for weeks.
> Give me some time to update the owners list (again) for those who need to be added - i know there's quite a few. Moving forward, PM me if you want to be added - this at least gives me a list to go off of and will definitely help. Thanks guys, and once again, there are some sweet lookin' rigs here from fellow 690 owners!


Welcome back bud. Glad to hear you've been on vacation, always good.









Have quite a few members who've submerged their 690's as of late, has been interesting to read results as posted in.


----------



## FiShBuRn

Why my GPUs usage in BF3 (Multiplayer), between 55-70%, is not allways 99% like in MaxPayne 3 or Heaven 3.0/3DMark11? I dont get it :s

I could have allways like 120fps (1080p) but i dont because of this poor gpu usage...


----------



## egotrippin

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Yeah, quite odd. The funny thing is that even a sapphire 7970 6gb can't play it with 8x msaa at that res.


Wow I noticed Max Payne with 8X MSAA was using 100% of my VRAM without any issues. I changed it to 4X MSAA and my VRAM usage dropped to 1GB (per GPU of course). Literally half the VRAM. MSAA is a motherf*#%ker


----------



## dboythagr8

Quote:


> Originally Posted by *FiShBuRn*
> 
> Why my GPUs usage in BF3 (Multiplayer), between 55-70%, is not allways 99% like in MaxPayne 3 or Heaven 3.0/3DMark11? I dont get it :s
> I could have allways like 120fps (1080p) but i dont because of this poor gpu usage...


A lot more variables come into play when playing an online title versus running a single player game or benchmark.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *FiShBuRn*
> 
> Why my GPUs usage in BF3 (Multiplayer), between 55-70%, is not allways 99% like in MaxPayne 3 or Heaven 3.0/3DMark11? I dont get it :s
> I could have allways like 120fps (1080p) but i dont because of this poor gpu usage...


It's a driver issue. I got much higher usage in sli than quad. Similar fps as well, only like 20 more


----------



## egotrippin

Quote:


> Originally Posted by *FiShBuRn*
> 
> Why my GPUs usage in BF3 (Multiplayer), between 55-70%, is not allways 99% like in MaxPayne 3 or Heaven 3.0/3DMark11? I dont get it :s
> I could have allways like 120fps (1080p) but i dont because of this poor gpu usage...


If I turn off vsync I get 130fps in Battlefield 3. I think I saw a benchmark for BF3 on Anandtech which showed anywhere from 120-160fps at 1920x1200 depending on whether you use FXAA or 4XMSAA. http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/11

When I first ran BF3 I had a stuttering issue and tried a few different settings and nothing appeared to work... until I played again the next day and the issue was gone. I'm not sure what changed. Keep tinkering because it works great on my computer now.


----------



## juanP

craaaaaazzyyyyy!!!!!


----------



## AVEPICS

I Guess now I'm officially in the 690 club







well at least the "Hydro-copper" 690 club that is =) had to order directly from EVGA which I must say they have some great service ordered the card on last Friday, finally came in this morning at work from UPS
























*the coolest thing is our general manager of our dealership was stocked when he saw this arrive **pc gamer as well***










*this sucker was heavy!*


****I hope this is proof that I can be in the club too***










*Now I gotta start fitting this sucker into my new Danger Den 29 case*


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> If I turn off vsync I get 130fps in Battlefield 3. I think I saw a benchmark for BF3 on Anandtech which showed anywhere from 120-160fps at 1920x1200 depending on whether you use FXAA or 4XMSAA. http://www.anandtech.com/show/5805/nvidia-geforce-gtx-690-review-ultra-expensive-ultra-rare-ultra-fast/11
> When I first ran BF3 I had a stuttering issue and tried a few different settings and nothing appeared to work... until I played again the next day and the issue was gone. I'm not sure what changed. Keep tinkering because it works great on my computer now.


BF3 performance is better with my GTX 690 than my tri-crossfire HD 6970 set-up but the FPS numbers arent much highe maybe 20FPSr. Difference is though 50-70% usage comapred to the AMD cards all at 99% and no visible micro-stutter.

GTX 690 SLI I saw no performance gain just lower usage of all the cores. Shame because ther perfromance really doesn't translate at all. Just got to hurry up and get my surround monitor set-up and then maybe I'll see what these cards can do.


----------



## egotrippin

Quote:


> Originally Posted by *ReignsOfPower*
> 
> egotrippin - you make me laugh lol. Are you sure the rest of your system is stable?
> Try this - Power +135% Core +150, Memory +300. Then go up 10 at a time up on the memory until it becomes unstable again. Once you hit that limit, go back 10 on the memory, then start pushing core 5 at a time until unstable. Then pull back your memory 10 and try again on the same crash-ey core state. If still crash-ey, pull memory back again another 10. Wash rinse repeat.
> Here is my best run I've managed. I wouldn't say this is a 100% stable setting. My drivers started playing up after the test. Weird. Had to hard reset. +150 on the core I have had complete stability in my limited testing. 180 barely bumped up the bench scores.
> 3DMark 11
> http://3dmark.com/3dm11/3243901 - P11354 - GTX 590 - Highest overclock I managed.
> http://3dmark.com/3dm11/3837904 - P16033 - GTX 690 - Stock GPU run.
> http://3dmark.com/3dm11/3886447 - P18331 - GTX 690 - +135%, +180 GPU +500 Memory - I don't think I'll be able to hit 20k even if I dial in 5GHz and some finnikky OS tweaks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some oldies for comparison sake (X58 -> X79) - 3D Mark Vantage
> http://3dmark.com/3dmv/2033299 - P19509 - 4870x2 (OC)
> http://3dmark.com/3dmv/2937943 - P31227 - 5970 (OC)
> http://3dmark.com/3dmv/3042854 - P41053 - GTX 590 (Stock)
> http://3dmark.com/3dmv/4183956 - P64292 - GTX 690 (OC)


I followed your advice but started with a lower clock speed. I max the voltage and power target and I can do +700 with the memory clock but my GPU clock offset maxes out at +130. My heaven benchmark is now:


Quote:


> Originally Posted by *fat_italian_stallion*
> 
> It seems the sweet spot for benches is around 5.0 with watercooling to really see the most out of ur cards. Everything higher I've seen the rig is on phase or ln2


Before I had my cpu at 3.4 Ghz with turbo boost set to 4.4 Ghz on all cores. I have since disabled turbo boost and set my clock to 4.8 Ghz and hyperthreading is still on. This did not make any real differences in my benchmarks but maybe it would if I had multiple cards.

I'm using the 304.79 beta drivers, maybe that's why my GPU offset is lower than others.


----------



## SmokinWaffle

Not an nVidia guy, but just want to say how nice it is to see a lot of you guys running these under water. Annoys me to hell when people get super expensive top end cards then just run them on air.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *SmokinWaffle*
> 
> Not an nVidia guy, but just want to say how nice it is to see a lot of you guys running these under water. Annoys me to hell when people get super expensive top end cards then just run them on air.


The sad thing is that there really isn't much benefit from doing so other than noise and extended component life since voltages are locked and the stock cooler seems to be adequate (from the one day I ran aircooled)


----------



## FiShBuRn

My results:


----------



## Qu1ckset

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> The sad thing is that there really isn't much benefit from doing so other than noise and extended component life since voltages are locked and the stock cooler seems to be adequate (from the one day I ran aircooled)


The stock cooler is more then adequate, this is the first duel gpu card ive owned where id ont have to make fan profiles because it stays cool and not to mention the cooler is sexy


----------



## Arizonian

Quote:


> Originally Posted by *Qu1ckset*
> 
> The stock cooler is more then adequate, this is the first duel gpu card ive owned where id ont have to make fan profiles because it stays cool and not to mention the cooler is sexy


The dual vapor chamber cooling on each side of the GPU with fan in the middle ended up being a very efficient cooling method. 29C-31C idle and 64C-69C full load with current OC to keep me from throttling.


----------



## thestache

Quote:


> Originally Posted by *Arizonian*
> 
> The dual vapor chamber cooling on each side of the GPU with fan in the middle ended up being a very efficient cooling method. 29C-31C idle and 64C-69C full load with current OC to keep me from throttling.


It's awesome.

Heavily ovcerclocked I have to run my card that is on air at 90% fan to keep it at 69deg under load but even that is impressive. It's about half as noisy as a HD 6990 and 30deg cooler.

Still the price of a waterblock is worth it for quiet operation and piece of mind knowling the card will never reach 40deg no matter the overclock.


----------



## Arizonian

Quote:


> Originally Posted by *Sexparty*
> 
> It's awesome.
> Heavily ovcerclocked I have to run my card that is on air at 90% fan to keep it at 69deg under load but even that is impressive. It's about half as noisy as a HD 6990 and 30deg cooler.
> Still the price of a waterblock is worth it for quiet operation and piece of mind knowling the card will never reach 40deg no matter the overclock.


Well said and point taken.


----------



## SmokinWaffle

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> The sad thing is that there really isn't much benefit from doing so other than noise and extended component life since voltages are locked and the stock cooler seems to be adequate (from the one day I ran aircooled)


True. As far as I'm concerned, if you have enough money for a 590/5970/6990/690 or any ultrahigh end GPU, you have enough money to treat it right and slap a waterblock on it.


----------



## xoleras

Quote:


> Originally Posted by *SmokinWaffle*
> 
> True. As far as I'm concerned, if you have enough money for a 590/5970/6990/690 or any ultrahigh end GPU, you have enough money to treat it right and slap a waterblock on it.


To be fair, a water setup is quite expensive if you do it right - it can easily add 500$ or more. I do agree with you on principle however







.

You know what gets me? Its people who spend 1000$ on a multi GPU config yet they hook it up to a cheesy 23 inch TN panel at 1080p. What the hell, I would not use that investment for anything less than 2560x1600 or surround gaming. 1080p is such a waste


----------



## Arizonian

Quote:


> Originally Posted by *xoleras*
> 
> To be fair, a water setup is quite expensive if you do it right - it can easily add 500$ or more. I do agree with you on principle however
> 
> 
> 
> 
> 
> 
> 
> .
> You know what gets me? Its people who spend 1000$ on a multi GPU config yet they hook it up to a cheesy 23 inch TN panel at 1080p. What the hell, I would not use that investment for anything less than 2560x1600 or surround gaming. 1080p is such a waste


1080p with 3D Vision where FPS gets cut by 45%, it is very worth it a $1000 set up. Example if I'm getting 100 FPS gaming 120 Hz when I switch on 3D Vision it drops to 55 FPS.

In 3D Vision nothing lower than 30 FPS has to be bare minimum and anything above 60 FPS it doesn't matter because it tops visually. Hence you can see where in some cases it's worth it. Even if not under water in my case.


----------



## xoleras

True, i'll give you 3d vision. 690 could definitely help for that situation as well, since upper framerates always matter @ 120hz.


----------



## egotrippin

Quote:


> Originally Posted by *xoleras*
> 
> To be fair, a water setup is quite expensive if you do it right - it can easily add 500$ or more. I do agree with you on principle however
> 
> 
> 
> 
> 
> 
> 
> .
> You know what gets me? Its people who spend 1000$ on a multi GPU config yet they hook it up to a cheesy 23 inch TN panel at 1080p. What the hell, I would not use that investment for anything less than 2560x1600 or surround gaming. 1080p is such a waste


The 690 comes with a VGA adapter...

I can play Solitaire at 1024x768 at 1546498165132165 frames per second


----------



## ceteris

Quote:


> Originally Posted by *egotrippin*
> 
> The 690 comes with a VGA adapter...
> I can play Solitaire at 1024x768 at 1546498165132165 frames per second


Oh yeah? I can't get more than 27 FPS on Minesweeper. I think I need to Quad-SLI push past 50


----------



## xoleras

Quote:


> Originally Posted by *ceteris*
> 
> Oh yeah? I can't get more than 27 FPS on Minesweeper. I think I need to Quad-SLI push past 50


You should disable tessellation and advanced shadows, that should help your minesweeper framerate.

I'm still struggling to get a solid 60 fps in it, its a beast of a game.


----------



## ReignsOfPower

Quote:


> Originally Posted by *Arizonian*
> 
> 1080p with 3D Vision where FPS gets cut by 45%, it is very worth it a $1000 set up. Example if I'm getting 100 FPS gaming 120 Hz when I switch on 3D Vision it drops to 55 FPS.
> In 3D Vision nothing lower than 30 FPS has to be bare minimum and anything above 60 FPS it doesn't matter because it tops visually. Hence you can see where in some cases it's worth it. Even if not under water in my case.


I would like 120Hz gaming, but I cant deal with < 27" screens and I cant deal with 1080p over a 27"+ screen. I went with a U3011 because I chose resolution over refresh rate. What I really want to see is 1440p @ 120Hz. We have the tech to do it now (our cards support 4k res worth of bandwidth), don't know why we aren't taking advantage of it. I would pay a pretty penny for something that would do 1440p @ 120Hz.

With regards to waterblocking a GTX 690 - I did it purely for the following:
1. Temperatures way way low
2. Acoustics - I have sensitive hearing and I personally find it extremely tiring listening to anything over 36 - 40dB for a prolonged period
3. Aesthetics. I think water cooling looks the tits!


----------



## emett

What's wrong with the 100hz catleap monitors?


----------



## rubicsphere

I shall be joining you all on Saturday!


----------



## thestache

Quote:


> Originally Posted by *emett*
> 
> What's wrong with the 100hz catleap monitors?


Other than 120hz.net they are hard to find, overpriced for what they are and the warranty is like russian roulette. I'd have tried one if they were 16:10 but they don't seem approperiate for a portrait surround set-up which is what I would need them for.

But the fact they exist and are possible show there is no reason why manufacturers can't do the same.


----------



## egotrippin

Quote:


> Originally Posted by *rubicsphere*
> 
> 
> I shall be joining you all on Saturday!


$900?! ***? How did that happen?


----------



## rubicsphere

The guy had it listed for $1100 OBO so I threw him a low ball of $900 and he accepted. I was pretty surprised too since it is still sealed and new.


----------



## nicodemus

Hey gents, I'm interested in a 690 but i have a few questions i'm hoping you can help me out with.

1) Will my PCI-e 2.0 setup hold the card back?
2) Will I need to push my 2500k over 4ghz to maximize the card?
3) Can I overclock this card with Afterburner?

My setup is 1080p @ 120hz and I'd put it under water.

Thanks!


----------



## ceteris

Quote:


> Originally Posted by *nicodemus*
> 
> Hey gents, I'm interested in a 690 but i have a few questions i'm hoping you can help me out with.
> 1) Will my PCI-e 2.0 setup hold the card back?
> 2) Will I need to push my 2500k over 4ghz to maximize the card?
> 3) Can I overclock this card with Afterburner?
> My setup is 1080p @ 120hz and I'd put it under water.
> Thanks!


1) the difference is literally a couple of frames (source Hard OCP)

2) I don't think you need to go that high. I currently use factory OC on my H80 til I put my system under water.

3) You can. I used Afterburner but eventually switched off to EVGA Precision for the skin.


----------



## nicodemus

cool, thanks! =)


----------



## ceteris

NP!

Well I've got a question for anyone who's up to date on Thunderbolt. So I am about to pull the trigger on a M5E and now I'm interested in this whole Thunderbolt tech. Asus just happens to have a Thunderbolt 27" monitor coming out couple months down the road but how will my GTX 690 work with it? Can I use the Thunderbolt connection through the M5E to connect to the monitor if I get it? Or will I have to wait for a graphics card with the Thunderbolt connection on it? If that's the case, I'm thinking I might need to ditch my 690 sometime soon


----------



## rubicsphere

I'm pretty sure that thunderbolt is backwards compatible with mini display port. I'm not positive though


----------



## xoleras

Quote:


> Originally Posted by *rubicsphere*
> 
> I'm pretty sure that thunderbolt is backwards compatible with mini display port. I'm not positive though


There are 3 different versions of PC thunderbolt, and only 1 of them (the most expensive) supports miniDP or thunderbolt displays. Depends on your motherboard.


----------



## egotrippin

Quote:


> Originally Posted by *rubicsphere*
> 
> The guy had it listed for $1100 OBO so I threw him a low ball of $900 and he accepted. I was pretty surprised too since it is still sealed and new.


It must have "fell off a truck" somewhere ;-)


----------



## egotrippin

Quote:


> Originally Posted by *nicodemus*
> 
> Hey gents, I'm interested in a 690 but i have a few questions i'm hoping you can help me out with.
> 1) Will my PCI-e 2.0 setup hold the card back?
> 2) Will I need to push my 2500k over 4ghz to maximize the card?
> 3) Can I overclock this card with Afterburner?
> My setup is 1080p @ 120hz and I'd put it under water.
> Thanks!


1 not unless you have multiple cards
2 no, but do it anyway - that chip is made for it
3 yes but I like EVGA Precision better


----------



## thestache

Quote:


> Originally Posted by *ceteris*
> 
> 1) the difference is literally a couple of frames (source Hard OCP)
> 2) I don't think you need to go that high. I currently use factory OC on my H80 til I put my system under water.
> 3) You can. I used Afterburner but eventually switched off to EVGA Precision for the skin.


That review/test was pointless and is not relavent to a GTX 690. Double the amount of data being passed through a single PCIe 2.0 x16 lane changes everything.

Anyways at 1080p 120hz it won't make enough of a difference to justify it. 3x monitor surround does though.


----------



## ceteris

Quote:


> Originally Posted by *Sexparty*
> 
> That review/test was pointless and is not relavent to a GTX 690. Double the amount of data being passed through a single PCIe 2.0 x16 lane changes everything.
> Anyways at 1080p 120hz it won't make enough of a difference to justify it. 3x monitor surround does though.


Chill, boy. Its a reference. There is obviously going to be a difference between SLI 2 cards and SLI on a single card. But the difference isn't going to be like 20-50 FPS. Probably a couple more FPS depending on the the entire setup which is not worth breaking the bank for if the goal is a single slot dual gpu solution.

And to my knowledge, X79 boards were even running the 600 series at PCI 2.0 until nVidia released the patch last month with their sorry explanation. Despite of that, the benchmarks I and many others were getting were still very impressive compared to Kepler's predecessor.


----------



## thestache

Quote:


> Originally Posted by *ceteris*
> 
> Chill, boy. Its a reference. There is obviously going to be a difference between SLI 2 cards and SLI on a single card. But the difference isn't going to be like 20-50 FPS. Probably a couple more FPS depending on the the entire setup which is not worth breaking the bank for if the goal is a single slot dual gpu solution.
> And to my knowledge, X79 boards were even running the 600 series at PCI 2.0 until nVidia released the patch last month with their sorry explanation. Despite of that, the benchmarks I and many others were getting were still very impressive compared to Kepler's predecessor.


The difference with a surround set-up and GTX 690 SLI is going to be around 20-50fps and a single GTX 690 it's still enough to justify a PCIe 3.0 slot when powering a surround set-up. Because a 5-10fps difference with a surround set-up is a big difference. It can be the difference between lows in the 50s to never dropping below 60fps in games like battlefield 3 because when you're dropping $1000s on GPUs and monitors it's important things perform.

There is a reason the GTX 690 is the only native PCIe 3.0 card nvidia has and that's because it can actually use the increased bandwidth of 3.0. The GTX 680 doesn't need it unless used in tri-quad SLI set-ups on boards which become choked by such set-ups. ie PCIe 2.0 x8-x8-x8-x8.

If I'm able to I'll do some benchmarks with my surround set-up when it finally comes this week I will because it'll be interesting to see how much of a difference it can make for future reference and people's piece of mind.


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> The difference with a surround set-up and GTX 690 SLI is going to be around 20-50fps and a single GTX 690 it's still enough to justify a PCIe 3.0 slot when powering a surround set-up. Because a 5-10fps difference with a surround set-up is a big difference. It can be the difference between lows in the 50s to never dropping below 60fps in games like battlefield 3 because when you're dropping $1000s on GPUs and monitors it's important things perform.
> There is a reason the GTX 690 is the only native PCIe 3.0 card nvidia has and that's because it can actually use the increased bandwidth of 3.0. The GTX 680 doesn't need it unless used in tri-quad SLI set-ups on boards which become choked by such set-ups. ie PCIe 2.0 x8-x8-x8-x8.
> If I'm able to I'll do some benchmarks with my surround set-up when it finally comes this week I will because it'll be interesting to see how much of a difference it can make for future reference and people's piece of mind.


What monitors are you getting? I've been inches away from pulling the trigger on a Samsung MD230 x 3 setup:





They are going for $1150 refurbished on ebay - It's not a great "value" compared to similar sized monitors but I love the slim bezels, uniformity, and lack of brandin on the monitors


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> What monitors are you getting? I've been inches away from pulling the trigger on a Samsung MD230 x 3 setup:
> 
> 
> They are going for $1150 refurbished on ebay - It's not a great "value" compared to similar sized monitors but I love the slim bezels, uniformity, and lack of brandin on the monitors


No, no, no my friend!

They have horrible input lag and response time according to reviews. Very cheap monitors. I was going to get them too for the tiny bezel and piece of mind but the monitors themselves are sub-par apparently.

I've ordered 3x Dell U2412M monitors. Which seem to be tied with the ASUS PA246Q as the best IPS monitor for gaming. The Dells with the plastic front bezel removed have around 10-15mm of bezel which is really low and are 1200P so are good for portrait which is how I'll be using mine. You just have to take the time to remove the plastic covers and remount the stands.

Anyways I'll start a thread about it when I get them this week because it seems a lot of people at the moment are interested in IPS 1200P monitors for surround set-ups.


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> No, no, no my friend!
> They have horrible input lag and response time according to reviews. Very cheap monitors. I was going to get them too for the tiny bezel and piece of mind but the monitors themselves are sub-par apparently.
> I've ordered 3x Dell U2412M monitors. Which seem to be tied with the ASUS PA246Q as the best IPS monitor for gaming. The Dells with the plastic front bezel removed have around 10-15mm of bezel which is really low and are 1200P so are good for portrait which is how I'll be using mine. You just have to take the time to remove the plastic covers and remount the stands.
> Anyways I'll start a thread about it when I get them this week because it seems a lot of people at the moment are interested in IPS 1200P monitors for surround set-ups.


U2412M has been my alternate choice. IPS display and 16x10 which I hear is better for portrait mode anyway. Are you going to use the stands they come on or invest in a single stand? I was thinking about a single arm mount or the Ergotron LX. I saw this setup on [H]



They are about $295 on ebay with free shipping. Did you find a better deal?


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> U2412M has been my alternate choice. IPS display and 16x10 which I hear is better for portrait mode anyway. Are you going to use the stands they come on or invest in a single stand? I was thinking about a single arm mount or the Ergotron LX. I saw this setup on [H]
> 
> They are about $295 on ebay with free shipping. Did you find a better deal?


I got x1 from Dell direct with a $120/30% off deal (for $270) and x2 off ebay for $299 with free shipping, so similar prices.

I'm going to use the stands they come with, don't like the look of any tripple monitor stands I've found so far and I'll be sitting them as low as they go because (my desk has a rasied shelf for the monitors and) to my knowledge the included stand will get them lower than an aftermarket tripple monitor stand.

1200p gives you half an inch less height and 3 inches extra width in portrait mode, plus bezels. So that's why I went 1200p over 1080p, figured I'd enjoy wider dimensions.

With the plastic case removed you can get the bezels down from the size of x2 16mm to the size of just one, x1 16mm. So hopefully it's enough that it doesn't bother me.


----------



## ceteris

Quote:


> Originally Posted by *Sexparty*
> 
> The difference with a surround set-up and GTX 690 SLI is going to be around 20-50fps and a single GTX 690 it's still enough to justify a PCIe 3.0 slot when powering a surround set-up. Because a 5-10fps difference with a surround set-up is a big difference. It can be the difference between lows in the 50s to never dropping below 60fps in games like battlefield 3 because when you're dropping $1000s on GPUs and monitors it's important things perform.
> There is a reason the GTX 690 is the only native PCIe 3.0 card nvidia has and that's because it can actually use the increased bandwidth of 3.0. The GTX 680 doesn't need it unless used in tri-quad SLI set-ups on boards which become choked by such set-ups. ie PCIe 2.0 x8-x8-x8-x8.
> If I'm able to I'll do some benchmarks with my surround set-up when it finally comes this week I will because it'll be interesting to see how much of a difference it can make for future reference and people's piece of mind.


For 3x monitor surround yeah it will. But the person I responded to wasn't asking for multi monitor setup.

You might want to think about the VRAM as well, although there aren't any other 690 models that sport more of it. Much why a couple nVidia users on this forum are waiting to defect to the 7990.

EDIT: NM it was on a previous post


----------



## Lotek420

Hello all. I am new to the forum and a GTX 690 quad SLI owner. I see lots of talk about PCI-E 3.0 Did any of you catch this article from Hardocp?

http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review


----------



## thestache

Quote:


> Originally Posted by *Lotek420*
> 
> Hello all. I am new to the forum and a GTX 690 quad SLI owner. I see lots of talk about PCI-E 3.0 Did any of you catch this article from Hardocp?
> http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review


Yeap. That's the one I'm talking about. Their test is stupid and irrelevant.


----------



## thestache

Got my monitor and it's great (other two come tomorrow) but when I rotate the image into portrait mode through nvidia control panel the image goes slightly blurry and text no longer looks crisp. Really unsure why.


----------



## nanoscale

Got the card last week, and I have just finished new build this afternoon.


Spoiler: Warning: Spoiler!


----------



## Arizonian

For those interested, just got word from EVGA as of yesterday the back plates for the GTX 690 have been shipped to EVGA. They approximate them to be on sale at EVGA.com within a week.









I know it's just aesthetics but it ties everything in nicely as well as some added protection for $999 GPU. Someday I'm handing this card down and it'll still be in great condition.

It's the last piece to complete my computer build until next year when I can afford a huge capacity SSD.









PS - Nice build Nanoscale.


----------



## Qu1ckset

Quote:


> Originally Posted by *Arizonian*
> 
> For those interested, just got word from EVGA as of yesterday the back plates for the GTX 690 have been shipped to EVGA. They approximate them to be on sale at EVGA.com within a week.
> 
> 
> 
> 
> 
> 
> 
> 
> I know it's just aesthetics but it ties everything in nicely as well as some added protection for $999 GPU. Someday I'm handing this card down and it'll still be in great condition.
> It's the last piece to complete my computer build until next year when I can afford a huge capacity SSD.
> 
> 
> 
> 
> 
> 
> 
> 
> PS - Nice build Nanoscale.


Good im dying for the backplate, i have a bad itch to buy something for my computer lol


----------



## juanP

anyone watercooling their 690's yet?


----------



## thestache

Quote:


> Originally Posted by *Arizonian*
> 
> For those interested, just got word from EVGA as of yesterday the back plates for the GTX 690 have been shipped to EVGA. They approximate them to be on sale at EVGA.com within a week.
> 
> 
> 
> 
> 
> 
> 
> 
> I know it's just aesthetics but it ties everything in nicely as well as some added protection for $999 GPU. Someday I'm handing this card down and it'll still be in great condition.
> It's the last piece to complete my computer build until next year when I can afford a huge capacity SSD.
> 
> 
> 
> 
> 
> 
> 
> 
> PS - Nice build Nanoscale.


Thanks for that been wanting one for a while.


----------



## thestache

Quote:


> Originally Posted by *juanP*
> 
> anyone watercooling their 690's yet?


Yeah tons.

Koolance block on one of mine and it's fantastic.


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> Got my monitor and it's great (other two come tomorrow) but when I rotate the image into portrait mode through nvidia control panel the image goes slightly blurry and text no longer looks crisp. Really unsure why.


Get that issue figured out yet?


----------



## Divineshadowx

Does anyone know why ram wouldn't run at xmp? My gskill 2400mhz were not running anything above 1600mhz without randomly crashing my sytstem. I sent them back to newegg, and they sent it back saying it was broken and they couldn't replace it, turns out they sent it back with all 4 sticks completly broken. So i went out and bought a corsair vengeance 2133mhz quad channel kit, and still the same problem. Half the time it does not boot at xmp, and anything above 1600 it crashes randomly during games. I'm on a 3770k @ 4.5ghz and running an asus p8z77 ws mobo.


----------



## pilla99

Quote:


> Originally Posted by *Divineshadowx*
> 
> Does anyone know why ram wouldn't run at xmp? My gskill 2400mhz were not running anything above 1600mhz without randomly crashing my sytstem. I sent them back to newegg, and they sent it back saying it was broken and they couldn't replace it, turns out they sent it back with all 4 sticks completly broken. So i went out and bought a corsair vengeance 2133mhz quad channel kit, and still the same problem. Half the time it does not boot at xmp, and anything above 1600 it crashes randomly during games. I'm on a 3770k @ 4.5ghz and running an asus p8z77 ws mobo.


Not sure. I run 2133MHz just fine when my system decides to turn on. I have a CPU power // overheating issue. Not sure really.
Never once had a Ram issue though.


----------



## fat_italian_stallion

up the voltage to 1.65


----------



## Lu(ky

Quote:


> Originally Posted by *juanP*
> 
> anyone watercooling their 690's yet?


I been so busy at work I had no time for my build, but one week to go on vacation for 10-days doing stuff around the house and this is going to be my #1 project is my Lian Li V354 build/mod with a eK GTX 690 card. I already have all the stuff just no time to get it going.. Also got in the mail today the modDIY cables for my SeaSonic Platinum 860w PSU not bad for $110.00 to my door...

Got the same color below...


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> Get that issue figured out yet?


Yeah it's all good now. Lots to learn with this whole surround thing.

The three U2412Ms are intense and the bezels dont bother me at all but having performance issues. Either I'm getting serious input lag and ghosting which I wasn't on one monitor or the GTX 690 is struggling with the 3884x1920 resolution. Think it's the card and it's VRAM limitations. Going to update drivers and have a play tonight. See if it improves. Ultra in BF3 hits the VRAM limitation and the card throttles from heat even with a minor OC, so I'm glad I waited before watercooling and commiting to the GTX 690. Going to sell it and SLI some watercooled reference GTX 680 4GBs.

Will start a thread and document my experience with the monitors and de-bezeling them and getting some GPUs that run them well soon once I've had a good play with them.

But I can reccomend surround gaming. It's intense.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Sexparty*
> 
> Yeah it's all good now. Lots to learn with this whole surround thing.
> The three U2412Ms are intense and the bezels dont bother me at all but having performance issues. Either I'm getting serious input lag and ghosting which I wasn't on one monitor or the GTX 690 is struggling with the 3884x1920 resolution. Think it's the card and it's VRAM limitations. Going to update drivers and have a play tonight. See if it improves. Ultra in BF3 hits the VRAM limitation and the card throttles from heat even with a minor OC, so I'm glad I waited before watercooling and commiting to the GTX 690. Going to sell it and SLI some watercooled reference GTX 680 4GBs.
> Will start a thread and document my experience with the monitors and de-bezeling them and getting some GPUs that run them well soon once I've had a good play with them.
> But I can reccomend surround gaming. It's intense.


U should be fine at that resolution, my old setup with 3x480s could handle it no problem, and even a slightly higher res. VRAM is only an issue for max Payne 3 right now and 7680x 1440


----------



## FalcX

Sign me up !!

Have had mine running for almost a week. Just waiting for some cables/extensions to arrive before closing up the system.


----------



## thestache

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> U should be fine at that resolution, my old setup with 3x480s could handle it no problem, and even a slightly higher res. VRAM is only an issue for max Payne 3 right now and 7680x 1440


I'm confused.

Have gotten VRAM usage down a bit with changing my settings so it doesn't stutter anymore because of the VRAM limit being hit but I'm still not feeling crisp movement and I'm getting a lot of screen tearing or ghosting (haven't decided which it is yet). Had this before but it wasn't as intense as it is now. Not sure whats making this feel laggy.

Drivers helped a bit. Maybe it's just BF3 because it is an awful game performance wise or maybe it's the IPS monitors. Will have to wait for next week and Planetside 2 Beta before I make a decision.

But a GTX 690 certainly does not have enough VRAM for BF3 on ultra (full ultra, everything maxxed, none of this turning AA off) at any surround resolution I've tested:

3880x1920 16:10 with bezel correction
3600x1920 16:10
3240x1920 16:9


----------



## AVEPICS

Quote:


> Originally Posted by *juanP*
> 
> anyone watercooling their 690's yet?


I posted a pic earlier on this thread but I'll post this one below: *( I actually have the Signature Hydro Copper edition )
*


----------



## Qu1ckset

Quote:


> Originally Posted by *AVEPICS*
> 
> I posted a pic earlier on this thread but I'll post this one below: *( I actually have the Signature Hydro Copper edition )
> *


Looks sweet!


----------



## benceh

Yo guys, thinking about getting the GTX 690 but would need some advise/help first.

Setup: i5 2500k @stock 3.3GHz with a CM 212 evo after market cooler installed on (didn't dare to overclock yet).
Asus P8Z68-V PRO/GEN3
128GB Samsung 830Series SSD
128GB Vertex 4 SSD
Seasonic X-760 Gold

Have to get myself a 120 hz monitor. Either the BenQ XL2420T, 24" or the BenQ XL2410T, 24". The 10T fits more on my budget.
I think the 20T and the 10T are the same, just different design/look no?? By the way, I'm not sure if the 10T is a 120hz monitor (If someone can confirm if it is or not it would be really nice).

So yea, i am able to get a GTX 690 if my setup is able to handle it, otherwise i guess i'll get a 670









If i overclock my 2500k cpu, would this current build fit well with a GTX 690, knowing that my board supports PCIe 2.0 and not 3.0 but it won't make a noticeable difference in fps.
Or would it be better to get a i7 3770k and a simple but efficient Z77 motherboard? And overclock slightly the i7.

I am using this setup for gaming only. No editing.

Now just one question by curiosity, if i get a gtx 690 installed on my current mobo, on the PCIe 2.0 x16 slot, the gpu will run in x16 or in twice x8? Cause it's a dual gpu, so i don't really get it.

Thanks guyss
(Don't have any experience in overclocking, so the less the better!)


----------



## Landon Heat

25c idle/40c load


----------



## Qu1ckset

Quote:


> Originally Posted by *benceh*
> 
> Yo guys, thinking about getting the GTX 690 but would need some advise/help first.
> Setup: i5 2500k @stock 3.3GHz with a CM 212 evo after market cooler installed on (didn't dare to overclock yet).
> Asus P8Z68-V PRO/GEN3
> 128GB Samsung 830Series SSD
> 128GB Vertex 4 SSD
> Seasonic X-760 Gold
> Have to get myself a 120 hz monitor. Either the BenQ XL2420T, 24" or the BenQ XL2410T, 24". The 10T fits more on my budget.
> I think the 20T and the 10T are the same, just different design/look no?? By the way, I'm not sure if the 10T is a 120hz monitor (If someone can confirm if it is or not it would be really nice).
> So yea, i am able to get a GTX 690 if my setup is able to handle it, otherwise i guess i'll get a 670
> 
> 
> 
> 
> 
> 
> 
> 
> If i overclock my 2500k cpu, would this current build fit well with a GTX 690, knowing that my board supports PCIe 2.0 and not 3.0 but it won't make a noticeable difference in fps.
> Or would it be better to get a i7 3770k and a simple but efficient Z77 motherboard? And overclock slightly the i7.
> I am using this setup for gaming only. No editing.
> Now just one question by curiosity, if i get a gtx 690 installed on my current mobo, on the PCIe 2.0 x16 slot, the gpu will run in x16 or in twice x8? Cause it's a dual gpu, so i don't really get it.
> Thanks guyss
> (Don't have any experience in overclocking, so the less the better!)


Bro the GTX690 is a beast, I'm using it on my Asus maximus iv gene-z (pci-e2.0) and on a 2560x1440 monitor I'm getting 100fps in bf3 at Max settings, so I'm sure on a 120hz 1080p the 690 will rape, pci-e2.0 is fine!


----------



## benceh

Quote:


> Originally Posted by *Qu1ckset*
> 
> Bro the GTX690 is a beast, I'm using it on my Asus maximus iv gene-z (pci-e2.0) and on a 2560x1440 monitor I'm getting 100fps in bf3 at Max settings, so I'm sure on a 120hz 1080p the 690 will rape, pci-e2.0 is fine!


Alright thank you for your positive answer. I see we nearly have the same build







i5 2500k and a z68 platform. Eventhough our motherboards aren't the same, in terms of gpu and everything they are pretty much the same.
Overclocked at 4.6GHz do you see any bottlenecking?? And hmm, could i have your bios overclock settings please. (Don't have any experience in OC).

Now for the 24inch 120hz monitor (My desk isn't big enough for a 27inch), here are the monitors available in my local store:
-Samsung SyncMaster T23A750, 23" 3D LED TV-TFT it's 3ms.
-Samsung SyncMaster S23A750, 23" 3D LED TFT, Schwarz (Don't know its difference between this and the one above) its 3ms.
-BenQ XL2420T, 24"
-Hanns.G HS233H3B, 23" 3D TFT, 120Hz (CURRENTLY ON SALE FOR 125EUR!)
-Acer GD245HQ, 24" Wide TFT 3D

I'm using it for Fps games in 2D, got no intention on using 3D at the moment.
Price is a bit of an issue, but if it is really worth it, my wallet is flexible


----------



## pilla99

Quote:


> Originally Posted by *Qu1ckset*
> 
> Bro the GTX690 is a beast, I'm using it on my Asus maximus iv gene-z (pci-e2.0) and on a 2560x1440 monitor I'm getting 100fps in bf3 at Max settings, so I'm sure on a 120hz 1080p the 690 will rape, pci-e2.0 is fine!


The really great news about it is that (overclocked a little) you never drop below 60 FPS on ultra. Gives you a great fluid experience the entire time playing. We have a similar build so you should be able to expect the same thing.

Weird thing about this card and something that can sort of suck is performance is not always 100% the same in relation to other games.
For example I can cream through BF3 on ultra and never see below 60 frames, yet I can't play a game like HON on max settings and not drop into the 40's in large battles.

A game that my old 6870 could easily outdo.
Just needs some driver updates i guess. Any major game should be fine though.


----------



## thestache

Quote:


> Originally Posted by *benceh*
> 
> Alright thank you for your positive answer. I see we nearly have the same build
> 
> 
> 
> 
> 
> 
> 
> i5 2500k and a z68 platform. Eventhough our motherboards aren't the same, in terms of gpu and everything they are pretty much the same.
> Overclocked at 4.6GHz do you see any bottlenecking?? And hmm, could i have your bios overclock settings please. (Don't have any experience in OC).
> Now for the 24inch 120hz monitor (My desk isn't big enough for a 27inch), here are the monitors available in my local store:
> -Samsung SyncMaster T23A750, 23" 3D LED TV-TFT it's 3ms.
> -Samsung SyncMaster S23A750, 23" 3D LED TFT, Schwarz (Don't know its difference between this and the one above) its 3ms.
> -BenQ XL2420T, 24"
> -Hanns.G HS233H3B, 23" 3D TFT, 120Hz (CURRENTLY ON SALE FOR 125EUR!)
> -Acer GD245HQ, 24" Wide TFT 3D
> I'm using it for Fps games in 2D, got no intention on using 3D at the moment.
> Price is a bit of an issue, but if it is really worth it, my wallet is flexible


Go the Samsung. Their 120hz monitors are far better than anyone else's on the market atte moment and have the best colours/contrast for a TN monitor.

GTX 690 will get you the desired 120FPS and won't be outdated as far as performance goes for a while. So is a good investment.

Overclocked to 4.6ghz won't bottleneck anything. A 2500k is fine for 1080P and will be for a while yet.


----------



## benceh

Quote:


> Originally Posted by *Sexparty*
> 
> Go the Samsung. Their 120hz monitors are far better than anyone else's on the market atte moment and have the best colours/contrast for a TN monitor.
> GTX 690 will get you the desired 120FPS and won't be outdated as far as performance goes for a while. So is a good investment.
> Overclocked to 4.6ghz won't bottleneck anything. A 2500k is fine for 1080P and will be for a while yet.


Alright, at the beginning i was more on the side of the BenQ XL2420T. You think the samsung?? oO It's 3ms, compared to the others that are 2ms? (dunno if im 100% right)
And which one of both Samsung?


----------



## egotrippin

Quote:


> Originally Posted by *benceh*
> 
> Alright, at the beginning i was more on the side of the BenQ XL2420T. You think the samsung?? oO It's 3ms, compared to the others that are 2ms? (dunno if im 100% right)
> And which one of both Samsung?


At 120hz you're going to have to wait 8.33ms for the screen to refresh anyway.


----------



## benceh

So is a 120hz monitor really worth it for FPS gaming in 2D (no interest in 3D) compared to a simple 60hz monitor? Knowing that my GPU can get 100-120fps in everygame.


----------



## egotrippin

Quote:


> Originally Posted by *benceh*
> 
> So is a 120hz monitor really worth it for FPS gaming in 2D (no interest in 3D) compared to a simple 60hz monitor? Knowing that my GPU can get 100-120fps in everygame.


I haven't tried so I can't say for sure. I think it's been shown the human eye can't see that many frames per second but maybe it does add some perceptual smoothness to the image that, while not directly observable still has an effect. Total input lag is going to matter more for first person shooter games if you need that split second advantage. anandtech.com does a good job of measuring input lag and there are some other good resources out there I've heard of.


----------



## benceh

Okay, so if i get a 120hz monitor, which should i go with?
The ones available in my store:
-Samsung SyncMaster T23A750, 23" 3D LED TV-TFT it's 3ms.
-Samsung SyncMaster S23A750, 23" 3D LED TFT, Schwarz (Don't know its difference between this and the one above) its 3ms.
-BenQ XL2420T, 24"
-Hanns.G HS233H3B, 23" 3D TFT, 120Hz (CURRENTLY ON SALE FOR 125EUR!)
-Acer GD245HQ, 24" Wide TFT 3D

The Hanns isn't top but the price is pretty attracting ^^ Most people say i should get the BenQ 20T or the Samsung, but this available Samsung monitor isn't one of the newest ones :/ The newest ones aren't available yet.
Any ideas? (I live in Europe)


----------



## Arizonian

IF your not looking for an Nvidia 3D Vision sponsored monitor or to even keep that possibility open down the road then the *Samsung* *SA950* for $449 is a 2ms 120 Hz monitor that should strongly be considered.

I've seen it in person the SA950 and is a beautiful monitor. I did a lot of homework when researching 3D Vision and sponsored monitors from both AMD HD3D and Nvidia 3D Vision before choosing myself.

One drawback for some is the 23" SA950 is a TFT panel (mirrored) any lighting behind you will reflect. It can be very distracting when it's in constant view unless you turn off your lights. If you don't have lighting directly behind you then







.

Otherwise your better off with a matte-finish monitor where lighting doesn't reflect and is equally as good of a gaming monitor. *Dell* *Alienware AW2310* for $449 from experience is a great 23" 2ms 120 Hz monitor. It's very solid with great colors currently the kids are using it on the 2nd rig.









It boils down to the Samsung SA950 with colors that seem a bit more brilliant yet inferior compared to the Alienware AW2310 implementation of 3D Vision.

Gaming at 120 Hz for me seems crazily fluid where movement is clean smooth anything above 100 FPS + which the 690 does very well on a single display at stock speeds. Going back to a 60 Hz monitor would be like going back to a CRT display for me or maybe pull out an 8 track tape to listen to music.









The picture below helps me illustrate when showing twice as many frames in between creates a more flud movement.



Edited to add: Video of Black Ops at 120 Hz on my ASUS VG278H monitor - FPS included upper left hand corner. Highest settings.
i7 3770K [4.0] & GTX 690 - Kepler Boost GPU #1 1175 MHz & GPU #2 1201 MHz


----------



## Levesque

I just ordered a brand new EVGA + EK waterblock for my 3rd LAN computer at home. I will use it with 3X24''.

I would really appreciate if someone could direct me to a good explanation on how to OC the 690, since I'm totally new to Nvidia (AMD owners for the last 5 years).

Also, what are the best drivers to use with the 690 right now?

Thank you.

I will post some pics when getting everything later this week.


----------



## benceh

Quote:


> Originally Posted by *Arizonian*
> 
> IF your not looking for an Nvidia 3D Vision sponsored monitor or to even keep that possibility open down the road then the *Samsung* *SA950* for $449 is a 2ms 120 Hz monitor that should strongly be considered.
> I've seen it in person the SA950 and is a beautiful monitor. I did a lot of homework when researching 3D Vision and sponsored monitors from both AMD HD3D and Nvidia 3D Vision before choosing myself.
> One drawback for some is the 23" SA950 is a TFT panel (mirrored) any lighting behind you will reflect. It can be very distracting when it's in constant view unless you turn off your lights. If you don't have lighting directly behind you then
> 
> 
> 
> 
> 
> 
> 
> .
> Otherwise your better off with a matte-finish monitor where lighting doesn't reflect and is equally as good of a gaming monitor. *Dell* *Alienware AW2310* for $449 from experience is a great 23" 2ms 120 Hz monitor. It's very solid with great colors currently the kids are using it on the 2nd rig.
> 
> 
> 
> 
> 
> 
> 
> 
> It boils down to the Samsung SA950 with colors that seem a bit more brilliant yet inferior compared to the Alienware AW2310 implementation of 3D Vision.


Thanks alot. I have no interest in 3D, only 2D. So which one would you say is the best for me? Knowing that i play fps games competivly. For me, the Samsun you talked about seems to be a very good monitor, but wouldn't be perfect for me, cause i'm often not alone in the room in which my desktop is in. So i wouldn't be able to turn the lights of during the evening. + My desktop is slighly near a window.
I was thinking of either the Alienware, or the BenQ XL2420T. What do you think?

Quote:


> Edited to add: Video of Black Ops at 120 Hz on my ASUS VG278H monitor - FPS included upper left hand corner. Highest settings.
> i7 3770K [4.0] & GTX 690 - Kepler Boost GPU #1 1175 MHz & GPU #2 1201 MHz


I saw your Crysis enthusiast video and how come you get an average of 60-65 fps with a GTX 690 on a single monitor? I thought the GTX 690 could get 100-110 fps average on nearly/maybe all games.
1/ One question, i am really interested in the disabling the GTX 690 SLI from times to times when playing older games like CoD4. How does the function work?
2/ Is the GTX 690 giving poor performance on certain games that aren't exactly fitting it? Some people say that on some games their GTX 580 perform as good as a GTX 690 does.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *benceh*
> 
> I saw your Crysis enthusiast video and how come you get an average of 60-65 fps with a GTX 690 on a single monitor? I thought the GTX 690 could get 100-110 fps average on nearly/maybe all games.
> 1/ One question, i am really interested in the disabling the GTX 690 SLI from times to times when playing older games like CoD4. How does the function work?
> 2/ Is the GTX 690 giving poor performance on certain games that aren't exactly fitting it? Some people say that on some games their GTX 580 perform as good as a GTX 690 does.


It's an option in the nvidia control panel under sli/ physx config


----------



## Arizonian

Quote:


> Originally Posted by *benceh*
> 
> Thanks alot. I have no interest in 3D, only 2D. So which one would you say is the best for me? Knowing that i play fps games competivly. For me, the Samsun you talked about seems to be a very good monitor, but wouldn't be perfect for me, cause i'm often not alone in the room in which my desktop is in. So i wouldn't be able to turn the lights of during the evening. + My desktop is slighly near a window.
> I was thinking of either the Alienware, or the BenQ XL2420T. What do you think?
> I saw your Crysis enthusiast video and how come you get an average of 60-65 fps with a GTX 690 on a single monitor? I thought the GTX 690 could get 100-110 fps average on nearly/maybe all games.
> 1/ One question, i am really interested in the disabling the GTX 690 SLI from times to times when playing older games like CoD4. How does the function work?
> 2/ Is the GTX 690 giving poor performance on certain games that aren't exactly fitting it? Some people say that on some games their GTX 580 perform as good as a GTX 690 does.


The BenQ XL2420T I've heard nothing bad about. Can't comment on it but it seems to be just as worthy monitor.

As for the Crysis FPS, I meant to remove that video because I've learned that it was incorrect. I found that I must have had some setting in control panel working against me. I re-monitored the game and found I'm much higher 97 FPS for lows. I'm going to revideo that as soon as I find play time.

As for disabalinig SLI - your going to find you won't need to. In applications where SLI is not capable your 690 will act as a single 680 anyway.
Quote:


> Originally Posted by *Levesque*
> 
> I just ordered a brand new EVGA + EK waterblock for my 3rd LAN computer at home. I will use it with 3X24''.
> I would really appreciate if someone could direct me to a good explanation on how to OC the 690, since I'm totally new to Nvidia (AMD owners for the last 5 years).
> Also, what are the best drivers to use with the 690 right now?
> Thank you.
> I will post some pics when getting everything later this week.


Here are *670* & *680* guides from OCN members that have been very helpful. Kepler over clocking techniques with offsets is somewhat standard across the line. *Don't understand Kepler Core Clocks?* is a simple explanation.

You know more than anyone it just boils down to luck especially under water where keeping temps down is no problem. You won't have any troubles keeping the card under 70C where it trips Kepler Boost to throttle back 13 MHz Core and then again at 80C another 13 MHz Core.

As for water cooled 690's (not my expertise) I'll allow those who are from the club elaborate further please chime in.









I'm using *301.42* currently on a single monitor where I've found no problems. I just upgrade to the next 'final' driver release to keep to date. I don't try betas and haven't had any problems. Those with multiple monitors may have had a different experience.

Tip: When installing new drivers you want to choose 'clean install' option which wipes previous drivers before applying newer version.









We await your pics, knowing you they will be very monolith worthy.


----------



## Levesque

Thank you Arizonian for the links. Will go read those right now.

And btw, I just bought a second EVGA 690 to go Quad-SLI under water.







Couldn't resist. Shiny!!!!!

I will try those with my 3X 30'' at 7680X1600. is there anyone here using that res? Is it working fine? Can I use the 3 DVI connectors on the 690 to do this with 3X 2560X1600, or do I have to go 2X DVI + 1 DP?


----------



## benceh

Quote:


> Originally Posted by *Arizonian*
> 
> The BenQ XL2420T I've heard nothing bad about. Can't comment on it but it seems to be just as worthy monitor.


Okay, hmm, since you posess the Alienware one, what do you think about the input lagg and the colours. The BenQ is at the moment cheaper in my local shop (small sale).
Apparently the BenQ only has a little input lag. But i'm not sure about the colours.


----------



## benceh

I heard 120hz doesn't work on all the resolutions. Some gaming friends of mine said it only works on native resolution, is that true?


----------



## fat_italian_stallion

why would u run anything other than native?


----------



## benceh

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> why would u run anything other than native?


Hmm Call of Duty : Modern Warfare 1 is a game like SC2 and CSS (graphically).
You probably play your game as entainement, and to have a good time infront of some sick graphics so the best is better.
I play for fun too, but in the competitive esport scene of CoD4. And at that point, most of us prefer to have not too high graphics.
For example i play Cod4 on lower resolution than a native 22/24inch monitor, and i usually don't enable AA or just a bit.
In those kind of games, the screen quality comes second, the most important is the colours (ingame tweaking) and the easiest settings to see your opponent.
On top graphics, there are soo many details that spotting a player is harder. This is briefly why.

However when i play games like Crysis or BF, since it's for fun and everything, i like to have most of the settings maxed out in order to get a good and more enjoyable gaming time. (Which i also do when i play Cod4, but it's different).


----------



## Arizonian

Quote:


> Originally Posted by *benceh*
> 
> Okay, hmm, since you posess the Alienware one, what do you think about the input lagg and the colours. The BenQ is at the moment cheaper in my local shop (small sale).
> Apparently the BenQ only has a little input lag. But i'm not sure about the colours.


They both have equal specifications, so if it's less expensive I'd pick up the BenQ at your local shop. I think this choice answered itself.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Levesque*
> 
> Thank you Arizonian for the links. Will go read those right now.
> And btw, I just bought a second EVGA 690 to go Quad-SLI under water.
> 
> 
> 
> 
> 
> 
> 
> Couldn't resist. Shiny!!!!!
> I will try those with my 3X 30'' at 7680X1600. is there anyone here using that res? Is it working fine? Can I use the 3 DVI connectors on the 690 to do this with 3X 2560X1600, or do I have to go 2X DVI + 1 DP?


Take a look here. Nvidia tell you exactly which you will need for every orientation of all of their cards.
Quote:


> Originally Posted by *benceh*
> 
> Hmm Call of Duty : Modern Warfare 1 is a game like SC2 and CSS (graphically).
> You probably play your game as entainement, and to have a good time infront of some sick graphics so the best is better.
> I play for fun too, but in the competitive esport scene of CoD4. And at that point, most of us prefer to have not too high graphics.
> For example i play Cod4 on lower resolution than a native 22/24inch monitor, and i usually don't enable AA or just a bit.
> In those kind of games, the screen quality comes second, the most important is the colours (ingame tweaking) and the easiest settings to see your opponent.
> On top graphics, there are soo many details that spotting a player is harder. This is briefly why.
> However when i play games like Crysis or BF, since it's for fun and everything, i like to have most of the settings maxed out in order to get a good and more enjoyable gaming time. (Which i also do when i play Cod4, but it's different).


Time to invest in a CRT monitor for competitive play.


----------



## Trelga

Hey guys quick question. I'm thinking of buying the 690, I've recently quit smoking and i feel like congratulating myself.

But how big of a PSU will i need for the 690? I currently have a Corsair HX650.

Everything in my computer will be:
Blu-Ray drive
ssd
Hard drive
i5-2500k
h100


----------



## Arizonian

Quote:


> Originally Posted by *Trelga*
> 
> Hey guys quick question. I'm thinking of buying the 690, I've recently quit smoking and i feel like congratulating myself.
> But how big of a PSU will i need for the 690? I currently have a Corsair HX650.
> Everything in my computer will be:
> Blu-Ray drive
> ssd
> Hard drive
> i5-2500k
> h100


650 Watt is what the 'over inflated' recommendation is from Nvidia. Lowest I'd feel comfortable would be a good quality 750 Watt taking the other components into consideration. I settled for a good quality 850 Watt for piece of mind it wouldn't run at full load all the time. I don't think I've seen anyone lower than 750 Watt.


----------



## Trelga

Quote:


> Originally Posted by *Arizonian*
> 
> 650 Watt is what the 'over inflated' recommendation is from Nvidia. Lowest I'd feel comfortable would be a good quality 750 Watt taking the other components into consideration. I settled for a good quality 850 Watt for piece of mind it wouldn't run at full load all the time. I don't think I've seen anyone lower than 750 Watt.


I'm not questioning your judgement. I gurantee you're more knowledgble than me lol.

But I was looking at some benchmarks. It says the TPD for the 690 is like 312 Watts.

I currently have two 560 ti's, and they draw around 199watts a piece. So if my system can handle this now fine, shouldn't it be able to handle the 690? I really want the 690, but i don't have the cash to get it and a new PSU, if I can't get the 690 I'll just go with 2 670 FTWs.


----------



## pilla99

I think you would be fine with a 650w, however if you OC you start using a lot more power. I have my 690 at 135% power target, and that can pull a max of 420 or around there.
420 + CPU ~120 and everything else would then be cutting it close.

However these are all numbers that assume you are running 100% full boar.


----------



## Trelga

I can't see myself OCing this beast. I really want it but i'mma wait a few weeks to decide on it or 670 sli.

My cpu is OC'd at the moment at 4Ghz, but is undervolted to 1.22 from stock. I may try to get it 4.5 but i'm not wanting to go above 1.3v


----------



## fat_italian_stallion

If you have SB or IB you'll be fine, even overclocked, but SB-E is quite a bit more power hungry and would need the extra headroom of the 750w. With reasonable overclocks I see at most 1100w pull from the wall with SLi 690s and SB-E, that includes tons of fans, lighting (50w), and pumps that pull quite a bit. A 650w psu will likely be more than enough for a single 690, which is more powerful than 2 670s


----------



## Trelga

Well i will have one nzxt sleeved led kit. I'm replacing the fans in my tower as well but they aren't led.

If you guys say i'm good though i'm buying it. I can say 100% sure this is the only time i'll have a spare 1000 dollars to "throw away" on a GPU.


----------



## thestache

Quote:


> Originally Posted by *Levesque*
> 
> Thank you Arizonian for the links. Will go read those right now.
> And btw, I just bought a second EVGA 690 to go Quad-SLI under water.
> 
> 
> 
> 
> 
> 
> 
> Couldn't resist. Shiny!!!!!
> I will try those with my 3X 30'' at 7680X1600. is there anyone here using that res? Is it working fine? Can I use the 3 DVI connectors on the 690 to do this with 3X 2560X1600, or do I have to go 2X DVI + 1 DP?


I hit the VRAM limit with my 3880x1920P set-up so there is no way your GTX 690 SLI will handle that resolution. Performance wise my single GTX 690 doesn't perform all that well either so not sure how you will go.

You can use all the dual link DVIs on the GTX 690.


----------



## thestache

Quote:


> Originally Posted by *Trelga*
> 
> I can't see myself OCing this beast. I really want it but i'mma wait a few weeks to decide on it or 670 sli.
> My cpu is OC'd at the moment at 4Ghz, but is undervolted to 1.22 from stock. I may try to get it 4.5 but i'm not wanting to go above 1.3v


Don't be scared about overclocking. I ran mine at 1.416v and 5000mhz and really was not kind to it and it never showed any signs of slowing down. I had it watercooled very well but even then, as long as you don't go above 1.45v and keep it at 75deg or below it'll give you several years.


----------



## Fallendreams

EVGA *Blackplates* in Stock!









CLICK HERE


----------



## pilla99

Quote:


> Originally Posted by *Fallendreams*
> 
> EVGA *Blackplates* in Stock!
> 
> 
> 
> 
> 
> 
> 
> 
> CLICK HERE


I've never installed a backplate before is it pretty straightforward? I don't think they're going to make an Asus one and since the cards look the same anyways I'm going to grab an EVGA plate.


----------



## Fallendreams

Quote:


> Originally Posted by *pilla99*
> 
> I've never installed a backplate before is it pretty straightforward? I don't think they're going to make an Asus one and since the cards look the same anyways I'm going to grab an EVGA plate.


its super simple! Just screwing and unscrewing. I don't think ASUS will make a backplate for there GTX 690.


----------



## Qu1ckset

Quote:


> Originally Posted by *Fallendreams*
> 
> EVGA *Blackplates* in Stock!
> 
> 
> 
> 
> 
> 
> 
> 
> CLICK HERE


i was just about to order until i saw shipping cost more then the back plate, $70 total....


----------



## Fallendreams

Quote:


> Originally Posted by *Qu1ckset*
> 
> i was just about to order until i saw shipping cost more then the back plate, $70 total....










WOW!!!


----------



## Qu1ckset

If anyone is ordering this backplate can you *PLEASE* order me one and mail it to me, i will pay upfront for backplate, and then when it arrives to you, i will send you more money for shipping witch shud only my $10 max..

PLEASE!!! I WILL SEND REP!!!!


----------



## Arizonian

Woot snagged one! Finally my system shall be aesthetically complete.







WTG EVGA.


----------



## leignheart

is evga freaking kidding? shipping is more than the backplate? this is a slap in the face to everyone who has been waiting to get this thing. and to top it off, it isnt even that great looking. nothing special about it. i might just go with ek on this one cause evga does not deserve my money.


----------



## xoleras

Quote:


> Originally Posted by *leignheart*
> 
> is evga freaking kidding? shipping is more than the backplate? this is a slap in the face to everyone who has been waiting to get this thing. and to top it off, it isnt even that great looking. nothing special about it. i might just go with ek on this one cause evga does not deserve my money.


First, take a deep breath. Then look at the picture: it is being shipped international. Ground shipping through evga is very very cheap to CONUS. Another question, who cares about the price? The backplate doesn't improve temps, its purely aesthetic. And its optional. The shipping charge is purely due to being international, 24.99 seems fine to me.

Just IMHO


----------



## fat_italian_stallion

Quote:


> Originally Posted by *leignheart*
> 
> is evga freaking kidding? shipping is more than the backplate? this is a slap in the face to everyone who has been waiting to get this thing. and to top it off, it isnt even that great looking. nothing special about it. i might just go with ek on this one cause evga does not deserve my money.


It's international shipping... Was only $15 to get my 690s to me in conus.


----------



## egotrippin

Can somebody report if the backplate will improve the rigidity of the card? I've noticed that with the Koolance block, the card isn't as straight as it should be and I don't want it to warp any further.


----------



## Qu1ckset

Quote:


> Originally Posted by *leignheart*
> 
> is evga freaking kidding? shipping is more than the backplate? this is a slap in the face to everyone who has been waiting to get this thing. and to top it off, it isnt even that great looking. nothing special about it. i might just go with ek on this one cause evga does not deserve my money.


if you live in the states dont worry about that shipping charge, i live in canada.. thats why i want someone who lives in the states to help me out


----------



## egotrippin

Quote:


> Originally Posted by *Qu1ckset*
> 
> If anyone is ordering this backplate can you *PLEASE* order me one and mail it to me, i will pay upfront for backplate, and then when it arrives to you, i will send you more money for shipping witch shud only my $10 max..
> PLEASE!!! I WILL SEND REP!!!!


I'll help ya out. I was about to order one anyway.


----------



## egotrippin

Anybody else getting this error on the site when they click "ADD TO CART" ?


----------



## macforth

For the last couple of months, after leaving Aussieland and visiting my daughter and family in MD, I have been trying to buy two EVGA GTX 690 Hydro Copper GPU's, but unsuccesfully due to their very short supply. So I have almost come to the conclusion that buying two aircooled ones , and fitting water blocks will have to be the way to go. I want to get them before I return in a couple of weeks....most of you know why.

To that end I started to look for all the different waterblocks available. I can find no review that has taken all available blocks and compared them, in a similar way to this review: http://translate.googleusercontent.com/translate_c?hl=en&prev=/search%3Fq%3DHEATKILLER%2B%25C2%25AE%2BGPU-X%2B%25C2%25B3%2BGTX%2B680%26hl%3Den%26biw%3D1901%26bih%3D1110%26prmd%3Dimvns&rurl=translate.google.com&sl=de&u=http://www.hardwareluxx.de/community/f137/bundymania-user-test-fullcover-wasserkuehler-fuer-die-amd-hd-7970-grafikkarte-889380.html&usg=ALkJrhgurMrW58D2x83muQWSHLudNXM0XA
Most of those that watercool, all look to good cooling results matched with good looks. As to what looks good......well that's a subjective matter.
Personally I can't seem to warm to those circles, lol.

However, I did like the heatkiller series of blocks made by Watercool. I found some of them shown on various sites for 680's and 670's, but couldn't find mention of 690's. So I went to their site, and saw the following news message, dated 10th May, 2012:-
"In preparation: full cover water coolers for Geforce GTX 670 and GeForce GTX 690
We are currently developing full cover water cooler for NVIDIA GPU series GeForce GTX 670 and GeForce GTX 690. The coolers
are based on the new and improved cooling structure of the HEATKILLER ® GPU-X ³ GTX 680 and connect as usual excellent build
quality and cooling performance with one another"
Now that's quite a while ago, but although they show 680's and 670's listed, I see no mention of 690's.

Has anyone heard any news regarding their GTX 690 blocks?

Here's some of their blocks for other GPU's : http://www.frozencpu.com/cat/l3/g/c87/s143/list/p1/b180/Heatkiller-Water_Blocks_VGA-VGA_Water_Blocks-Page1.html


----------



## Fallendreams

Quote:


> Originally Posted by *egotrippin*
> 
> Anybody else getting this error on the site when they click "ADD TO CART" ?


No its going through on my end. Try to Clear your cache, Temp Files and cookies or try Chrome, Firefox, ETC.


----------



## juanP

so if i use a 690 on a X16 slot does it run in true x16 mode or does it split to x8/x8 because of the dual chips?


----------



## egotrippin

Quote:


> Originally Posted by *macforth*
> 
> For the last couple of months, after leaving Aussieland and visiting my daughter and family in MD, I have been trying to buy two EVGA GTX 690 Hydro Copper GPU's, but unsuccesfully due to their very short supply. So I have almost come to the conclusion that buying two aircooled ones , and fitting water blocks will have to be the way to go. I want to get them before I return in a couple of weeks....most of you know why.
> To that end I started to look for all the different waterblocks available. I can find no review that has taken all available blocks and compared them, in a similar way to this review: http://translate.googleusercontent.com/translate_c?hl=en&prev=/search%3Fq%3DHEATKILLER%2B%25C2%25AE%2BGPU-X%2B%25C2%25B3%2BGTX%2B680%26hl%3Den%26biw%3D1901%26bih%3D1110%26prmd%3Dimvns&rurl=translate.google.com&sl=de&u=http://www.hardwareluxx.de/community/f137/bundymania-user-test-fullcover-wasserkuehler-fuer-die-amd-hd-7970-grafikkarte-889380.html&usg=ALkJrhgurMrW58D2x83muQWSHLudNXM0XA
> Most of those that watercool, all look to good cooling results matched with good looks. As to what looks good......well that's a subjective matter.
> Personally I can't seem to warm to those circles, lol.
> However, I did like the heatkiller series of blocks made by Watercool. I found some of them shown on various sites for 680's and 670's, but couldn't find mention of 690's. So I went to their site, and saw the following news message, dated 10th May, 2012:-
> "In preparation: full cover water coolers for Geforce GTX 670 and GeForce GTX 690
> We are currently developing full cover water cooler for NVIDIA GPU series GeForce GTX 670 and GeForce GTX 690. The coolers
> are based on the new and improved cooling structure of the HEATKILLER ® GPU-X ³ GTX 680 and connect as usual excellent build
> quality and cooling performance with one another"
> Now that's quite a while ago, but although they show 680's and 670's listed, I see no mention of 690's.
> Has anyone heard any news regarding their GTX 690 blocks?
> Here's some of their blocks for other GPU's : http://www.frozencpu.com/cat/l3/g/c87/s143/list/p1/b180/Heatkiller-Water_Blocks_VGA-VGA_Water_Blocks-Page1.html


If the Heatkiller block was available I probably would have purchased that. I'm hoping they make one of the "hole" designed blocks with black on silver. If they release that I will probably sell my Koolance block. I have no idea when they will release the new block but I keep checking the site every few days.

Quote:


> Originally Posted by *Fallendreams*
> 
> No its going through on my end. Try to Clear your cache, Temp Files and cookies or try Chrome, Firefox, ETC.


Tried Chrome, Firefox, and IE - both with me logged in to my EVGA account and not logged in. Turns out, I can't add ANYTHING to my cart right now... not just the backplate. ***?


----------



## egotrippin




----------



## juanP

wow that got sold off fast


----------



## egotrippin

Quote:


> Originally Posted by *juanP*
> 
> so if i use a 690 on a X16 slot does it run in true x16 mode or does it split to x8/x8 because of the dual chips?


Full x16.


----------



## egotrippin

It's available again, I received an email notifying me it was available but I still can't add it to my cart


----------



## Arizonian

Quote:


> Originally Posted by *juanP*
> 
> wow that got sold off fast


Seem to be back in stock.

http://www.evga.com/products/moreInfo.asp?pn=M020-00-000243&family=Accessories%20-%20Hardware&sw=4


----------



## Shogon

I had to go on water, even at stock my system would black screen in BF3 lol. Bought the XSPC Razor block, max temp has been 55C, typical gaming load, under 47C (highest core temp).


















Super efficient compared to SLI 580s.


----------



## egotrippin

It finally worked for me - backplate purchased, $4.99 shipping in the US


----------



## FPSandreas

do you fellas know when they will make oc versions of the gtx 690 like the gtx680 superclocked version??


----------



## egotrippin

Quote:


> Originally Posted by *FPSandreas*
> 
> do you fellas know when they will make oc versions of the gtx 690 like the gtx680 superclocked version??


I doubt there will be a superclocked version but I expect a Classified version.


----------



## Shogon

Quote:


> Originally Posted by *egotrippin*
> 
> I doubt there will be a superclocked version but I expect a Classified version.


I doubt that, Nvidia made it clear to all vendors not to alter the 690 in any way.


----------



## Romin

Guys, my card reaches 70s and starts to artifact in BF3. Is this normal?! even with sli disabled.


----------



## Qu1ckset

Quote:


> Originally Posted by *Romin*
> 
> Guys, my card reaches 70s and starts to artifact in BF3. Is this normal?! even with sli disabled.


Nah man atifacting isn't normal somethings wrong with your card, when I play bf3 and my temps are around the same the game runs perfect.


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> Nah man atifacting isn't normal somethings wrong with your card, when I play bf3 and my temps are around the same the game runs perfect.


Mine doesn't even artifact running BF3 at 3880x1920P with a 1212mhz overclock. Overheats like crazy and has to be stopped (above 75deg real quick) and thats seriously pushing the core and memory.

RMA it.


----------



## Qu1ckset

Quote:


> Originally Posted by *Sexparty*
> 
> Mine doesn't even artifact running BF3 at 3880x1920P with a 1212mhz overclock. Overheats like crazy and has to be stopped (above 75deg real quick) and thats seriously pushing the core and memory.
> RMA it.


Lol ya, I'm running mine on 2560x1440p and it doesn't even seem that hard of a task, and he is only running 1080p I think, regardless its not normal for any card to artifact at stock clocks


----------



## FiShBuRn

Notice: EVGA only ships to the Canada, Mexico, and the United States.









They dont send the backplate to europe?


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Shogon*
> 
> I had to go on water, even at stock my system would black screen in BF3 lol. Bought the XSPC Razor block, max temp has been 55C, typical gaming load, under 47C (highest core temp).
> 
> Super efficient compared to SLI 580s.


That's awful hot for water. I've never even come close to the 40s even while folding for days on end. Might want to check the mount of ur block.


----------



## macforth

For those looking for news about Heatkiller blocks to suit 690's ( eg egotrippin ), I just found this today:-

http://watercool.de/wbb/board1-watercool-support/4284-nvidia-geforce-gtx-690-update/


----------



## Qu1ckset

Quote:


> Originally Posted by *FiShBuRn*
> 
> Notice: EVGA only ships to the Canada, Mexico, and the United States.
> 
> 
> 
> 
> 
> 
> 
> 
> They dont send the backplate to europe?


You can ask someone on ocn to buy one for you and ship it to you like I did, or you just have to wait till other internet shops and eBay get it in stock .


----------



## egotrippin

Quote:


> Originally Posted by *macforth*
> 
> For those looking for news about Heatkiller blocks to suit 690's ( eg egotrippin ), I just found this today:-
> http://watercool.de/wbb/board1-watercool-support/4284-nvidia-geforce-gtx-690-update/


Thanks for the update. I've always thought that the Heatkiller blocks are classy.


----------



## jassilamba

Hey guys, I received my GTX 690 last week and sadly it was a DOA. Can't wait to get the new one in and have some fun with it. Wish could run like







and go get it myself. This wait is killing me.


----------



## egotrippin

Quote:


> Originally Posted by *jassilamba*
> 
> Hey guys, I received my GTX 690 last week and sadly it was a DOA. Can't wait to get the new one in and have some fun with it. Wish could run like
> 
> 
> 
> 
> 
> 
> 
> and go get it myself. This wait is killing me.


I think your avatar wants to eat my avatar.


----------



## pilla99

Is there anyway to tell when the next set of beta drivers are coming? I really want to get a new set and see if it helps some of my issues. I see that the last beta drivers were put out on July 2nd, how long normally do we wait for the next batch??


----------



## thestache

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> That's awful hot for water. I've never even come close to the 40s even while folding for days on end. Might want to check the mount of ur block.


Yeah thats true, mine water cooled and overclocked to 1212mhz only hits 40deg and thats with a loop that has limited cooling.


----------



## jassilamba

Quote:


> Originally Posted by *egotrippin*
> 
> I think your avatar wants to eat my avatar.


Haha, now what I see it, funny the way he/she stops chewing. LOL


----------



## wahdahale

System Specs :

Case - Corsair 650D
CPU - Intel i7 3930k @ 4.2
Cooling - Corsair H100 in pull configuration ( no room because of the heat sink on the mb :[ )
PSU - Corsair AX1200 Modular PSU
MB - ASUS P9X79 Deluxe
Memory - 16gb Corsair Vengence Ram @ 1600
GPU - EVGA GTX690
HD - Corsair GT 180gb SSD


----------



## Romin

Anyone knows how to switch between GPUs in single mode?!


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Romin*
> 
> Anyone knows how to switch between GPUs in single mode?!


plug the monitor into a different dvi port.


----------



## Shogon

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> That's awful hot for water. I've never even come close to the 40s even while folding for days on end. Might want to check the mount of ur block.


It has to do with my ambient temps, and the fact there is 0 insulation in my room from when they built it. Gets on average of 80F in my room, 96F+ not counting humidity lately here as an outside temp, and we don't use the AC much as some.

During the night and when I'm not running my [email protected] machine for 4-5 days straight (in the same room, adds constant hot air, and my room has no real circulation, even with a ceiling fan and 2 other stand fans in my room) the temps are well below 50C with my overclock. Where I live you either need PC parts that run cool or go water. Or move, someday I'll go to the Bay where temps are 10F less or more, can't imagine my temps in a place like that!


----------



## Arizonian

Back plate arrived this afternoon. Finally the little extra protection the 690 deserves.











Spoiler: Warning: GTX 690 Backplate PIC!


----------



## Qu1ckset

looks really good, cant wait to get mine!


----------



## FiShBuRn

omg i want one!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Arizonian*
> 
> Back plate arrived this afternoon. Finally the little extra protection the 690 deserves.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: GTX 690 Backplate PIC!


That should have been included since you spent over $1000 on your card. Does look nice though.


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> That should have been included since you spent over $1000 on your card. Does look nice though.


+1 I agree. One would think.


----------



## Shogon

Nice backplate, in a few days I'll order mine after some more EVGA bucks. I had issues with EK blocks and EVGA backplates on my 580s, would be lame if that happened again lol.


----------



## thestache

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> That should have been included since you spent over $1000 on your card. Does look nice though.


Yeah and 4GB of VRAM for each GPU.


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> Yeah and 4GB of VRAM for each GPU.


Any update on your triple monitor gaming? Was it a VRAM issue or something else?


----------



## juanP

Quote:


> Originally Posted by *Arizonian*
> 
> Back plate arrived this afternoon. Finally the little extra protection the 690 deserves.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: GTX 690 Backplate PIC!


got both my plates today too. looks nice but not worth really the extra 60 bucks. should have been included with the gpus.


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> Any update on your triple monitor gaming? Was it a VRAM issue or something else?


Yeah VRAM limitation.

In Battlefield 3 Ultra settings with AA off and HBAO off I'm hitting 1700-2012MB VRAM usage depending on the map and situation so it's clearly a VRAM issue. Would be guessing with the 4GB cards I'd be using atleast 2500MB VRAM. Selling the GTX 690 on ebay and getting 2x EVGA 4GB with Backplate for SLI and watercooling them. Shame because I do love the GTX 690 and mine overclocks well.

Having a lot of trouble removing the bezel and re-mounting the screens also. Nowhere has any strong double sided tape or metal repair tape to mount the panel to the floating box behind it so thats been put on hold until I can find some or someone that isn't totally hopeless and works at a hardware shop that can assist me.

Surround gaming is worth the trouble so far though, couldn't possibly go back to single monitor gaming.


----------



## Joneszilla

May be able to buy one of these soon and join the club. Question: Would i need to buy anything else? I.e, will the mobo and psu in my sig rig work with the 690? Sorry if stupid question but ive only built this one computer and i havent opened it up since i put it together. Little nervous about changing out working parts. Lol


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> Yeah VRAM limitation.
> In Battlefield 3 Ultra settings with AA off and HBAO off I'm hitting 1700-2012MB VRAM usage depending on the map and situation so it's clearly a VRAM issue. Would be guessing with the 4GB cards I'd be using atleast 2500MB VRAM. Selling the GTX 690 on ebay and getting 2x EVGA 4GB with Backplate for SLI and watercooling them. Shame because I do love the GTX 690 and mine overclocks well.
> Having a lot of trouble removing the bezel and re-mounting the screens also. Nowhere has any strong double sided tape or metal repair tape to mount the panel to the floating box behind it so thats been put on hold until I can find some or someone that isn't totally hopeless and works at a hardware shop that can assist me.
> Surround gaming is worth the trouble so far though, couldn't possibly go back to single monitor gaming.
> 
> 
> Spoiler: Warning: Spoiler!


Nice screenshots... I just bought an Apple 27" monitor so I guess I'll be gaming at 2560 by 1440. Do you think the debezzeling will make much of a difference in experience? I see these setups like Vega's triple monitors on an erogtron lx


Spoiler: Warning: Spoiler!






which looks pretty cool from the front but pretty rough from the back or side. Has anybody came up with a more elegant solution? If I had those three dell monitors you have, I'd just try to find a way to remove the dell logo and leave a clean flat-black finish and maybe place them on a single arm fixed to the desk.


----------



## USFORCES

Someone lost there butt on this card...


----------



## egotrippin

Quote:


> Originally Posted by *USFORCES*
> 
> Someone lost there butt on this card...


New egg says OUT OF STOCK. Was that price a mistake?


----------



## thestache

Quote:


> Originally Posted by *egotrippin*
> 
> Nice screenshots... I just bought an Apple 27" monitor so I guess I'll be gaming at 2560 by 1440. Do you think the debezzeling will make much of a difference in experience? I see these setups like Vega's triple monitors on an erogtron lx
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> which looks pretty cool from the front but pretty rough from the back or side. Has anybody came up with a more elegant solution? If I had those three dell monitors you have, I'd just try to find a way to remove the dell logo and leave a clean flat-black finish and maybe place them on a single arm fixed to the desk.


Quote:


> Originally Posted by *egotrippin*
> 
> Nice screenshots... I just bought an Apple 27" monitor so I guess I'll be gaming at 2560 by 1440. Do you think the debezzeling will make much of a difference in experience? I see these setups like Vega's triple monitors on an erogtron lx
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> which looks pretty cool from the front but pretty rough from the back or side. Has anybody came up with a more elegant solution? If I had those three dell monitors you have, I'd just try to find a way to remove the dell logo and leave a clean flat-black finish and maybe place them on a single arm fixed to the desk.


With the stock stands you barely even see them and it's a clean look.

I guess if the monitor shelf of my desk wasnt raised I might need to raise them and it'd be different but for me not a problem. De-bezzling won't make a difference unless you're using surround really. When I finally get the bezels off of mine I honeslty don't mind how they look from the backs as long as from the front I'm getting what I want.


----------



## egotrippin

Quote:


> Originally Posted by *Sexparty*
> 
> With the stock stands you barely even see them and it's a clean look.
> I guess if the monitor shelf of my desk wasnt raised I might need to raise them and it'd be different but for me not a problem. De-bezzling won't make a difference unless you're using surround really. When I finally get the bezels off of mine I honeslty don't mind how they look from the backs as long as from the front I'm getting what I want.


That looks nice - who cares what it looks like from the back, there's a wall there ;-)


----------



## USFORCES

Quote:


> Originally Posted by *egotrippin*
> 
> New egg says OUT OF STOCK. Was that price a mistake?


No it was an open box discount, either someone paid a big restocking fee or egg took a loss or both.


----------



## juanP

attached my backplates


----------



## Arizonian

^^*That's Awesome* juanP^^ Doubly sweet.







Ties it all together nicely.

All my EVGA cards have had back plates, glad the 690 has one now.


----------



## FalcX

Got my backplate today.... only thing was the $41 shipping and $13 brokerage fee when getting it........Needed to arrange a group sale to Canada.....

Cant wait to get it on.

I know its not the purpose to lower temps, but has anyone seen a temp diff(up or down) since putting the backplate on ?


----------



## Nizzzlle

Joining the club here with my EVGA gtx 690. So pumped!


Spoiler: Warning: Spoiler!


----------



## kxdu

Sign me up, Asus 690


----------



## dynn

i bought the gtx690 last month, but still unable to use it because im building a new comp

i just started with the gpu 690x. I was looking for some information that helps me to decide what processor i should buy and found this forum in google. i think you guys can help me to build my computer.

i will use only one monitor, and well its big, (1920x1080) thats my resolution and im not planing to get two gpu

im a little confuse of what processor should i buy (i7 3770k), (i7 3930k) or the (i7 3960x) i want to use it only for games
but i want a beast computer
Im saving money and now i have 1200dls,( I save like 60 dls peer week and sometimes 100 peer week) i can wait for a better one and still saving money, i know it will take long to get all the parts but its fine, there is no rush.

may i wait for next processors /motherboard releases? and keep saving money?

DSC06180.JPG 2877k .JPG file


----------



## egotrippin

Quote:


> Originally Posted by *dynn*
> 
> i bought the gtx690 last month, but still unable to use it because im building a new comp
> i just started with the gpu 690x. I was looking for some information that helps me to decide what processor i should buy and found this forum in google. i think you guys can help me to build my computer.
> i will use only one monitor, and well its big, (1920x1080) thats my resolution and im not planing to get two gpu
> im a little confuse of what processor should i buy (i7 3770k), (i7 3930k) or the (i7 3960x) i want to use it only for games
> but i want a beast computer
> Im saving money and now i have 1200dls,( I save like 60 dls peer week and sometimes 100 peer week) i can wait for a better one and still saving money, i know it will take long to get all the parts but its fine, there is no rush.
> may i wait for next processors /motherboard releases? and keep saving money?
> 
> DSC06180.JPG 2877k .JPG file


Welcome to the club - if you're going to be gaming on a 1920x1080 monitor that GTX 690 is going to put out more than 60 frames per second on any game you play. The CPU will have little to do with your overall experience. If you're only interested in games, a 3770k is more than ample. $1200 after purchasing the 690 is enough to build a nice computer.

Stop waiting and start shopping!


----------



## thestache

Quote:


> Originally Posted by *dynn*
> 
> i bought the gtx690 last month, but still unable to use it because im building a new comp
> i just started with the gpu 690x. I was looking for some information that helps me to decide what processor i should buy and found this forum in google. i think you guys can help me to build my computer.
> i will use only one monitor, and well its big, (1920x1080) thats my resolution and im not planing to get two gpu
> im a little confuse of what processor should i buy (i7 3770k), (i7 3930k) or the (i7 3960x) i want to use it only for games
> but i want a beast computer
> Im saving money and now i have 1200dls,( I save like 60 dls peer week and sometimes 100 peer week) i can wait for a better one and still saving money, i know it will take long to get all the parts but its fine, there is no rush.
> may i wait for next processors /motherboard releases? and keep saving money?
> 
> DSC06180.JPG 2877k .JPG file


GTX 690 is a good future proof GPU for 1080P but totally un-necessary, would have been better off getting a single GTX 680 Lightning and overclocking it. Anyways, since you're a novice I'd stick with a 3770K or 3570K and overclocking either to 4500mhz with a Corsair H100, ASUS Sabertooth Z77 mobo, Corsair Vengeance 8GB 1866mhz RAM, Corsair Force 3 120GB SSD and whatever else you want. Anything more is overkill and not needed but should future proof you quite well.


----------



## Qu1ckset

Quote:


> Originally Posted by *Sexparty*
> 
> GTX 690 is a good future proof GPU for 1080P but totally un-necessary, would have been better off getting a single GTX 680 Lightning and overclocking it. Anyways, since you're a novice I'd stick with a 3770K or 3570K and overclocking either to 4500mhz with a Corsair H100, ASUS Sabertooth Z77 mobo, Corsair Vengeance 8GB 1866mhz RAM, Corsair Force 3 120GB SSD and whatever else you want. Anything more is overkill and not needed but should future proof you quite well.


the h100 is overkill imo, i have a h80 @4.6 and she stays nice in cool, asus sabertooth, maximus, and rampage boards are all overkill if on a budget , but all bomb mobos







, as far as ram and ssd's pretty much anyone is a good choice, right now i think the vertex4, samsung 830, and crucial m4 are the best bang for your buck right now!


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> the h100 is overkill imo, i have a h80 @4.6 and she stays nice in cool, asus sabertooth, maximus, and rampage boards are all overkill if on a budget , but all bomb mobos
> 
> 
> 
> 
> 
> 
> 
> , as far as ram and ssd's pretty much anyone is a good choice, right now i think the vertex4, samsung 830, and crucial m4 are the best bang for your buck right now!


There is no such thing as overkill when it comes to cooling Ivy bridge. Lol.

Just like you can't put a price on a good motherboard. Buy it right first time so it never needs to be upgraded jntil next generation. Nothing worse than needing a new motherboard or pulling everything out to install a new one. Sabertooths are cheap anyways.


----------



## Qu1ckset

Quote:


> Originally Posted by *Sexparty*
> 
> There is no such thing as overkill when it comes to cooling Ivy bridge. Lol.
> Just like you can't put a price on a good motherboard. Buy it right first time so it never needs to be upgraded jntil next generation. Nothing worse than needing a new motherboard or pulling everything out to install a new one. Sabertooths are cheap anyways.


Ya I have sandy bridge so no worrys, and ya I hear you, that's why I bought everything in my sign rig last me till Maxwell and Haswell


----------



## dynn

Ok then, according to what you guys are saying
ill buy the i7 3770k (for heavy games especialy) and university stuff
like i said before i already got the gtx 690, and i have banked $1200 dls (getting 60/80 dls each week and sometimes more)

now the problem is wich mobo should i get?

theres a lot of them, first of all ( i need to know what exactly mobo does in games and cpu usage)
it doesnt matter if i expend 400 dls or more on mobo only, by knowing its a great choice
but i dont really want to buy something that will be outdated the next year or something like that. Like i said before i can wait and still saving money if something great is coming soon.

1.- looking for best performance
2.- if overkill should be good because it will good for future
3.- motherboard colors for more money (could be fine) < --- depending on how much is it ($200 just for a nice of course will not get it)

i dont know what a mb are, etc etc.. on mobo means, sorri but if someone can explin me with examples ill apreciate it too much.

Another important thing (should i get a mobo with sound card?) its that really important? or should i get one of the best razers headphones?

recomendations are welcome, if theres a forum dedicated to what to get plz link it to me, because still having some question about what to get.

Im from mexico (sorri for bad english) and mexico is a little more expensive getting stuff like processors, and ssd /mobo sometimes they cant have the best ones
my parent are right now in newyork in vacation, maybe i should tell them to buy something in there

right now im just focussing in most important parts on my new build (gtx 690 <-- got it) now its time to get processor (i7 3770k because most recommended) and now need support for mobo

if someone links the mobo should be great

sorri for wasting time guys and thanks to all


----------



## Romin

Quote:


> Originally Posted by *dynn*
> 
> Ok then, according to what you guys are saying
> ill buy the i7 3770k (for heavy games especialy) and university stuff
> like i said before i already got the gtx 690, and i have banked $1200 dls (getting 60/80 dls each week and sometimes more)
> now the problem is wich mobo should i get?
> theres a lot of them, first of all ( i need to know what exactly mobo does in games and cpu usage)
> it doesnt matter if i expend 400 dls or more on mobo only, by knowing its a great choice
> but i dont really want to buy something that will be outdated the next year or something like that. Like i said before i can wait and still saving money if something great is coming soon.
> 1.- looking for best performance
> 2.- if overkill should be good because it will good for future
> 3.- motherboard colors for more money (could be fine) < --- depending on how much is it ($200 just for a nice of course will not get it)
> i dont know what a mb are, etc etc.. on mobo means, sorri but if someone can explin me with examples ill apreciate it too much.
> Another important thing (should i get a mobo with sound card?) its that really important? or should i get one of the best razers headphones?
> recomendations are welcome, if theres a forum dedicated to what to get plz link it to me, because still having some question about what to get.
> Im from mexico (sorri for bad english) and mexico is a little more expensive getting stuff like processors, and ssd /mobo sometimes they cant have the best ones
> my parent are right now in newyork in vacation, maybe i should tell them to buy something in there
> right now im just focussing in most important parts on my new build (gtx 690 <-- got it) now its time to get processor (i7 3770k because most recommended) and now need support for mobo
> if someone links the mobo should be great
> sorri for wasting time guys and thanks to all


Here is a list of parts I recommend.

And u won't need a sound card with this mobo.


----------



## thestache

Quote:


> Originally Posted by *Romin*
> 
> Here is a list of parts I recommend.
> 
> And u won't need a sound card with this mobo.


Pretty much nailed it.

Either Sabertooth or Formula for motherboards and personally I prefer Coirsair or Intel SSD drives because of their speed. But other than that, hit those parts up and you'll be good.


----------



## dynn

Quote:


> Originally Posted by *Sexparty*
> 
> Pretty much nailed it.
> Either Sabertooth or Formula for motherboards and personally I prefer Coirsair or Intel SSD drives because of their speed. But other than that, hit those parts up and you'll be good.


i was taking a look on internet and found thoses mobo:

MOTHERBOARD CROSSHAIR V FORMULA/THUNDERBOLT SOCKET AM3+ AMD 990FX USB 3.0 - ASUS
MOTHERBOARD MAXIMUS IV EXTREME B3 SK1155 - ASUS
MOTHERBOARD MAXIMUS IV EXTREME-Z B3 1155 INTEL Z68 SATA 6GB/S USB 3.0 - ASUS
MOTHERBOARD MAXIMUS V EXTREME SOCKET 1155 - ASUS
MOTHERBOARD MAXIMUS V GENE SOCKET 1155 - ASUS

holy... wichone should i get and whats the difference?


----------



## Romin

Quote:


> Originally Posted by *dynn*
> 
> i was taking a look on internet and found thoses mobo:
> MOTHERBOARD CROSSHAIR V FORMULA/THUNDERBOLT SOCKET AM3+ AMD 990FX USB 3.0 - ASUS
> MOTHERBOARD MAXIMUS IV EXTREME B3 SK1155 - ASUS
> MOTHERBOARD MAXIMUS IV EXTREME-Z B3 1155 INTEL Z68 SATA 6GB/S USB 3.0 - ASUS
> MOTHERBOARD MAXIMUS V EXTREME SOCKET 1155 - ASUS
> MOTHERBOARD MAXIMUS V GENE SOCKET 1155 - ASUS
> holy... wichone should i get and whats the difference?


Neither of them !








http://www.newegg.com/Product/Product.aspx?Item=N82E16813131854


----------



## thestache

Quote:


> Originally Posted by *dynn*
> 
> i was taking a look on internet and found thoses mobo:
> MOTHERBOARD CROSSHAIR V FORMULA/THUNDERBOLT SOCKET AM3+ AMD 990FX USB 3.0 - ASUS
> MOTHERBOARD MAXIMUS IV EXTREME B3 SK1155 - ASUS
> MOTHERBOARD MAXIMUS IV EXTREME-Z B3 1155 INTEL Z68 SATA 6GB/S USB 3.0 - ASUS
> MOTHERBOARD MAXIMUS V EXTREME SOCKET 1155 - ASUS
> MOTHERBOARD MAXIMUS V GENE SOCKET 1155 - ASUS
> holy... wichone should i get and whats the difference?


You want the Z77 version shown in the link.


----------



## egotrippin

I have to vote for the Seasonic Platinum 860 PSU - anandtech said it was quite possibly the best PSU ever, and I have the 1000 watt version

Also, a good Gigabyte mobo can be found for $100-150 on sale. Also, http://www.microcenter.com/ has discounts on motherboard + cpu combination purchases. Their prices are often the best I've found anywhere. I got my Gigabyte Z68XP UD4 there for $150 and it's stable, overclocks easy, and feels heavy and solid like a rock. Gigabyte has many of the overclocking world records on their boards

With SSDs, the previous drives mentioned are good but OCZ Vertex 4 or, what I have - OCZ Vertex 3 MAX IOPS - all hang with or exceed the performance of every drive on the market - use benchmark comparisons at http://www.anandtech.com/bench to see how they are at the top of every benchmark


----------



## max883

just got my water cooling ready to go tonight







Xspc GTX-690 water block, xspc 3x120 radiator, xspc raystorm cpu block, XSPC D5 Dual Bay Reservoir/Pump







All in a thermaltake DH-101 HTpc case


----------



## macforth

Does anyone have 2 X GTX 690's with XSPC waterblocks.........and using an XSPC Razor sli Flow Bridge.

If so......what distance SLI is the flow bridge made for. There are literally no specifications on it to be found anywhere.

I need a tri distance........that is ..the distance between my SLI PCIe slots is ~ 61mm or 2.4 inches


----------



## fat_italian_stallion

Quote:


> Originally Posted by *macforth*
> 
> Does anyone have 2 X GTX 690's with XSPC waterblocks.........and using an XSPC Razor sli Flow Bridge.
> If so......what distance SLI is the flow bridge made for. There are literally no specifications on it to be found anywhere.
> I need a tri distance........that is ..the distance between my SLI PCIe slots is ~ 61mm or 2.4 inches


distance between pcie slots is standard between all boards. Mobos wouldn't fit in cases otherwise.


----------



## macforth

You are correct.....the distance between PCIe slots is set under the ATX standard.......but that is not what I was referring to.
When using SLI, different motherboards will advise using different slots.
For example: The Asus Maximus V Extreme has 4 full length red PCIe slots, and one black full length one situated between the 2nd and 3rd red slots.
If you are running 4 x SLI then all four red slots must be populated. However, if you are running dual SLI then the first red slot and the black slot are to be populated.
So in dual, the distance between the Vid cards 1 and 2.......is different to the distance between the first two cards in 4 X SLI.

I have 2 690's and the MVE.........the distance of the PCIe slots when running these is ~61mm or 2.4 inches.

As the XSPC SLI bridge has zero specifications listed anywhere, I was interested if I could use this, should I buy two XSPC water blocks.


----------



## fat_italian_stallion

In the maximus v extreme for SLI you should populate the 1st red and the 3rd red as stated by asus in the manual since you'll get the most bandwidth that way assuming no other red slots are populated. As for the razor sli bridge, there's only 1 model out right now so you really don't have a choice on which slots to run it in, unlike EK that has multiple lengths that skip slots.


----------



## macforth

I am sorry, but that is incorrect. The Asus manual does NOT state that, in fact both the written one that came with mine, and the downloadable one CLEARLY state that when using two cards in SLI then slots PCIe 3.0/2.0_x16/8_1 and PCIe 3.0/2.0_x8_2B, should be used.

The first one is the first red slot, and the second (2B) is the black slot.

As to whether there may be a difference if using the second card in a different slot, I simply don't know the answer. But both the slots suggested run PCIe 3.0 and in SLI (dual) would be running in x8 mode (native). Thats the equivalence of x16 PCIe 2

As to the distance of the XSPC SLI bridge, ........I now have an answer from XSPC themselves. They have notified me that the length is 21 mm and so is unable to be used.


----------



## Anzial

On M5E it is _advisable_ to use first red slot and the black 16x slot for 2x SLI because it'll use native CPU lanes, bypassing the PLX chip and provide most performance. That's straight from the mouth of Shamino, the designer of the M5E board.


----------



## Supreme888

40 experimental project?


----------



## Qu1ckset

Quote:


> Originally Posted by *Supreme888*
> 
> 40 experimental project?


wow.....


----------



## thestache

Quote:


> Originally Posted by *macforth*
> 
> I am sorry, but that is incorrect. The Asus manual does NOT state that, in fact both the written one that came with mine, and the downloadable one CLEARLY state that when using two cards in SLI then slots PCIe 3.0/2.0_x16/8_1 and PCIe 3.0/2.0_x8_2B, should be used.
> The first one is the first red slot, and the second (2B) is the black slot.
> As to whether there may be a difference if using the second card in a different slot, I simply don't know the answer. But both the slots suggested run PCIe 3.0 and in SLI (dual) would be running in x8 mode (native). Thats the equivalence of x16 PCIe 2
> As to the distance of the XSPC SLI bridge, ........I now have an answer from XSPC themselves. They have notified me that the length is 21 mm and so is unable to be used.


Just get x2 koolance gtx 690 blocks with the adjustable 1-2 or 2-3 SLI bridge. That way you can choose to run them in either serial or parralelle and get the blocks to fit your mobo instead of the other way round.

http://koolance.com/help-video-block-connecting

Otherwise you're talking a big gap between blocks and regular tubing would work fine for that. 61mm is a lot of room.


----------



## pilla99

40,000 dollars sitting on that table. Take those down to a dealership and grab a nice car.


----------



## Supreme888

__
https://www.reddit.com/r/xtp46/my_friends_work_just_got_a_shipment_of_37_gtx/


----------



## macforth

Watercool have announced that their GTX 690 waterblocks are close to completion, and that photos of the waterblocks will more than likely be out at the end of this week.

The coolers are to be based on the new and improved cooling structure of the HEATKILLER ® GPU-X ³ GTX 680. They have also made it known that 690 backplates will be available with the blocks.


----------



## PCModderMike

Quote:


> Originally Posted by *Supreme888*
> 
> 40 experimental project?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Holy Smokes!


----------



## jagz

Quote:


> Originally Posted by *macforth*
> 
> Watercool have announced that their GTX 690 waterblocks are close to completion, and that photos of the waterblocks will more than likely be out at the end of this week.
> The coolers are to be based on the new and improved cooling structure of the HEATKILLER ® GPU-X ³ GTX 680. They have also made it known that 690 backplates will be available with the blocks.


Hope it's soon, I'm about to buy this koolance block. I actually wandered here to ask about backplates, I can't find any? There is one on PPC's but it's only for EK so they say.


----------



## dynn

i just bought motherboard maximus V formula and i7 3770k just like you recommended to me
i apreciate too much your help, and im learning more about computers.

i will have it in 2 weeks, and what i said before, im going for parts.

what case do you suggest me?

i think one that fits so good with the motherboard maximus v formula and the gtx 690

1.- i like to have one that has enough space to manipulate everything with no problems
2.- one that have great look
3.- good fan
just the enought i need to oc the i7 3770k and space and great look with fan and that.

what you say? or someone in here that suggest


----------



## dynn

sorry forgot something, im going to buy the H100 water


----------



## thestache

Quote:


> Originally Posted by *jagz*
> 
> Hope it's soon, I'm about to buy this koolance block. I actually wandered here to ask about backplates, I can't find any? There is one on PPC's but it's only for EK so they say.


The koolance block is probably the best on the market and can reccomend it, also the backplates are available direct from EVGA.


----------



## Divineshadowx

Is there any way of making the 690 keep a good stable 120fps. My 690s always throttle below the stock 915. Like im playing bf3 and getting 80fps with a quad sli setup on 1080p, I should be getting 200 on a small map. Basically every game i play, even very low demanding ones im never getting 120fps. I tried evga precision frame rate target but that was an utter failure.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Is there any way of making the 690 keep a good stable 120fps. My 690s always throttle below the stock 915. Like im playing bf3 and getting 80fps with a quad sli setup on 1080p, I should be getting 200 on a small map. Basically every game i play, even very low demanding ones im never getting 120fps. I tried evga precision frame rate target but that was an utter failure.


Are you using vsync? I don't have a 120hz monitor but when I was curious what theoretical top end frame rate I could get, I turned vsync off and I was getting about 130 fps on average. If you're getting 120 fps and you have 120 hz monitor, screen tearing shouldn't be an issue anyway. I've also noticed that gpu overclocks matter more at lower framerates but doing a good memory overclock matters at very high frame rates so crank up that memory clock +300 or more.


----------



## Qu1ckset

I'm curious about the 690 hydro copper, is there a light on the side of it like the stock 690 cooler and if so what color is it?
Secondly I know this block is going to probably have less performance then others like koolance and ek but is it going to be a big hit in performance or only a little worse?


----------



## egotrippin

Quote:


> Originally Posted by *Qu1ckset*
> 
> I'm curious about the 690 hydro copper, is there a light on the side of it like the stock 690 cooler and if so what color is it?
> Secondly I know this block is going to probably have less performance then others like koolance and ek but is it going to be a big hit in performance or only a little worse?


The waterblock for the Hydro Coppers is designed by Swiftech and they make good products. I have a Swiftech CPU block which works great. Before My GTX 690 with a Koolance waterblock I had a GTX 580 Classified Hydro Copper which was also designed by Swiftech. In this review http://www.xtremesystems.org/forums/showthread.php?279944-Bundymania-User-Review-10x-GTX-570-580-Fullcover-Waterblocks you can see how the Swiftech designed Hydro Copper block for the GTX 580 delivered the lowest temperature of the test selection. If that same quality was passed along to the GTX 690 Hydro Copper than I'd expect it's the best as well.


----------



## Divineshadowx

Quote:


> Originally Posted by *egotrippin*
> 
> Are you using vsync? I don't have a 120hz monitor but when I was curious what theoretical top end frame rate I could get, I turned vsync off and I was getting about 130 fps on average. If you're getting 120 fps and you have 120 hz monitor, screen tearing shouldn't be an issue anyway. I've also noticed that gpu overclocks matter more at lower framerates but doing a good memory overclock matters at very high frame rates so crank up that memory clock +300 or more.


Well, im using adaptive vsync because it seemed like the "new" tech, but i guess it is useless since my frame rates dont actually drop below 60? And if i turn off vsync i get like 500 in some games. Maybe i should change the power managment mode but in that case the cards stay at 915mhz and heat my room.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, im using adaptive vsync because it seemed like the "new" tech, but i guess it is useless since my frame rates dont actually drop below 60? And if i turn off vsync i get like 500 in some games. Maybe i should change the power managment mode but in that case the cards stay at 915mhz and heat my room.


Arizonian would be better suited to answer because he does 120hz but I know from when I tried without vsync, the framerate was high enough that I couldn't notice any screen tearing. Try it out ;-)

And yes, bump up that power target. If I'm anywhere close to falling short of my target frame rate I bump that power target to +135%


----------



## Qu1ckset

Anyone on here with watercooled 690 system I want to know your read setup and what temps you get with them please!

I wanna find the smallest layout of rads I can get away with, I want a minimalistic water cooling build with the least tubing I can get away with so it looks simple and clean, but gets better temps then my current setup!


----------



## egotrippin

Quote:


> Originally Posted by *Qu1ckset*
> 
> Anyone on here with watercooled 690 system I want to know your read setup and what temps you get with them please!
> I wanna find the smallest layout of rads I can get away with, I want a minimalistic water cooling build with the least tubing I can get away with so it looks simple and clean, but gets better temps then my current setup!


Gaming, my temps are anywhere from 39c-46c. And that's when overclocked running games at 2560 x 1440 that use up 90% or more of the 690s capabilities.


----------



## Qu1ckset

Quote:


> Originally Posted by *egotrippin*
> 
> Gaming, my temps are anywhere from 39c-46c. And that's when overclocked running games at 2560 x 1440 that use up 90% or more of the 690s capabilities.


but what rads are you using and what cpu do you have?


----------



## egotrippin

Quote:


> Originally Posted by *Qu1ckset*
> 
> but what rads are you using and what cpu do you have?


i7 2600k @ 4.4 Ghz with swiftech apogee hd waterblock

I have a 480mm rad and a 120mm rad although I bet I could get away with just a 240. I have Feser rads and they aren't made anymore but if I was in the market I'd definitely buy Alphacools all copper rads like the UT60


----------



## Qu1ckset

Quote:


> Originally Posted by *egotrippin*
> 
> i7 2600k @ 4.4 Ghz with swiftech apogee hd waterblock
> I have a 480mm rad and a 120mm rad although I bet I could get away with just a 240. I have Feser rads and they aren't made anymore but if I was in the market I'd definitely buy Alphacools all copper rads like the UT60


if i end up building my loop, il probably stick to hwlabs black ice extreme rads, i loved them on my last build


----------



## egotrippin

Quote:


> Originally Posted by *Qu1ckset*
> 
> if i end up building my loop, il probably stick to hwlabs black ice extreme rads, i loved them on my last build


You will find excellent information here: martinsliquidlab.org


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, im using adaptive vsync because it seemed like the "new" tech, but i guess it is useless since my frame rates dont actually drop below 60? And if i turn off vsync i get like 500 in some games. Maybe i should change the power managment mode but in that case the cards stay at 915mhz and heat my room.


Sorry what?

You're running quad SLI for 1080p with vsync on? You realize you could do that with a single GTX 680 and get better performance than you're getting now.

The reason you don't use more of the GPUs can be related to a lot of things, core parking, drivers, BF3 is a crap game and the fact it doesn't need to use the GPUs more. When I ran quad SLI tests it did the same. Then suddenly hooked up my 3880x1920 surround set up and 99-97% usage and max performance out of my GPUs.

For a start turn off Vsync it's useless and creates input lag and google core parking and how to change it. Turn that off. Select maximum performance in your drivers and you should see an improvement. But seriously high frames doesn't mean smooth gameplay But more heat and work for your system.

Use your frame rate target in precision x to minimize screen tearing not v sync.


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> Anyone on here with watercooled 690 system I want to know your read setup and what temps you get with them please!
> I wanna find the smallest layout of rads I can get away with, I want a minimalistic water cooling build with the least tubing I can get away with so it looks simple and clean, but gets better temps then my current setup!


Built a system for my brother using a 140mm and a 280mm rad for a GTX 690 and 3930k and he never gets over 70deg for his CPU and or over 45deg with his GTX 690. Both with sizable overclocks. 4700mhz on the CPU and 1212mhz on the GPUs.


----------



## Qu1ckset

Quote:


> Originally Posted by *Sexparty*
> 
> Built a system for my brother using a 140mm and a 280mm rad for a GTX 690 and 3930k and he never gets over 70deg for his CPU and or over 45deg with his GTX 690. Both with sizable overclocks. 4700mhz on the CPU and 1212mhz on the GPUs.


cpu seems pretty warm, are the rads in push/pull or?


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> cpu seems pretty warm, are the rads in push/pull or?


It's a 3930k overclocked to 4700mhz dude. Of course it's hot. The VRM for the CPU Core overheat to 82deg without proper cooling. I'm having to watercool mine because I want 5000mhz when I watercool mine.

280mm is in push and the 140mm in push pull. 30FPI rads which could benefit from a lot higher RPM fans but too noisy.


----------



## jassilamba

Hey guys,

Im one of those owners who has owned this card for almost a month but never gotten to use it. Keep getting Code 43 in the device manager. Received the replacement 690 today and the same thing. Tried it in 2 different systems same thing.

Running the following:
i7 - 2600K
Asus Sabertooth P67
OCZ 750W PSU (Have an X-1050 on the way)
16 GB of Gskill sniper
2 HDDs

6 Fans

any ideas if I'm doing something wrong. Or any known issues with the p67 board,

EVGA was nice enough to advance RMA the 3rd one in so should be here sometime next Thursday. Hopefully I can finally enjoy the awesome power of GTX 690


----------



## Romin

Quote:


> Originally Posted by *jassilamba*
> 
> Hey guys,
> Im one of those owners who has owned this card for almost a month but never gotten to use it. Keep getting Code 43 in the device manager. Received the replacement 690 today and the same thing. Tried it in 2 different systems same thing.
> Running the following:
> i7 - 2600K
> Asus Sabertooth P67
> OCZ 750W PSU (Have an X-1050 on the way)
> 16 GB of Gskill sniper
> 2 HDDs
> 6 Fans
> any ideas if I'm doing something wrong. Or any known issues with the p67 board,
> EVGA was nice enough to advance RMA the 3rd one in so should be here sometime next Thursday. Hopefully I can finally enjoy the awesome power of GTX 690


Fresh windows !


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Sexparty*
> 
> Built a system for my brother using a 140mm and a 280mm rad for a GTX 690 and 3930k and he never gets over 70deg for his CPU and or over 45deg with his GTX 690. Both with sizable overclocks. 4700mhz on the CPU and 1212mhz on the GPUs.


The cooling for those components is extremely underkill. Even at 5.2 I don't even see 70s with 5.5vcore and highest 37c with +172 core on the gpus. Might be a bad mount for the block at worst or fans speed should be upped/ extra rad added.


----------



## thestache

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> The cooling for those components is extremely underkill. Even at 5.2 I don't even see 70s with 5.5vcore and highest 37c with +172 core on the gpus. Might be a bad mount for the block at worst or fans speed should be upped/ extra rad added.


I don't understand what you're trying to get at. The system is fine.


----------



## dynn

HI there what do u guys think about this tower

http://www.newegg.com/Product/Product.aspx?Item=N82E16811119225

its for:

maximus V formula (already bought, ill get it in 1 week)
gtx 690 (already got it)
i7 3770k (already bought, ill get it in 1 week)
Corsair H100 (planing to buy it soon)

I need good space
prefer a tower that has good cooling and be perfect for motherboard
i dont know if its a waste getting a full atx tower (should i be fine with mid?)

remember im planning on OC

all suggestions are welcome


----------



## PoWn3d_0704

With a 690 and a processor that is that fast, you want a full tower. ESP with the H100 cooler. I have the NZXT Phantom, and it comes with brackets to mount the H100 in a push/pull config with the two top 200mm fans. I don't know if the HAF has brackets like that, But it is a huge tower, and you shouldn't have a problem. There is nothing worse than having to build a system in a case that is too damn small. I love to have room to work.


----------



## egotrippin

Quote:


> Originally Posted by *jassilamba*
> 
> Hey guys,
> Im one of those owners who has owned this card for almost a month but never gotten to use it. Keep getting Code 43 in the device manager. Received the replacement 690 today and the same thing. Tried it in 2 different systems same thing.
> Running the following:
> i7 - 2600K
> Asus Sabertooth P67
> OCZ 750W PSU (Have an X-1050 on the way)
> 16 GB of Gskill sniper
> 2 HDDs
> 6 Fans
> any ideas if I'm doing something wrong. Or any known issues with the p67 board,
> EVGA was nice enough to advance RMA the 3rd one in so should be here sometime next Thursday. Hopefully I can finally enjoy the awesome power of GTX 690


That's some bad luck. Hang in there. I once had three defective Blackberrys in a row. I kept returning the defective phone only to discover another hardware issue with the replacement.
Quote:


> Originally Posted by *dynn*
> 
> HI there what do u guys think about this tower
> http://www.newegg.com/Product/Product.aspx?Item=N82E16811119225
> its for:
> maximus V formula (already bought, ill get it in 1 week)
> gtx 690 (already got it)
> i7 3770k (already bought, ill get it in 1 week)
> Corsair H100 (planing to buy it soon)
> I need good space
> prefer a tower that has good cooling and be perfect for motherboard
> i dont know if its a waste getting a full atx tower (should i be fine with mid?)
> remember im planning on OC
> all suggestions are welcome


Go full ATX. I'm a fan of Lian Li and Silverstone cases unless you're reaching the end of your budget and don't wish to spend $300 or more. For computer cases under $300, I like Corsair and Fractal Designs.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Sexparty*
> 
> I don't understand what you're trying to get at. The system is fine.


I'm getting at it seems as if there's something wrong with temps. They are very high for those overclocks.


----------



## thestache

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> I'm getting at it seems as if there's something wrong with temps. They are very high for those overclocks.


The temps are higher than optimal because it needs a bigger radiator instead of the 140mm but we wer unable to fit a bigger one at the time. Are the temps bad? No. Is it fine? Yes. Just not as good as it could be. But there is nothing wrong with 60-65deg on the CPU and 40-45 in games and gaming benchmarks.


----------



## jassilamba

Quote:


> Originally Posted by *egotrippin*
> 
> That's some bad luck. Hang in there. I once had three defective Blackberrys in a row. I kept returning the defective phone only to discover another hardware issue with the replacement.
> Go full ATX. I'm a fan of Lian Li and Silverstone cases unless you're reaching the end of your budget and don't wish to spend $300 or more. For computer cases under $300, I like Corsair and Fractal Designs.


Win re-install took care of it. EVGA rep said that they know that it works but their products need to work out of the box without spending the rest of the night installing everything. Anyways thanks for the tip guys. I cancelled my RMA, and hopefully will get to experience some 690 goodness tonight.

On a side note I would recommend the NZXT Switch 810 (I know its not Lian Li) its a great case if you are on a budget with great water cooling and modding capabilities.

Again thanks for your help guys.


----------



## Divineshadowx

Does anyone know if there is something else to over volt for overclocking ram with ivy beside the vcore of the ram and cpu? I've had two different ram kits now, one a g.skill 2400mhz, and another corsair 2133mhz and none run xmp. The g.skill was rated at 1.65v @ 2400mhz, but wouldn't run anything above 1600 without randomly crashing my setup, iIeven went to 1.8v, and still it crashed randomly while playing games and benching. Same with the corsair which is rated at 1.5v @ 2133mhz, I tried the xmp at 1.65v and it still crashed. Both crash at anything above 1600mhz. My 3770k is at 4.5ghz. Any help?


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Does anyone know if there is something else to over volt for overclocking ram with ivy beside the vcore of the ram and cpu? I've had two different ram kits now, one a g.skill 2400mhz, and another corsair 2133mhz and none run xmp. The g.skill was rated at 1.65v @ 2400mhz, but wouldn't run anything above 1600 without randomly crashing my setup, iIeven went to 1.8v, and still it crashed randomly while playing games and benching. Same with the corsair which is rated at 1.5v @ 2133mhz, I tried the xmp at 1.65v and it still crashed. Both crash at anything above 1600mhz. My 3770k is at 4.5ghz. Any help?


It's your motherboard.

Clear your CMOS, update BIOS, clear CMOS, then google how to overclock your RAM to achieve such speeds with your motherboard. Read the specs for you board carefully and then you'll understand why you're only able to get it to work at 1600mhz (your board is only guaranteed to run at 1600mhz, everything above that is an OC). There's nothing wrong with your motherboard you're just not overclocking it properly/or your board needs the CMOS cleared/a BIOS update.

DDR3 2800(O.C.)/2600(O.C.)/2400(O.C.)/2200(O.C.)/2133(O.C.)/2000(O.C.)/1866(O.C.)/1800(O.C.)/1600/1333 MHz


----------



## Divineshadowx

I'll try clearing the cmos, but my board is updated. On the asus site it specifically lists the corsair ram to run at 2133. And i thought the none o.c were just stock speeds, because ive never seen a board without the o.c above 1866.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Sexparty*
> 
> The temps are higher than optimal because it needs a bigger radiator instead of the 140mm but we wer unable to fit a bigger one at the time. Are the temps bad? No. Is it fine? Yes. Just not as good as it could be. But there is nothing wrong with 60-65deg on the CPU and 40-45 in games and gaming benchmarks.


No reason to get hostile, just trying to save you a few degrees. I know with the gpu blocks some are afraid to really crank the block down onto the card which causes higher temps.


----------



## jagz

&original=http%3A%2F%2Fcdn.overclock.net%2F1%2F11%2F11bcb97b_20120811_184405.jpeg">



What type of OC's have you guys managed? So Core Voltage won't work and I can only apply vcore via bringing power limit to 135%?


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> It's your motherboard.
> Clear your CMOS, update BIOS, clear CMOS, then google how to overclock your RAM to achieve such speeds with your motherboard. Read the specs for you board carefully and then you'll understand why you're only able to get it to work at 1600mhz (your board is only guaranteed to run at 1600mhz, everything above that is an OC). There's nothing wrong with your motherboard you're just not overclocking it properly/or your board needs the CMOS cleared/a BIOS update.
> DDR3 2800(O.C.)/2600(O.C.)/2400(O.C.)/2200(O.C.)/2133(O.C.)/2000(O.C.)/1866(O.C.)/1800(O.C.)/1600/1333 MHz


Thanks a lot man, resetting the cmos allowed me to run my new vengeance at xmp







. And ya bf3 is crap, doesn't use the full potential of the gpu. And I noticed something to, when you go up close to a building or something bf3 and look closely, it looks like crap. But in heaven when you go in the free move around mode, and go close to the grass or walls, it looks as sharp as it did from far away. Interesting.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Thanks a lot man, resetting the cmos allowed me to run my new vengeance at xmp
> 
> 
> 
> 
> 
> 
> 
> . And ya bf3 is crap, doesn't use the full potential of the gpu. And I noticed something to, when you go up close to a building or something bf3 and look closely, it looks like crap. But in heaven when you go in the free move around mode, and go close to the grass or walls, it looks as sharp as it did from far away. Interesting.


Re-setting CMOS can fix anything sometimes. Maybe get into the habit of clearing the CMOS everytime you install a new motherboard/components and before and after each time you update your BIOS. Saved my P8P67 WS from overvolting the CPU even when setting a manual voltage and other strange things in the past.

Glad it's working now man.

Just more reasons BF3 is a terrible game. Hadn't noticed that one but the lists of things I do see wrong with it engine wise/graphically is looooong.

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> No reason to get hostile, just trying to save you a few degrees. I know with the gpu blocks some are afraid to really crank the block down onto the card which causes higher temps.


Not getting hostle. I appreacite you seeing something and trying to help, just in this instance there is no problem and I'm aware the water loops performance could be better.


----------



## dynn

Hi there!

I need to know if this processor and motherboard

procesor:
http://www.newegg.com/Product/Product.aspx?Item=N82E16819116501

motherboard:
http://www.newegg.com/Product/Product.aspx?Item=N82E16813131854

are the same in amazon, because parents are getting it from amazon i just want to make sure if the ones of newegg are the same that amazon

procesor:
http://www.amazon.com/Intel-i7-3770K-Quad-Core-Processor-Cache/dp/B007SZ0EOW/ref=sr_1_1?ie=UTF8&qid=1344370351&sr=8-1&keywords=i7+3770k

motherboard:
http://www.amazon.com/Maximus-FORMULA-Extended-SupremeFX-Motherboard/dp/B008CJ1KAA/ref=sr_1_1?s=electronics&ie=UTF8&qid=1344370676&sr=1-1&keywords=asus+maximus+v+formula

i wanted to buy the newegg.com products, but for some reason my parents will buy it in amazon, i just want to be sure if theyre the exactly the same


----------



## egotrippin

Quote:


> Originally Posted by *dynn*
> 
> Hi there!
> I need to know if this processor and motherboard
> procesor:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16819116501
> motherboard:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131854
> are the same in amazon, because parents are getting it from amazon i just want to make sure if the ones of newegg are the same that amazon
> procesor:
> http://www.amazon.com/Intel-i7-3770K-Quad-Core-Processor-Cache/dp/B007SZ0EOW/ref=sr_1_1?ie=UTF8&qid=1344370351&sr=8-1&keywords=i7+3770k
> motherboard:
> http://www.amazon.com/Maximus-FORMULA-Extended-SupremeFX-Motherboard/dp/B008CJ1KAA/ref=sr_1_1?s=electronics&ie=UTF8&qid=1344370676&sr=1-1&keywords=asus+maximus+v+formula
> i wanted to buy the newegg.com products, but for some reason my parents will buy it in amazon, i just want to be sure if theyre the exactly the same


Yes. Pay attention to the model number of products like the CPU which is BX80637I73770K. Some products have different versions/revisions. I've never purchased anything from NewEgg but I have purchased well over fifty items from Amazon and I've always been satisfied. If you're interested in doing more customization or liquid cooling, check out FrozenCPU.com


----------



## Qu1ckset

Quote:


> Originally Posted by *egotrippin*
> 
> Yes. Pay attention to the model number of products like the CPU which is BX80637I73770K. Some products have different versions/revisions. I've never purchased anything from NewEgg but I have purchased well over fifty items from Amazon and I've always been satisfied. If you're interested in doing more customization or liquid cooling, check out FrozenCPU.com


i usually use performance-pcs.com for my watercooling stuff


----------



## macforth

DYNN..............they are the same.

And one thing though.........to have parents who are prepared to spend well over $600 on the latest in computer gear are priceless.........
.............just make sure you show your appreciation!!!!!!!

....... signed Grandpa Macforth


----------



## thunder1

These cards are awesome in NV surround, Can I join


----------



## Arizonian

Quote:


> Originally Posted by *thunder1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> These cards are awesome in NV surround, Can I join


Nice cards n rig you got there thunder1.









To get on the list PM 'jcde7ago' our OP and he'll get you added. Have you run any benchmarks to see what she's got under the hood?

*OFFICIAL* Top 30 Heaven Benchmark 3.0 Scores & Top 30 3d Mark 11 Scores Using Performance Settings are great places to compare some scores in fun.

Have you done any over clocking and if so what stable OC did you end up with?









Edited to add: "How to put your Rig in your Sig" - Welcome to OCN as well.


----------



## Qu1ckset

Quote:


> Originally Posted by *thunder1*
> 
> 
> These cards are awesome in NV surround, Can I join


sick RV01, makes me miss my old one lol


----------



## Kyouki

OH BOY I would like to see these teamed up with a GTX690 I am tempted to sell my GSkill and buy This Ram. I think with them both being a black and aluminum look would really stand out and both have the same angles, should just be pleasing to the eye! What you think?



Link to newegg, http://www.newegg.com/Product/Product.aspx?Item=N82E16820233287


----------



## Damir Poljak

Hi guys, just got my Gtx 690! I decided Not to wait on ati 7990 and I think I made a good choice.
$1k is a lot of money but what the heck, I earn enough. Will Post some testings later on, cheers


----------



## Divineshadowx

Is it up to a program to use full gpu potential? In Crysis 2 I get 40% usage on all gpus, and hover around 80-100fps for some reason. And it never uses the boost clock either, even when i'm not at my refresh rate, instead it throttles if anything. But in heaven and vantage I get 80-90% usage and the boost clock works, and I get the highest possible fps.


----------



## sockpirate

how is the micro stuttering with the 690 ?

and what are some of the overclocks you guys are seeing on air?


----------



## Arizonian

Quote:


> Originally Posted by *sockpirate*
> 
> how is the micro stuttering with the 690 ?
> and what are some of the overclocks you guys are seeing on air?


No microstutter on single 27" 120 Hz display. Frame metering on hardware level is working well.

GPU #1 reaches 1175 MHz Core and GPU #2 reaches 1201 MHz when synced to highest stable Core clock for me on air.


----------



## sockpirate

I am really considering this! Such a beast of a card, nvidia really got it right this time compared to the 590!


----------



## sockpirate

Would an AX 750 be enough for a 3570k and a 690 ? Requires a minimum of 650w , i plan on overclocking the CPU to at least 4.5ghz, which will be on water.

Should i bump up to an AX 850 to be safe if i plan on OCing the GPUs as well ?


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Is it up to a program to use full gpu potential? In Crysis 2 I get 40% usage on all gpus, and hover around 80-100fps for some reason. And it never uses the boost clock either, even when i'm not at my refresh rate, instead it throttles if anything. But in heaven and vantage I get 80-90% usage and the boost clock works, and I get the highest possible fps.


Are you using the high resolution texture package and DX11? What resolution are you running? That's bellow the usage I have experienced.


----------



## egotrippin

Quote:


> Originally Posted by *sockpirate*
> 
> Would an AX 750 be enough for a 3570k and a 690 ? Requires a minimum of 650w , i plan on overclocking the CPU to at least 4.5ghz, which will be on water.
> Should i bump up to an AX 850 to be safe if i plan on OCing the GPUs as well ?


Yes it's enough. I have a 690 now, and even though I'm now using a Seasonic Platnium 1000 now, I was using a cheap LSW 750 before which wasn't a TRUE 750 because it only had like 600W on the 12V rail which is what EVERYTHING runs off of. I was able to use that to power my 690 which uses two 8 PIN connectors and before that I was able to use the the LSW to power my GTX 580 Classified Hydro Copper which used an extra 6 pin connector in addition to the two 8 pins. I would run Folding which kept my GPU and CPU at 100% (both overclocked) and never had an issue with power.
Make sure to look at the specs on the side of the of your PSU to be sure you have enough watts on the 12V rail. The 690 specs are for at least 38 Amps. 38 amps x 12 volts = 456 watts. a 690 will use 300 - 375 watts when overclocked and your average overclocked i7 quad core will only use a max of around 95 watts so you see how you could get by on as little as a a 450 watt psu.


----------



## Divineshadowx

Quote:


> Originally Posted by *egotrippin*
> 
> Are you using the high resolution texture package and DX11? What resolution are you running? That's bellow the usage I have experienced.


I'm running 1080p lol. But i'm sure quad sli is enough for 120fps on crysis 2. And ya completely maxed out, texture and dx11. It never goes up above 100fps. Same with bf3, I tried bf3 with a 4th of my gpu, so basically a single 680, and I got the EXACT same fps, 90-100. In heaven I got 150fps, is it driver issues?


----------



## jagz

690 <3 Water. [email protected] temp is a minimun 30c lower now (~75c to ~42c)

I noticed something in afterburner.. On my 580's, I could see vrm temp and such.. Can't on the 690. How can I see these values?


----------



## Damir Poljak

Hey guys, I just switched after a long time to NVidia, (gtx 690), ati lost this one for me








If anybody can recommend drivers for 690 single card on two monitors, that would be awesome. Thank you all


----------



## jagz

Quote:


> Originally Posted by *Damir Poljak*
> 
> Hey guys, I just switched after a long time to NVidia, (gtx 690), ati lost this one for me
> 
> 
> 
> 
> 
> 
> 
> 
> If anybody can recommend drivers for 690 single card on two monitors, that would be awesome. Thank you all


I'm using 301.42 running 2 monitors.

Main monitor top right DVI
2nd monitor bottom DVI


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> I'm running 1080p lol. But i'm sure quad sli is enough for 120fps on crysis 2. And ya completely maxed out, texture and dx11. It never goes up above 100fps. Same with bf3, I tried bf3 with a 4th of my gpu, so basically a single 680, and I got the EXACT same fps, 90-100. In heaven I got 150fps, is it driver issues?


Have you tried turning off vsync? I can get 130 fps with one 690 in BF3 @ 1080

I'm using 304.79 drivers right now - here's a screenshot of 120+ fps... it can go up to 150 but usually hangs out 100-120 with vsync off... but it's still not maxing out the GPUs


----------



## thestache

Quote:


> Originally Posted by *Damir Poljak*
> 
> Hey guys, I just switched after a long time to NVidia, (gtx 690), ati lost this one for me
> 
> 
> 
> 
> 
> 
> 
> 
> If anybody can recommend drivers for 690 single card on two monitors, that would be awesome. Thank you all


304.48 is working best for me on a surround set-up. Other drivers do alot of strange things like turn my monitors off and back on during explosions in games and such.

If you're going to run more than one monitor I'd have to reccomend GTX 680 4GB SLI, the GTX 690 doesn't have enough VRAM.


----------



## Qu1ckset

Got my evga 690 backplate thanks to egotrippin









Also ordered my evga hydro copper block today!







, should have my loop finished next month, still have a bunch of stuff to order


----------



## jagz

Quote:


> Originally Posted by *jagz*
> 
> I noticed something in afterburner.. On my 580's, I could see vrm temp and such.. Can't on the 690. How can I see these values?


Anyone know?

Also, I have set MSI Afterburner's power limit to 135% but the shown vcore on AB (which I can't change) is 1.175v. Is that the max? Do I need to use precision X? I ask because even a modest OC is failing Wu's ([email protected])


----------



## Divineshadowx

Quote:


> Originally Posted by *egotrippin*
> 
> Have you tried turning off vsync? I can get 130 fps with one 690 in BF3 @ 1080
> I'm using 304.79 drivers right now - here's a screenshot of 120+ fps... it can go up to 150 but usually hangs out 100-120 with vsync off... but it's still not maxing out the GPUs


I turned off vsync, and put the power management to max, but still the same. 30-40% usage, and the never activates the boost clock. I get as low as 60fps, and it goes up to about 150 in some parts.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> I turned off vsync, and put the power management to max, but still the same. 30-40% usage, and the never activates the boost clock. I get as low as 60fps, and it goes up to about 150 in some parts.


Pretty sure you should check core parking. When I get very low usage and bad performance almost always it's because of core parking. Sometimes drivers and I need a restart bust mostly core parking.

What's yours set to?


----------



## nVIDIASLiRig

COUNT ME IN! Here my eVGA GTX 690, soon get 2nd one!


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Pretty sure you should check core parking. When I get very low usage and bad performance almost always it's because of core parking. Sometimes drivers and I need a restart bust mostly core parking.
> What's yours set to?


I already did that, I just double checked now with a tool and it says unparked.


----------



## jagz

Quote:


> Originally Posted by *nVIDIASLiRig*
> 
> COUNT ME IN! Here my GTX 690, soon get 2nd one!
> 
> 
> Spoiler: Warning: Spoiler!


Nice, Block that badboy! I just did on mine



and another one







why haha


----------



## dboythagr8

Quote:


> Originally Posted by *Sexparty*
> 
> Pretty sure you should check core parking. When I get very low usage and bad performance almost always it's because of core parking. Sometimes drivers and I need a restart bust mostly core parking.
> What's yours set to?


Quote:


> Originally Posted by *Divineshadowx*
> 
> I already did that, I just double checked now with a tool and it says unparked.


I am also getting weird usage issues. What is 'core parking' and how do I check it?


----------



## Divineshadowx

Quote:


> Originally Posted by *dboythagr8*
> 
> I am also getting weird usage issues. What is 'core parking' and how do I check it?


It's an issue with multicored cpu and windows. Google it. And I think the drivers are ****. I dont think benchmarking is even worth it right now. What is the point if your parts are not using their full potential. With vsync off you should get 100% usage in benchmarks, because it is a "benchmark".


----------



## dboythagr8

Quote:


> Originally Posted by *Divineshadowx*
> 
> It's an issue with multicored cpu and windows. Google it. And I think the drivers are ****. I dont think benchmarking is even worth it right now. What is the point if your parts are not using their full potential. With vsync off you should get 100% usage in benchmarks, because it is a "benchmark".


What tool did you use?


----------



## thestache

Quote:


> Originally Posted by *dboythagr8*
> 
> What tool did you use?


Quote:


> Originally Posted by *Divineshadowx*
> It's an issue with multicored cpu and windows. Google it. And I think the drivers are ****. I dont think benchmarking is even worth it right now. What is the point if your parts are not using their full potential. With vsync off you should get 100% usage in benchmarks, because it is a "benchmark".


You don't need a tool.


Spoiler: Warning: Spoiler!



Right clock desktop go personalize.

Go screen saver.

Click change power settings.

Click change advanced power settings.

Then go down and click processor power management and then minimum processor state.

Then change setting to 100 and restart and try the game out.



If that doesn't help then that isn't your problem.

But let me just say one thing even if you guys won't like it. GTX 690 SLI at 1080p is *bloody stupid* and I'm not suprized you're having usage issues. I did benchmarks out of intrest when I first got my two GTX 690s and had similar issues and was not impressed/did not see scaling that justified running them like that. Take one of the cards out/disable it and see if you get 100% usage and report back. Because honestly I don't see any program maxing all four of those cores out at such a low resolution. You'll run out of VRAM long before one or two of the cores start to struggle lead alone four.

I would not even run quad SLI on my 3880x1920 resolution with all the money in the world. The benefit from the fourth core is minimal. I will be running GTX 680 4GB SLI when my card sells and even then the max I would install into my beastly X79 system is tri SLI. Quad SLI doesn't scale well at all from my experiences and the price vs performance renders it a waste of money.

I would ditch the second card and replace it with a 30 inch 2560x1600P monitor or surround set-up.

Just something to think about.


----------



## dboythagr8

Quote:


> Originally Posted by *Sexparty*
> 
> You don't need a tool.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Right clock desktop go personalize.
> Go screen saver.
> Click change power settings.
> Click change advanced power settings.
> Then go down and click processor power management and then minimum processor state.
> Then change setting to 100 and restart and try the game out.
> 
> 
> If that doesn't help then that isn't your problem.
> But let me just say one thing even if you guys won't like it. GTX 690 SLI at 1080p is *bloody stupid* and I'm not suprized you're having usage issues. I did benchmarks out of intrest when I first got my two GTX 690s and had similar issues and was not impressed/did not see scaling that justified running them like that. Take one of the cards out/disable it and see if you get 100% usage and report back. Because honestly I don't see any program maxing all four of those cores out at such a low resolution. You'll run out of VRAM long before one or two of the cores start to struggle lead alone four.
> I would not even run quad SLI on my 3880x1920 resolution with all the money in the world. The benefit from the fourth core is minimal. I will be running GTX 680 4GB SLI when my card sells and even then the max I would install into my beastly X79 system is tri SLI. Quad SLI doesn't scale well at all from my experiences and the price vs performance renders it a waste of money.
> I would ditch the second card and replace it with a 30 inch 2560x1600P monitor or surround set-up.
> Just something to think about.


I only have one 690. I think he has two. Also if you look at my rig you'll see that my main monitor is a Dell U3011:thumb:. I'm currently using my 120hz 1080p monitor because...I have both and love to go back and experience 120hz from time to time.

Thanks for the help btw.

Edit: Did what you said and min processor state was at 100%.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> You don't need a tool.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Right clock desktop go personalize.
> Go screen saver.
> Click change power settings.
> Click change advanced power settings.
> Then go down and click processor power management and then minimum processor state.
> Then change setting to 100 and restart and try the game out.
> 
> 
> If that doesn't help then that isn't your problem.
> But let me just say one thing even if you guys won't like it. GTX 690 SLI at 1080p is *bloody stupid* and I'm not suprized you're having usage issues. I did benchmarks out of intrest when I first got my two GTX 690s and had similar issues and was not impressed/did not see scaling that justified running them like that. Take one of the cards out/disable it and see if you get 100% usage and report back. Because honestly I don't see any program maxing all four of those cores out at such a low resolution. You'll run out of VRAM long before one or two of the cores start to struggle lead alone four.
> I would not even run quad SLI on my 3880x1920 resolution with all the money in the world. The benefit from the fourth core is minimal. I will be running GTX 680 4GB SLI when my card sells and even then the max I would install into my beastly X79 system is tri SLI. Quad SLI doesn't scale well at all from my experiences and the price vs performance renders it a waste of money.
> I would ditch the second card and replace it with a 30 inch 2560x1600P monitor or surround set-up.
> Just something to think about.


Quote:


> Originally Posted by *Sexparty*
> 
> You don't need a tool.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Right clock desktop go personalize.
> Go screen saver.
> Click change power settings.
> Click change advanced power settings.
> Then go down and click processor power management and then minimum processor state.
> Then change setting to 100 and restart and try the game out.
> 
> 
> If that doesn't help then that isn't your problem.
> But let me just say one thing even if you guys won't like it. GTX 690 SLI at 1080p is *bloody stupid* and I'm not suprized you're having usage issues. I did benchmarks out of intrest when I first got my two GTX 690s and had similar issues and was not impressed/did not see scaling that justified running them like that. Take one of the cards out/disable it and see if you get 100% usage and report back. Because honestly I don't see any program maxing all four of those cores out at such a low resolution. You'll run out of VRAM long before one or two of the cores start to struggle lead alone four.
> I would not even run quad SLI on my 3880x1920 resolution with all the money in the world. The benefit from the fourth core is minimal. I will be running GTX 680 4GB SLI when my card sells and even then the max I would install into my beastly X79 system is tri SLI. Quad SLI doesn't scale well at all from my experiences and the price vs performance renders it a waste of money.
> I would ditch the second card and replace it with a 30 inch 2560x1600P monitor or surround set-up.
> Just something to think about.


I did the min cpu already. And the reasoon I got quad sli is becayse benchmarks showed 60fps in games, and I wanted 120fps. But the second one makes no difference. Same exact fps in bf3 and crysis with one gpu. The only difference I saw was in heaven, I didnt get double the fps but it was close. And I know quad sli is complete overkill for 1080p, but that still doesnt explain usage , the gpu is there to do math and render, why would resolution matter, with vsync off it should use all of it's power to get the highest fps possible. At least that is what the term usage means.
And also, the power target is useless, it never goes above 100% when I have it set to 135% even when the usage in heaven is 80-90%. Im pretty sure it is a driver issue. The last stable release from nvidia was 3 months ago, so they obviously see a problem. What is the point of two betas, might as well update the last one.


----------



## Divineshadowx

Well I tried this again, and removed my second 690, and I'm getting better results than two of them. Constant 120fps in bf3, and the boost clock actually activates. Get 50-70% usage now.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well I tried this again, and removed my second 690, and I'm getting better results than two of them. Constant 120fps in bf3, and the boost clock actually activates. Get 50-70% usage now.


That's crazy... how does it scale at higher resolutions? I had considered getting a second 690 but when I realized that a single 690 exceeded my needs at 1080 and just perfectly met my needs at 2560x1440p I think it would be a waste to buy a second.


----------



## PowerK

I haven't used 1080p display for sometime. But I'm happily using my 690 Quad-SLI with U3011 (2560x1600).
I've done a lot of comparing between different SLI modes = 2GPU, 3GPU and 4GPU using nVIDIA Inspector. Crysis, Crysis 2, Witcher 2, GTA4, Sleeping Dogs etc. (But I have't played BF3)

If I'm to build my desktop again today, I'd still go for 690 Quad-SLI. I love it.

From my experience, at 2560x1600, 690 Quad-SLI is needed for,
Crysis with 4xMSAA + 4xSGSSAA
GTA4 with ENB supersampling AA
Witcher 2 with ulbersampling enabled.
Sleeping Dogs with Extreme AA enabled.


----------



## Divineshadowx

Quote:


> Originally Posted by *PowerK*
> 
> I haven't used 1080p display for sometime. But I'm happily using my 690 Quad-SLI with U3011 (2560x1600).
> I've done a lot of comparing between different SLI modes = 2GPU, 3GPU and 4GPU using nVIDIA Inspector. Crysis, Crysis 2, Witcher 2, GTA4, Sleeping Dogs etc. (But I have't played BF3)
> If I'm to build my desktop again today, I'd still go for 690 Quad-SLI. I love it.
> From my experience, at 2560x1600, 690 Quad-SLI is needed for,
> Crysis with 4xMSAA + 4xSGSSAA
> GTA4 with ENB supersampling AA
> Witcher 2 with ulbersampling enabled.
> Sleeping Dogs with Extreme AA enabled.


There is seriously no point in two. With two I get 80-100fps in bf3, with one I get 160fps at 1080p. At 2560x1600/1400 you would get 120fps constant. On Crysis 2, I get 100fps, but the game seems to be capped at that for some reason, even when I am looking at the sky I get 100fps. I'm not sure if it is the drivers or any game's coding simply not being able to handle normal usage with 4 cores. At 5760x1080 you might fall below 120, but maybe not because I am only getting 70-80% usage in games. The only reason anyone would need two would be 7680x1600 or something more crazy, and at that resolution the bottleneck would be the 2gb effective vram. Otherwise, simply for benchmarking, vantage/heaven would benefit from 2.


----------



## PowerK

If you don't find two 690, then by all means, sell one and stick with one 690 or you can go for two 680 SLI.

I find two 690s excellent across the board with the games I play. Due to the horsepower provided by 4 GPUs, image quality is perfect at 2560x1600.
Single 690 (and 680 2-Way SLI) isn't enough to play those games I mentioned eariler like.. Crysis with 4xMSAA + 4xSGSSAA
GTA4 with ENB supersampling AA
Witcher 2 with ulbersampling enabled.
Sleeping Dogs with Extreme AA enabled.


----------



## egotrippin

Quote:


> Originally Posted by *PowerK*
> 
> If you don't find two 690, then by all means, sell one and stick with one 690 or you can go for two 680 SLI.
> I find two 690s excellent across the board with the games I play. Due to the horsepower provided by 4 GPUs, image quality is perfect at 2560x1600.
> Single 690 (and 680 2-Way SLI) isn't enough to play those games I mentioned eariler like.. Crysis with 4xMSAA + 4xSGSSAA
> GTA4 with ENB supersampling AA
> Witcher 2 with ulbersampling enabled.
> Sleeping Dogs with Extreme AA enabled.


checked out your sig rig - red Lian Li, brave choice - I like it


----------



## PowerK

Cheers, egotrippin.








Yes, I was brave when decided on this red Lian Li case. I had always used black cases for desktop and I wanted different colors for change. No regrets.


----------



## Divineshadowx

Well I turned on the 4xmsaa + 4xsgssaa and I'm still getting my capped 100fps in crysis 2. Just to make sure though, what do I do after I configured the settings in inspector? Do I just apply and leave it there?


----------



## Ruby Rabbit

Hi Just noticed your post when I was looking to see the GTX690 temps for Metro. I'm getting 82c with 75% fan. It's bugging me, I have a Corsair 650D case.

However your Haven score is very low, in fact it's more like you are only using one GPU. Have you checked that both GPU's are running? I'm getting 87.2 on Haven (max settings 1920x1080) with an OC I get 98.2 On Metro maxed out, I get around 80-120fps.


----------



## thestache

Quote:


> Originally Posted by *dboythagr8*
> 
> I only have one 690. I think he has two. Also if you look at my rig you'll see that my main monitor is a Dell U3011:thumb:. I'm currently using my 120hz 1080p monitor because...I have both and love to go back and experience 120hz from time to time.
> Thanks for the help btw.
> Edit: Did what you said and min processor state was at 100%.


Sorry was talking about him.

Personally I don't like to disable core parking, I think it's a bad band-aid since it keeps your voltage at it's max and your clock speed at it's max all the time and I like my CPU to clock down and idle when using the desktop (I run offsets), I run high overclocks and it's not great for it to be running like that all the time even under watercooling but sometimes it's the only way to fix the problem core parking can sometimes create.

Quote:


> Originally Posted by *Divineshadowx*
> 
> I did the min cpu already. And the reasoon I got quad sli is becayse benchmarks showed 60fps in games, and I wanted 120fps. But the second one makes no difference. Same exact fps in bf3 and crysis with one gpu. The only difference I saw was in heaven, I didnt get double the fps but it was close. And I know quad sli is complete overkill for 1080p, but that still doesnt explain usage , the gpu is there to do math and render, why would resolution matter, with vsync off it should use all of it's power to get the highest fps possible. At least that is what the term usage means.
> And also, the power target is useless, it never goes above 100% when I have it set to 135% even when the usage in heaven is 80-90%. Im pretty sure it is a driver issue. The last stable release from nvidia was 3 months ago, so they obviously see a problem. What is the point of two betas, might as well update the last one.
> 
> Well I tried this again, and removed my second 690, and I'm getting better results than two of them. Constant 120fps in bf3, and the boost clock actually activates. Get 50-70% usage now.


Seems to be driver related. Which drivers you using? I get really low usage and strange issues with the latest drivers. I'm using 304.48 which seems to be the best for me, maybe try that out.

Glad you're getting better utilization with the single card. But resolution does matter. If i switch my surround off I'll get max usage around 70-80% at 1200p, soon as I switch to surround I never drop below 95%. All that second card is at 1080p is a paperweight.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Sorry was talking about him.
> Personally I don't like to disable core parking, I think it's a bad band-aid since it keeps your voltage at it's max and your clock speed at it's max all the time and I like my CPU to clock down and idle when using the desktop (I run offsets), I run high overclocks and it's not great for it to be running like that all the time even under watercooling but sometimes it's the only way to fix the problem core parking can sometimes create.
> Seems to be driver related. Which drivers you using? I get really low usage and strange issues with the latest drivers. I'm using 304.48 which seems to be the best for me, maybe try that out.
> Glad you're getting better utilization with the single card. But resolution does matter. If i switch my surround off I'll get max usage around 70-80% at 1200p, soon as I switch to surround I never drop below 95%. All that second card is at 1080p is a paperweight.


It is worse than a paperweight, it drops my performance in basically every game. I still don't think I should get lower fps though, lower usage yes because it splits it up among two more cores. Do you think it would be useful if I add two more monitors. I really don't want to get rid of it. But as of now, it is sitting on my bed because it drops my fps. If I sell it I could add two more monitors and water my cool my entire system, but if I keep it I have to wait, because I'm starting college and need the money the first couple of days.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> It is worse than a paperweight, it drops my performance in basically every game. I still don't think I should get lower fps though, lower usage yes because it splits it up among two more cores. Do you think it would be useful if I add two more monitors. I really don't want to get rid of it. But as of now, it is sitting on my bed because it drops my fps. If I sell it I could add two more monitors and water my cool my entire system, but if I keep it I have to wait, because I'm starting college and need the money the first couple of days.


Seems stupid it kills performance aey. I didn't get worse performance with my GTX 690 SLI benchmarks but there were very minimal gains if not any.

I'd sell it and get x2 extra monitors or sell both and get x2 extra monitors and GTX 680 4GB SLI which will overclock higher and not run out of VRAM in surround set-ups. But it's up to you, either way you should sell the paper weight and watercool your system.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Seems stupid it kills performance aey. I didn't get worse performance with my GTX 690 SLI benchmarks but there were very minimal gains if not any.
> I'd sell it and get x2 extra monitors or sell both and get x2 extra monitors and GTX 680 4GB SLI which will overclock higher and not run out of VRAM in surround set-ups. But it's up to you, either way you should sell the paper weight and watercool your system.


3 monitors might suffer with the 690, I was playing crysis 2 and was getting 1972mb vram usage, and this is on 1080p alone. Nvidia/board partners should have made it 4gb, because it gets outperformed by 2 680 and costs the same, if it was 4gb then maybe it would be worth the price. But I don't really like 3 monitors, looks cool but you're getting the same dpi. I would rather get a higher res monitor but they seem like a rip off right now, 1000 for a decent one, and they are not 120hz. I'm getting use to 120hz now. You would think buying expensive hardware would solve problems, but nope, it creates more lol.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> 3 monitors might suffer with the 690, I was playing crysis 2 and was getting 1972mb vram usage, and this is on 1080p alone. Nvidia/board partners should have made it 4gb, because it gets outperformed by 2 680 and costs the same, if it was 4gb then maybe it would be worth the price. But I don't really like 3 monitors, looks cool but you're getting the same dpi. *I would rather get a higher res monitor but they seem like a rip off right now, 1000 for a decent one,* and they are not 120hz. I'm getting use to 120hz now. You would think buying expensive hardware would solve problems, but nope, it creates more lol.


You can get one of the Korean high-def monitors like Shimian or Catleap for $350-400 and that'll give ya 2560x1440 using what I think is only a slight grade down from Apple monitors. I got an Apple 27" display for $715 on eBay and it's awesome... you don't even know what colors you're missing until you get something that has 80%+ Adobe RGB Gamut... it's like I'm viewing everything for the first time. There are so many colors that I didn't even realize weren't very well represented on a regular monitor. I'm not a graphic artist so I didn't "need" that color space with the uniformity and viewing angles of an IPS display but I can still appreciate it.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> 3 monitors might suffer with the 690, I was playing crysis 2 and was getting 1972mb vram usage, and this is on 1080p alone. Nvidia/board partners should have made it 4gb, because it gets outperformed by 2 680 and costs the same, if it was 4gb then maybe it would be worth the price. But I don't really like 3 monitors, looks cool but you're getting the same dpi. I would rather get a higher res monitor but they seem like a rip off right now, 1000 for a decent one, and they are not 120hz. I'm getting use to 120hz now. You would think buying expensive hardware would solve problems, but nope, it creates more lol.


I totally agree with you. Lol. Just creates more problems and trouble trying to get everything to work perfectly in harmony.

I'm selling my GTX 690 because it doesn't have enough VRAM. Very slack and stupid of Nvidia to only include 2GB per GPU and was a bad purchase on my part but wasn't going to run a surround set-up so can't be too dissapointed.


----------



## tonyjones

Do you guys think upgrading from a Radeon 6990 to GTX 690 is worth it? I'm a casual gamer.


----------



## Qu1ckset

Code:

Quote:


> Originally Posted by *tonyjones*
> 
> Do you guys think upgrading from a Radeon 6990 to GTX 690 is worth it? I'm a casual gamer.


What resolution are you playing at?

I actually sold my 6990 to get the gtx690 huge performance deference, but if your just playing at 1080p or 1440p the 6990 should be fine.

The gtx690 is a lot cooler and no where nere as loud as the hd6990


----------



## tonyjones

Right now I'm only playing at 1920x1200, I plan on getting a 27" or 30" soon though.


----------



## Qu1ckset

Quote:


> Originally Posted by *tonyjones*
> 
> Right now I'm only playing at 1920x1200, I plan on getting a 27" or 30" soon though.


honestly you should be fine at 1440p and 1600p,


----------



## jagz

When I bought my 690 I don't know how I overlooked it's vram. I thought it was 4GB.. I didn't think Asus would try to sucker people that bad and say it's 4GB when they know full well 2GB on one GPU and 2GB on the other GPU do not = 4GB.

I hope I never run into any vram limitations at 1440p. Ran 3DMark11 last night, this is 5.2Ghz on the 2700k (I can go much higher, this chip is amazing) and 1161Mhz on the 690 (I can't go any higher no matter what I do)



Link

That's with Nvidia beta driver 304.79, got quite a lower score on driver 301.42


----------



## Qu1ckset

Quote:


> Originally Posted by *jagz*
> 
> When I bought my 690 I don't know how I overlooked it's vram. I thought it was 4GB.. I didn't think Asus would try to sucker people that bad and say it's 4GB when they know full well 2GB on one GPU and 2GB on the other GPU do not = 4GB.
> I hope I never run into any vram limitations at 1440p. Ran 3DMark11 last night, this is 5.2Ghz on the 2700k (I can go much higher, this chip is amazing) and 1161Mhz on the 690 (I can't go any higher no matter what I do)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Link
> That's with Nvidia beta driver 304.79, got quite a lower score on driver 301.42


Um every dual gpu card has been like that.. they all advertise the combined vram total, this is nothing new, both nvidea and amd do it.

like the gtx295, 2gb hd5970(1gb x 1gb), 2gb gtx460 2win(1gb x 1gb), 2gb gtx560 2win(1gb x 1gb), 4gb hd6990(2gb x 2gb), 3gb gtx590(1,5gb x 1.5gb), gtx690(2gb x2gb), and when the 6gb hd7990 comes out wit will be 3gb x 3gb

you should be perfectly fine at 1440p


----------



## Divineshadowx

I'm going to wait for new drivers, if they do not solve my usage problems i'm going to get rid of 1 of my 690s and buy more/better monitors. It makes no sense how the power is there but games cannot utilize it. Heaven stresses them at 1080p to 70-80% so graphical load is out of the question imo.


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> Um every dual gpu card has been like that.. they all advertise the combined vram total, this is nothing new, both nvidea and amd do it.
> like the gtx295, 2gb hd5970(1gb x 1gb), 2gb gtx460 2win(1gb x 1gb), 2gb gtx560 2win(1gb x 1gb), 4gb hd6990(2gb x 2gb), 3gb gtx590(1,5gb x 1.5gb), gtx690(2gb x2gb), and when the 6gb hd7990 comes out wit will be 3gb x 3gb
> you should be perfectly fine at 1440p


A single 690 is overkill for 1440p and 1600p, I get 150fps on bf3, and the max 100 on crysis 2 @1080p. Since there are no 1440/1600p monitors at 120hz, 60 is all you're gonna need.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> A single 690 is overkill for 1440p and 1600p, I get 150fps on bf3, and the max 100 on crysis 2 @1080p. Since there are no 1440/1600p monitors at 120hz, 60 is all you're gonna need.


*overkill is just the right amount of kill*


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> A single 690 is overkill for 1440p and 1600p, I get 150fps on bf3, and the max 100 on crysis 2 @1080p. Since there are no 1440/1600p monitors at 120hz, 60 is all you're gonna need.


The gtx690 is overkill right now, but in a year or two it won't, look at how the 5970 and 295 perform now, they ain't so over kill like they used to be.


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> The gtx690 is overkill right now, but in a year or two it won't, look at how the 5970 and 295 perform now, they ain't so over kill like they used to be.


Those cards are 3-4 year old not 1-2. And we wont see any upgrades in resolution until 2k tv's start coming out, or when display port starts becoming the new thing, because dl-dvi and hdmi cant support 1080+ @120hz. Maybe more demanding games since new consoles are coming but still, a single 690 will eat all the new consoles combined sevenfold.


----------



## Divineshadowx

Quote:


> Originally Posted by *egotrippin*
> 
> You can get one of the Korean high-def monitors like Shimian or Catleap for $350-400 and that'll give ya 2560x1440 using what I think is only a slight grade down from Apple monitors. I got an Apple 27" display for $715 on eBay and it's awesome... you don't even know what colors you're missing until you get something that has 80%+ Adobe RGB Gamut... it's like I'm viewing everything for the first time. There are so many colors that I didn't even realize weren't very well represented on a regular monitor. I'm not a graphic artist so I didn't "need" that color space with the uniformity and viewing angles of an IPS display but I can still appreciate it.


That price point seems really nice. But they have 16.7mil colors vs the 1billion of the expensive ones. Would that make a difference, especially since it is ips? If i sell my 690 I could buy 3 of those, for a crazy resolution. But I think i would run out vram.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> That price point seems really nice. But they have 16.7mil colors vs the 1billion of the expensive ones. Would that make a difference, especially since it is ips? If i sell my 690 I could buy 3 of those, for a crazy resolution. But I think i would run out vram.


I'm of the mindset that the monitor should be among the most expensive components you purchase... after all, it's the part of the computer you spend time looking at. If you're the kind of person to drop the dough for GTX 690s you can probably get a nice monitor. If I didn't purchase an Apple display I would have bought HPs ZR30w. It has 111% of the Adobe RGB gamut and has among the lowest input lag of any IPS monitor on the market. Check out the review at http://www.anandtech.com/show/3754/a-new-30-contender-hp-zr30w-review


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Those cards are 3-4 year old not 1-2. And we wont see any upgrades in resolution until 2k tv's start coming out, or when display port starts becoming the new thing, because dl-dvi and hdmi cant support 1080+ @120hz. Maybe more demanding games since new consoles are coming but still, a single 690 will eat all the new consoles combined sevenfold.


Not to mention cards back then were rather weak compared to todays tech. This is the first generation in a while a single GPU card can play any game at 1080-1200P since crysis.

With that being said they will become oputdated but not nearly as quickly as the GTX 295.

Quote:


> Originally Posted by *egotrppin*
> I'm of the mindset that the monitor should be among the most expensive components you purchase... after all, it's the part of the computer you spend time looking at. If you're the kind of person to drop the dough for GTX 690s you can probably get a nice monitor. If I didn't purchase an Apple display I would have bought HPs ZR30w. It has 111% of the Adobe RGB gamut and has among the lowest input lag of any IPS monitor on the market. Check out the review at http://www.anandtech.com/show/3754/a-new-30-contender-hp-zr30w-review


Looks like the first ever 30inch worth gaming on. Thats really exciting!


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Not to mention cards back then were rather weak compared to todays tech. This is the first generation in a while a single GPU card can play any game at 1080-1200P since crysis.
> With that being said they will become oputdated but not nearly as quickly as the GTX 295.
> Looks like the first ever 30inch worth gaming on. Thats really exciting!


That post is 2 and a half years old lol, have they seriously not released anything new. I woud like that higher res over my 1080 but selling my 690 seems less future proof. But also it sitting on my bed seems less usefull than that. I seriously do think those monitors are overpriced though. I mean if costs more than a 690, a 3960x, basically any computer component out there. The shimians are real cheap, even though people say they are apple rejects, I would take that as a compliment as I find apple trash, no offense to egotrippin. But if I go that route i might as well get 3, but 7860x1440 would probably be the demise of a single 690.


----------



## egotrippin

Quote:


> Originally Posted by *Divineshadowx*
> 
> That post is 2 and a half years old lol, have they seriously not released anything new. I woud like that higher res over my 1080 but selling my 690 seems less future proof. But also it sitting on my bed seems less usefull than that. I seriously do think those monitors are overpriced though. I mean if costs more than a 690, a 3960x, basically any computer component out there. The shimians are real cheap, even though people say they are apple rejects, I would take that as a compliment as I find apple trash, no offense to egotrippin. But if I go that route i might as well get 3, but 7860x1440 would probably be the demise of a single 690.


I read somewhere that HP should be doing a refresh on their Z series soon... but even though it's over two years old it's still among the best out there and you can get one used for around $750 on eBay.

Apple rejects should be a compliment because Apple usually represents the apex of technology so being just shy of their quality guidelines means you still have an excellent product. Unfortunately, their target market isn't gamers and their products aren't cheap so I know there's a lot of resentment out there. The GTX 690 is supported with the release of Mountain Lion so it looks like some of those compatibility issues may be starting to go away.


----------



## Qu1ckset

I'd rather buy a U3011 over a ZR30W any day, but I decided to go with the crossover just for the price range and I don't regret it one bit, and honestly I went the eyefinity route and hated it when it came to gaming, way to many problems, IF I ever where to try it again it would be years from now when the bugs are ironed out, and when they have higher rez screens main streap like 1080p and have super slim bezels till then, I'd rather own a bigger high rez screen any day.

And if you were planing on doing 1440p/1600p eyefinity/surround buying 2x gtx690's was a bad purchase due to limited vram. A gtx690 is perfect for any single screen setup, and I learned my lesson with 4way xfire/sli when I had my xfire 5970s its been proven in several benches 3way sli is the best as usually the 4th barely adds much performance.


----------



## iARDAs

Will there ever be a 690 with 8GB vram?

Qu1ckset pointed a great fact.

690 is truly a beast and YES 2GB of vram per core is enough, but how come there is not a version of 690 with 4gb vram per core?

I hate the marketing scam of Nvidia's dual core GPUs.

My previous 590 was advertised as 3GB of vram, but it was 1.5GB to be honest.


----------



## dynn

Hi there!

wich SSD is the best one? m4 crucial?

http://www.newegg.com/Product/Product.aspx?Item=N82E16820148442

And what memory ram is better

G.SKILL Ripjaws Z Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2400 (PC3 19200) Desktop Memory Model F3-2400C10D-8GZH
http://www.newegg.com/Product/Product.aspx?Item=N82E16820231585
or this one

CORSAIR Dominator Platinum 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2133 Desktop Memory Model CMD8GX3M2A2133C9
http://www.newegg.com/Product/Product.aspx?Item=N82E16820233285

should ripjaws be best because is 2400?

i have i7 3770k and motherboard maximus v formula

suggestion are welcome


----------



## Qu1ckset

Quote:


> Originally Posted by *iARDAs*
> 
> Will there ever be a 690 with 8GB vram?
> 
> Qu1ckset pointed a great fact.
> 
> 690 is truly a beast and YES 2GB of vram per core is enough, but how come there is not a version of 690 with 4gb vram per core?
> 
> I hate the marketing scam of Nvidia's dual core GPUs.
> 
> My previous 590 was advertised as 3GB of vram, but it was 1.5GB to be honest.


You are the second person to comment on the advertised vram for dual gpu cards, this is nothing new and ati/amd do the same thing, its been like this since the birth of dual gpu cards.... I knew this when buying my 5970 when I was newb to computers..

And no they won't release a 8gb 690, unless you count Asus Mars III which will come with a huge price tag and only be in limited production..
Keep your fingers crossed if the release a 790


----------



## Qu1ckset

Quote:


> Originally Posted by *dynn*
> 
> Hi there!
> wich SSD is the best one? m4 crucial?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16820148442
> And what memory ram is better
> 
> G.SKILL Ripjaws Z Series 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2400 (PC3 19200) Desktop Memory Model F3-2400C10D-8GZH
> http://www.newegg.com/Product/Product.aspx?Item=N82E16820231585
> or this one
> CORSAIR Dominator Platinum 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 2133 Desktop Memory Model CMD8GX3M2A2133C9
> http://www.newegg.com/Product/Product.aspx?Item=N82E16820233285
> should ripjaws be best because is 2400?
> i have i7 3770k and motherboard maximus v formula
> suggestion are welcome


Man your posting this stuff in the wrong thread, post these kinda of questions in the Intel build section or ssd section, and to answer questions
For 128gb ssds go for the crucial m4, Samsung 830, or ocz vertex 4, pick what ever one has the best price.

And for ram if your just gaming there is no need for that speed of ram, it does nothing for you unless ur benching and stuff, both are good sticks pick what evers cheaper IMO


----------



## iARDAs

Quote:


> Originally Posted by *Qu1ckset*
> 
> You are the second person to comment on the advertised vram for dual gpu cards, this is nothing new and ati/amd do the same thing, its been like this since the birth of dual gpu cards.... I knew this when buying my 5970 when I was newb to computers..
> And no they won't release a 8gb 690, unless you count Asus Mars III which will come with a huge price tag and only be in limited production..
> Keep your fingers crossed if the release a 790


I wonder if in USA a customer can actually sue Nvidia or Ati for wrong advertising. Afterall 2+2 is not equal to 4 in this case.

Anyhow. It sux that they wont release a 4GB per core (a.k.a. 8GB) version of a 690.

So much power in that GPU, even better than 670 SLI, but you might hit the Vram wall evantually.

When I had my 590, I was playing BF3 very smoothly around 100+fps with my older 120hz monitor, but suddenly my FPS would drop around 40. I never checked it back than as I didnt know the importance of Vram, but I am sure i was hitting the 1.5GB vram barrier back than.

590 was not a great GPU. a 570 SLI would easily get better results than a 590, but in 690, a 670 or 680 SLI setup is barely or exactly equal to an 690. Hence I can not understand the vram being stuck at 2.

Don't get me wrong guys. I DO love the 690. This was not a rant or a hate thing. Just, things could be different.


----------



## taderater

Where did you get the backplates?
I have searcged endlesly and only the EVGA ones pop up.
Please advise,
Thanks


----------



## kzinti1

Quote:


> Originally Posted by *taderater*
> 
> Where did you get the backplates?
> I have searcged endlesly and only the EVGA ones pop up.
> Please advise,
> Thanks


Have you tried here? http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_1018_1038&products_id=34825


----------



## Divineshadowx

Will 3x 2560x1440 be enough to stress dual 690/quad 680 to the point where games and such can utilize the cores, or will I just get lower fps and have the same usage.


----------



## y2kcamaross

Quote:


> Originally Posted by *Divineshadowx*
> 
> Will 3x 2560x1440 be enough to stress dual 690/quad 680 to the point where games and such can utilize the cores, or will I just get lower fps and have the same usage.


Quad gpu usage is still pretty God awful in 90% of games, I'd imagine your frame rates wont be great and you'll have to turn down some settings in order to stay under the 2gb vram limit


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> Will 3x 2560x1440 be enough to stress dual 690/quad 680 to the point where games and such can utilize the cores, or will I just get lower fps and have the same usage.


you are going to run into vram problems, 3way sli with 4gb 680s would be better for that


----------



## Marcsrx

Quote:


> Originally Posted by *Divineshadowx*
> 
> A single 690 is overkill for 1440p and 1600p, I get 150fps on bf3, and the max 100 on crysis 2 @1080p. Since there are no 1440/1600p monitors at 120hz, 60 is all you're gonna need.


I disagree with this. At 1440p all maxed out BF3 can not maintain 100fps which is the refresh rate of my catleap. It averages ~80fps and does hit/exceed 100fps but I can vsync at 100hz. Not to mention the temps are ~70degrees C. I think 1440p does just the right amount of hurt to the 690.


----------



## Marcsrx

Edit, at 60fps, yes you are right...


----------



## Divineshadowx

Quote:


> Originally Posted by *Marcsrx*
> 
> I disagree with this. At 1440p all maxed out BF3 can not maintain 100fps which is the refresh rate of my catleap. It averages ~80fps and does hit/exceed 100fps but I can vsync at 100hz. Not to mention the temps are ~70degrees C. I think 1440p does just the right amount of hurt to the 690.


Well, with your monitor I guess it holds back, did you overclock your gpu? I was thinking more fps. But for 3x 2560x1440 there would be no point in fps higher than 60 because you can use only 1 display port, so i'm guessing the max refresh rate of 3 monitors would be 65-70, because dl-dvi runs out of bandwidth. I was getting a constant 60fps in Crysis 2 with 3d, so wouldn't 3d technically double the resolution to 3840x2160 on a 1080 display since it renders the screen twice for each eye. If that is how it works then 3840x2160 is more than double of 2560x1440 so ya...


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Will 3x 2560x1440 be enough to stress dual 690/quad 680 to the point where games and such can utilize the cores, or will I just get lower fps and have the same usage.


No it'll stress it to it's maximum if it's needed, provided it has enough VRAM, soon as it hits the VRAM limit (which it will in BF3 as soon as you turn the settings up past high) your FPS will drop to around 20FPS. GTX 690 SLI would power x3 1600P at 60FPS easily if it had enough VRAM. Probably wouldn't be a constant 95% core usage or above though, which is what you'd get in heaven.

My single GTX 690 is enough to power my 3880x1920 surround set-up at 60FPS in BF3 but once it hits the VRAM wall I'm looking at 20FPS or worse. You just can't power a surround set-up with 2GB of VRAM in BF3 and similar games. Just isn't possible. In BF3 alone motion blur, even if the slider is set tp 0% uses 200-300MB of VRAM. In BF3 I'm running:

Ultra everything.
No AA.
No SBAO.
No Motion blur.

And I'm using 1700-2012MB of VRAM depending on the map and situation. The FPS I get with those settings are 60-100FPS and 70-95% on the cores but even that isn't able to stress them a constant 95% and above.

GTX 680 4GB SLI would be a different story and for x3 1080P/1200P SLI is more than enough. For x3 1440P/1600P I'd be using TRI SLI but nothing more. Quad SLI just doesn't scale enough to be worth it.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> No it'll stress it to it's maximum if it's needed, provided it has enough VRAM, soon as it hits the VRAM limit (which it will in BF3 as soon as you turn the settings up past high) your FPS will drop to around 20FPS. GTX 690 SLI would power x3 1600P at 60FPS easily if it had enough VRAM. Probably wouldn't be a constant 95% core usage or above though, which is what you'd get in heaven.
> My single GTX 690 is enough to power my 3880x1920 surround set-up at 60FPS in BF3 but once it hits the VRAM wall I'm looking at 20FPS or worse. You just can't power a surround set-up with 2GB of VRAM in BF3 and similar games. Just isn't possible. In BF3 alone motion blur, even if the slider is set tp 0% uses 200-300MB of VRAM. In BF3 I'm running:
> Ultra everything.
> No AA.
> No SBAO.
> No Motion blur.
> And I'm using 1700-2012MB of VRAM depending on the map and situation. The FPS I get with those settings are 60-100FPS and 70-95% on the cores but even that isn't able to stress them a constant 95% and above.
> GTX 680 4GB SLI would be a different story and for x3 1080P/1200P SLI is more than enough. For x3 1440P/1600P I'd be using TRI SLI but nothing more. Quad SLI just doesn't scale enough to be worth it.


Ya I was expecting the vram bottleneck. The 690 would truley be awesome if they had 4gb on each core. I really dont know what nvidia was thinking.

Anyways, I fixed my usage issue. Basically I went into nvidia inspector and cranked up all the eye candy I could find. I put a combined mode of 2x2ssaa and 2xmsaa and it ran my usage up on all 4 cores and the boost clock activated. I was getting my constant 120fps without any drops. Thing is, with the eye candy overide settings my vram shot up from a normal 1.2-1.5 to 1.95 in bf3. And the cool thing is the game looked really crisp with the overriden settings, normally they. look like crap. This is on 1080p. I tried the same on starcraft 2, but it doesnt look very good overriden, but I got the usage up and with it the fpsBf3 looked excellent. I tried it on crysis 2 but no matter what settings I use nothing changes. I even tried 4x4ssaa which would probably destroy my setup but nothing changed. Got my 100-160fps on that. I'll play more with it tomorrow.


----------



## kingchris

hi guys, got my 690 last week and wow, what a differance in picture quality when gaming im impressed..put it on water and temps <40 c.


----------



## Descadent

hello everyone, I just bought two more Korean Crossover monitors since they are on sale.

Is it better go to sli 670 4gb or get a 690? (sli 680 4gb are more than a 690 so I won't be doing that) Resolution is 7680x1440 and I won't be running msaa, just fxaa if I run any aa at all

Thanks.


----------



## Qu1ckset

Quote:


> Originally Posted by *Descadent*
> 
> hello everyone, I just bought two more Korean Crossover monitors since they are on sale.
> Is it better go to sli 670 4gb or get a 690? (sli 680 4gb are more than a 690 so I won't be doing that) Resolution is 7680x1440 and I won't be running msaa, just fxaa if I run any aa at all
> Thanks.


The gtx690 holds its own against 3x sli 670s but obviously the added vram is defiantly needed at the resolution.
http://venkateshcv.blogspot.ca/2012/05/geforce-gtx-690-vs-3-way-sli-gtx-670.html


----------



## Qu1ckset

My first batch of stuff arrived for my watercooling build im doing which includes the 690 hydrocopper block and backplate









Feel free to sub my build log!


----------



## Divineshadowx

Quote:


> Originally Posted by *Descadent*
> 
> hello everyone, I just bought two more Korean Crossover monitors since they are on sale.
> Is it better go to sli 670 4gb or get a 690? (sli 680 4gb are more than a 690 so I won't be doing that) Resolution is 7680x1440 and I won't be running msaa, just fxaa if I run any aa at all
> Thanks.


Gtx 690 > 670 sli, but the vram will hold you back. So it would probably be the same if you don't turn up the eye candy a lot. I don't think sli 670 would be enough for 7680x1440 though, as stated before 80fps is normal on bf3 at 2560x1440 with a single 690 which is more powerful than 670sli, and you are running 3 times that resolution so you're probably looking at 35-40fps because he was running 4xmsaa I think.


----------



## thestache

Quote:


> Originally Posted by *Descadent*
> 
> hello everyone, I just bought two more Korean Crossover monitors since they are on sale.
> Is it better go to sli 670 4gb or get a 690? (sli 680 4gb are more than a 690 so I won't be doing that) Resolution is 7680x1440 and I won't be running msaa, just fxaa if I run any aa at all
> Thanks.


GTX 670 4GB SLI is the only option there, so yes.

GTX 690 won't power x3 1440P period. FPS won't be a problem but VRAM will. FPS wise the GTX 670 4GB SLI will be able to maintain 60FPS with that resolution. It'll give them a good work out overclocked but I would expect good performance. Overclocked GTX 670 SLI isn't far off a GTX 690. So it's up to you. Personally I'd be running highly overclocked GTX680 SLI for 1440P surround or Tri SLI but for 1080P surround I'd say the GTX 670 SLI.

You'll also be able to run as much AA as your FPS can handle with 4GB cards so don't worry about that. But at such high resolutions AA isn't as visually impacting anyways. I run none in BF3 on my surround set-up until I change my GTX 690 for GTX 680 4GB SLI and don't notice it much. And I'm super anal about this type of stuff. Have to have graphics maxxed.

But either way the GTX 690 won't work so it'll need to be changed.


----------



## Descadent

thanks everyone, I wound up going with the 670 sc 4gb sli. We'll see how it runs. I can deal with it until next gen and just step up. As long as I can still play, but hopefully monitors will be here friday and cards tomorrow and I will see. Yes I don't run aa at 1440p, you really really gotta be up close and look hard to see it unless your super human eyesight. 60fps is a big deal to me too so we'll see what I'll have to do to compromise in graphical settings to get that. Unfortunately not many people have such a setup but with the korean monitors so cheap now, it's going to start happening.


----------



## thestache

Quote:


> Originally Posted by *Descadent*
> 
> thanks everyone, I wound up going with the 670 sc 4gb sli. We'll see how it runs. I can deal with it until next gen and just step up. As long as I can still play, but hopefully monitors will be here friday and cards tomorrow and I will see. Yes I don't run aa at 1440p, you really really gotta be up close and look hard to see it unless your super human eyesight. 60fps is a big deal to me too so we'll see what I'll have to do to compromise in graphical settings to get that. Unfortunately not many people have such a setup but with the korean monitors so cheap now, it's going to start happening.


You'll stay over 60FPS easily without it and you're right it isn't really needed. My surround set-up doesn't use the full extent (70-95% usage) of my GTX 690 and I stay around 70-100 FPS in BF3. Only time I get bad performance us when I hit the VRAM wall which is often.


----------



## Descadent

ouch, yeah the 2gb ram on each gpu really hurts 690 as everyone says, even know they advertise it as 4gb


----------



## Qu1ckset

Quote:


> Originally Posted by *Descadent*
> 
> ouch, yeah the 2gb ram on each gpu really hurts 690 as everyone says, even know they advertise it as 4gb


Why does every one say that, its been like this for years from both nvidea and amd, they have always doubled up the ram total on all there dual gpu cards


----------



## max883

Hope nvidia realeses a new driver that would allow us GTX 690 water cooled owners to increse the voltage from 1.75 to something like 2.35








and baybe we can overclokc it up to +300 That had ben nice!!









I have a xspc 690 water cooled card and idle is: 24c max load 40c.

pt +135
gpu +150
mem +500


----------



## y2kcamaross

Quote:


> Originally Posted by *max883*
> 
> Hope nvidia realeses a new driver that would allow us GTX 690 water cooled owners to increse the voltage from 1.75 to something like 2.35
> 
> 
> 
> 
> 
> 
> 
> 
> and baybe we can overclokc it up to +300 That had ben nice!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a xspc 690 water cooled card and idle is: 24c max load 40c.
> 
> pt +135
> gpu +150
> mem +500


A 690 would burn up at that kind of voltage on anything that wasn't sub 0 cooling


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> Why does every one say that, its been like this for years from both nvidea and amd, they have always doubled up the ram total on all there dual gpu cards


Not sure, their site clearly stares 2048mb per gpu. Maybe because the 690 costs more than the 2gb 680 sli and performs slightly less so they think it would be 4gb per gpu like the 4gb 680's. I kinda feel ripped off. I know it uses less power and cools better, but really if someone can afford 690sli/quadsli they can probablyafford decent case airflow/water cooling a power sufficient psu.


----------



## egotrippin

Quote:


> Originally Posted by *max883*
> 
> Hope nvidia realeses a new driver that would allow us GTX 690 water cooled owners to increse the voltage from 1.75 to something like 2.35
> 
> 
> 
> 
> 
> 
> 
> 
> and baybe we can overclokc it up to +300 That had ben nice!!
> 
> 
> 
> 
> 
> 
> 
> 
> I have a xspc 690 water cooled card and idle is: 24c max load 40c.
> pt +135
> gpu +150
> mem +500


It's that kind of talk that caused so many people to burn up their 590s... but hey, I guess this is OCN right?


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> Not sure, their site clearly stares 2048mb per gpu. Maybe because the 690 costs more than the 2gb 680 sli and performs slightly less so they think it would be 4gb per gpu like the 4gb 680's. I kinda feel ripped off. I know it uses less power and cools better, but really if someone can afford 690sli/quadsli they can probablyafford decent case airflow/water cooling a power sufficient psu.


Not alot of people are running dual 690s, I think its a waste, a single 690 merks 1600p or lower, and only takes up a single slot, runs cooler, and uses less power like you said, so why run two? If I planned on running 3x 1440p/1600p surround I would have never considered the 690 and would have went wit 3way sli 4gb 680s...

Most benches show the forth card in 4way sli adds very little performance so why buy 2x 690s.

May I ask what your expectations where when you bought two gtx 690s?? , because the gtx 590 was no different when it came out except for the fact it had crappy vram


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> Not alot of people are running dual 690s, I think its a waste, a single 690 merks 1600p or lower, and only takes up a single slot, runs cooler, and uses less power like you said, so why run two? If I planned on running 3x 1440p/1600p surround I would have never considered the 690 and would have went wit 3way sli 4gb 680s...
> Most benches show the forth card in 4way sli adds very little performance so why buy 2x 690s.
> May I ask what your expectations where when you bought two gtx 690s?? , because the gtx 590 was no different when it came out except for the fact it had crappy vram


The 590 was much different the 690. 580 sli destroyed a 590, 680sli don't destroy a 690. And I wanted to play at 120hz, and benchmarks showed a single 690 getting about 60fps in bf3 so I bought two. But obviously that isn't true because because I get 150fps in bf3 with one. Also I didn't plan on going tripple screen 1440p, so ya 2 690s is a waste.


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> The 590 was much different the 690. 580 sli destroyed a 590, 680sli don't destroy a 690. And I wanted to play at 120hz, and benchmarks showed a single 690 getting about 60fps in bf3 so I bought two. But obviously that isn't true because because I get 150fps in bf3 with one. Also I didn't plan on going tripple screen 1440p, so ya 2 690s is a waste.


why dont you sell one to make most your money back, while there is still a demand for it?


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> why dont you sell one to make most your money back, while there is still a demand for it?


Because if I sell it I would have $1000, and I would use that on 3 2560x1440p monitors. And the single 690 I have left would run into vram problems and wouldn't actually be able to drive that resolution i'm guessing. Same if I sell both and get 4x 4gb 680s, then I would have to go water cooling because there would be literally no airflow, that's another 500. I guess I could go 3 680s and have 500 left over for more monitors after I sell my current monitor. Not sure if 3 680s would be enough for 7680x1440 with max eye candy.


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> Because if I sell it I would have $1000, and I would use that on 3 2560x1440p monitors. And the single 690 I have left would run into vram problems and wouldn't actually be able to drive that resolution i'm guessing. Same if I sell both and get 4x 4gb 680s, then I would have to go water cooling because there would be literally no airflow, that's another 500. I guess I could go 3 680s and have 500 left over for more monitors after I sell my current monitor. Not sure if 3 680s would be enough for 7680x1440 with max eye candy.


3x 680s should be more then enough, the fourth card will not add very much performance, 3 sli is the sweet spot


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> 3x 680s should be more then enough, the fourth card will not add very much performance, 3 sli is the sweet spot


Just sell both of them and go GTX 670 4GB Tri SLI and watercool them. More than enough for 1440P surround.


----------



## Qu1ckset

Quote:


> Originally Posted by *Sexparty*
> 
> Just sell both of them and go GTX 670 4GB Tri SLI and watercool them. More than enough for 1440P surround.


Tri SLI 670s is only a tad better then a single 690, but obviously the added vram would help


----------



## Shogon

Quote:


> Originally Posted by *Qu1ckset*
> 
> Why does every one say that, its been like this for years from both nvidea and amd, they have always doubled up the ram total on all there dual gpu cards


I guess some are new to the dual GPU card world


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> Tri SLI 670s is only a tad better then a single 690, but obviously the added vram would help


Not true.

Not sure why people in this thread seem to think GTX 670 SLI is slower than a GTX 690 but on guru3d their benchmarks at 1600P show a difference of 1 FPS at stock clogs for both and GTX 670 Tri SLI being 20 FPS faster than a GTX 690 (in BF3).

I would expect GTX 670 SLI to perform slightly less than a GTX 690 and overclocked surpass a GTX 690 overclocked.

GTX 690 with enough VRAM would run 1440P surround and with GTX670 Tri SLI being (around 25-35% from what ive seen) faster I'd expect it to comfortably handle it. If my
Motherboard took Tri SLI that's whatbid be running but because it doesn't I'm stuck with GTX 680 4GB SLI as my only option.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> Not true.
> Not sure why people in this thread seem to think GTX 670 SLI is slower than a GTX 690 but on guru3d their benchmarks at 1600P show a difference of 1 FPS at stock clogs for both and GTX 670 Tri SLI being 20 FPS faster than a GTX 690 (in BF3).
> I would expect GTX 670 SLI to perform slightly less than a GTX 690 and overclocked surpass a GTX 690 overclocked.
> GTX 690 with enough VRAM would run 1440P surround and with GTX670 Tri SLI being (around 25-35% from what ive seen) faster I'd expect it to comfortably handle it. If my
> Motherboard took Tri SLI that's whatbid be running but because it doesn't I'm stuck with GTX 680 4GB SLI as my only option.


Well according to basically everyone here there is no difference in gpus. I might as well get a 660ti since it is basically the same as a 670, which is the same as a 680. But somehow they all beat the 690 because they decided to give it 2gb vram when freaking gaming notebooks have more. Maybe the 7990 will be good, i dont generally like amd seing their crap cpus but at least they release drivers every 2 weeks. Unlike nvidia, the last damn stable release was 4 months ago?


----------



## egotrippin

I've decided to sell my GTX 690 Backplate. There's nothing wrong with it, I haven't even taken it out of the wrapper. I decided since my motherboard is inverted it won't be visible anyway. I could probably still return it, since it's only been a couple weeks but I figured I'd give somebody in here a shot because they are still out of stock.

Message me if you're interested.



*EDIT: SOLD*


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well according to basically everyone here there is no difference in gpus. I might as well get a 660ti since it is basically the same as a 670, which is the same as a 680. But somehow they all beat the 690 because they decided to give it 2gb vram when freaking gaming notebooks have more. Maybe the 7990 will be good, i dont generally like amd seing their crap cpus but at least they release drivers every 2 weeks. Unlike nvidia, the last damn stable release was 4 months ago?


Well unlike every other idiot on here I know what I'm talking about or keep my mouth closed. Lol. So here, course it doesnt scale well at 1200P (as you know from your own tests) but once you actually strain the cards then you see more of a lead and what they are capable of.

All of them being within a frame of each other at 1600P until Tri SLI comes along and ruins the party.

So honestly, best bang for buck has and always will be GTX 670 SLI and Tri SLI.




AMD. Honestly if I could get over how bad their drivers were I'd be getting a HD 7990 or HD 7970 XFire myself but I can't since my last generations GPUs and the problems I had with them were soo bad. They've released one stable set of drivers since BF3 released and I was having problems at 1080P. Would have hated to have had my surround set-up back then.

The memory interface size and increased VRAM is tempting but the issues that come with AMD just aren't worth it until they sort their stuff out.


----------



## Qu1ckset

Quote:


> Originally Posted by *Sexparty*
> 
> Not true.
> Not sure why people in this thread seem to think GTX 670 SLI is slower than a GTX 690 but on guru3d their benchmarks at 1600P show a difference of 1 FPS at stock clogs for both and GTX 670 Tri SLI being 20 FPS faster than a GTX 690 (in BF3).
> I would expect GTX 670 SLI to perform slightly less than a GTX 690 and overclocked surpass a GTX 690 overclocked.
> GTX 690 with enough VRAM would run 1440P surround and with GTX670 Tri SLI being (around 25-35% from what ive seen) faster I'd expect it to comfortably handle it. If my
> Motherboard took Tri SLI that's whatbid be running but because it doesn't I'm stuck with GTX 680 4GB SLI as my only option.


Quote:


> Originally Posted by *Sexparty*
> 
> Well unlike every other idiot on here I know what I'm talking about or keep my mouth closed. Lol. So here, course it doesnt scale well at 1200P (as you know from your own tests) but once you actually strain the cards then you see more of a lead and what they are capable of.
> All of them being within a frame of each other at 1600P until Tri SLI comes along and ruins the party.
> So honestly, best bang for buck has and always will be GTX 670 SLI and Tri SLI.
> 
> 
> AMD. Honestly if I could get over how bad their drivers were I'd be getting a HD 7990 or HD 7970 XFire myself but I can't since my last generations GPUs and the problems I had with them were soo bad. They've released one stable set of drivers since BF3 released and I was having problems at 1080P. Would have hated to have had my surround set-up back then.
> The memory interface size and increased VRAM is tempting but the issues that come with AMD just aren't worth it until they sort their stuff out.


i never said the 3way SLI gtx670 is slower then a single gtx690, i said the 690 holds its own against them, and even in that benchmark you included you can see it matched them up to 1200p, and only then at 1600p its 15fps slower which proves me right when saying the gtx690 holds its own against 3way SLI 670s.

and if you look the single 670 is so close the single 680
Quote:


> Originally Posted by *Divineshadowx*
> 
> Well according to basically everyone here there is no difference in gpus. I might as well get a 660ti since it is basically the same as a 670, which is the same as a 680. But somehow they all beat the 690 because they decided to give it 2gb vram when freaking gaming notebooks have more. Maybe the 7990 will be good, i dont generally like amd seing their crap cpus but at least they release drivers every 2 weeks. Unlike nvidia, the last damn stable release was 4 months ago?


AMD is not releasing the HD7990 according to alot of rumors going around on the internet, so you would have to rely on OEMS to make them and so far power color is the only one confirmed making it, and that cards a triple slot monster


----------



## thestache

Quote:


> Originally Posted by *Qu1ckset*
> 
> i never said the 3way SLI gtx670 is slower then a single gtx690, i said the 690 holds its own against them, and even in that benchmark you included you can see it matched them up to 1200p, and only then at 1600p its 15fps slower which proves me right when saying the gtx690 holds its own against 3way SLI 670s.
> and if you look the single 670 is so close the single 680
> AMD is not releasing the HD7990 according to alot of rumors going around on the internet, so you would have to rely on OEMS to make them and so far power color is the only one confirmed making it, and that cards a triple slot monster


Yeah they are all a frame away from each other. So peoples misconception about them is bogus. I'm sure in a surround set-up they would seperate a bit futher but not by much. The GTX 670 is a GTX 680 with one cuda core cluster disabled so it's not slow enough to make a difference to most people. Especially when overclocked, it closes the distance.

You said only slightly faster and 25% faster in a benchmark that doesn't utilize the cards 100% is substantial. Overclocked GTX 690s can't reach past 1200mhz easily but GTX 670s can. So, overclocked and in SLI they will surpass a GTX 690 easily. Thats what I'm trying to get at. I'd sell the cards and buy different cards you'll actually use, are faster and have 4GB of VRAM. Sure they don't look as pretty but under water and actually performing who cares.


----------



## V3teran

I love my 690,then again im gaming on a u2410 at 1920*1200 and i dont plan on going any higher,i tried a u2711 and it was too big for me,at my current resolution the 690 is perfect.


----------



## Marcsrx

Quote:


> Originally Posted by *V3teran*
> 
> I love my 690,then again im gaming on a u2410 at 1920*1200 and i dont plan on going any higher,i tried a u2711 and it was too big for me,at my current resolution the 690 is perfect.










not sure if serious or trolling... You could have had the same results w/a 670/680 and kept $500 in your wallet.


----------



## Arizonian

Here is an unscientific comparison GTX 690 vs GTX 670 Tri-SLI by Linus Tech Tips on You Tube. Only error he made was when he commented that the GTX 690 costing up to $1200 in comparison to three GTX 670's Tri-SLI. When factually it's very easy to get a GTX 690 for $999.99 as I did. So his dollar per value is off because the GTX 690 is $200 less expensive than a Tri-SLI GTX 670 for $1200.






For 3D Vision gaming or just gaming in 2D @ 120 Hz refresh rate the GTX 690 fits like a glove with great performance and where it shines. In 3D Vision I'm getting 60 FPS+ right where demands for 3D Vision gaming require for best experience. In normal 2D gaming and 120 Hz refresh rate I'm lows of 90 FPS and average 120 FPS+ on the most demanding games with graphic settings turned on and maxed.

However being only 2GB effectively I can see depending on game demand how it might be over whelming and insufficient with multiple monitors. If I was using multiple monitors I would much rather go for 670 or 680 4GB SLI offering for best performance regardless of game. Price performance set aside because why would anyone invest in multiple monitors and cut themselves short on graphic performance in certain games. Just impractical.


----------



## Divineshadowx

Quote:


> Originally Posted by *Marcsrx*
> 
> 
> 
> 
> 
> 
> 
> 
> not sure if serious or trolling... You could have had the same results w/a 670/680 and kept $500 in your wallet.


Very true. I don't even see a point in getting a 690. I mean at [email protected] a 670 would be enough for all games. For 2560x1600/1440 sli 660ti's would give 60fps, save 400 right there. At higher resolutions the 690 would fail because of the vram. Maybe in your case since you have your monitor oc'd to 100hz, but at that point might as well get 670/680 sli 4gb's and have better performance.


----------



## V3teran

Quote:


> Originally Posted by *Marcsrx*
> 
> 
> 
> 
> 
> 
> 
> 
> not sure if serious or trolling... You could have had the same results w/a 670/680 and kept $500 in your wallet.


Im serious and i wanted a 690,i was one of the first on here to get one and no you dont get the same performance as a 680 especially when you play modded games like skyrim and gta 4 which would bring a single 670/680 to its knees,i also paid £900 for my gtx 690 alot more than what you paid for yours,i actually had 2 at one point but only 1 is needed so i sent one back.

Check out my modded GTA 4 and 2x690's that i had.

690's









GTA4-A single 670/680 would be crippled with this and a heavily modded skyrim with all the HD mods and texture packs is just as much demanding,also Max payne 3 maxed out will also cripple a single 680 while running at a constant 60fps at all times.


----------



## MrTOOSHORT

GTA4 doesn't support sli or crossfire. It ran butter smooth with my former 680 no problem with all the mods. The 480 I have now can't do it though.

My gtx680 was at 1.45Ghz btw when playing gta4.


----------



## V3teran

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> GTA4 doesn't support sli or crossfire. It ran butter smooth with my former 680 no problem with all the mods. The 480 I have now can't do it though.
> My gtx680 was at 1.45Ghz btw when playing gta4.


What mods did you use,i mean im using some very heavy mods and when you max the game out using the mods im using its can cripple your card,i know this from people over at Guru3D,im using HD texture pack,fonias roads,L3VO enb,Car pack etc,take alook at my vid,did your GTA4 look as good as that while running at a constant 60fps?

GTA maxed out is not a problem
GTA maxed Modded will max your card out and if you dont have the vram you will get serious texture popup.
Skyrim is something else,use all the HD texture mods,it maxes my GPU's while using 8x AA so i know a single card would struggle,STEP mod,has loads of mods,check this out if you want to really max your game out with some great features.
http://skyrim.nexusmods.com/mods/11

Heres another vid of mine.


----------



## MrTOOSHORT

Just using iCEnhancer 2.1.

but still, unless something has changed with the new mods, gta4 doesn't support sli(gtx690=sli 680 chips) or crossfireX.

Correct me if I'm worng on the sli gta4 issue.


----------



## Semiregular

So here is it is, The link to my build log, You can find the GTX 690 in there (just scroll down a bit)
1) http://www.overclock.net/t/1298715/build-log-the-voyager-watercooled-prodigy-gtx-690/10#post_18024151
2) ASUS


----------



## V3teran

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Just using iCEnhancer 2.1.
> but still, unless something has changed with the new mods, gta4 doesn't support sli(gtx690=sli 680 chips) or crossfireX.
> Correct me if I'm worng on the sli gta4 issue.


Icehancer is not the only ENB and it works differently for each person,i had much better results with L3VO,Icehancer for me looks oversaturated, also used many other mods that affect the roads,textures and lighting.

The sli issue works much better than it did before by using commandline.txt

Also put this in your commandline.txt
-norestrictions
-nomemrestrict
-percentvidmem 100...............This enables 100 percent use of gpu
-memrestrict 681574400
-memrestrict 629145600
-availablevidmem 4.0..............This enables 100 percent use of gpu ram
-noprecache
-novblank

And also Custom Anti-Aliasing flags in Nvidia Inspector .....nVIDIA's original SLI compatibility bits for GTA4 is "0x03400405". However, new SLI bits "0x42500045" seems to improve SLI scaling nicely
http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8828733&postcount=26


----------



## Ruby Rabbit

Hi guy's,
I really want to water cool my 690. I have read the water cooling results you guys are getting and now want to take the plunge.
My question is, I already have a h100 cpu cooler, whilst I know it's not the best solution it does keep my 3930k @ 4.6ghz overclock idle 41c and load 67c with an ambient of 26c. So I don't really want to change this until I do an upgrade when haswell hits.
What I would like to do is just cool the 690, and I was wondering, if I went with the following option:
- Phobya 200mm Rad, mounted at the front of my 650D case
- D5 Alpha cool pump,
- Masterkleer 3/8" Tubing
- Mayhems coolant
- 5.25" Dual bay res.
I could get the 30's-low 40s temps on the 690?


----------



## V3teran

I dont think the 690 is worth watercooling unless theres a volt mod because people that wc them are only getting an increase of around 30mhz compared to people on air on the core tops generally,even though the gpu is still very cool at full load,its needs more power to be put through it to achieve stability success once going over 30mhz increase on the core,this is what put me off from watercooling mine,all the hassle of setting it all up for a small increase...no thanks.


----------



## Ruby Rabbit

Quote:


> Originally Posted by *V3teran*
> 
> I dont think the 690 is worth watercooling unless theres a volt mod because people that wc them are only getting an increase of around 30mhz compared to people on air on the core tops generally,even though the gpu is still very cool at full load,its needs more power to be put through it to achieve stability success once going over 30mhz increase on the core,this is what put me off from watercooling mine,all the hassle of setting it all up for a small increase...no thanks.


@ V3teran - Excellent Point! It's just the screaming 690 fan running at 80%-85% just to keep the temp's in the low 80's c. It's bugging the crap out of me. But you are right the small GPU payoff. Maybe I should go the full monte and just cool the CPU as well. This way I can go the 5ghz option







Again a small payoff on what I already have.

Thanks


----------



## jagz

Quote:


> Originally Posted by *V3teran*
> 
> I dont think the 690 is worth watercooling unless theres a volt mod because people that wc them are only getting an increase of around 30mhz compared to people on air on the core tops generally,even though the gpu is still very cool at full load,its needs more power to be put through it to achieve stability success once going over 30mhz increase on the core,this is what put me off from watercooling mine,all the hassle of setting it all up for a small increase...no thanks.


80c to 40c for me ([email protected]) at 1150Mhz going water. Quiet, cool, and didn't cost too much just some fittings, a block and 240 rad. Was a relatively inexpensive project for me to put this thing on water.

Casual use, gaming... alot of idling, sure, no need really. The stock fan was surpisingly quiet even at some 75%+


----------



## thestache

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> @ V3teran - Excellent Point! It's just the screaming 690 fan running at 80%-85% just to keep the temp's in the low 80's c. It's bugging the crap out of me. But you are right the small GPU payoff. Maybe I should go the full monte and just cool the CPU as well. This way I can go the 5ghz option
> 
> 
> 
> 
> 
> 
> 
> Again a small payoff on what I already have.
> Thanks


Pointless going a custom loop for the GPU and not including the CPU for a few dollars extra. You'll need a 240mm rad minimum to cool the GTX 690, a 240mm for the cpu and I'd recommend the koolance blocks. These cards throttle once they get too hot so for that alone and the noisy fan it's worth it. A custom loop done right is priceless and fun to install so don't be deterred or put it off until haswell.


----------



## Marcsrx

Quote:


> Originally Posted by *Divineshadowx*
> 
> Very true. I don't even see a point in getting a 690. I mean at [email protected] a 670 would be enough for all games. For 2560x1600/1440 sli 660ti's would give 60fps, save 400 right there. At higher resolutions the 690 would fail because of the vram. Maybe in your case since you have your monitor oc'd to 100hz, but at that point might as well get 670/680 sli 4gb's and have better performance.


I disagree with your point. 1 card, less power, less heat, less noise, less to fail. I <3 my 690 GTX. Lovely card!


----------



## Shogon

Quote:


> Originally Posted by *V3teran*
> 
> I dont think the 690 is worth watercooling unless theres a volt mod because people that wc them are only getting an increase of around 30mhz compared to people on air on the core tops generally,even though the gpu is still very cool at full load,its needs more power to be put through it to achieve stability success once going over 30mhz increase on the core,this is what put me off from watercooling mine,all the hassle of setting it all up for a small increase...no thanks.


Lol I'm getting +135 on the core and +325 on the memory on water. I could do that on air, except the 28C+ ambient and 50% or more humidity is a killer.


----------



## Divineshadowx

Quote:


> Originally Posted by *Marcsrx*
> 
> I disagree with your point. 1 card, less power, less heat, less noise, less to fail. I <3 my 690 GTX. Lovely card!


Less power? Yeah like 50 watts, which would also produce not much more heat, especially when you're water cooling it doesn't matter. All 600's are fairly quiet, and less to fail? The vram in general is the biggest fail here. I'm not saying its a bad card, its great, just that there are better options for $1000. If 4gb 680's/670's didn't exist it would be the deal, that isn't the case though.


----------



## Marcsrx

I guess I dont have a problem w/the vram on a single monitor. Kind of a non-issue for me. Not to mention cooling two cards adds to the cost differential in your comparison. As I said, I could not be happier w/my 690.


----------



## Bayu

Yea







I am in da club since yesterday


----------



## Michalius

Quote:


> Originally Posted by *V3teran*
> 
> *I dont think the 690 is worth watercooling* unless theres a volt mod because people that wc them are only getting an increase of around 30mhz compared to people on air on the core tops generally,even though the gpu is still very cool at full load,its needs more power to be put through it to achieve stability success once going over 30mhz increase on the core,this is what put me off from watercooling mine,all the hassle of setting it all up for a small increase...no thanks.


Are you kidding me? My rig is completely silent now that I've put it under water. At load with even an moderate OC, the thing was crazy loud. In addition to that, you are dealing with added heat being dumped into the system.

Watercooling isn't just about performance.


----------



## V3teran

Quote:


> Originally Posted by *Michalius*
> 
> Are you kidding me? My rig is completely silent now that I've put it under water. At load with even an moderate OC, the thing was crazy loud. In addition to that, you are dealing with added heat being dumped into the system.
> Watercooling isn't just about performance.


Maybe not for you it isnt but for me its all about performance,i have my 930 running at 4.4ghz HT on and its fully stable in Linx,Occt,IBT and prime 95,for me watercooling will always be about performance and anything else that comes out of it in a positive way is an added bonus,also the weather here in England is not as hot as were some of you other pc owners are so i dont need to crank up my gpu fan,65% fan speed does me in any game that i play,temps dont go over 78 degrees and most games max out at 75 degrees and its all nice and quiet.


----------



## Chamezz

Could someone clarify why the GTX690 doesn't/can't(?) have 4GB Vram per GPU? Doesn't it physically fit on the board or is it a cost-issue?


----------



## dboythagr8

The Mars III card will have it.

Nvidia chose to not put an extra 2GB for each GPU it's really that simple.


----------



## Lotek420

Asus told TechPowerup that the Mars III card was just for show. It will not be a production run card. JFYI


----------



## thestache

Quote:


> Originally Posted by *Lotek420*
> 
> Asus told TechPowerup that the Mars III card was just for show. It will not be a production run card. JFYI


That'd be a big mistake.


----------



## PhantomTaco

Quote:


> Originally Posted by *Sexparty*
> 
> That'd be a big mistake.


If I'm not mistaken, didn't NVIDIA tell all the manufacturers that they weren't allowed to change anything about the PCB design or power/speeds of the card including the shroud?


----------



## jprovido

hi. I just got my gtx 690 a few hours ago. got it overclocked to 1215 core(boost) and 7200 memory. is this an average clocking card? got a feeling I got a golden one here. it slightly beats the overclock I got with my gtx 6*8*0


----------



## Arizonian

Quote:


> Originally Posted by *jprovido*
> 
> hi. I just got my gtx 690 a few hours ago. got it overclocked to 1215 core(boost) and 7200 memory. is this an average clocking card? got a feeling I got a golden one here. it slightly beats the overclock I got with my gtx 6*8*0


I'd say if your first GPU #1 is getting *1215* MHz Core then yes you do have a great over clocker on air.









What is your second GPU Core Clock getting out of curiosity?

I'm running GPU #1 *1176* MHz and GPU #2 *1202* MHz Core so basically both at *1176* MHz Core is my best on air and only as good as my first GPU Core Clock.

Still even at a low over clock the GTX 690 for me on single 1920 x 1080 120 Hz monitor I'm getting lowest 90 FPS as high as 120+ FPS gaming settings turned up.


----------



## jprovido

Quote:


> Originally Posted by *Arizonian*
> 
> I'd say if your first GPU #1 is getting *1215* MHz Core then yes you do have a great over clocker on air.
> 
> 
> 
> 
> 
> 
> 
> 
> What is your second GPU Core Clock getting out of curiosity?
> I'm running GPU #1 *1176* MHz and GPU #2 *1202* MHz Core so basically both at *1176* MHz Core is my best on air and only as good as my first GPU Core Clock.
> Still even at a low over clock the GTX 690 for me on single 1920 x 1080 120 Hz monitor I'm getting lowest 90 FPS as high as 120+ FPS gaming settings turned up.


I have them synced so both are at 1215mhz I think I can push it a tiny bit more. finally I got a decent clocking card. my last two cards has been a bust. TY SILICON GODS!

the reason I got this card is because i wanted to get 120fps on my games or atleast close to it on my 120hz screen. the performance increase is incredible from my gtx 680 and even with a pci-e 2.0 and my aging x58 system I don't really feel like I'm being bottlencked that much









edit;

btw what do you think is a better OC setting for my gtx 690. 4.4ghz HT off or 4.2ghz HT on?


----------



## thestache

Quote:


> Originally Posted by *jprovido*
> 
> hi. I just got my gtx 690 a few hours ago. got it overclocked to 1215 core(boost) and 7200 memory. is this an average clocking card? got a feeling I got a golden one here. it slightly beats the overclock I got with my gtx 6*8*0


All the cores on both of my GTX 690s are clocked at 1212mhz stable. So yours would be an good overclocker in that department. However I can't get my memory that high on either card. Can only get it to about 6612mhz on both, so thats a pretty dam good memory overclock.

With unlocked voltage I'm sure I could but that will never happen.

Quote:


> Originally Posted by *PhantomTaco*
> If I'm not mistaken, didn't NVIDIA tell all the manufacturers that they weren't allowed to change anything about the PCB design or power/speeds of the card including the shroud?


Yeah they did say that about the GTX 690 but the MARS3 is hardly a GTX 690. It's a custom PCB GTX 680 X2. Which should be fair game, however they might have altered the PCB for full voltage control and if I understand it correctly Nvidia needs to approve the card before sale so that might be holding it back or even Nvidia might not like how much performance it has to offer and could see it taking sales from the GTX 690 etc. Who knows. Whoever is in charge of buisness decisions for these guys is an idiot and who knows the reason, just like locked voltage control and stopping MSI from offering software voltage control.


----------



## thestache

Quote:


> Originally Posted by *jprovido*
> 
> I have them synced so both are at 1215mhz I think I can push it a tiny bit more. finally I got a decent clocking card. my last two cards has been a bust. TY SILICON GODS!
> the reason I got this card is because i wanted to get 120fps on my games or atleast close to it on my 120hz screen. the performance increase is incredible from my gtx 680 and even with a pci-e 2.0 and my aging x58 system I don't really feel like I'm being bottlencked that much
> 
> 
> 
> 
> 
> 
> 
> 
> edit;
> btw what do you think is a better OC setting for my gtx 690. 4.4ghz HT off or 4.2ghz HT on?


Why do you have to lower your clock speed to achieve HT? I understand it requires more voltage but whats the limitation?

I'd say the faster core speed would be more desirable. HT makes no difference in most games (it can but in most it doesn't).


----------



## PhantomTaco

Quote:


> Originally Posted by *Sexparty*
> 
> All the cores on both of my GTX 690s are clocked at 1212mhz stable. So yours would be an good overclocker in that department. However I can't get my memory that high on either card. Can only get it to about 6612mhz on both, so thats a pretty dam good memory overclock.
> With unlocked voltage I'm sure I could but that will never happen.
> Yeah they did say that about the GTX 690 but the MARS3 is hardly a GTX 690. It's a custom PCB GTX 680 X2. Which should be fair game, however they might have altered the PCB for full voltage control and if I understand it correctly Nvidia needs to approve the card before sale so that might be holding it back or even Nvidia might not like how much performance it has to offer and could see it taking sales from the GTX 690 etc. Who knows. Whoever is in charge of buisness decisions for these guys is an idiot and who knows the reason, just like locked voltage control and stopping MSI from offering software voltage control.


Jesus that thing has 3 8 pin power connectors? That's insane. How many watts does it pull?! Ah just found it...525 watts...compared to the 690s 300 watts. Wait, does that even make sense? That's a lot of extra power for overclocking...


----------



## jprovido

Quote:


> Originally Posted by *PhantomTaco*
> 
> Jesus that thing has 3 8 pin power connectors? That's insane. How many watts does it pull?! Ah just found it...525 watts...compared to the 690s 300 watts. Wait, does that even make sense? That's a lot of extra power for overclocking...


I came from two GTX 480's and they were power hungry, energy sucking global warming inducing machines. I have a watt metter and at full load it pulls 800w from the wall. the gtx 690 is very power efficient and hardly goes above 500 watts and the performance isn't even close not to mention my system is overclocked and x58 systems aren't really power efficient


----------



## jagz

Quote:


> Originally Posted by *jprovido*
> 
> hi. I just got my gtx 690 a few hours ago. got it overclocked to 1215 core(boost) and 7200 memory. is this an average clocking card? got a feeling I got a golden one here. it slightly beats the overclock I got with my gtx 6*8*0


Quote:


> Originally Posted by *Arizonian*
> 
> I'd say if your first GPU #1 is getting *1215* MHz Core then yes you do have a great over clocker on air.
> 
> 
> 
> 
> 
> 
> 
> 
> What is your second GPU Core Clock getting out of curiosity?
> I'm running GPU #1 *1176* MHz and GPU #2 *1202* MHz Core so basically both at *1176* MHz Core is my best on air and only as good as my first GPU Core Clock.


Quote:


> Originally Posted by *Sexparty*
> 
> All the cores on both of my GTX 690s are clocked at 1212mhz stable.


All 3 of you have better clocking 690's than mine.. I top out at 1150Mhz


----------



## Divineshadowx

Is 2x 4gb 680's enough for 7680x1440? Or should I wait for gk110 or w/e it is. I really want to get rid of my second 690, it is so worthless if not benchmarking. And at that point I would probably invest in monitors, I could go for water but my current 1080p seems noob


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> Is 2x 4gb 680's enough for 7680x1440? Or should I wait for gk110 or w/e it is. I really want to get rid of my second 690, it is so worthless if not benchmarking. And at that point I would probably invest in monitors, I could go for water but my current 1080p seems noob


Id sell your second gtx 690, save your money and wait for the 780 and if they make it the 790


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> Id sell your second gtx 690, save your money and wait for the 780 and if they make it the 790


So sell my second 690, get a 2560x1440 monitor, and then when gk110 comes i'll sell my other 690 and with the money I got from my second one i'll add two more monitors and upgrade to gk110? Sounds good to me.


----------



## Semiregular

Quote:


> Originally Posted by *Divineshadowx*
> 
> So sell my second 690, get a 2560x1440 monitor, and then when gk110 comes i'll sell my other 690 and with the money I got from my second one i'll add two more monitors and upgrade to gk110? Sounds good to me.


Sounds like a great plan


----------



## jprovido

got a weird problem with my gtx 690. it coil whines when it's *at idle* but totally disappears when at full load. never heard of coilwhine the other way around like this . it's kinda annoying tbh


----------



## Semiregular

Quote:


> Originally Posted by *jprovido*
> 
> got a weird problem with my gtx 690. it coil whines when it's *at idle* but totally disappears when at full load. never heard of coilwhine the other way around like this . it's kinda annoying tbh


Sounds like what linus has with his motherboard:




it's at about 1:20. I don't have a solution though







(maybe add some noise dampening to the case or send it back for a new one)


----------



## jprovido

Quote:


> Originally Posted by *Semiregular*
> 
> Sounds like what linus has with his motherboard:
> 
> 
> 
> 
> it's at about 1:20. I don't have a solution though
> 
> 
> 
> 
> 
> 
> 
> (maybe add some noise dampening to the case or send it back for a new one)


that's exactly the same sound I'm hearing and i'm 100% sure it's from the gpu. I've read somewhere that I should leave it at load overnight pref. a game then see if it disappears. im gonna try it later. it's almost inaudible when I close my sidepanel but still annoys me when you payed a big premium for a card but still got an issue like this albeit minor


----------



## Divineshadowx

Oh look, i'm back to my old ram problems, amazing. Now my comp boots at xmp and crashes at windows screen. Even at 1.65v, back to 1600mhz. My first stick with a different mobo wouldnt run dual channel, my one after that with a new mobo randomly crashed at xmp, and now this one doesn't even work. I hate ram.


----------



## egotrippin

Quote:


> Originally Posted by *jprovido*
> 
> got a weird problem with my gtx 690. it coil whines when it's *at idle* but totally disappears when at full load. never heard of coilwhine the other way around like this . it's kinda annoying tbh


I've had an opposite problem... silent at idle and while gaming or benchmarking but there's a high pitch whine when folding. If I adjust the power target or gpu clock up or down, the pitch of the whine will go up or down.


----------



## jprovido

Quote:


> Originally Posted by *egotrippin*
> 
> I've had an opposite problem... silent at idle and while gaming or benchmarking but there's a high pitch whine when folding. If I adjust the power target or gpu clock up or down, the pitch of the whine will go up or down.


id gladly prefer a whining card at load compared to one at idle. mine is really annoying and 90% of the time you're at desktop anyway. I rancrysis 2 for 8 hours and the coil whining disappeared for a while but came back just now so I thought of posting it here. Ima try to leave it longer this time and with a heavier load too


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Oh look, i'm back to my old ram problems, amazing. Now my comp boots at xmp and crashes at windows screen. Even at 1.65v, back to 1600mhz. My first stick with a different mobo wouldnt run dual channel, my one after that with a new mobo randomly crashed at xmp, and now this one doesn't even work. I hate ram.


What changed?


----------



## Divineshadowx

Quote:


> Originally Posted by *Sexparty*
> 
> What changed?


What do you mean what changed? I didn't change anything, sometimes it boots at xmp sometimes it doesn't and needs a few restarts, now it doesn't load windows at all. So I changed it to 1600mhz...............


----------



## jagz

I seemingly can't play Wake Island in BF3. I've never had this error prior to adding the 690.










I've already checked if DX was up to date and such.


----------



## PhantomTaco

Hey guys have a favor/question to ask. I was wondering if anyone had their 690 currently not installed or just got it and are planning to put a waterblock on it? If you are is there any way you could measure and send me the screw locations for the 8 screws around the GPUs (4 screws to each one). I've been talking with Dwood about making a backplate, but really really really don't want to take apart my loop and pull my card out to measure (I just got things up and running a little under a week ago). Any help is much appreciated thanks


----------



## Sujeto 1

Guys which components you would recommend me to match with a GTX 690? i need advice in an entire rig using the GTX 690.


----------



## Arizonian

Quote:


> Originally Posted by *Sujeto 1*
> 
> Guys which components you would recommend me to match with a GTX 690? i need advice in an entire rig using the GTX 690.


Well IMO anything from an Intel 2500K on up would be great with at least 8 GB RAM 1600 MHz. A 750 Watt PSU or higher would run your GTX 690 nicely. You'll be styling for quite some time on anything 2560x1440 or less.

Good luck with your new build your considering.









Edited to add: Personally I felt more comfortable with a good quality 850 Watt PSU.


----------



## jprovido

im running ok with an older i7 950 4.4ghz ht off. bottleneck seems minimal.


----------



## FiShBuRn

Now with backplate that i got from egotrippin.


----------



## Yerph

Proud own of GTX 690.



Brand is Gainward


----------



## Sujeto 1

Ok just 2500k? i was thinking you all were matching Sandy Bridge- E and 32 GB ram or such of things as GTX 690 is a top video card, i was guessing you were pickin top components also.


----------



## jprovido

Quote:


> Originally Posted by *Sujeto 1*
> 
> Ok just 2500k? i was thinking you all were matching Sandy Bridge- E and 32 GB ram or such of things as GTX 690 is a top video card, i was guessing you were pickin top components also.


tbh I was a little worried when I was buying a gtx 690. first of all I had an older gen and very aging x58 system with an i7 950. pci-e 16x 2.0 only. created a thread here in OCN and members advised me it wouldn't bottleneck as much as I thought it would. voila when I got the card the performance was awesome. I'm only playing at 1080p on my 27" 120hz monitor and my gpu usage is pretty high with demanding games esp when I turned off HT and clocked my i7 a little higher at 4.4ghz 120fps + 120hz monitor = pr0n


----------



## thestache

Quote:


> Originally Posted by *Sujeto 1*
> 
> Ok just 2500k? i was thinking you all were matching Sandy Bridge- E and 32 GB ram or such of things as GTX 690 is a top video card, i was guessing you were pickin top components also.


Honestly don't buy one. There are few scenarios a GTX 690 is good for. 120hz and 1440P being the only ones and even then you're VRAM limited in games like skyrim and max payne. The HD 7970 with 3GB of VRAM is a much better choice and cheaper too. Just a shame AMD are unreliable when it comes to drivers.

Get 4GB GTX 680 SLI if your at 1440P or higher or just a single MSI Lightening for 1200P or lower. Otherwise it's just a waste of money. And just stick to Z77. It's more than fast enough for games. Unless you need the features of X79 which most don't, don't worry about it.

Don't get me wrong I love mine it's just not the premium card they made it out to be. The VRAM limitation and small memory bus kills it.


----------



## jprovido

Quote:


> Originally Posted by *Sexparty*
> 
> Honestly don't buy one. There are few scenarios a GTX 690 is good for. 120hz and 1440P being the only ones and even then you're VRAM limited in games like skyrim and max payne. The HD 7970 with 3GB of VRAM is a much better choice and cheaper too. Just a shame AMD are unreliable when it comes to drivers.
> Get 4GB GTX 680 SLI if your at 1440P or higher or just a single MSI Lightening for 1200P or lower. Otherwise it's just a waste of money. And just stick to Z77. It's more than fast enough for games. Unless you need the features of X79 which most don't, don't worry about it.
> Don't get me wrong I love mine it's just not the premium card they made it out to be. The VRAM limitation and small memory bus kills it.


I agree. would've gotten the 7990 devil 13 if it was available from where I'm from. just an overall better card


----------



## Sujeto 1

Quote:


> Originally Posted by *jprovido*
> 
> I agree. would've gotten the 7990 devil 13 if it was available from where I'm from. just an overall better card


really? i saw reviews of 7990 devil that thing is too huge and hungry and horny. according to reviews it's a bit faster than GTX 690 but louder, i woud'nt put that thing on my case never in my life, i think AMD should work on a more elegant cooling solution. for 1000 USD? 2 x 7970 are much cheaper.


----------



## Divineshadowx

Quote:


> Originally Posted by *Sujeto 1*
> 
> really? i saw reviews of 7990 devil that thing is too huge and hungry and horny. according to reviews it's a bit faster than GTX 690 but louder, i woud'nt put that thing on my case never in my life, i think AMD should work on a more elegant cooling solution. for 1000 USD? 2 x 7970 are much cheaper.


Why is 3 slot such a huge problem, you can only use one so its smaller than 4 slots. And I doubt it beats a 690 in outright performance as 680 sli beats 7970crossfire in almost every test, so it should be even. The only decent choice right now is probably 670/680 sli/tri-sli or wait for gk110, whenever the hell that will come out.


----------



## MrTOOSHORT

Anyone know if there is a 1.21v modded bios for the eVGA gtx690? I have one on hold at a local computer store and may pick it up tomorrow. Really thinking hard about it!


----------



## burningrave101

Quote:


> Originally Posted by *jprovido*
> 
> I agree. would've gotten the 7990 devil 13 if it was available from where I'm from. just an overall better card


I'm curious as to what exactly would make you think Powercolor's 7990 is a better card than the GTX 690.


----------



## DeadLink

Quick question for the GTX 690 owners with single 2560x1440P monitors. For BF3, would you say that you can manage 60-80 FPS minimum at settings on ultra? Considering AA and AF if you were to increase those settings would the FPS start dipping below 60?

For owners with 2 GTX 690's where would the hindering begin with certain CPU's. IE the 3770K, does that CPU seem to be a comfortable fit with something along the lines of a Quad SLI setup or would the 3930K/3960x be better suited to handle 4 Way SLI?

Thanks for your time,
-Matt


----------



## thestache

Quote:


> Originally Posted by *DeadLink*
> 
> Quick question for the GTX 690 owners with single 2560x1440P monitors. For BF3, would you say that you can manage 60-80 FPS minimum at settings on ultra? Considering AA and AF if you were to increase those settings would the FPS start dipping below 60?
> For owners with 2 GTX 690's where would the hindering begin with certain CPU's. IE the 3770K, does that CPU seem to be a comfortable fit with something along the lines of a Quad SLI setup or would the 3930K/3960x be better suited to handle 4 Way SLI?
> Thanks for your time,
> -Matt


Single GTX 690 is fine but I'd get GTX 670 SLI instead, it's cheaper and will offer the same performance plus better overclocking potential/can even use the voltage unlocked BIOS.

Gone are the days when you need more then one GPU for 1080P 60FPS. So 1440P 60FPS is easily handled by two GPUs.

http://www.guru3d.com/article/evga-geforce-gtx-690-review/19


----------



## thestache

Quote:


> Originally Posted by *burningrave101*
> 
> I'm curious as to what exactly would make you think Powercolor's 7990 is a better card than the GTX 690.


Extra VRAM and larger bus make it able to be used for eyefinity, where as the GTX 690 sucks in a surround set-up. Unlocked voltage control and better overclocking. All reasons it could be considered better.

However AMD drivers are a good reason why it could be considered not as good. Although latest Geforce drivers have much needed improvements but still full of bugs at desktop. Flash videos flash green for me and are watchable.

I'd say they are even when you weigh the pros and cons. However I'm leaning towards HIS HD 7970 X edition Xfire and it's pros to power my surround set-up until the GTX 780/770 comes out.


----------



## DeadLink

For two 670's Vs 690 in price with waterblocks is ~100 bucks. So if I can get the same performance stock for stock comparison with one video card rather than 2 I would prefer the single PCB solution. Less wiring and Less work entirely to deal with. Even the 680 SLI is only ~1 FPS average above the 690.


----------



## jprovido

Quote:


> Originally Posted by *Sexparty*
> 
> Single GTX 690 is fine but I'd get GTX 670 SLI instead, it's cheaper and will offer the same performance plus better overclocking potential/can even use the voltage unlocked BIOS.
> Gone are the days when you need more then one GPU for 1080P 60FPS. So 1440P 60FPS is easily handled by two GPUs.
> http://www.guru3d.com/article/evga-geforce-gtx-690-review/19


I'm playing on a single 1080p 120hz monitor. pushing 120fps is hard even with an overclocked gtx 690







(max settings w/ max AA tho). people sometimes overrate cards. for some reason I'm not as blown away as I was when I first got my HD5970 a few years ago. dunno if it's just me lol


----------



## thestache

Quote:


> Originally Posted by *DeadLink*
> 
> For two 670's Vs 690 in price with waterblocks is ~100 bucks. So if I can get the same performance stock for stock comparison with one video card rather than 2 I would prefer the single PCB solution. Less wiring and Less work entirely to deal with. Even the 680 SLI is only ~1 FPS average above the 690.


It's not that simple and two GPUs on two seperate cards will always offer better performance but if thats what you want then by all means spend your money that way.

As for quad SLI it's a waste of money and anyone who does it, is just that. Wasting money. My 2700k at 5000mhz handled quad SLI fine.


----------



## DeadLink

Quote:


> Originally Posted by *Sexparty*
> 
> It's not that simple and two GPUs on two seperate cards will always offer better performance but if thats what you want then by all means spend your money that way.
> As for quad SLI it's a waste of money and anyone who does it, is just that. Wasting money. My 2700k at 5000mhz handled quad SLI fine.


From the link you provided the two GTX 670's did not have better performance than the single GTX 690, in fact the total power usage from two 670's is higher by at least 26 watts as well. As far as over clocking goes that is up to my dad but having to add two cards instead of one with no plans on any additional upgrades for a while the offering of the single PCB Vs the two you explained still doesn't offer anything better besides OC'ing headroom and that is not of any interest at this time. (Still taking into account that power usage is during 100%)

I would still appreciate owners post their experience on this that do use 2560x1440 please. Thanks again,

-Matt


----------



## pilla99

Quote:


> Originally Posted by *DeadLink*
> 
> Quick question for the GTX 690 owners with single 2560x1440P monitors. For BF3, would you say that you can manage 60-80 FPS minimum at settings on ultra? Considering AA and AF if you were to increase those settings would the FPS start dipping below 60?
> For owners with 2 GTX 690's where would the hindering begin with certain CPU's. IE the 3770K, does that CPU seem to be a comfortable fit with something along the lines of a Quad SLI setup or would the 3930K/3960x be better suited to handle 4 Way SLI?
> Thanks for your time,
> -Matt


Yes. With an overclock I play 60FPS minimum on ultra at 1440p.


----------



## DeadLink

If you dont mind sharing could you possibly PM me your settings and clocks that you use?


----------



## pilla99

Quote:


> Originally Posted by *DeadLink*
> 
> If you dont mind sharing could you possibly PM me your settings and clocks that you use?


These OC settings have been great for me all summer. I have my CPU currently only at a 4.2Ghz clock however I ran 60 minimum even at the stock 3.8 speed.
Just set to ultra and get the latest bet drivers and you should carve through the game.


----------



## DeadLink

Thank you^.


----------



## rationalthinking

Question here..

I have a 3 x 23" Monitor setup but really hate playing in surround. So I use the center monitor to game with the other two streaming movies and such.

When I select Use All Displays in the nVidia control panel my 690 is taken out of SLi. Now, the card is really a waste and just playing on 1 680 and the other 680 as PhysX.

The only option to keep it in SLi is to do Surround or use 1 monitor.

Am I going about this wrong or is there another option?

Thanks in advance!


----------



## Sujeto 1

Quote:


> Originally Posted by *burningrave101*
> 
> I'm curious as to what exactly would make you think Powercolor's 7990 is a better card than the GTX 690.


Well, clearly Reviews say so. Better is a relative words, but AMD give more fps, too bad drivers need to be fixed. the cons is that is hotter, bigger, and same price than GTX 690.


----------



## Marcsrx

Quote:


> Originally Posted by *DeadLink*
> 
> Quick question for the GTX 690 owners with single 2560x1440P monitors. For BF3, would you say that you can manage 60-80 FPS minimum at settings on ultra? Considering AA and AF if you were to increase those settings would the FPS start dipping below 60?
> For owners with 2 GTX 690's where would the hindering begin with certain CPU's. IE the 3770K, does that CPU seem to be a comfortable fit with something along the lines of a Quad SLI setup or would the 3930K/3960x be better suited to handle 4 Way SLI?
> Thanks for your time,
> -Matt


This is how I play, all maxed 1440p @ 100hz, typically around 80-120 fps. No overclock here. I'm reading that I should turn AA off w/a 1440p monitor as it is not necessary w/the extra pixels.. Going to try and see if that further elevates my average FPS.


----------



## iARDAs

Quote:


> Originally Posted by *Marcsrx*
> 
> This is how I play, all maxed 1440p @ 100hz, typically around 80-120 fps. No overclock here. I'm reading that I should turn AA off w/a 1440p monitor as it is not necessary w/the extra pixels.. Going to try and see if that further elevates my average FPS.


Yeah AA is barely noticeable in 1440p. Turn it off I would say. I dont believe it is worth the performance drop.

In another thread however someone mentioned that AA is not only for eleminating jaggies but also do something with lighting? If so I still dont notice the difference much.

I play all my games without AA of any kind including FXAA. I never see the benefit. I mean I see it but its not worth such a performance drop.

Also with AA, my vram usage hits over 2200 in BF3 multi. YOu Might turn it off to have Vram below 2000


----------



## Marcsrx

Yeah I think w/a decent OC on the CPU and 690 + turning off AA, I'm hoping to sit much closer to 100fps on average to meet the refresh rate of the catleap. I'm really not that far off right now...


----------



## iARDAs

Quote:


> Originally Posted by *Marcsrx*
> 
> Yeah I think w/a decent OC on the CPU and 690 + turning off AA, I'm hoping to sit much closer to 100fps on average to meet the refresh rate of the catleap. I'm really not that far off right now...


Lucky that you have the 2B version. My catleap almost died when i tried to take it to 65hz.


----------



## thestache

Quote:


> Originally Posted by *rationalthinking*
> 
> Question here..
> I have a 3 x 23" Monitor setup but really hate playing in surround. So I use the center monitor to game with the other two streaming movies and such.
> When I select Use All Displays in the nVidia control panel my 690 is taken out of SLi. Now, the card is really a waste and just playing on 1 680 and the other 680 as PhysX.
> The only option to keep it in SLi is to do Surround or use 1 monitor.
> Am I going about this wrong or is there another option?
> Thanks in advance!


Sounds like your drivers.

I've had similar problems with new drivers and had to rollback. You should be able to activate surround with all three monitors with all connected via DVI no dramas. Always.


----------



## juanP

deleted


----------



## Divineshadowx

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah AA is barely noticeable in 1440p. Turn it off I would say. I dont believe it is worth the performance drop.
> 
> In another thread however someone mentioned that AA is not only for eleminating jaggies but also do something with lighting? If so I still dont notice the difference much.
> 
> I play all my games without AA of any kind including FXAA. I never see the benefit. I mean I see it but its not worth such a performance drop.
> 
> Also with AA, my vram usage hits over 2200 in BF3 multi. YOu Might turn it off to have Vram below 2000


You have a 670 and you play all your games w/o AA? Ok then, my 5770crossfire got 60fps in bf3 w/o aa and a 670 is probably 3x stronger than that.


----------



## dboythagr8

How can people say AA is barely noticeable at 1440p? Especially in BF3 I definitely notice when it's on or off and I'm at 1600p...


----------



## iARDAs

Quote:


> Originally Posted by *Divineshadowx*
> 
> You have a 670 and you play all your games w/o AA? Ok then, my 5770crossfire got 60fps in bf3 w/o aa and a 670 is probably 3x stronger than that.


I was talking about 1440p gaming. With 4xMSAA at times the game would drop to 30 fps when the battle is heated.

690s also are handling that game great but I wonder if the Vram cap would create issues. At one scenario I saw my Vram usage in BF3 spiking to 2350.

Quote:


> Originally Posted by *dboythagr8*
> 
> How can people say AA is barely noticeable at 1440p? Especially in BF3 I definitely notice when it's on or off and I'm at 1600p...


Honestly we have been debating this over at 1440p+ Gaming Club and most users also believe the same thing. I mean YES there is a difference between no AA and 2xMSAA, but the difference is not worth the extra performance drop and in 1440p gaming you need roughylu 30-40% more GPU power.

I also notice the difference in BF3, but its not vast in my opinion.

I've been playing Mafia 2, Dirt 2 and BF3 lately and I am perfectly fine witout AA.

But 2xMSaa vs 4xMSaa i see no difference but thats just me.

(sorry for going out of topic btw guys. I know this is the 690 thread


----------



## Divineshadowx

Quote:


> Originally Posted by *iARDAs*
> 
> I was talking about 1440p gaming. With 4xMSAA at times the game would drop to 30 fps when the battle is heated.
> 
> 690s also are handling that game great but I wonder if the Vram cap would create issues. At one scenario I saw my Vram usage in BF3 spiking to 2350.
> Honestly we have been debating this over at 1440p+ Gaming Club and most users also believe the same thing. I mean YES there is a difference between no AA and 2xMSAA, but the difference is not worth the extra performance drop and in 1440p gaming you need roughylu 30-40% more GPU power.
> 
> I also notice the difference in BF3, but its not vast in my opinion.
> 
> I've been playing Mafia 2, Dirt 2 and BF3 lately and I am perfectly fine witout AA.
> 
> But 2xMSaa vs 4xMSaa i see no difference but thats just me.
> 
> (sorry for going out of topic btw guys. I know this is the 690 thread


690 is perfect for 1440p/1600p, anything more and the vram limits it. But the 690 can run any game maxed with 4xmsaa and achieve 60+fps, different story ofc for the 670, there is no point in AA if you are getting below 60fps, that should probably be your first goal. 2Gb vram is fine for 1440p with msaa, probably not crazy ssaa and sgssaa, but that isn't very optimized. You can see the AA though imo, got to look for it sometimes but if does improve the overall picture. Anyway, there isn't much to talk about other than Nvidia failing to give the 690 proper vram and bus width...


----------



## iARDAs

Quote:


> Originally Posted by *Divineshadowx*
> 
> 690 is perfect for 1440p/1600p, anything more and the vram limits it. But the 690 can run any game maxed with 4xmsaa and achieve 60+fps, different story ofc for the 670, there is no point in AA if you are getting below 60fps, that should probably be your first goal. 2Gb vram is fine for 1440p with msaa, probably not crazy ssaa and sgssaa, but that isn't very optimized. You can see the AA though imo, got to look for it sometimes but if does improve the overall picture. Anyway, there isn't much to talk about other than Nvidia failing to give the 690 proper vram and bus width...


I wish selling a Dual core GPU was easy in Turkey, than I would go for an 690 perhaps.

I always like dual GPU cards, but it was a nightmare to sell my 590 last year. Lost tons of money on it and barely sold it.

This was the only reason I stayed with single core GPUs this generation and probably next generations as well.

For me the Dual GPU cards have always been a technological marvel.

but again i like their horsepower and how much framerate they produce at the end, especially at 1440p. I have yet to see someone unhappy with a 690 in 1440p so far. I wonder if there will be a 695 with more vram in the future?


----------



## gizmo83

hi guys. i'm looking for a controller led utility for my gigabyte 690. Can you indicate me anything like the evga utility? Thank You!


----------



## Kyouki

For the LED controler on the GTX 690, The link is from the home page of this club. The EVGA utility works with all the GTX 690's

ftp://ftp.evga.com/utilities/EVGA_LED_Controller.zip


----------



## jprovido

Quote:


> Originally Posted by *Kyouki*
> 
> For the LED controler on the GTX 690, The link is from the home page of this club. The EVGA utility works with all the GTX 690's
> ftp://ftp.evga.com/utilities/EVGA_LED_Controller.zip


it's a cool little program.


----------



## gizmo83

Quote:


> Originally Posted by *Kyouki*
> 
> For the LED controler on the GTX 690, The link is from the home page of this club. The EVGA utility works with all the GTX 690's
> ftp://ftp.evga.com/utilities/EVGA_LED_Controller.zip


the program say that not found an evga card







and doesn't start







help me...


----------



## gizmo83

nobody can help me?


----------



## n99127

So I've been trying to get my GTX 690 Quad SLI setup overclocked.

It seems that currently the most I can pull off is about a +150 offset on the GPUs and about +250 on the memory clocks. I think I tried +200/+500, but couldn't get the system to remain stable, which I thought was odd given that the fans were barely past maybe 40%-50% speed (not loud enough to be really audible). I set the power target to 135% as well.

Any suggestions on what I could possibly do?


----------



## thestache

Quote:


> Originally Posted by *n99127*
> 
> So I've been trying to get my GTX 690 Quad SLI setup overclocked.
> It seems that currently the most I can pull off is about a +150 offset on the GPUs and about +250 on the memory clocks. I think I tried +200/+500, but couldn't get the system to remain stable, which I thought was odd given that the fans were barely past maybe 40%-50% speed (not loud enough to be really audible). I set the power target to 135% as well.
> Any suggestions on what I could possibly do?


Nope thats what should happen.

150+ on the cores and 250+ on the memory for quad SLI are good results. Crank the fans up enough they keep the GPU at 70deg under full load and youll never have them overheating and throttling (even though I doubt you'd be able to in quad SLI).


----------



## n99127

Hmm well that kinda sucks I suppose. Oh well, I was looking into getting a water cooling system for my GTX 690s, but we'll see. The temperature where I live is starting to drop pretty dramatically, so I might be able to just rely on uh..."external cooling."

I suppose trying to push the Quad SLI system isn't really worth it anyways...can't really think of any games that would absolutely require that kind of power.


----------



## Kyouki

Quote:


> Originally Posted by *gizmo83*
> 
> the program say that not found an evga card
> 
> 
> 
> 
> 
> 
> 
> and doesn't start
> 
> 
> 
> 
> 
> 
> 
> help me...


I am not sure i was advised and figured it worked with all GTX 690 since they are the same card just branded by who sales them. I will try to do some searching around to see if this is common.


----------



## gizmo83

Quote:


> Originally Posted by *Kyouki*
> 
> I am not sure i was advised and figured it worked with all GTX 690 since they are the same card just branded by who sales them. I will try to do some searching around to see if this is common.


thank you


----------



## duox

anyone run one of these on a seasonic x750 ? If I get this upcoming job I might grab one if I don't have to upgrade my PSU.


----------



## pilla99

Quote:


> Originally Posted by *duox*
> 
> anyone run one of these on a seasonic x750 ? If I get this upcoming job I might grab one if I don't have to upgrade my PSU.


I do. However see my thread here on actual power usage, you can get away with a 500w PSU if you wanted.

http://www.overclock.net/t/1290091/gtx-690-true-power-measurement


----------



## dboythagr8

Will post this here since apparently my thread is getting ignored...

I just brought my 120hz monitor back out for use as my secondary monitor. I tried BF3 on my main display (U3011) and kept Precision, CPU-Z, and real temp up on my secondary. In Precision I noticed that there was no base clock or boost clock marker and and the the GPUs were running at 705mhz. I thought that was odd so I restarted my computer and did the same setup as listed above. This time the markers were back. Load up BF3 and play as normal and see that the GPUs are running at the boost clock through the Precision overlay, except now my secondary monitor goes black. It's still on though (blue led on monitor indicating activity). I exit BF3 and secondary monitor and the monitoring programs come back up...

I don't think this is normal could anybody offer advice?


----------



## dbterp

hey guys, does the 690's vram limit skyrim on max settings with some texture mods at 1440p?


----------



## duox

Quote:


> Originally Posted by *pilla99*
> 
> I do. However see my thread here on actual power usage, you can get away with a 500w PSU if you wanted.
> http://www.overclock.net/t/1290091/gtx-690-true-power-measurement


Thanks pal rep for that.


----------



## thestache

Quote:


> Originally Posted by *dboythagr8*
> 
> Will post this here since apparently my thread is getting ignored...
> I just brought my 120hz monitor back out for use as my secondary monitor. I tried BF3 on my main display (U3011) and kept Precision, CPU-Z, and real temp up on my secondary. In Precision I noticed that there was no base clock or boost clock marker and and the the GPUs were running at 705mhz. I thought that was odd so I restarted my computer and did the same setup as listed above. This time the markers were back. Load up BF3 and play as normal and see that the GPUs are running at the boost clock through the Precision overlay, except now my secondary monitor goes black. It's still on though (blue led on monitor indicating activity). I exit BF3 and secondary monitor and the monitoring programs come back up...
> I don't think this is normal could anybody offer advice?


Sounds like drivers.

I had similar issues with my surround set-up and sold mine because of it. Was yet to find a driver that would not produce sticky clocks or had the ability to watch flash videos in browsers. MSI Lightning HD 7970 crossfire on the way and we will see if it's better or worse.


----------



## thestache

Quote:


> Originally Posted by *dbterp*
> 
> hey guys, does the 690's vram limit skyrim on max settings with some texture mods at 1440p?


It can at 1080P so you can bet it will on 1440P.


----------



## thestache

Quote:


> Originally Posted by *n99127*
> 
> Hmm well that kinda sucks I suppose. Oh well, I was looking into getting a water cooling system for my GTX 690s, but we'll see. The temperature where I live is starting to drop pretty dramatically, so I might be able to just rely on uh..."external cooling."
> I suppose trying to push the Quad SLI system isn't really worth it anyways...can't really think of any games that would absolutely require that kind of power.


No point water cooling quad SLI unless you're running surround and even then itll be usless with the VRAM wall. Nothing will use them enough to increase fan speed for it to become audible or hit the max 95% limit.

Read the thread and you'll see how pointless quad SLI is. I had it for a day and gave up on it and recently gave up on a single GTX 690. Feel sorry for the guy who bought mine to run triple 1440P on it. Mine didn't run tripple 1200P.


----------



## JonnyKovsH

Hi. Who can save bios for Asus 690 and Evga 690? send me both bios in a private message, please (g1 and g2)


----------



## MrTOOSHORT

Quote:


> Originally Posted by *JonnyKovsH*
> 
> Hi. Who can save bios for Asus 690 and Evga 690? send me both bios in a private message, please (g1 and g2)


http://www.mvktech.net/component/option,com_joomlaboard/Itemid,/func,view/catid,10/id,63047/#63047

Scroll down 8 posts.


----------



## JonnyKovsH

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> http://www.mvktech.net/component/option,com_joomlaboard/Itemid,/func,view/catid,10/id,63047/#63047
> Scroll down 8 posts.


Thanks


----------



## gizmo83

i would use the evga led controller on my gigabyte 690. Is possibile overwrite the gigabyte's bios with the evga ? or change the manufacturer code?


----------



## s74r1

So I've had the chance to test two 690's last month. With boost disabled and voltage forced to 1.175v, the first one did 1163MHz Core & 6445MHz effective memory, and the second one did 1124MHz Core and 7200MHz effective memory (fully stable). After an exhaustive suite of benchmarks, I found the one with higher mem to come out on top most of the time.

With boost and normal voltage enabled, the two GPU's even out at 1.150v/1.165v respectively after heating up past 80c so this one is only stable at about 1100MHz core with those voltages (boosts to 1110 at first, then backs down to 1097 with +52 on core and +596 on mem). I believe nvidia made GPU1 boost to slightly lower volts because it's always hotter due to airflow restriction. This card also seems to make more noticeable electrical noise too.

Pretty bad overclockers though, right? Never had good luck with overclocking EVGA cards...


----------



## Arizonian

Quote:


> Originally Posted by *s74r1*
> 
> So I've had the chance to test two 690's last month. With boost disabled and voltage forced to 1.175v, the first one did 1163MHz Core & 6445MHz effective memory, and the second one did 1124MHz Core and 7200MHz effective memory (fully stable). After an exhaustive suite of benchmarks, I found the one with higher mem to come out on top most of the time.
> With boost and normal voltage enabled, the two GPU's even out at 1.150v/1.165v respectively after heating up past 80c so this one is only stable at about 1100MHz core with those voltages (boosts to 1110 at first, then backs down to 1097 with +52 on core and +596 on mem). I believe nvidia made GPU1 boost to slightly lower volts because it's always hotter due to airflow restriction. This card also seems to make more noticeable electrical noise too.
> Pretty bad overclockers though, right? Never had good luck with overclocking EVGA cards...


Intersesting because I've found the same scenario where GPU1 does worse @ 1176 MHz Core & GPU2 1202 MHz Core.

However my Memory over clock is very poor. Most I was able to bench was 3200 MHz Memory effectively.

I've not had any luck with over clocking my last four GPU's I've had in my hands either. GTX 580, 680, 680SC, & 690 all approximately a 14% over clocks on Core and poor 4-7% on Memory.

I've found for my single 120 Hz monitor the GTX 690 is more than enough even at stock. I've settled for a stable over clock of 1170 MHz Core 3121 MHz Memory for a 24/7 over clock for both GPU's unsynched with GPU2 running lower over clocks to match GPU1 1170 MHz.

I did want to try a dual GPU for the first time and confirmed for myself it's got it's pros and cons. I may try two singles next time in SLI instead.


----------



## thestache

Quote:


> Originally Posted by *s74r1*
> 
> So I've had the chance to test two 690's last month. With boost disabled and voltage forced to 1.175v, the first one did 1163MHz Core & 6445MHz effective memory, and the second one did 1124MHz Core and 7200MHz effective memory (fully stable). After an exhaustive suite of benchmarks, I found the one with higher mem to come out on top most of the time.
> With boost and normal voltage enabled, the two GPU's even out at 1.150v/1.165v respectively after heating up past 80c so this one is only stable at about 1100MHz core with those voltages (boosts to 1110 at first, then backs down to 1097 with +52 on core and +596 on mem). I believe nvidia made GPU1 boost to slightly lower volts because it's always hotter due to airflow restriction. This card also seems to make more noticeable electrical noise too.
> Pretty bad overclockers though, right? Never had good luck with overclocking EVGA cards...


The second one did well on the memory. But neither overclocked that well. Typically the higher you clock the memory the lower your core clock will be. So think higher on one lower on the other. You're voltage limited so it's up to you where you send the voltage and what's more important. I'd say core is on the GTX 690. Saw the biggest boost in heaven and games over clocking the core that last little bit than getting the memory a bit higher.

Both my cards overclocked to 1200mhz+ on the core but only 6600-68000mhz on the memory. I'd consider them good but not great.


----------



## Divineshadowx

Hey guys, got my 2560x1440p crossover 27q today. I ordered the perfect pixel version and I seem to have got it. But I don't know if it is the gold version, which is the 10bit, anyone know how can I tell since I can't read korean? The 1440p looks great so far, however my color is terrible and idk how to fix it. Is there any decent software calibrator, because I don't feel like $250 for a hardware one is really needed for none pro work. Oh and btw, was playing crysis 2 maxed and vram hit over 2000, lol.


----------



## Buzzkill

Quote:


> Originally Posted by *JonnyKovsH*
> 
> Hi. Who can save bios for Asus 690 and Evga 690? send me both bios in a private message, please (g1 and g2)


Techpowerup.com Has a videocard BIOS database. It only has the ASUS 690 GTX BIOS for download. This might help JonnyKovsH.

http://www.techpowerup.com/vgabios/index.php?page=1&architecture=NVIDIA&manufacturer=&model=GTX+690&interface=&memSize=0


----------



## MrTOOSHORT

Proud new owner of an eVGA GTX 690:





Ordered a full cover EK block and back plate from EK a few minutes ago. Should be here on Thursday or Friday!


----------



## Arizonian

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Proud new owner of an eVGA GTX 690:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Ordered a full cover EK block and back plate from EK a few minutes ago. Should be here on Thursday or Friday!


Congrats.







It's going into a great system there.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Hey guys, got my 2560x1440p crossover 27q today. I ordered the perfect pixel version and I seem to have got it. But I don't know if it is the gold version, which is the 10bit, anyone know how can I tell since I can't read korean? The 1440p looks great so far, however my color is terrible and idk how to fix it. Is there any decent software calibrator, because I don't feel like $250 for a hardware one is really needed for none pro work. Oh and btw, was playing crysis 2 maxed and vram hit over 2000, lol.


Go to the crossover/catnap thread and ask what settings they're using. You don't need software or hardware calibrators when other people have them and can do it for you.


----------



## s74r1

Quote:


> Originally Posted by *thestache*
> 
> Typically the higher you clock the memory the lower your core clock will be. So think higher on one lower on the other. You're voltage limited so it's up to you where you send the voltage and what's more important. I'd say core is on the GTX 690. Saw the biggest boost in heaven and games over clocking the core that last little bit than getting the memory a bit higher.


I tested both of my 690's with memory at stock and never got anything more than marginally higher core clocks (like 13mhz). This is also true that Heaven seems to favor core clocks slightly, but all my other benches seemed to favor memory.


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> Go to the crossover/catnap thread and ask what settings they're using. You don't need software or hardware calibrators when other people have them and can do it for you.


I did, messed around and somehow the default icc profile seems the best now, not sure what happened. The 690's vram is basically not enough for 1440p. Was getting 2048+ in bf3, and bf3 doesn't actually use any sgssaa or high AA. If more demanding titles come out I bet most would have to upgrade to 3gb/4gb cards.


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> I did, messed around and somehow the default icc profile seems the best now, not sure what happened. The 690's vram is basically not enough for 1440p. Was getting 2048+ in bf3, and bf3 doesn't actually use any sgssaa or high AA. If more demanding titles come out I bet most would have to upgrade to 3gb/4gb cards.


just because bf3 uses 2048mb of ram doesnt mean it needs it, bf3 will take use of available vram if its there to be used, 2gb of vram is more then enough for 1440p with any game. (thats not including mods ex. skyrim).

if you didnt have enough vram your game would stutter and lag like crazy, ive had it happen before when i played metro 2033 on 1600p with max setting and max AA on sli 1.5gb 580s, thought my computer froze it was lagging so bad lol, turned down AA and it ran perfect


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> just because bf3 uses 2048mb of ram doesnt mean it needs it, bf3 will take use of available vram if its there to be used, 2gb of vram is more then enough for 1440p with any game. (thats not including mods ex. skyrim).
> if you didnt have enough vram your game would stutter and lag like crazy, ive had it happen before when i played metro 2033 on 1600p with max setting and max AA on sli 1.5gb 580s, thought my computer froze it was lagging so bad lol, turned down AA and it ran perfect


Ya I guess. Bf3 was running fine, same for crysis 2. I can't say the same for the witcher 2 however. And that was not even caused by the vram. I was getting 38fps with ubersampling. It seems really unoptimized though.

2gb seems fine for the games with normal settings, but if you start adding ssaa and sgssaa to some older titles it kills it, runs at like 0.4fps, i know what you're talking about lol, the game basically doesn't respond.

Drivers are new though, we might see some improvments, i dont think we have seen the full potential of the 690 yet, besides the crappy vram integration ofc.


----------



## ban916

Getting my 690 tomorrow. Coming from a 7970.


----------



## thestache

Quote:


> Originally Posted by *s74r1*
> 
> I tested both of my 690's with memory at stock and never got anything more than marginally higher core clocks (like 13mhz). This is also true that Heaven seems to favor core clocks slightly, but all my other benches seemed to favor memory.


Faulty cards mate.

Reference GTX 680s clock speeds are 1006mhz base and 1058mhz boost. If your GTX 690 couldn't even achieve that then there was something wrong with it.

But you're right some do favor memory. Software I was using preferred clock speed so went with that route. Never tried stock core and seeing how high the memory would go.

I shouldn't be able to post here anymore. Since I sold mine and should have my membership revoked. Lol.


----------



## s74r1

Quote:


> Originally Posted by *thestache*
> 
> Faulty cards mate.
> Reference GTX 680s clock speeds are 1006mhz base and 1058mhz boost. If your GTX 690 couldn't even achieve that then there was something wrong with it.
> But you're right some do favor memory. Software I was using preferred clock speed so went with that route. Never tried stock core and seeing how high the memory would go.
> I shouldn't be able to post here anymore. Since I sold mine and should have my membership revoked. Lol.


I meant the cards only got 13mhz over the usual overclock after removing the memory overclock (like 1137/3000 instead of 1124/3600), so i don't see how memory overclocks effect the core clock much at all. Bad overclocker maybe, but not faulty. I currently run mine at +52 and +600 which lands it at 1097-1110 core clock in-game and 7200MHz effective GDDR5. Max stable core is 1124 only with voltage forced to 1.175 and boost disabled. With default boost enabled and normal voltage the 1.150/1.165 on GPU0/GPU1 in-game made it unstable.

And for the record, both of my 690's had identical default kepler boost (1058Mhz, then drops to 1045MHz after heating up). So I've only got a 5% core overclock since mine now boost to 1110, then 1097 after heating up with +52.


----------



## Arizonian

Quote:


> Originally Posted by *s74r1*
> 
> I meant the cards only got 13mhz over the usual overclock after removing the memory overclock (like 1137/3000 instead of 1124/3600), so i don't see how memory overclocks effect the core clock much at all. Bad overclocker maybe, but not faulty. I currently run mine at +52 and +600 which lands it at 1097-1110 core clock in-game and 7200MHz effective GDDR5. Max stable core is 1124 only with voltage forced to 1.175 and boost disabled. With default boost enabled and normal voltage the 1.150/1.165 on GPU0/GPU1 in-game made it unstable.
> And for the record, both of my 690's had identical default kepler boost (1058Mhz, then drops to 1045MHz after heating up). So I've only got a 5% core overclock since mine now boost to 1110, then 1097 after heating up with +52.


I find that messing with the voltage didnt yield any benifits. Lowering the voltage may have given me 1C-2C lower temps. So I leave it on auto and let it use what it needs and volt down when not needed seems best for performance and for the card longevity as well.

I am an old-school thinking over clocker and I always start with the Core. Max the Core stable first then work on the Memory. Might be why my highest Memory is 3200 MHz effectively. My Memory however is a poor over clocker I've found. When I reversed the strategy over clocking Memory first my performance was worse. So I'm now 14% OC on Core and 5% OC on Memory 24/7 stable was best I could do on air. Technically way more than enough anyway for my needs.

I've been floored with the Memory over clocks that I'm seeing others have sucess with on the single cards. It's unprecedented and sick.









PS - What percentage over clock does your 7200 MHz Memory amount to?


----------



## s74r1

Speaking of voltage, did you actually see if the voltage was changing at all? Messing with voltage in Precision-X or Afterburner didn't seem to do anything on my 690's.

The only way I was able to force 1.175v was with Nvidia Inspector via command line, but then the voltage sticks there even while idling, so ultimately it wasn't worth the extra heat/noise. The default 1.150v/1.165v under load works well since GPU0 is always hotter than GPU1, but unfortunately my GPU0 is the weaker one that wants more volts for high OC's.

For those looking to squeeze every last ounce of performance out of these, I would recommend backing down the core one or two notches if your memory OC's bad, you may reach 7GHz+ which should outweigh the loss, especially in 3DMark since it seems to prefer memory clocks. But some just won't budge past 6.4-6.5GHz, like my first 690 (maxed at 6445MHz effective)

My method of OC'ing is a bit different, I start high and then back down from there judging by how fast it artifacts. (Like when core OC'ing if it artifacted within 5mins of Heaven, backing it down 3 notches (39MHz) was almost always rock stable)


----------



## Divineshadowx

I know this is overclock.net, but seriously why do you guys need to actually oc the 690 that high. I doubt there is anything that you cannot run on 1080p,1440p/1600p maxed out at 60/120fps. If you're at a higher res the vram will kill you, and if you are benchmarking and aiming for high scores you might as well get quad 680s. I would just enjoy my card, and not end up killing it to resell it for gk110 or something along those lines.


----------



## pilla99

Quote:


> Originally Posted by *Divineshadowx*
> 
> I know this is overclock.net, but seriously why do you guys need to actually oc the 690 that high. I doubt there is anything that you cannot run on 1080p,1440p/1600p maxed out at 60/120fps. If you're at a higher res the vram will kill you, and if you are benchmarking and aiming for high scores you might as well get quad 680s. I would just enjoy my card, and not end up killing it to resell it for gk110 or something along those lines.


On my system to run BF3 max at 1440p at 60 fps minimum I have to OC. Otherwise I can get 50's here and there. Metro 2033 on 1440p can't run 60 all the time even with an OC at max settings. Everything else I play though it eats through. Even some games like Starcraft II though if enough is going on can drop me to 90's.


----------



## Divineshadowx

Quote:


> Originally Posted by *pilla99*
> 
> On my system to run BF3 max at 1440p at 60 fps minimum I have to OC. Otherwise I can get 50's here and there. Metro 2033 on 1440p can't run 60 all the time even with an OC at max settings. Everything else I play though it eats through. Even some games like Starcraft II though if enough is going on can drop me to 90's.


Really? I was getting 60fps on a 64player map, same for metro. Sc2 is cpu limited. The gpu don't actually work hard to run a 4v4 or something, and the game is not optimized to use the full power of sandy/ivy high end. I'm on a mild oc.


----------



## Marcsrx

Yeah, I float between 70 and 110 fps in BF3 @ 1440p all maxed.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> I know this is overclock.net, but seriously why do you guys need to actually oc the 690 that high. I doubt there is anything that you cannot run on 1080p,1440p/1600p maxed out at 60/120fps. If you're at a higher res the vram will kill you, and if you are benchmarking and aiming for high scores you might as well get quad 680s. I would just enjoy my card, and not end up killing it to resell it for gk110 or something along those lines.


GPUs now a days are good enough overclocking isn't an issue and should never be. As long as you have adequate cooling and stay below the recommended temps you'll never have any problems. So it's a non issue.


----------



## Arizonian

Quote:


> Originally Posted by *Divineshadowx*
> 
> *I know this is overclock.net, but seriously why do you guys need to actually oc the 690 that high*. I doubt there is anything that you cannot run on 1080p,1440p/1600p maxed out at 60/120fps. If you're at a higher res the vram will kill you, and if you are benchmarking and aiming for high scores you might as well get quad 680s. I would just enjoy my card, and not end up killing it to resell it for gk110 or something along those lines.


A few reasons.
To use a term most avid over clockers can relate to, 'I'd over clock my toaster if I could'.









In short - Why does one climb a mountain? Challenge. Why do we love going fast in cars? It's fun.









Second reason for myself is 3D gaming which I do like, that cuts FPS by 45%-50% and optimally 60 FPS is best for smoothest 3D game play. As an example if one is getting 100 FPS in gaming, when 3D Vision is turned on it will be 50-55 FPS gaming. During normal game play dips of 70 I'm crawling at 35 FPS in 3D gaming.

I couldn't agree more the GTX 690 can handle up to 1600p res, even at stock clocks in games. I dabbled in some benchmarking this year and if I could afford another $999 I would but would rather spend the money if I did on a huge capacity SSD.

As for killing it, without being able to add voltage past 1.175v it protects the card from some degradation but that's just part of over clocking anything.

I'm eventually going to put this 690 in the second rig the kids use if I should decide to buy GK110. I purchased the extended warranty so it's going to be around for the long haul.

Next series I'm going to try something new and be patient until the non-reference cards come out. Better cooling for even higher over clocks.









Edited to add: After your post I tried the card at stock clocks and Kepler Boost hits 1045 MHz GPU#1 & 1058 GPU#2. Also it didn't matter if I raised the Power Target offset to default or +135 I got the same results. Still a nice boost for stock.


----------



## workthis

EVGA

http://imageshack.us/a/img189/6976/sam0029converted.jpg

http://imageshack.us/a/img838/5387/sam0036converted.jpg

Side note: T-shirt is waaaaaaaaaaaaaay too big for me. I wish they gave an M sized T-shirt instead of XL.


----------



## Divineshadowx

Okay what is this bs, I can't complete 2 minutes of OCCT with my 3770k @4500mhz @1.38 volts...... is my cpu just completely faulty?


----------



## Divineshadowx

It's official, I have the worst cpu ever manufactured. At 1.38v, I can't get 4.8, 4.7, 4.6, 4.5. Basically the only OC that I can do is 4.5 @1.25v, so I can run that at 1.25 but not 1.38, stupidest thing I've seen in my life. Probably has a broken memory controller as well, which is why I cant run anything stable over 1600mhz, with 2133 and 2400mhz different branded kits.


----------



## Arkanor

I got one of these, EVGA.



http://imgur.com/xEspG


----------



## Arizonian

Quote:


> Originally Posted by *Arkanor*
> 
> I got one of these, EVGA.
> 
> 
> http://imgur.com/xEspG


----------



## gizmo83

nobody can tell me about a led controller for my gigabyte 690? the evga utility is just for evga cards..


----------



## Arizonian

Quote:


> Originally Posted by *gizmo83*
> 
> nobody can tell me about a led controller for my gigabyte 690? the evga utility is just for evga cards..


If you've downloaded the *EVGA GTX 690 LED Controller v1.0.0.9* and it didn't work to control the LED the only other option you can 'TRY at your own risk' is to flash an EVGA reference BIOS for the GTX 690. It may work after it thinks it's an EVGA card. Anyone else try this?

Otherwise you out of luck as your going to depend on Gigabyte to come up with the LED controller software for their own card. I wouldn't hold your breath. The cards are all reference and one would think the EVGA controller would work on the Gigabyte or ASUS version.

Vendor product support after it's manufactured is one of the reasons I choose EVGA personally. Reference cards are the same but the vendors really stand apart in their own ways backing up their products.

Good luck.


----------



## jagz

Posted this some time ago, but I seem to have issues playing certain maps on BF3 since I added the 690. These are max settings but with no AA.










Get it everytime on Wake Island after 15 minutes or so.


----------



## Qu1ckset

Quote:


> Originally Posted by *jagz*
> 
> Posted this some time ago, but I seem to have issues playing certain maps on BF3 since I added the 690. These are max settings but with no AA.
> 
> 
> 
> 
> 
> 
> 
> 
> Get it everytime on Wake Island after 15 minutes or so.


are you using max AA??


----------



## jagz

Quote:


> Originally Posted by *Qu1ckset*
> 
> are you using max AA??


None I believe.


----------



## gizmo83

thank you. i have to look for an evga bios but i hope it's possible flashing new bios..


----------



## MrTOOSHORT

Quote:


> Originally Posted by *jagz*
> 
> Posted this some time ago, but I seem to have issues playing certain maps on BF3 since I added the 690. These are max settings but with no AA.
> 
> 
> 
> 
> 
> 
> 
> 
> Get it everytime on Wake Island after 15 minutes or so.


Driver looks older, try the new official one released recently:

http://www.nvidia.com/Download/index.aspx?lang=en-us


----------



## jagz

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Driver looks older, try the new official one released recently:
> http://www.nvidia.com/Download/index.aspx?lang=en-us


Thanks, I'll give it a shot.


----------



## MrTOOSHORT

^^^

You're welcome, hope it works for you.








Quote:


> Originally Posted by *gizmo83*
> 
> thank you. i have to look for an evga bios but i hope it's possible flashing new bios..


The EVGA gtx 690 original bios' can be downloaded here:

http://www.mvktech.net/component/option,com_joomlaboard/Itemid,34/func,view/id,63892/catid,10/


----------



## Le0Heart

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, I dont think a CPU can just stop, your comp would just crash then. Is it just bf3 that lags? try crysis 2 maybe and heaven. A malfunctioning CPU is really rare. Your ram is at stock so I dont see that being the problem, did it pass an error test? Not sure why you turned HT off, that I dont think would change anything. I get about 30-50% load on my cpu in bf3 maxed, the game isnt very cpu intensive. Do you get about 90-120+ on max? If so then it should not be a bottleneck anywhere. First I would just reinstall everything if only bf3 has the issue. Then make sure everything is at stock, and that you have messed with the nvidia control panel settings. Just curious btw, why a 3820 lol, might as well went with a 3930k or ivy?


Bro, when I read GTX 690 quad SLI in your computer specs (signature) does it mean you have 4 GTX690 cards running in 4-way SLI ? Goshhhh I wish I had liberty to spend that much


----------



## AngelKnight

[]Got my galaxy gtx690 today
And after a short test
I found the max core offset is +147mhz
but the mem can go skyhigh
now I am testing tdp+135 ，core+147 ，mem+815
will post the result later


----------



## AngelKnight

passed!
TDP +135%
Core +147Mhz（1192 boost）
mem + 815Mhz
p.s. The time shows on my os is wrong

This is not the limit
will try higher later
2600K is a bottleneck
so I am consider that to buy a x79 platform
and........Sorry for my poor English


----------



## MrTOOSHORT

^^^

Awesome results, I'm jelly!









I though my memory maxing out at +750ish was gravy.


----------



## KaRLiToS

can someone tell me about GTX 690 Quad Sli and Surround gaming in 7680x1440 ?


----------



## AngelKnight

thanks my bro
The +850mhz mem has past 3dmark 11 and vantage
Testing furmark now


----------



## Qu1ckset

Quote:


> Originally Posted by *KaRLiToS*
> 
> can someone tell me about GTX 690 Quad Sli and Surround gaming in 7680x1440 ?


Honestly GTX 690 Sli is waste of your money due to the limited vram, if your going 7680x1440 get 3x7970s or 3x 4gb 670/680s.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Qu1ckset*
> 
> Honestly GTX 690 Sli is waste of your money due to the limited vram, if your going 7680x1440 get 3x7970s or 3x 4gb 670/680s.


+1 to this.

I'm a single screen kind of guy, so this card is perfect for me. Thinking of going 120Hz soon!


----------



## Qu1ckset

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> +1 to this.
> I'm a single screen kind of guy, so this card is perfect for me. Thinking of going 120Hz soon!


1440p is so much better then 120hz 1080p in my opinion, but there is a American company putting the korean panels with 120hz board soon


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> 1440p is so much better then 120hz 1080p in my opinion, but there is a American company putting the korean panels with 120hz board soon


Ofc there is, knowing my luck after I buy something new something newer comes the next day. Maybe we will see display port 1.2 actually get used


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ofc there is, knowing my luck after I buy something new something newer comes the next day. Maybe we will see display port 1.2 actually get used


Trust me I know what you mean, I'm putting together my watercooling build and buying a new backup gpu (gtx 650 2gb) and then I'm waiting till 2014 when hopefully maxwell/haswell will be out along with some 4k screens. Upgrading every Gen is such a waste, man I've gone xfire5970>sli580s>6990>690

Yes the 780 is gunna be beast is the make it gk110 but who cares I got a beast O a card already!


----------



## thestache

Quote:


> Originally Posted by *KaRLiToS*
> 
> can someone tell me about GTX 690 Quad Sli and Surround gaming in 7680x1440 ?


It's not going to happen that's what. Doesn't have the VRAM. Doesn't have the VRAM for triple 1200P so there is no way it'll do it with triple 1440P.

Tri SLI 4GB GTX 680 is what you want for triple 120hz or triple 1440P/1600P.

Quad SLI doesn't scale well enough to justify the purchase and when you think about it a single GTX 680 can power a 1440P/1600P monitor easily so, one for each is enough.

I sold mine and went MSI Lightning HD 7970 xfire for the display ports because I was sick of screen tearing.


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> Trust me I know what you mean, I'm putting together my watercooling build and buying a new backup gpu (gtx 650 2gb) and then I'm waiting till 2014 when hopefully maxwell/haswell will be out along with some 4k screens. Upgrading every Gen is such a waste, man I've gone xfire5970>sli580s>6990>690
> Yes the 780 is gunna be beast is the make it gk110 but who cares I got a beast O a card already!


Ya i agree. 4k is going to be my upgrade stage, I dont see a point to upgrade before because our gpus now already destroy everything on 1440/1600. Maybe we will actually use our gpus when the console ports come from the new consoles in 2013.


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ya i agree. 4k is going to be my upgrade stage, I dont see a point to upgrade before because our gpus now already destroy everything on 1440/1600. Maybe we will actually use our gpus when the console ports come from the new consoles in 2013.


New consoles aren't coming out till fall 2013 earliest, and it takes about a year or so after there release to start pumping out decent games


----------



## iARDAs

Sometimes i feel like i should purchase a 690 and sell my 670

than i think of Crysis 2 hitting 2250 vram in DX11 with high res texture pack

And than I start worrying about Crysis 3 and Battlefield 4, GTA5 and how the vram will be in 1440p

than i back up.

I hate it. I really do.


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> New consoles aren't coming out till fall 2013 earliest, and it takes about a year or so after there release to start pumping out decent games


Really? I thought they were around the corner. Oh well, I guess there is a long wait then for 4k/maxwell/haswell. Maybe crysis 3 and bf4 will entertain us, I think they're using the same engine though. Maybe improved graphics?


----------



## Vrait

Quote:


> Originally Posted by *iARDAs*
> 
> Sometimes i feel like i should purchase a 690 and sell my 670
> 
> than i think of Crysis 2 hitting 2250 vram in DX11 with high res texture pack
> 
> And than I start worrying about Crysis 3 and Battlefield 4, GTA5 and how the vram will be in 1440p
> 
> than i back up.
> 
> I hate it. I really do.


Why not just buy a second 670?


----------



## dboythagr8

Couple questions:

1. For phsyx...what's the best way to handle it with the 690? One GPU for Physx and the other for gameplay? Getting ready for Borderlands 2.

2. Is anybody getting massive drops in GPU usage in BF3 Armored Kill? I will play for a while and all of a sudden my fps will take a dive and my fps will go from 60+ down to like 30-40 and usage will be terrible. The only way to fix this is to ALT-Tab out, and then back in...am I hitting the VRAM wall or something?


----------



## Divineshadowx

Quote:


> Originally Posted by *dboythagr8*
> 
> Couple questions:
> 1. For phsyx...what's the best way to handle it with the 690? One GPU for Physx and the other for gameplay? Getting ready for Borderlands 2.
> 2. Is anybody getting massive drops in GPU usage in BF3 Armored Kill? I will play for a while and all of a sudden my fps will take a dive and my fps will go from 60+ down to like 30-40 and usage will be terrible. The only way to fix this is to ALT-Tab out, and then back in...am I hitting the VRAM wall or something?


I think when you dedicate 1 of your gpu to phsyx, it still runs the game on it, because physx doesn't use much power. And if you were hitting the vram wall you would go down to literally 1fps. I've had it happen in crysis 1 with lots of mods.


----------



## dboythagr8

Quote:


> Originally Posted by *Divineshadowx*
> 
> I think when you dedicate 1 of your gpu to phsyx, it still runs the game on it, because physx doesn't use much power. And if you were hitting the vram wall you would go down to literally 1fps. I've had it happen in crysis 1 with lots of mods.


mehh i don't know what is going on then


----------



## Jessekin32

It's on it's way























I've waited for this moment for so long











(EVGA, if you can't read it








)


----------



## iARDAs

Quote:


> Originally Posted by *Vrait*
> 
> Why not just buy a second 670?


my GPU hits 75 degrees under heavy Load. I am scared that if i get another 670 the top card might be as hot as mid 80s. maybe even more.


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> my GPU hits 75 degrees under heavy Load. I am scared that if i get another 670 the top card might be as hot as mid 80s. maybe even more.


My experience shows the 690 running idle 30-31C as does my 680 in the second rig with same fan settings. Gaming (depending on title) highest temps I've seen is 77C-78C. It may be the extra fan (in the main rig) blowing from the bottom of the case directly up into the 690 that keeps it as cool as my 680.

You know the cons and pros of the dual GPU better than anyone haveing had a GTX 590. (Please correct me if I'm wrong







).

If anyone else is running anything similar to a Yamakasi Catleap Q270 1440p would like to comment on their experience with their GTX 690 gaming please share.

It won't be hard to recoup most of the money from your 670 at this time should you decide to switch. Somehow a second GTX 670 4GB SLI sounds pretty darn good since your half way there and won't have to deal with the latter. If you have decent over clockers 26 MHz dynamic downclock for hitting 80C is approximately all your looking at. In the long run 4GB VRAM will allow you to carry those cards easily past next series until Maxwell and piece of mind.


----------



## Jessekin32

Anyone know if I can run two 690's in quad sli, and also have a dedicated PhysX card?


----------



## Divineshadowx

Quote:


> Originally Posted by *Jessekin32*
> 
> Anyone know if I can run two 690's in quad sli, and also have a dedicated PhysX card?


Why would you need to, physx takes like no power from the gpu. Biggest waste of money ever.


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> My experience shows the 690 running idle 30-31C as does my 680 in the second rig with same fan settings. Gaming (depending on title) highest temps I've seen is 77C-78C. It may be the extra fan (in the main rig) blowing from the bottom of the case directly up into the 690 that keeps it as cool as my 680.
> You know the cons and pros of the dual GPU better than anyone haveing had a GTX 590. (Please correct me if I'm wrong
> 
> 
> 
> 
> 
> 
> 
> ).
> If anyone else is running anything similar to a Yamakasi Catleap Q270 1440p would like to comment on their experience with their GTX 690 gaming please share.
> It won't be hard to recoup most of the money from your 670 at this time should you decide to switch. Somehow a second GTX 670 4GB SLI sounds pretty darn good since your half way there and won't have to deal with the latter. If you have decent over clockers 26 MHz dynamic downclock for hitting 80C is approximately all your looking at. In the long run 4GB VRAM will allow you to carry those cards easily past next series until Maxwell and piece of mind.


Yep I had a 590.

I always liked it, never needed to OC it but was a pain in the back to sell it in Turkey. I lost more money than I thought.

However this time i believe as you are saying i will upgrade for 670 4GB sli and wait for the 8xx series. If it wasnt for the Vram cap I would probably still go 690.

However so far noone in the 1440p+ club had a single issue with vram in their 690s in 1440p resolution. Crysis 3 is making me worried though.

Besides Crysis 2 with DX11 and High Res texture packs and some extreme modes in Skyrim, i never used my extra 2GB o Vram to be honest.

Yeah BF3 uses 2200 GB vram but as people discussed in the club it seems that the game uses more vram if you have it.


----------



## Ruby Rabbit

Quote:


> Originally Posted by *Jessekin32*
> 
> Anyone know if I can run two 690's in quad sli, and also have a dedicated PhysX card?


Best keeping your old Nvidia GPU in your rig and dedicate that to PhysX. Works a treat!


----------



## V3teran

I reckon a 9800gt or over is enough for physx or maybe a 250.
i know its nice to have physx if you have a spare card *lying around* but i wouldnt go spending money on a card just for physx as theres hardly any games that use it,theres mafia 2,batman,borderlands 2 and metro2033 is all i can think of.

Also a single 690 at 1920 res is all you need for anygame maxed out.


----------



## Divineshadowx

Anyone have ideas about a laptop. I want one for school, was looking at gaming ones with 660m and such, quite cheap actually. Kind of ironic, gaming laptop for school... But ya I don't really need a gaming one, as you can tell from my desktop.. but non gaming ones feel so cheap to me and lack hardware. Any help?


----------



## Marcsrx

Not the thread for such inquires...


----------



## Divineshadowx

Quote:


> Originally Posted by *Marcsrx*
> 
> Not the thread for such inquires...


There's a gpu in the post isn't there? qq


----------



## Marcsrx

This is the GTX690 club thread, not the "Which laptop should I buy" thread...


----------



## Shogon

Quote:


> Originally Posted by *Divineshadowx*
> 
> Anyone have ideas about a laptop. I want one for school, was looking at gaming ones with 660m and such, quite cheap actually. Kind of ironic, gaming laptop for school... But ya I don't really need a gaming one, as you can tell from my desktop.. but non gaming ones feel so cheap to me and lack hardware. Any help?


For you.


----------



## Divineshadowx

Quote:


> Originally Posted by *Shogon*
> 
> For you.


Eww...


----------



## Marcsrx

Quote:


> Originally Posted by *Shogon*
> 
> For you.


yolololol


----------



## Divineshadowx

Quote:


> Originally Posted by *Marcsrx*
> 
> yolololol


I wonder if that link was an insult or he was being serious. And btw I asked it here because there are some people who know a lot, not like some other threads.


----------



## pilla99

Quote:


> Originally Posted by *Shogon*
> 
> For you.


As someone who has owned 3 of those + and air I can attest that they are indeed great laptops. Sure you aren't going to be running games very well (especially on the new retina ones holy crap 2880x1800) but that's not what laptops are for. I love OSX the design + touch gestures in the operating system make it the best laptop in my opinion. Plus I think they look the best.

I would use OSX/linux as my primary OS if it wasn't for the gaming limitations.


----------



## Divineshadowx

Quote:


> Originally Posted by *pilla99*
> 
> As someone who has owned 3 of those + and air I can attest that they are indeed great laptops. Sure you aren't going to be running games very well (especially on the new retina ones holy crap 2880x1800) but that's not what laptops are for. I love OSX the design + touch gestures in the operating system make it the best laptop in my opinion. Plus I think they look the best.
> I would use OSX/linux as my primary OS if it wasn't for the gaming limitations.


I could care less about the display. Theres no point in getting a screen with a higher res than 1080p because laptops are used for mainstream content, which is all 1080p. And on my current 1440p things look tiny with a 27". For 2k I can get 680m laptop, with an amazing cpu. For 1200 I can get a 660m with a 1080p display, spend 150bucks on a much better ssd than the one in the mac, and get a better clocked i7. Osx? Sorry I prefer properly coded c++ that doesn't crash random applications, and actually runs most software that devs create.


----------



## Shakal

sorry guys i didn't go over the whole thread but is it worth water cooling it? And how easy is it to install the waterblock? The card looks so nice and fancy.Right now i have a 7970 with komodo waterblock so if i got the water block for 690 it would fit right in w/o having me have to mess with my loop. What do you recommend just how much further will it go under water versus the air? I am really considering this card


----------



## pilla99

Quote:


> Originally Posted by *Divineshadowx*
> 
> I could care less about the display. Theres no point in getting a screen with a higher res than 1080p because laptops are used for mainstream content, which is all 1080p. And on my current 1440p things look tiny with a 27". For 2k I can get 680m laptop, with an amazing cpu. For 1200 I can get a 660m with a 1080p display, spend 150bucks on a much better ssd than the one in the mac, and get a better clocked i7. Osx? Sorry I prefer properly coded c++ that doesn't crash random applications, and actually runs most software that devs create.


Again, laptops are media consumption devices, not meant for any kind of gaming or serious work. That's why desktops exist. In my experience, anyone that owns a gaming laptop does so because they were too poor to afford a desktop and a laptop or tablet.

I haven't met anyone with a "gaming" laptop over the age of 24. I use 1400p everyday at work and at home. Having the extra screen real estate for videos browsers etc would be great, just because you aren't going to max crises on it doesn't mean it's useless. The MacBook pro does decently well with games anyway, IE starcraft II can play at native resolution and keep over 30 frames on modest settings.

OSX is a great OS. In my experience (which is more than most everyday PC users) OSX is more stable and user friendly than Windows. My job is IT. I maintain and work on over 150 PC's and 50 macs. All 2 years or newer. I almost never go out to fix the Macs. 9/10 times it's a PC problem when something arises. Driver issues, viruses, people that don't know what they're doing. All of these things the Mac can handle for you. It can also be a complex OS. If you know what you're doing it can do anything you want. I am also a student. Some of the smartest computer science professors I know including the department chair use Macs. And I guarantee they know a little more about c++ than you do.

Personal preference but in the notebook world I would never buy a PC, the desktop world I would never get a Mac. You can install any OS natively on any system with a little work so it's more the hardware you should be concerned with when buying. Retina screen + excellent touchpad + great build on the MacBook pro make it the clear winner in that department.

In the get things done power world of Desktops building your own is the only way to go.


----------



## Divineshadowx

Quote:


> Originally Posted by *pilla99*
> 
> Again, laptops are media consumption devices, not meant for any kind of gaming or serious work. That's why desktops exist. In my experience, anyone that owns a gaming laptop does so because they were too poor to afford a desktop and a laptop or tablet.
> I haven't met anyone with a "gaming" laptop over the age of 24. I use 1400p everyday at work and at home. Having the extra screen real estate for videos browsers etc would be great, just because you aren't going to max crises on it doesn't mean it's useless. The MacBook pro does decently well with games anyway, IE starcraft II can play at native resolution and keep over 30 frames on modest settings.
> OSX is a great OS. In my experience (which is more than most everyday PC users) OSX is more stable and user friendly than Windows. My job is IT. I maintain and work on over 150 PC's and 50 macs. All 2 years or newer. I almost never go out to fix the Macs. 9/10 times it's a PC problem when something arises. Driver issues, viruses, people that don't know what they're doing. All of these things the Mac can handle for you. It can also be a complex OS. If you know what you're doing it can do anything you want. I am also a student. Some of the smartest computer science professors I know including the department chair use Macs. And I guarantee they know a little more about c++ than you do.
> Personal preference but in the notebook world I would never buy a PC, the desktop world I would never get a Mac. You can install any OS natively on any system with a little work so it's more the hardware you should be concerned with when buying. Retina screen + excellent touchpad + great build on the MacBook pro make it the clear winner in that department.
> In the get things done power world of Desktops building your own is the only way to go.


I dont think anyone poor can afford a gaming notebook. Someone poor who wants to game will game on a console, not a pc. Sc2 is a horrible game to judge gpu wise, my phone can run the graphical load of sc2, its the spam of units that makes even quad 690 lag. osx is harder to use than windows, and the icons are stupid, name a time in history where eagles carried mail, tiny things can ruin big things. How can you buy a product from a company that advertises and sells beats audio and bose. I really doubt they do it for the money. How can you buy a product from a company that sues others for copying their style, we are humans, we work together, as long as they dont make a laptop and put an apple on it its not copy righted. Iphone, better phones exist, ipod, better devices exist, mac pro server, most overpriced garbage ever, mac air cant even run league on low. Macbook pro overpriced.. i might seem like a hater, but seriously, gather up the isheep here and give me one good reason on why apple is any better than any other decent company.


----------



## pilla99

Quote:


> Originally Posted by *Divineshadowx*
> 
> I dont think anyone poor can afford a gaming notebook. Someone poor who wants to game will game on a console, not a pc. Sc2 is a horrible game to judge gpu wise, my phone can run the graphical load of sc2, its the spam of units that makes even quad 690 lag. osx is harder to use than windows, and the icons are stupid, name a time in history where eagles carried mail, tiny things can ruin big things. How can you buy a product from a company that advertises and sells beats audio and bose. I really doubt they do it for the money. How can you buy a product from a company that sues others for copying their style, we are humans, we work together, as long as they dont make a laptop and put an apple on it its not copy righted. Iphone, better phones exist, ipod, better devices exist, mac pro server, most overpriced garbage ever, mac air cant even run league on low. Macbook pro overpriced.. i might seem like a hater, but seriously, gather up the isheep here and give me one good reason on why apple is any better than any other decent company.


-My point was that if you really want to you can do some gaming on the MacBook Pro.

-When I was 17 years old I saved up $1200 and bought a 7805u gaming laptop. I did so because that was all that I could afford.
Same goes for every other person I have met with one.

-OSX being harder or easier is an opinion, I'm not going to argue it anymore.

-Sells beats audio and bose should be a consideration in buying a PC? I am actually confused on your point here.. You can't find an Apple commercial that has a pair of bose or beats in it. Not sure what you are talking about. You can find whole laptops however with companies like HP that are dedicated to things like "beats" IE:
http://www.samsclub.com/sams/hp-pavilion-dm4-beats-edition-laptop-intel-core-i5-3210m-500gb-14-0/prod7190064.ip?refcd=GL05251200010028&pid=_CSE_Google_PLA_Office-Electronics&ci_src=17588969&ci_sku=sku7784065S

As far as the phone that is another personal opinion. I look for different things in different markets. In the desktop world I want raw power. In the tablet, phone and laptop world I want ease of use and fluidity along with solid build quality. I don't intend to do serious work on devices like that so my priorities change.

That's why I used an iPhone from 07 - 10 and have been using a Windows phone for over a year and a half now. Sure there might be a new android phone with more Ghz and Ram, but in the phone market, I could give a ****. I want fluidity and cohesion. Not benchmarks when it comes to phones.

This obviously is the wrong forum to go for any kind of pro Apple argument but I think they are a great company. Sure overpriced, but lots of things in this world are. Mercedes are overpriced for what you get too, but they look nice, are solidly build and hold value. Much like Apple products.


----------



## Divineshadowx

Quote:


> Originally Posted by *pilla99*
> 
> -My point was that if you really want to you can do some gaming on the MacBook Pro.
> -When I was 17 years old I saved up $1200 and bought a 7805u gaming laptop. I did so because that was all that I could afford.
> Same goes for every other person I have met with one.
> -OSX being harder or easier is an opinion, I'm not going to argue it anymore.
> -Sells beats audio and bose should be a consideration in buying a PC? I am actually confused on your point here.. You can't find an Apple commercial that has a pair of bose or beats in it. Not sure what you are talking about. You can find whole laptops however with companies like HP that are dedicated to things like "beats" IE:
> http://www.samsclub.com/sams/hp-pavilion-dm4-beats-edition-laptop-intel-core-i5-3210m-500gb-14-0/prod7190064.ip?refcd=GL05251200010028&pid=_CSE_Google_PLA_Office-Electronics&ci_src=17588969&ci_sku=sku7784065S
> As far as the phone that is another personal opinion. I look for different things in different markets. In the desktop world I want raw power. In the tablet, phone and laptop world I want ease of use and fluidity along with solid build quality. I don't intend to do serious work on devices like that so my priorities change.
> That's why I used an iPhone from 07 - 10 and have been using a Windows phone for over a year and a half now. Sure there might be a new android phone with more Ghz and Ram, but in the phone market, I could give a ****. I want fluidity and cohesion. Not benchmarks when it comes to phones.
> This obviously is the wrong forum to go for any kind of pro Apple argument but I think they are a great company. Sure overpriced, but lots of things in this world are. Mercedes are overpriced for what you get too, but they look nice, are solidly build and hold value. Much like Apple products.


Ive had 2 iphones, ihome, and a bunch of other apple crap. Until I got techy i figured it was the best. Im not saying what apple makes is low end, im saying you can get more for your dollar. My target audience are those who buy everything apple, literally everything. I dont like what apple does as a company, seems to me they are trying to take over. Their marketing is absolute bullcrap, tim cook? The way they talk pisses me off. But i guess you cant judge their engineers on their stupid spokespeople. They dont actually make anything though, intel, nvidia, other companies, calling it apple seems ignorant. And they have bose snd beats all over their stores.


----------



## Arizonian

You'll get more input discussing (good info BTW) your topics regarding laptops @ Laptops and Netbooks

Questions being asked about GTX 690 could get buried but I'm not too concerned because this isn't a fast moving thread.







I digress.
Quote:


> Originally Posted by *Shakal*
> 
> sorry guys i didn't go over the whole thread but is it worth water cooling it? And how easy is it to install the waterblock? The card looks so nice and fancy.Right now i have a 7970 with komodo waterblock so if i got the water block for 690 it would fit right in w/o having me have to mess with my loop. What do you recommend just how much further will it go under water versus the air? I am really considering this card


I wish I could answer it myself but haven't taken the plunge.









Keep in mind the GTX 690 does have it's over clocking limits being a dual GPU compared to singles in SLI. Mine maxes on air 1176 MHz & 1202 MHz Core 3200 MHz Memory. Temps go into the mid to low 70's max. I had to back off a tad on the OC & and it seems to help hover now high 60's more often. Personally I don't feel that putting it under water would do any better on the OC by much, if at all. However I'm on a single 120 Hz 1080p monitor so it's not running multiple monitors.

I'm not sure what your currently running and it sometimes helps those giving advice to know what the questioner has for a system. There is a lot of members who are under water with their GTX 690 in the club and I'm sure will chime in.

How to put your Rig in your Sig


----------



## Shogon

Quote:


> Originally Posted by *Shakal*
> 
> sorry guys i didn't go over the whole thread but is it worth water cooling it? And how easy is it to install the waterblock? The card looks so nice and fancy.Right now i have a 7970 with komodo waterblock so if i got the water block for 690 it would fit right in w/o having me have to mess with my loop. What do you recommend just how much further will it go under water versus the air? I am really considering this card


It is surely worth watercooling a 690, the only downside is removing the stock heatsink and its sheer beauty









Mine with an XSPC Razor block gets to 1185 Mhz on the core and 3317 on the memory, though I was able to achieve those clocks on air, it was no way possible 24/7 without overheating. Now, instead of high 80s and restarts I'm around 50C and lower, with the latest temp drop max load has been 44C in BF3.


----------



## V3teran

Quote:


> Originally Posted by *Shakal*
> 
> sorry guys i didn't go over the whole thread but is it worth water cooling it? And how easy is it to install the waterblock? The card looks so nice and fancy.Right now i have a 7970 with komodo waterblock so if i got the water block for 690 it would fit right in w/o having me have to mess with my loop. What do you recommend just how much further will it go under water versus the air? I am really considering this card


I posted this earlier ill post it again to save my typing again.....

I dont think the 690 is worth watercooling unless theres a volt mod because people that wc them are only getting an increase of around 30mhz compared to people on air on the core tops generally,even though the gpu is still very cool at full load,its needs more power to be put through it to achieve stability success once going over 30mhz increase on the core,this is what put me off from watercooling mine,all the hassle of setting it all up for a small increase...no thanks.

Thats my opinion.


----------



## thestache

Should have stayed with my buggy GTX 690... These MSI Lightning HD 7970s are the worst cards I've ever bought.


----------



## rationalthinking

Quote:


> Originally Posted by *thestache*
> 
> Should have stayed with my buggy GTX 690... These MSI Lightning HD 7970s are the worst cards I've ever bought.


I'm guessing driver issues? Was the exact reason I stopped purchasing AMD cards. I have one in my home server but that is it.


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> Should have stayed with my buggy GTX 690... These MSI Lightning HD 7970s are the worst cards I've ever bought.


Quote:


> Originally Posted by *thestache*
> 
> Should have stayed with my buggy GTX 690... These MSI Lightning HD 7970s are the worst cards I've ever bought.


Lol whats wrong with them?


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Lol whats wrong with them?


Has taken me a good 8 hours of installing and un-installing the cards seperatly, using driver sweeper, ccleaner etc just to get them to load up and play BF3 properly but now I get horrible screen tearing (worse than the GTX 690) even though I'm using display port on my all monitors and as soon as I OC the cards even 10mhz they crash.

AMD drivers doing what they are best at and thats make me want to throw my computer out the window. Not to mention I suspect a bad card with the OCing problems.


----------



## rationalthinking

Quote:


> Originally Posted by *thestache*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Divineshadowx*
> 
> Lol whats wrong with them?
> 
> 
> 
> Has taken me a good 8 hours of installing and un-installing the cards seperatly, using driver sweeper, ccleaner etc just to get them to load up and play BF3 properly but now I get horrible screen tearing (worse than the GTX 690) even though I'm using display port on my all monitors and as soon as I OC the cards even 10mhz they crash.
> 
> AMD drivers doing what they are best at and thats make me want to throw my computer out the window. Not to mention I suspect a bad card with the OCing problems.
Click to expand...

Grass isn't always greener.


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> Has taken me a good 8 hours of installing and un-installing the cards seperatly, using driver sweeper, ccleaner etc just to get them to load up and play BF3 properly but now I get horrible screen tearing (worse than the GTX 690) even though I'm using display port on my all monitors and as soon as I OC the cards even 10mhz they crash.
> AMD drivers doing what they are best at and thats make me want to throw my computer out the window. Not to mention I suspect a bad card with the OCing problems.


First of all, how would using dp reduce tearing at all. I'm using a dl-dvi cable and I get 0 tearing. If that's why you switched then wow.. dp 1.2 bandwidth exceeds 1440p res by like 3fold. And ya amd drivers blow, I had to go into safe mode and uninstall the drivers about 5 times, and then one day they fixed themselves.


----------



## Buzzkill

Please add me to the 690 club. I have 2 EVGA 690. Waiting for HEATKILLER® GPU-X³ GTX 680 "Hole Edition" Ni-Bl waterblocks to be avalible to add them to the loop. Watercool said two weeks for avaliblity but that was three weeks ago.



DSCF2800.JPG 732k .JPG file


----------



## azaroth

this has got to be one of the best looking cards ive seen, had i the money id totally sport one in my rig. some day i will when the **** sucking for money pays off, gotta love having your own buisness.


----------



## Buzzkill

Quote:


> Originally Posted by *thestache*
> 
> Has taken me a good 8 hours of installing and un-installing the cards seperatly, using driver sweeper, ccleaner etc just to get them to load up and play BF3 properly but now I get horrible screen tearing (worse than the GTX 690) even though I'm using display port on my all monitors and as soon as I OC the cards even 10mhz they crash.
> AMD drivers doing what they are best at and thats make me want to throw my computer out the window. Not to mention I suspect a bad card with the OCing problems.


Your going to have to do a complete Windows reinstall to have the drivers work properly when switching from nvidia to amd. Even though you uninstall there are still remnance of the nvidia drivers on your hard drive. I got my 690 and just put it in and I got the red screen of death when I took out my 5870. Make sure to do a full reformat of your hard drive before reinstalling.


----------



## Divineshadowx

Quote:


> Originally Posted by *Buzzkill*
> 
> Your going to have to do a complete Windows reinstall to have the drivers work properly when switching from nvidia to amd. Even though you uninstall there are still remnance of the nvidia drivers on your hard drive. I got my 690 and just put it in and I got the red screen of death when I took out my 5870. Make sure to do a full reformat of your hard drive before reinstalling.


That might work, but I don't think its required to do that. How would test benches work if you would have to install windows and everything else again everytime you switched a gpu. I think a windows re-installation would be sufficient.


----------



## thestache

Quote:


> Originally Posted by *Buzzkill*
> 
> Your going to have to do a complete Windows reinstall to have the drivers work properly when switching from nvidia to amd. Even though you uninstall there are still remnance of the nvidia drivers on your hard drive. I got my 690 and just put it in and I got the red screen of death when I took out my 5870. Make sure to do a full reformat of your hard drive before reinstalling.


Brand new SSD and fresh copy of windows.

Quote:


> Originally Posted by *Divineshadowx*
> 
> First of all, how would using dp reduce tearing at all. I'm using a dl-dvi cable and I get 0 tearing. If that's why you switched then wow.. dp 1.2 bandwidth exceeds 1440p res by like 3fold. And ya amd drivers blow, I had to go into safe mode and uninstall the drivers about 5 times, and then one day they fixed themselves.


DP has more bandwidth than DVI so that was supposed to reduce tearing. It doesn't though, at all. So I don't know what people are talking about saying DP is amazing and everything because it's made no change for me.

Drivers seem to be working now but I just get bad performance, can't OC the cards even 10mhz or they crash, msi afterburner has no many issues on 2.2.3 and 2.2.4 it's not even funny and bezel correction doesn't work properly. Honestly kicking myself right now. Shouldn't have listened to people and have stayed with my original route of GTX 680 SLI.


----------



## Buzzkill

Sorry that didn't help you out. As drivers mature it will work better. About the time they get the drivers right a new model comes out to have it all start over. My 7770 wouldn't wake up after it went to sleep but with 12.8 that was fixed. Having 690 or two you should be good for awhile or you can drop it in a diffrent rig and buy the next best in the future.


----------



## ceteris

Quote:


> Originally Posted by *Buzzkill*
> 
> Please add me to the 690 club. I have 2 EVGA 690. Waiting for HEATKILLER® GPU-X³ GTX 680 "Hole Edition" Ni-Bl waterblocks to be avalible to add them to the loop. Watercool said two weeks for avaliblity but that was three weeks ago.
> 
> 
> DSCF2800.JPG 732k .JPG file


Haha I've been waiting forever for that block forever too! But at this point, I've been running on air for too long that I don't care anymore. Thinking of even upgrading to the Mars III when it comes out and just upgrade my CPU cooler to an H100 from an H80. For now I'm looking into the new ASUS VG Series VG278HE 27-Inch Screen LED-lit Monitor and seeing what I can squeeze out of the 690. Albeit its only 1080p, it will be a big upgrade from my old 23" college LCD.


----------



## Divineshadowx

As drivers mature? The 7970 has been out for almost a year, if they haven't "matured" then idk what to say. DP has more bandwidth so it reduces tearing? ***? 1440p screen requires 7.87gbit of bandwidth, dp v1.2 has 21.6gbit, dual link dvi has 9.9gbit, more than enough for 1440p, more than a gig to spare for 1600p. And what exactly does tearing have to do with bandwidth anyway. Why on earth would anyone sell a 690 and get 7970s, when a 680 is basically the same cost and performs better. The only reason I would ever get rid of my 690 is for world record benchmarking with ln2 cooling a 7ghz cpu and quad 680s for raw power,other than that get 4gb 680sli/trisli to run tripple 1440/1600p, maybe 670 if you were short on $. Who told you to go that route, one of the isheep probably.


----------



## Buzzkill

The Copper versions HEATKILLER® GPU-X³ GTX 690 "Hole Edition" and HEATKILLER® GPU-X³ GTX 690 LT
have been available at the end of August with Nickel version said to been two weeks after but not yet. I got a second 690GTX while waiting for Watercool. I almost got an Aquacomputer AquagraFX GTX 690 G1/4 but they said no Nickel Plated Version like they have for the 680GTX. The HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl will be $20 or more for each one but I like the Silver and Black . It just looks better.
I will have to wait for a USA retailer to get them in but I got Performance-Pcs to trying to get some when Watercool make these available to the public. As for Drivers they get better with time. More FPS is always better and the newest drivers support old hardware too. I don't know how long you have been building your own computers or if you buy pre built but when you get the newest hardware drivers and bios are never perfect at launch and a year later you would think progress would be made. 306.02 Beta was not a good driver for me it had many issues. 306.23 is better but I still would like a new version. Everyone has there own opinion so take it all with caution. Just like saying 7970 is better than 690 or 680 *I* don't think so but spend your money on what ever you want. If you don't own it you really don't know how good it will be in your system.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> As drivers mature? The 7970 has been out for almost a year, if they haven't "matured" then idk what to say. DP has more bandwidth so it reduces tearing? ***? 1440p screen requires 7.87gbit of bandwidth, dp v1.2 has 21.6gbit, dual link dvi has 9.9gbit, more than enough for 1440p, more than a gig to spare for 1600p. And what exactly does tearing have to do with bandwidth anyway. Why on earth would anyone sell a 690 and get 7970s, when a 680 is basically the same cost and performs better. The only reason I would ever get rid of my 690 is for world record benchmarking with ln2 cooling a 7ghz cpu and quad 680s for raw power,other than that get 4gb 680sli/trisli to run tripple 1440/1600p, maybe 670 if you were short on $. Who told you to go that route, one of the isheep probably.


Well either way, MSI Lightning HD 7970 crossfire with display port cables was supposed to be the best set-up for tripple monitors and the cards were a huge bust. Swapped them for a PowerColor Devil 13 HD 7990 and it is doing better (actually overclocks unlike the lightnings) but still has awful screen tearing and laggy performance.

Have no idea how to get rid of it.

Think I might just give up on tripple monitors. It's amazing but nothing seems to work properly.


----------



## V3teran

Personally i would never buy an AMD/ATI card as ive heard far to many horror stories regarding drivers issues,some people get problems with them and so people do not get any problems with them,the problems seem to crop up once you have more than 1 card,with 1 card your fine,ive always stuck with nvidia since the 3dfx voodoo days and long before that,if its not broke then why fix it!


----------



## Jessekin32

Anyone having issues with Surround and their 690 with Borderlands 2? I'm running 6050x1080 and I'm getting only 30fps with mild to high settings. This seems a little nuts for a 690 :/


----------



## thestache

Quote:


> Originally Posted by *Jessekin32*
> 
> Anyone having issues with Surround and their 690 with Borderlands 2? I'm running 6050x1080 and I'm getting only 30fps with mild to high settings. This seems a little nuts for a 690 :/


Check your VRAM mate.

I always had problems with my GTX 690 and surround so I sold it.


----------



## Arizonian

Quote:


> Originally Posted by *V3teran*
> 
> Personally i would never buy an AMD/ATI card as ive heard far to many horror stories regarding drivers issues,some people get problems with them and so people do not get any problems with them, the problems seem to crop up once you have more than 1 card, with 1 card your fine, I've always stuck with nvidia since the 3dfx voodoo days and long before that, if its not broke then why fix it!


I was the opposite. After having all three 3DFX Voodoo cards







I eneded up switching to AMD. It wasn't until the GTX 580 came out that I gave Nvidia & 3D Vision a shot.

Can't say anyone is perfect but the percentages of better drivers between the three seem to favor Nvidia from my experience. However with a little knowledge those driver issues can be remedied most of the time.

Nvidia feels a smoother ride at a higher cost due to R&D. Where AMD's (when it finally has competition) pricing is hard to beat for same performance and that's what makes them attractive.

Quote:


> Originally Posted by *Buzzkill*
> 
> Please add me to the 690 club. I have 2 EVGA 690. Waiting for HEATKILLER® GPU-X³ GTX 680 "Hole Edition" Ni-Bl waterblocks to be avalible to add them to the loop. Watercool said two weeks for avaliblity but that was three weeks ago.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> DSCF2800.JPG 732k .JPG file


Nice rig.







The 690's speaks for themselves.









PS - our OP has not been online for some time. There are a lot of members who've not had their name listed. I'm sure he will be back soon. May have to PM him to be updated.


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> Well either way, MSI Lightning HD 7970 crossfire with display port cables was supposed to be the best set-up for tripple monitors and the cards were a huge bust. Swapped them for a PowerColor Devil 13 HD 7990 and it is doing better (actually overclocks unlike the lightnings) but still has awful screen tearing and laggy performance.
> Have no idea how to get rid of it.
> Think I might just give up on tripple monitors. It's amazing but nothing seems to work properly.


Why not just try 680 tri sli, I think 2 of them might suffer. And for physx games you can just dedicate the 3rd to it and still run 680x2. That should be enough until 4k and maxwell.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Why not just try 680 tri sli, I think 2 of them might suffer. And for physx games you can just dedicate the 3rd to it and still run 680x2. That should be enough until 4k and maxwell.


Yeah I've pretty much had it with AMD.

Taking the Devil 13 back monday and getting GTX 680 4GB SLI until next gen. I would have kept and enjoyed the Lightnings if they overclocked more than 30mhz and MSI Afterburner actually worked.


----------



## Buzzkill

Quote:


> Originally Posted by *thestache*
> 
> Yeah I've pretty much had it with AMD.
> Taking the Devil 13 back monday and getting GTX 680 4GB SLI until next gen. I would have kept and enjoyed the Lightnings if they overclocked more than 30mhz and MSI Afterburner actually worked.
> Quote:
> 
> 
> 
> If you have the Powercolor Devil 13 post a picture of it. Or you got MSI Lightnings? Why don't you have system spec in your profile and pictures or CPUz validation or anything at all? You can say you have anything but with out pictures. And you can say you dont have a camera or a picture phone to take a picture so run CPUz and post the validation. Its the Internet and only the truth get posted on the Internet. Or maybe its the opposite everyone lies to act cool.
> 
> How to: Enter your system information.
> http://www.overclock.net/t/511845/how-to-enter-your-system-information
Click to expand...


----------



## Cheesemaster

Quote:


> Originally Posted by *Jessekin32*
> 
> Anyone having issues with Surround and their 690 with Borderlands 2? I'm running 6050x1080 and I'm getting only 30fps with mild to high settings. This seems a little nuts for a 690 :/


I have two 690gtx's I have triple monitor setup Acer 3d monitors I have every single graphics option maxxed with high physx and get over 80 FPS average


----------



## thestache

Quote:


> Originally Posted by *Buzzkill*
> If you have the Powercolor Devil 13 post a picture of it. Or you got MSI Lightnings? Why don't you have system spec in your profile and pictures or CPUz validation or anything at all? You can say you have anything but with out pictures. And you can say you dont have a camera or a picture phone to take a picture so run CPUz and post the validation. Its the Internet and only the truth get posted on the Internet. Or maybe its the opposite everyone lies to act cool.
> How to: Enter your system information.
> http://www.overclock.net/t/511845/how-to-enter-your-system-information


Lol. Are you serious?

Everyone in this thread knows me and my set-up so I don't need to post it and I'm pretty sure anyone going from a GTX 690 - Lighting HD 7970 Crossfire - Devil 13 in a matter of a week because they can't get their tripple monitors to work without terrible scrren tearing is hardly something to make up or brag about. But whatever heres your proof...

My set-up.


Both my GTX 690s when I got them in my old Thermaltake Level 10 set-up. (Quad SLI benchmarking.)


The GTX 690 I kept and put in my rig (benchmarking before my case came). Used the other one in a rig for my little brother.


The rig I built my brother. I have since copied it because I like it soo much.


My Lightning crossfire.


My Devil 13.


My girlfriend holding a sign with my user name in front of my set-up.


Can we continue now?


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> Lol. Are you serious?
> Everyone in this thread knows me and my set-up so I don't need to post it and I'm pretty sure anyone going from a GTX 690 - Lighting HD 7970 Crossfire - Devil 13 in a matter of a week because they can't get their tripple monitors to work without terrible scrren tearing is hardly something to make up or brag about. But whatever heres your proof...
> My set-up.
> 
> Both my GTX 690s when I got them in my old Thermaltake Level 10 set-up. (Quad SLI benchmarking.)
> 
> The GTX 690 I kept and put in my rig (benchmarking before my case came). Used the other one in a rig for my little brother.
> 
> The rig I built my brother. I have since copied it because I like it soo much.
> 
> My Lightning crossfire.
> 
> My Devil 13.
> 
> My girlfriend holding a sign with my user name in front of my set-up.
> 
> Can we continue now?


LOL, funniest post i've ever seen. You just made my day ty.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> LOL, funniest post i've ever seen. You just made my day ty.


Lol. No problem.


----------



## KaRLiToS

Its funny to see your Girlfriend holding a paper with "the stache" written on it.

Nice girlfriend...I mean setup...or both?


----------



## SimpleTech

Quote:


> Originally Posted by *KaRLiToS*
> 
> Nice girlfriend...I mean setup...or both?


We need more pictures of the former.


----------



## EDGERRIES

Quote:


> Originally Posted by *thestache*
> 
> Lol. No problem.


Hahahahaha, mine too! Cant believe that accusation!


----------



## EDGERRIES

Quote:


> Originally Posted by *Divineshadowx*
> 
> LOL, funniest post i've ever seen. You just made my day ty.


Quote:


> Originally Posted by *thestache*
> 
> Lol. No problem.


Sorry for the double post, was laughing too hard and wasn't concentrating!







Hope "thestache" sorts out his surround problem though, nothing more frustrating.


----------



## dboythagr8

Is my GTX 690 dead?

Was playing Borderlands 2. I exited and stuff seemed weird. Reset machine went back in and it was fine up until I left for good some time later from bl2. Before I left I noticed my gpu clocks were not right, wasn't boosting correctly. Left and went into menu to turn on SLI, screen went black and picture never came back. Turned machine off. Turned it back on and my motherboard was stuck on Asus screen.

Then I tried moving it to diff pci slot and I could get a picture but EVGA Precision had a popup that said unidentified GPU or whatever and would not display at my native res of 2560x1600...put in my backup 580 SLI setup and they worked fine. Tried 690 again same thing happened as before. The Geforce logo still lights up and when I went into GPU-Z the parts of the 690 specs showed, but not all. So what's going on here?


----------



## marcmartyn

Hi.

I want to join the club.

This is my card: EVGA GTX 690 Hydro Copper


----------



## Buzzkill

Quote:


> Originally Posted by *thestache*
> 
> Lol. Are you serious?
> Everyone in this thread knows me and my set-up so I don't need to post it and I'm pretty sure anyone going from a GTX 690 - Lighting HD 7970 Crossfire - Devil 13 in a matter of a week because they can't get their tripple monitors to work without terrible scrren tearing is hardly something to make up or brag about. But whatever heres your proof...
> My set-up.
> 
> Both my GTX 690s when I got them in my old Thermaltake Level 10 set-up. (Quad SLI benchmarking.)
> 
> The GTX 690 I kept and put in my rig (benchmarking before my case came). Used the other one in a rig for my little brother.
> 
> The rig I built my brother. I have since copied it because I like it soo much.
> 
> My Lightning crossfire.
> 
> My Devil 13.
> 
> My girlfriend holding a sign with my user name in front of my set-up.
> 
> Can we continue now?


I see how you have over 400 post and talk bad about Quad sli but no system spec to see what your hardware really is. Just because something dosent work for you where its your unability to use the hardware or it is the hardware. Everyone has different systems. I don't know if you are A+ certified or self taught but people want to learn from someone who is qualified to teach not just opinions. And because Quad sli didn't float your boat it's crap right. You have been here 6 months so I am sorrry I didn't know your rig but I have been reading post here for 6 years. It states to list system spec's in the rules maybe reread them because you missed that part and there are some other parts you missed that I am not going list here or now.

P.S You have the best hardware but always have problems. I wonder why?? Still cant figure out how to post system specifications and keep the current?? If you sell something or return it you don't have it anymore.

Maybe this would help you.
The Ultimate Newbie Guide And General Information Thread

Quote: franz

"Please post your system specs." This is important to OCN members, because we use that information to help diagnose problems. It becomes even more important if you are posting overclocking related question


----------



## Divineshadowx

Quote:


> Originally Posted by *Buzzkill*
> 
> I see how you have over 400 post and talk bad about Quad sli but no system spec to see what your hardware really is. Just because something dosent work for you where its your unability to use the hardware or it is the hardware. Everyone has different systems. I don't know if you are A+ certified or self taught but people want to learn from someone who is qualified to teach not just opinions. And because Quad sli didn't float your boat it's crap right. You have been here 6 months so I am sorrry I didn't know your rig but I have been reading post here for 6 years. It states to list system spec's in the rules maybe reread them because you missed that part and the part some other parts I am not going list.
> P.S You have the best hardware but always have problems. I wonder why?? Still cant figure out how to post system specifications??


He has problems because no hardware or software ever created was perfect, every single one of them had some type of problem. This is the 690 club, we are the extreme of the extreme of gaming, and guess what, problems arise when dealing with top notch systems because it is the barrier of technology, it's the first time in history that a card like the 690 has been created, so even Nvidia can't foresee all the problems. Guess what, quad sli 690 sucks, I would know, deal with it. There is absolutely no reason to get quad 690 sli over 4gb 680's. A single 690 is the perfect card for all current resolutions. Qualified? Since you're so obsessed with posting specs, post a picture of your Ph.D in computer engineering and status as an Engineering Manager at Nvidia.


----------



## Scooby-Snack

Off subject from Girlfriend holding a sign









Should I use my Hydro Copper GTX 580 along with my GTX 690? ( I have never used a physics card before and am a little unsure the benefits)?

I'm running 3 S23A700D's-

I figured no sense losing my ass on my GTX 580 if it will provide a noticeable benefit-

Thanks for any advice-

Scoob

PS.

I'm wondering if I should post my Girlfriend holding a sign in front of my TJ11... She is pretty hot hahahaha


----------



## Buzzkill

The 590GTX dual Card

6990 dual card also.
Use google to look up more dual GPU models because I am not listing them all. If you really think this is the first and only than you need to do some reading.

I know you need a PHd to follow the rules. Everything I asked for is simply the forum rules.

I dont post how something is crap because I don't know how to use it. You do it because you want to. It will never be perfect and something better is always in the works. Just because you can't use the all graphics power today doesn't mean tomorrow you will not need it. Go back look threw this thread and see how many people have there system specifications listed. If your giving advise you should have them. I know its totally unreasonable to ask for factual info.


----------



## Divineshadowx

Quote:


> Originally Posted by *Buzzkill*
> 
> The 590GTX dual Card
> 6990 dual card also.
> Use google to look up more dual GPU models because I am not listing them all. If you really think this is the first and only than you need to do some reading.
> I know you need a PHd to follow the rules. Everything I asked for is simply the forum rules.
> I dont post how something is crap because I don't know how to use it. You do it because you want to. It will never be perfect and something better is always in the works. Just because you can't use the all graphics power today doesn't mean tomorrow you will not need it. Go back look threw this thread and see how many people have there system specifications listed. If your giving advise you should have them. I know its totally unreasonable to ask for factual info.


Seriously what are you talking about. Everyone in this forum knows that there is no point in 690 quad sli. There are benchmarks to utilize the power of quad sli, not all of it but most. The problem is the vram, anything above 1600p and you are limited. If you are benchmarking get quad 680s, more vram, and they oc higher, leading to better results. There is no need for me to have everything I talk about because I read what others say about them and watch videos. And for thestache, he just posted pics of everything he had, so stop with the useless rhetoric.


----------



## Buzzkill

Once I get my Overlord OC Pixel Perfect 1440P monitor then I will see if there is a vram issue. Pre-Orderd one on 9-12-12 first day pre-Order offered. With 8GB total vram and new drivers I don't think it will be a big issue. You can always adjust settings if vram is topping 2GB per each gpu. Driver updates can solve issues but if you have to wait for them to be released. I hope lucid logic MVP fix there software and the ivy bridge gpu can be used giving a boost to game play. This is software related so if it does it on one game it is not going to be every game out. This is the 690 owners club right? Looks like people playing borderlands 2 would help with quad sli. I am not going back and forth anymore because it will not fix a thing and people are trying to use forum for info and advise. Thank You.

I asked for system specifications because he seems to have a problem and has many video cards but still has problems so maybe it is not the video cards but something else. But it must be the video cards so spend more money buying video cards nvidia is not going to complain.


----------



## Divineshadowx

Let me educate you. 4gb+4gb is 8gb, but guess what, you have an effective 2gb vram. The 690 has two gk104 chips in sli, each with 2gb vram. When you add two more, you get 4, hence the term quad sli, but sli doesn't combine the effective vram so you have 2gb.

Rudeness removed


----------



## Ruby Rabbit

At last I have ordered my water cooling gear inclusive of a Koolance gtx690 water block. However I still have the two 680’s I purchased before the 690. So I was wondering if it would be better to water cool the 680’s over the 690? Which would give the better OC performance on water? Any thoughts? [/SIZE]


----------



## TFL Replica

Thread cleaned. Now be nice to each other!


----------



## Jessekin32

Anyone with any news or other issues with Borderlands 2 and their GTX 690? I get around 30-39fps with about 50-58% usage on each GPU while only using half my memory. What gives? D:
Quote:


> Originally Posted by *thestache*
> 
> Check your VRAM mate.
> I always had problems with my GTX 690 and surround so I sold it.


VRAM is at 1085MB ish on each GPU. It's not reaching a VRAM limit. The GPU's are NOT running at 100% either. Also, whenever I get headshots or "crits" the game lags or freezes for about 3/4 a second. As a Sniper, this drives me NUTS. I hope a driver update fixes this.


----------



## DinaAngel

anyone got to 1300mhz? as nvidia stated
my card doesnt want to go fourther than 140 on core + and 200 on memmory, means 1200 mhz on core and got no idea what memmory is


----------



## thestache

Quote:


> Originally Posted by *EDGERRIES*
> 
> Sorry for the double post, was laughing too hard and wasn't concentrating!
> 
> 
> 
> 
> 
> 
> 
> Hope "thestache" sorts out his surround problem though, nothing more frustrating.


Thanks, thinking price wise and from my experiences so far I'll go with 4GB GTX 680 SLI and try that or MSI Lightning HD 7970 Crossfire again and hope I get cards that aren't lemons because they would have been good if they worked.

Quote:


> Originally Posted by *Buzzkill*
> 
> I see how you have over 400 post and talk bad about Quad sli but no system spec to see what your hardware really is. Just because something dosent work for you where its your unability to use the hardware or it is the hardware. Everyone has different systems. I don't know if you are A+ certified or self taught but people want to learn from someone who is qualified to teach not just opinions. And because Quad sli didn't float your boat it's crap right. You have been here 6 months so I am sorrry I didn't know your rig but I have been reading post here for 6 years. It states to list system spec's in the rules maybe reread them because you missed that part and there are some other parts you missed that I am not going list here or now.
> P.S You have the best hardware but always have problems. I wonder why?? Still cant figure out how to post system specifications and keep the current?? If you sell something or return it you don't have it anymore.
> Maybe this would help you.
> The Ultimate Newbie Guide And General Information Thread
> Quote: franz
> "Please post your system specs." This is important to OCN members, because we use that information to help diagnose problems. It becomes even more important if you are posting overclocking related question


GTX 690 Quad SLI sucks. Period.

I have benchmarked it and it was useless. Proof in the top 30 3DMark 11 score thread, out of the whole forum I have a top 10 score.: http://www.overclock.net/t/872945/top-30-3d-mark-11-scores-using-performance-settings

There is no point to it. It doesn't have enough VRAM for surround at 1080P and 1600P screens are easily handled by a single GTX 690, so that rules out everything. GTX 680 4GB SLI for anything over 1600P is needed. Like Divineshadowx has said, because he actually knows what he's tallking about from past experiences. Unlike yourself who seems to think the card magically has 8GB total VRAM when it has 4GB. 2GB shared by each GTX 680. Because the GTX 680 reference design comes with 2GB. Nvidia never designed a 4GB GTX 680, that was up to the companies that make the cards to do in their own time and the GTX 690 is a total reference design that is not allowed to be altered in any way. Trust me if it was we would have 8GB versions and you can bet I'd have bought that one instead and we wouldn't be having this conversation.

I don't have my system specs posted because my comupter isn't finished yet and every thread I constantly post in, users know my build and don't need a constant reminder of it. But since you care for them so much, my system specs, as they are right now:

x3 Dell U2412Ms in portrait running 3860x1920P

Intel i7 3820 at 4700mhz (will buy a 3930k once I figure this GPU thing out, not going to buy one and re-mount CPU until I watercool it. too much effort)
ASUS X79 Sabertooth
Corsair Vengeance RAM 8GB 1600mhz (have Dominator Platinum 4x4GB 1866mhz on order)
Powercolor Devil 13 HD 7990
ASUS Xonar Phoebus
Corsair Force GS 240GB SSD
Corsair Force 3 60GB SSD
WD Velociraptor 360GB HDD
Silverstone Strider Gold 1000w PSU

Cougar Vortex PWM Fans x3 120mm
Corgar Vortex Fans x2 120mm and x2 140mm
NZXT Sentry Mesh Fan controller
Koolance 280mm Radiator
Koolance 240mm Radiator
(will buy the rest of my watercooling and set it up once I have this GPU thing figured out)
ModSmart braided cables for everything except GPU. (will get them when I figure this GPU thing out)

Razer BlackWidow BF3 keyboard
Razer Mamba 2012 mouse
Razer IronClad surface
Beats by Dre Detox headphones

Since when do I have problems with my computer? Only problem I ever recal having is with my GPUs because they don't perform as they should. Last time I ran 3DMark 11 I got a healthy score of 16850 on performance settings and never have issues with with my system. Only issues I had was some screen tearing, flash videos not working in browsers (but that was a new 306.02 driver issue and it didn't bother me) and the cards VRAM limitation. Here's the last few cards I can remember and what issues I had with them:

Palit GTX 295 - great card no issues
Powercolor HD 6990 - micro stutter and drivers because of a major design flaw in the card and AMD drivers are very poor
Saphire HD 6970 - good card, no problems other than bad AMD drivers but in tri-crossfire did help with micro stutter a bit
EVGA GTX 690 - screen tearing and VRAM limitation in surround (not exactly issues) and drivers causing green screen and flash videos not working. other than that best and favourtite card I've ever owned.
MSI Lightning HD 7970 Crossfire - could not load games with an overclock of more than 30mhz on the core even with 1250mv or 100mhz on the memory. would run with 1300mv for a few minutes then crash. MSI Afterburner would cause strange errors on the desktop, create lag and delays with the mouse and performing normal tasks. screen tearing even with x3 DP connections and limiting frame rate to 59FPS or Vsync.
Powercolor Devil 13 HD 7990 - good card but loud coil whine and screen tearing to the point it's too distracting to play games so will be taking back tomorrow.

Quote:


> Originally Posted by *Buzzkill*
> 
> Once I get my Overlord OC Pixel Perfect 1440P monitor then I will see if there is a vram issue. Pre-Orderd one on 9-12-12 first day pre-Order offered. With 8GB total vram and new drivers I don't think it will be a big issue. You can always adjust settings if vram is topping 2GB per each gpu. Driver updates can solve issues but if you have to wait for them to be released. I hope lucid logic MVP fix there software and the ivy bridge gpu can be used giving a boost to game play. This is software related so if it does it on one game it is not going to be every game out. This is the 690 owners club right? Looks like people playing borderlands 2 would help with quad sli. I am not going back and forth anymore because it will not fix a thing and people are trying to use forum for info and advise. Thank You.
> I asked for system specifications because he seems to have a problem and has many video cards but still has problems so maybe it is not the video cards but something else. But it must be the video cards so spend more money buying video cards nvidia is not going to complain.


The only 8GB dual GK104 GPUs that will ever exist on the open market are the new Tesla C3000 and the ASUS MARS if they ever come out. Thats it.

Graphics cards are always listed as their total VRAM which you then must divide in half for each to share. HD 6990 had 4GB shared by 2 GPUs also. The GTX 690 is a 4GB card, not a 8GB card.

And yes, the problem is the graphics cards.

Quote:


> Originally Posted by *DinaAngel*
> anyone got to 1300mhz? as nvidia stated
> my card doesnt want to go fourther than 140 on core + and 200 on memmory, means 1200 mhz on core and got no idea what memmory is


Only ever got to 1222mhz which was stable for a few hours but then crashed. Was like 165-170+ on the core and that was with 300+ on the memory. Air or water, both of mine never went over that. The one under water was stable longer but not by much. Think it crashed the next day but that one was running 1080P and not being pushed as hard.

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> At last I have ordered my water cooling gear inclusive of a Koolance gtx690 water block. However I still have the two 680's I purchased before the 690. So I was wondering if it would be better to water cool the 680's over the 690? Which would give the better OC performance on water? Any thoughts?


Each of your GTX 680s should be able to be overclocked to around 1300mhz if they are good eggs. Your GTX 690 will have a hard time going above 1200mhz. Plus two cards are normally better than dual cards, not enough that you'll notice it though.

I'd overclock all of them, run 3DMark 11 and Heaven and watercool which ever set-up wins to keep it stable. But my money would be on the GTX 680 SLI winning.


----------



## Cheesemaster

I am running quad sli I am not sure why you guys are having strange issues. Or I just might be the strange one. I game in surround I dont have any problems. For example I play borderlands 2 to maxed to the extreme on three monitors and rarely go below 80 fps. BF3 I hover around 80 fps on ultra no MSAA. and a long list I am getting awesome frame rates with maxed settings in all games. BF3 is an exception (no msaa) what gives guys? This is breaking my heart.

P.S I am in second on this forum in Mark11 P and I think i am still in the top twenty hall of fame and top ten in the extreme setting in Mark 11. so whats the diffence with my rig ?????????


----------



## dboythagr8

Quote:


> Originally Posted by *dboythagr8*
> 
> Is my GTX 690 dead?
> Was playing Borderlands 2. I exited and stuff seemed weird. Reset machine went back in and it was fine up until I left for good some time later from bl2. Before I left I noticed my gpu clocks were not right, wasn't boosting correctly. Left and went into menu to turn on SLI, screen went black and picture never came back. Turned machine off. Turned it back on and my motherboard was stuck on Asus screen.
> Then I tried moving it to diff pci slot and I could get a picture but EVGA Precision had a popup that said unidentified GPU or whatever and would not display at my native res of 2560x1600...put in my backup 580 SLI setup and they worked fine. Tried 690 again same thing happened as before. The Geforce logo still lights up and when I went into GPU-Z the parts of the 690 specs showed, but not all. So what's going on here?


Any comment on my 690 issue above?


----------



## Arizonian

Edited one post as a courtesy rather than just delete the entire post since there was positive feedback that contributed to the thread.

As for everyone posting, As TFL Replica just stated please be kind to each other. Please ask yourself am I positively contributing to the thread and did I maintain a friendly and professional atmosphere. Please keep any personal comments toward one another to yourselves.

Thanks guys.


----------



## ceteris

Quote:


> Originally Posted by *dboythagr8*
> 
> Any comment on my 690 issue above?


Sounds like a problem to me. I've never had to retweak anything in precision, change resolutions, etc after crashing out of games and other programs. Best bet is just to RMA it. Lucky you have a 580 sitting around as backup til you get your 690 back though


----------



## Arkanor

I lock my frames to 60 running Borderlands 2 and typically see 65-80% GPU usage, with spikes in a couple areas.

Memory use seems to be around 800-1100, 2560x1600 max everything.


----------



## pilla99

Just a question does has anyone / does anyone use ubuntu or linux with their 690 and how does it work? I am thinking of using crossover for the games I would want but want to make sure that the card is supported correctly on something besides Windows.


----------



## thestache

Quote:


> Originally Posted by *Cheesemaster*
> 
> I am running quad sli I am not sure why you guys are having strange issues. Or I just might be the strange one. I game in surround I dont have any problems. For example I play borderlands 2 to maxed to the extreme on three monitors and rarely go below 80 fps. BF3 I hover around 80 fps on ultra no MSAA. and a long list I am getting awesome frame rates with maxed settings in all games. BF3 is an exception (no msaa) what gives guys? This is breaking my heart.
> P.S I am in second on this forum in Mark11 P and I think i am still in the top twenty hall of fame and top ten in the extreme setting in Mark 11. so whats the diffence with my rig ?????????


You're missing something very important my friend. VRAM and AA.

GTX 690 SLI is fast. Nobody would be stupid to doubt that. But the applications it can be used for is limited by its limited VRAM. A single GTX 690 runs 2560x1600P above 60FPS and that's the maximum resolution a GTX 690 can be effective at. So that in its self renders a second card useless.

The card can't run surround resolutions at maximum settings with 5760x1080P or above because such resolutions in todays badly optimised/heavily AAed games require above the 2012mb limit. And that's why as such an expensive product it's considered useless.

It's one thing to win benchmarks on a single screen at 720P and 1080P with a VRAM limited system but once you start running your native surround resolutions the benchmarks won't even load. Heaven on my GTX 690 system I had would not even load at 3880x1920 and it won't on yours either.

And that's what we're getting at. GTX 690 is the pinnacle of video game hardware and pushing the boundaries of graphics card design but it has a flaw and that's the memory each GPU has available to it. Nobody wants to pay $1000 or even $2000 on GPUs and the same on monitors to run them without all the eye candy turned up. Max settings include AA and playing without AA isn't pushing anyone's boundaries. It's a giant cop out. Especially when three 4GB GTX 680s Are far cheaper and easier to find.

I don't doubt you have great frame rates in BF3 but once you turn AA on your GTX 690 SLI will fall flat on its face. I know mine did and it really upsets me that no review site has even mentioned this or tested it properly. Because on maps like Strike at Karkand in BF3 I was using a minimum 1900mb VRAM with AA off, HBAO off and motion blur off because I had to cut down on VRAM. The level would hit minimums of 50 in some areas but as soon as the VRAM limit was hit itd halter to 20FPS and be unplayable.

It's great to have the speed the GTX 690 has but it means nothing once the card hits it's VRAM limit which happens often when it's running a surround set-up. For this reason GTX 670/680 4GB SLI is better in everyway than any GTX 690 set-up. Which is a shame because the GTX 690 is the most beautiful piece of graphic card engineering to date as far as I'm concerned.

I hope you follow what I'm trying to get at.


----------



## un-nefer

Quote:


> Originally Posted by *thestache*
> 
> The card can't run surround resolutions at maximum settings with 5760x1080P or above because such resolutions in todays badly optimised/heavily AAed games require above the 2012mb limit.


Guessing you're just quoting other ppl who have not actually tested this...

Go check out Vega's surround tests mate, he had a much higher resolution surround setup (3x 2560x1440 monitors) and only in one of his tests did he hit the 2GB limit of the cards, and that was only when using added texture packs. In fact, from the results he got with 2GB GTX680's, and considering the resolution he tested at was almost double that of the one you have listed above, you would have no problem at all with 2GB cards.

To quote Vega's GTX680 results when running 3x 2560x1440 2B Catleap's at 4320x2560 (cleaned up to better show results):
Quote:


> *Skyrim - All Ultra/4x MSAA*
> 2x 680 - FPS: 72 / VRAM: 1863
> 
> *BF3 - Texture H/Shadow M/Effects L/Mesh L/Terrain L/Decor L/MSAA 0X/FXAA H/BLur Off/AF 16X/HBAO*
> 2x 680 - FPS: 59 / VRAM: 1848
> 
> *Diabo III - All in-game settings maxed*
> 2x 680 - FPS: 109 / VRAM: 1242
> 
> *Metro2033 - Normal/0x MSAA*
> 2x 680- 82 / VRAM: 1323
> 
> *Witcher 2 Enhanced - Medium*
> 2x 680 - FPS: 39 / VRAM: 1491
> 
> *Crysis 2 - High/DX11/Texture Pack*
> 2x 680 - FPS: 60 / VRAM: 2025
> 
> *WoW - Max/4x MSAA*
> 2x 680 - FPS: 107 / VRAM: 1702


All look like playable FPS to me - and higher FPS could even be achieved with lower in game visual quality settings.


----------



## Cheesemaster

I can run bf3 with MSAAx2 and get over 60fps. if I got\ straight ultra your right no work'o'sad'o. I turn off MSAA all together and I get average 80fps on three monitors. But I agree 2k on graphics cards I should run maxxed out supreme'o! kinda sucks. if these had 4gb then they would be unstoppable and cool to boot. should have waited for the mars cards. I will be good till next gen but yeah how could this have got past us and the devoplers? I have no idea, although my two cards were a birthday present.... I might yank these out for two mars 690's but I wont switch them for anything else.


----------



## Divineshadowx

Quote:


> Originally Posted by *un-nefer*
> 
> Guessing you're just quoting other ppl who have not actually tested this...
> Go check out Vega's surround tests mate, he had a much higher resolution surround setup (3x 2560x1440 monitors) and only in one of his tests did he hit the 2GB limit of the cards, and that was only when using added texture packs. In fact, from the results he got with 2GB GTX680's, and considering the resolution he tested at was almost double that of the one you have listed above, you would have no problem at all with 2GB cards.
> To quote Vega's GTX680 results when running 3x 2560x1440 2B Catleap's at 4320x2560 (cleaned up to better show results):
> All look like playable FPS to me - and higher FPS could even be achieved with lower in game visual quality settings.


Whats with the hypocritical statement? You say he is quoting others when you don't even have a 690? At 2560x1440 on crysis 1 I run into vram problems with mods, at random times my fps will drop to 0.3. There isn't anyway that 2gb could run any res higher than 2560x1600 with eye candy. No point to get a 690 anyway if youre not running AA. Thestache already proved he has all the hardware he speaks about go look at the pics.


----------



## DinaAngel

anyone got VDDC issue with 690? gpu 2 got VDDC unstabile at times, going down to 1168 and up, repeatingly under max load, it causes oc very difficult also


----------



## Arizonian

*Club update*: After speaking with our club OP - jcde7ago, I'm going to be assisting him in keeping up with club the member list. I've got about 50 more pages or so to sift through to be complete of the entire thread. I do apologize if your names aren't in order to date joined.

If you don't see your name feel free to PM me with the link to your GTX 690 photo in this thread to be added. Also if you were added and since no longer have the GTX 690 and would like to be removed from the list please PM me as well.

I was thinking of maybe a club 3DMark11 Extreme Benchoff with our GTX 690's. I'd be glad to keep track and update a list on the second post of the club thread.
_Hope alancsalt dosen't mind me using his icon_.


----------



## Buzzkill

Quote:


> Originally Posted by *Divineshadowx*
> 
> Whats with the hypocritical statement? You say he is quoting others when you don't even have a 690? At 2560x1440 on crysis 1 I run into vram problems with mods, at random times my fps will drop to 0.3. There isn't anyway that 2gb could run any res higher than 2560x1600 with eye candy. No point to get a 690 anyway if youre not running AA. Thestache already proved he has all the hardware he speaks about go look at the pics.


One person has problems and we are all going to have problems. Sorry guys that all just bought a 690 there no point to get a 690. Dont get the Devil 13 or 7970 crossfire more problems . This according to thestach and devineshadow. How my of you guys feel this way and belive no point in getting your 690?


----------



## Divineshadowx

Quote:


> Originally Posted by *Buzzkill*
> 
> One person has problems and we are all going to have problems. Sorry guys that all just bought a 690 there no point to get a 690. Dont get the Devil 13 or 7970 crossfire more problems . This according to thestach and devineshadow. How my of you guys feel this way and belive no point in getting your 690?


690 is great for 2560x1600, probably the best solution. 2 of them is useless though, especially for higher res. Learn to read maybe?


----------



## ceteris

Quote:


> Originally Posted by *Cheesemaster*
> 
> I can run bf3 with MSAAx2 and get over 60fps. if I got\ straight ultra your right no work'o'sad'o. I turn off MSAA all together and I get average 80fps on three monitors. But I agree 2k on graphics cards I should run maxxed out supreme'o! kinda sucks. if these had 4gb then they would be unstoppable and cool to boot. should have waited for the mars cards. I will be good till next gen but yeah how could this have got past us and the devoplers? I have no idea, although my two cards were a birthday present.... I might yank these out for two mars 690's but I wont switch them for anything else.


I read that quad Mars II was a PITA driverwise though compared with quad 590. Hope that won't be the case with the Mars III.

According to a rep, Asus has the final Mars III production type finished and ready to go but they are waiting for permission from nVidia to release them after 690 sales have settled. But this was almost a month and a half ago despite seeing 690's always in stock now. Hard to really say if you "should've waited" since Asus posts ambiguous ETA's and rarely gives solid release dates for most of their computer components.


----------



## Scooby-Snack

Quote:


> Originally Posted by *Divineshadowx*
> 
> Whats with the hypocritical statement? You say he is quoting others when you don't even have a 690? At 2560x1440 on crysis 1 I run into vram problems with mods, at random times my fps will drop to 0.3. There isn't anyway that 2gb could run any res higher than 2560x1600 with eye candy. No point to get a 690 anyway if youre not running AA. Thestache already proved he has all the hardware he speaks about go look at the pics.


Ok so based on the Cards Running out of Vram 2 GTX 690 SLI is a waste for my 3 Monitor Setup?

In your opinion would Quad SIi GTX 680 4GB be the way to go?

Do you think I will see much improvement going with Tri 680's vs Quad?

My main concern is getting the best performance possible out of my $2,400 investment in GPU's. I have no problem selling off my single Hydro GTX 690 and cancelling my new Order for GTX 690 lol!

I will be pissed if I spend that much and don't get **** for results LOL!

Thank you for the advice it's greatly appreciated for sure!

-Scoob


----------



## Qu1ckset

I just installed my Evga HydroCopper water block and backplate on, thought id share some pics


----------



## Arizonian

Are the 6990 4GB owners running multiple monitors complaing about the same limitations?


----------



## MrTOOSHORT

Looks very nice Qu1ckset!

Congrats!


----------



## ceteris

Quote:


> Originally Posted by *Scooby-Snack*
> 
> Ok so based on the Cards Running out of Vram 2 GTX 690 SLI is a waste for my 3 Monitor Setup?
> In your opinion would Quad SIi GTX 680 4GB be the way to go?
> Do you think I will see much improvement going with Tri 680's vs Quad?
> My main concern is getting the best performance possible out of my $2,400 investment in GPU's. I have no problem selling off my single Hydro GTX 690 and cancelling my new Order for GTX 690 lol!
> I will be pissed if I spend that much and don't get **** for results LOL!
> Thank you for the advice it's greatly appreciated for sure!
> -Scoob


I'd wait for GK-110 next year since it is almost October. Right now, my 690 is pretty much a placeholder for that or the Mars III. I'm sure nVidia will have up'd the VRAM on the next chipset which might come out sometime 2nd quarter of next year at the earliest (speculation atm).


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ceteris*
> 
> I'd wait for GK-110 next year since it is almost October. Right now, my 690 is pretty much a placeholder for that or the Mars III. I'm sure nVidia will have up'd the VRAM on the next chipset which might come out sometime 2nd quarter of next year at the earliest (speculation atm).


You must be rich to say your gtx 690 is place holder.

Most people would use an old gtx 470 as a plase holder.

I had to think long and hard to put $1000 into getting this gpu, actually $1300 with the waterblock.


----------



## ceteris

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> You must be rich to say your gtx 690 is place holder.
> Most people would use an old gtx 470 as a plase holder.
> I had to think long and hard to put $1000 into getting this gpu, actually $1300 with the waterblock.


I didn't mean to come off that way. I actually wanted this 690 to be THE card for awhile til until all these reviews and info on voltage restrictions, VRAM limitations, etc came to light. The card I had before this was actually a reference GTX 480 on a Koolance waterblock. It would just be too much trouble to go back to the 480 when I'm currently running on air til I recoup money from moving, replacing car wife trashed, etc to finish buying the rest of the parts I need to go back on water.

If I was rich, I'd upgrade to SLI or Tri-SLI EVGA 680 Classifieds or MSI 680 Twin Frozr 4GB as place holders for GK-110









Trust me, I'm keen on not wasting anymore money than you are. I spent around $1100+ with tax and shipping to be amongst the first people buy this card on launch and I frown thinking of the idea of taking a $200-$300 loss trying to liquidate this on Craigslist to upgrade to seperate 4GB GTX 680 cards this late in the year.


----------



## Ruby Rabbit

Quote:


> Originally Posted by *Qu1ckset*
> 
> I just installed my Evga HydroCopper water block and backplate on, thought id share some pics


I've got the Koolance block on the way. Can you post your temps and OC range. I must say I am still Reticent to pull such a nice looking card apart.


----------



## Qu1ckset

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> I've got the Koolance block on the way. Can you post your temps and OC range. I must say I am still Reticent to pull such a nice looking card apart.


Won't have it in my computer till Saturday at the earliest, waiting on a few more things to arrive in the mail


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> Won't have it in my computer till Saturday at the earliest, waiting on a few more things to arrive in the mail


The card looks 100% beast with that waterblock, congrats.


----------



## Divineshadowx

Quote:


> Originally Posted by *Scooby-Snack*
> 
> Ok so based on the Cards Running out of Vram 2 GTX 690 SLI is a waste for my 3 Monitor Setup?
> In your opinion would Quad SIi GTX 680 4GB be the way to go?
> Do you think I will see much improvement going with Tri 680's vs Quad?
> My main concern is getting the best performance possible out of my $2,400 investment in GPU's. I have no problem selling off my single Hydro GTX 690 and cancelling my new Order for GTX 690 lol!
> I will be pissed if I spend that much and don't get **** for results LOL!
> Thank you for the advice it's greatly appreciated for sure!
> -Scoob


It is a waste. I'm going to sell my second 690 and probably water cool my system, and buy other components, like a monitor calibrator. There is no point in quad sli at 1440p because the second card adds no benefit besides benchmarking scores. The vram will hold you back before the power of the cards do. I'm running into vram problems on crysis 1 and 2 with light mods at 1440p, it's not serious but the game will freeze time to time for about 2 sec, still very playable otherwise. Most people say quad over tri sli 680 does not give much benefit. What I would do is buy the 3x 4gb 680's and see how that works. If it isn't enough and you get below 60fps then I would throw one more in there, I'm sure it would be enough though. You can wait for gk110, but no one actually knows when it's coming out. Rumors said the 690 was coming 2 months before it was released, if you can wait then do it, don't be shocked though if they come out 6months-1year from now, who knows. And plus, the 4gb 680 will not lose it's value anytime soon. I would really water cool them also, with 3/4 cards they would receive barely any airflow.


----------



## Qu1ckset

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Looks very nice Qu1ckset!
> Congrats!


Quote:


> Originally Posted by *Divineshadowx*
> 
> The card looks 100% beast with that waterblock, congrats.


Thank Guys, will post more pics when its installed in my system


----------



## Arizonian

Quote:


> Originally Posted by *Qu1ckset*
> 
> I just installed my Evga HydroCopper water block and backplate on, thought id share some pics
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats Qu1ckset, nice work looks sweet.


----------



## pilla99

Asked this question earlier but no one responded so I'll ask again, has anyone / does anyone use their 690 with ubuntu or a distro of Linux? I am doing a lot of Python and programming this semester and I want to get more familiar with Linux as an OS.

I used to use a Mac and used Crossover Games to play some of my favorites like Counter Strike Source without having to use Windows. There have been two or three major version releases since then so I am not worried about being able to play in the native OS, but I had poor luck getting GPU drivers working with my 6870's when I had them.

I see there is an official driver on Nvidia's site but just wanted some input from anyone that might have this already done, any tips??? I am thinking about blowing away my Windows install tonight if I really want a project. Or just watch Breaking Bad instead.


----------



## thestache

Quote:


> Originally Posted by *un-nefer*
> 
> Guessing you're just quoting other ppl who have not actually tested this...
> Go check out Vega's surround tests mate, he had a much higher resolution surround setup (3x 2560x1440 monitors) and only in one of his tests did he hit the 2GB limit of the cards, and that was only when using added texture packs. In fact, from the results he got with 2GB GTX680's, and considering the resolution he tested at was almost double that of the one you have listed above, you would have no problem at all with 2GB cards.
> To quote Vega's GTX680 results when running 3x 2560x1440 2B Catleap's at 4320x2560 (cleaned up to better show results):
> All look like playable FPS to me - and higher FPS could even be achieved with lower in game visual quality settings.


Lol.

Check the settings he's using.

High textures
Medium shadows
Low mesh
Low decor
Low effects
HBAO on
AA off
FXAA high
Motion blur off

And I'm the one that doesn't know what I'm talking about... Learn to read before you start firing off stupid accusations.

I've had a EVGA GTX 690 (and ran two in SLI), Lightning HD 7970 xfire, Devil 13 HD 7990 and Gainward 4GB GTX 680 SLI all in the last few weeks and when it comes to surround and how they perform at x3 1920x1200P I know what I'm talking about. All might have screen tearing with my set up but all are quite good in their own rights. Except for the GTX 690s VRAM limit.

BF3 can not be run in surround at ultra on a GTX 690 period. It doesn't have the VRAM. Sure some maps use more than others but on maps like Strike at Karkand and Gulf of Oman I would hit the VRAM limit (and the game would stop) with ultra everything except no AA, No HBAO and no motion blur. Same settings would run Caspian Border with lows in the 70-80s.

As for that other guy I'm done talking to brick walls, go back and re-read all our posts. Several of us have tested this and several of us have acknowledged and agreed GTX 690 and surround is a no go.


----------



## Beez

Quote:


> Originally Posted by *Jessekin32*
> 
> Anyone with any news or other issues with Borderlands 2 and their GTX 690? I get around 30-39fps with about 50-58% usage on each GPU while only using half my memory. What gives? D:
> VRAM is at 1085MB ish on each GPU. It's not reaching a VRAM limit. The GPU's are NOT running at 100% either. Also, whenever I get headshots or "crits" the game lags or freezes for about 3/4 a second. As a Sniper, this drives me NUTS. I hope a driver update fixes this.


Im having this SAME Issue. including sniping. it stutters with physx on high. my machine is overkill. i7 3820 4.7 ghz, 16gb ddr ram and the 690. using septembers drivers and physx. turn physx on low and the game runs like butter.

been able to figure anything out about this?


----------



## Arizonian

I'd like to ask a question regarding multiple monitors with greater resolution than 2560x1600 and the VRAM limits being seen with just 2 GB VRAM.

Can anyone confirm if the current 6990 owners running Eyefinity are running into the same VRAM limit sporting 2 GB's per GPU?

Lastly, I don't know why this is a surprise because right on *Nvidia's* web site GTX 690 specs clearly say up to 4 displays Maximum Digital Resolution 2560x1600 & Maximum VGA Resolution 2048x1536.

I do applaud Nvidia finally coming the the table for running surround with one card but they forgot to bring the extra 1 GB to the table. If Nvidia wants to contend in the multiple monitor arena they need to at the very least match AMD's VRAM on cards. Nvidia does Surround like AMD does HD3D Vision, it's just enough to function but not enough to being doing it right.

Edited to add: On a positive note the GTX 690 does fit many many roles I won't go into listing here because most of us know them.







It's a great card for those roles it does fill.


----------



## zkalra

Quote:


> Originally Posted by *Beez*
> 
> Im having this SAME Issue. including sniping. it stutters with physx on high. my machine is overkill. i7 3820 4.7 ghz, 16gb ddr ram and the 690. using septembers drivers and physx. turn physx on low and the game runs like butter.
> been able to figure anything out about this?


Guys even on my setup it runs like crap in some areas. Check below. In the beacon protection mission where I am trying to get to sanctuary the damn thing drops to like low 40s. The sli profile needs working as I get the same performance on 1,2,3 or 4 cards.


----------



## xoleras

Good grief people! Physx has always caused a performance drop in every game it has been in! Why are people trying to run physx high and acting surprised when it slows down in some areas?

If you have stuttering with physx high obvious solution is to turn the physx effects down. Physx hasn't changed. It does cause a performance drop no matter what!


----------



## rationalthinking

Quote:


> GTX 690 Quad Sli and GTX 680 Dedicated to Physx


5 680s and 1 dedicated to PhysX on just a 23" 120hz monitor?

WOW.


----------



## zkalra

Quote:


> Originally Posted by *rationalthinking*
> 
> 5 680s and 1 dedicated to PhysX on just a 23" 120hz monitor?
> WOW.


Ordering two more when I move the computer into my study where there is enough desk space. Currently in my bedroom and doubt ill be allowed to sleep there if the wifey sees another two hanging around.


----------



## Divineshadowx

Not sure how many of you guys are getting bad performance. I have my single 690 running borderlands 2 and then 1 gpu of my second 690 dedicated to physx. I get a constant 60fps, cards arn't even being stressed. I have physx turned up to high and never got over 13%usage on the gpu that it is dedicated to it.


----------



## Beez

Quote:


> Originally Posted by *xoleras*
> 
> Good grief people! Physx has always caused a performance drop in every game it has been in! Why are people trying to run physx high and acting surprised when it slows down in some areas?
> If you have stuttering with physx high obvious solution is to turn the physx effects down. Physx hasn't changed. It does cause a performance drop no matter what!


actually this is false







and there are plenty of boards out there where theyve tested a 200-400 series and physx used 30% of its power. of course it causes a performance drop, but it should not be at this level - computers at these specs and cards this powerful should not be getting this problem, when a single gtx 680/580 series can do it alone. thats why were concerned

I ran a few more tests and when it happens - both cards usage drop to 10-16% then spike back to a normal/average 40%

only happens in borderlands. skyrim/bf3/metro 2033 etc, no problems whatsoever.

annnd feel like an idiot. using an asrock board, and looking over my CPU voltages, VTT was too high, set it back to 1.124 and bumped vcore a little bit to 1.36. no longer having any issues. runs amazing. physx in borderlands no longer having issue. hope this solves anyone else having the same problem.

Merged


----------



## zkalra

Quote:


> Originally Posted by *Divineshadowx*
> 
> Not sure how many of you guys are getting bad performance. I have my single 690 running borderlands 2 and then 1 gpu of my second 690 dedicated to physx. I get a constant 60fps, cards arn't even being stressed. I have physx turned up to high and never got over 13%usage on the gpu that it is dedicated to it.


I dont know mate. Guess you are lucky. I'm getting crap performance down into the low 40s in certain intense firefights. Same config as yours pretty much except I have a 680 entirely dedicated to Physx and then when I try 1,2,3 or 4 cards in SLI the performance is unchanged with perhaps Single GPU config giving the smoothest experience.


----------



## thestache

Quote:


> Originally Posted by *zkalra*
> 
> I dont know mate. Guess you are lucky. I'm getting crap performance down into the low 40s in certain intense firefights. Same config as yours pretty much except I have a 680 entirely dedicated to Physx and then when I try 1,2,3 or 4 cards in SLI the performance is unchanged with perhaps Single GPU config giving the smoothest experience.


What drivers are you using?


----------



## thestache

Quote:


> Originally Posted by *Arizonian*
> 
> I'd like to ask a question regarding multiple monitors with greater resolution than 2560x1600 and the VRAM limits being seen with just 2 GB VRAM.
> Can anyone confirm if the current 6990 owners running Eyefinity are running into the same VRAM limit sporting 2 GB's per GPU?
> Lastly, I don't know why this is a surprise because right on *Nvidia's* web site GTX 690 specs clearly say up to 4 displays Maximum Digital Resolution 2560x1600 & Maximum VGA Resolution 2048x1536.
> I do applaud Nvidia finally coming the the table for running surround with one card but they forgot to bring the extra 1 GB to the table. If Nvidia wants to contend in the multiple monitor arena they need to at the very least match AMD's VRAM on cards. Nvidia does Surround like AMD does HD3D Vision, it's just enough to function but not enough to being doing it right.
> Edited to add: On a positive note the GTX 690 does fit many many roles I won't go into listing here because most of us know them.
> 
> 
> 
> 
> 
> 
> 
> It's a great card for those roles it does fill.


What it can do, it does better than any card, period. It's just a shame it didn't have that extra VRAM or it'd do everything better than any other card instead of just some things.


----------



## Divineshadowx

Quote:


> Originally Posted by *zkalra*
> 
> I dont know mate. Guess you are lucky. I'm getting crap performance down into the low 40s in certain intense firefights. Same config as yours pretty much except I have a 680 entirely dedicated to Physx and then when I try 1,2,3 or 4 cards in SLI the performance is unchanged with perhaps Single GPU config giving the smoothest experience.


Try reinstalling to the newest driver or reinstall the game. I would make sure that the cpu isnt running physx. And a single 690 is better in almost every game except the witcher 2 with ubersampling and benchmarks because no current game can make use of the 4 cores. With both enabled I get horrible fps in games such as sc2, but not borderlands 2.


----------



## Buzzkill

New Beta NVIDIA GeForce 306.63

These have the same larger shared system memory size like regular Geforce drivers.
305.53 and 305.67 had 1024mb, 306.63 has 2816mb.

You can now download the NVIDIA GeForce 306.63 driver. This driver comes directly from NVIDIA's developer website. Windows driver version 306.63 and Linux drivers version 304.15.00.03 provide beta support for OpenGL 4.3 and GLSL 4.30 on capable hardware. You will need any one of the following Fermi or Kepler based GPUs to get access to the OpenGL 4.3 and GLSL 4.30 functionality.

OpenGL 4.3 Driver Release Notes
You will need any one of the following Fermi or Kepler based GPUs to get access to the OpenGL 4.3 and GLSL 4.30 functionality:

Quadro series: 6000, 600, 5000, 410, 4000, 400, 2000D, 2000
GeForce 600 series: GTX 690, GTX 680, GTX 670, GT 645, GT 640, GT 630, GT 620, GT 610, 605
GeForce 500 series: GTX 590, GTX 580, GTX 570, GTX 560 Ti, GTX 560 SE, GTX 560, GTX 555, GTX 550 Ti, GT 545, GT 530, GT 520, 510
GeForce 400 series: GTX 480, GTX 470, GTX 465, GTX 460 v2, GTX 460 SE v2, GTX 460 SE, GTX 460, GTS 450, GT 440, GT 430, GT 420, 405
All the extensions listed below are part of the OpenGL 4.3 core specification, but they can also be used in contexts below OpenGL 4.3 on supported hardware.

For OpenGL 3 capable hardware, these new extensions are provided:

ARB_arrays_of_arrays
ARB_clear_buffer_object
ARB_copy_image
ARB_ES3_compatibility
ARB_explicit_uniform_location
ARB_fragment_layer_viewport
ARB_framebuffer_no_attachments
ARB_internalformat_query2
ARB_invalidate_subdata
ARB_program_interface_query
ARB_robust_buffer_access_behavior
ARB_stencil_texturing
ARB_texture_buffer_range
ARB_texture_query_levels
ARB_texture_storage_multisample
ARB_texture_view
ARB_vertex_attrib_binding
KHR_debug
For OpenGL 4 capable hardware, these new extensions are provided:

ARB_compute_shader
ARB_multi_draw_indirect
ARB_shader_image_size
ARB_shader_storage_buffer_object


----------



## zkalra

Quote:


> Originally Posted by *Divineshadowx*
> 
> Try reinstalling to the newest driver or reinstall the game. I would make sure that the cpu isnt running physx. And a single 690 is better in almost every game except the witcher 2 with ubersampling and benchmarks because no current game can make use of the 4 cores. With both enabled I get horrible fps in games such as sc2, but not borderlands 2.


1. Sleeping Dogs runs way better with quad sli if the Extreme AA option is enabled. Its a slideshow on a single 690.
2. Max Payne 3 is outstanding and 120 fps smooth with quad sli and 8xMSAA. Again a slideshow on a single 690.
3. Metro 2033 has almost perfect scaling with quad sli with fps 4x that of a single 680.


----------



## Divineshadowx

Quote:


> Originally Posted by *zkalra*
> 
> 1. Sleeping Dogs runs way better with quad sli if the Extreme AA option is enabled. Its a slideshow on a single 690.
> 2. Max Payne 3 is outstanding and 120 fps smooth with quad sli and 8xMSAA. Again a slideshow on a single 690.
> 3. Metro 2033 has almost perfect scaling with quad sli with fps 4x that of a single 680.


The first 2 arnt even demanding. And metro is so old its not even optimized, I got 120fps on a single 690 with my old monitor on metro. I get 60fps on any single game with a single 690. And youre on 1080p so im not sure how you get a slideshow. And plus 60 is all you need. 1440p > 120hz all day.


----------



## Divineshadowx

Did some testing on crysis 2 with dx9, had 2x2ssaa enabled, so a theoretical resolution of 4 2560x1440 monitors. (roughly 14mil pixels). Somehow I didn't run into vram issues only once or twice for about 1sec, was lagging a bit, 30-50fps, all 4 gpu had usage of 80-100%. I'm not sure if running ssaa is less demanding than 4 actual monitors but it's what I got. This was with everything else maxed out, just no AA like msaa beside the ssaa. any thoughts?


----------



## zkalra

Quote:


> Originally Posted by *Divineshadowx*
> 
> The first 2 arnt even demanding. And metro is so old its not even optimized, I got 120fps on a single 690 with my old monitor on metro. I get 60fps on any single game with a single 690. And youre on 1080p so im not sure how you get a slideshow. And plus 60 is all you need. 1440p > 120hz all day.


Lol. Did you even try the games with extreme AA and 8xmsaa respectively? Mp3 would be a sideshow. Please enable the options I am referring to and then revert.


----------



## Divineshadowx

Quote:


> Originally Posted by *zkalra*
> 
> Lol. Did you even try the games with extreme AA and 8xmsaa respectively? Mp3 would be a sideshow. Please enable the options I am referring to and then revert.


Ugh ya, but extreme aa isnt actually built into the game. You can enable super aa in any game through inspector and get a slideshow, I get negative fps in crysis 2 with that kind of AA lol. Those games without those options are cake to run. Games like crysis 2 and bf3 are difficult to run with AA, not the 690 ofc but you know what I mean.


----------



## zkalra

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ugh ya, but extreme aa isnt actually built into the game. You can enable super aa in any game through inspector and get a slideshow, I get negative fps in crysis 2 with that kind of AA lol. Those games without those options are cake to run. Games like crysis 2 and bf3 are difficult to run with AA, not the 690 ofc but you know what I mean.


Uhhhh you are incorrect. Extreme A&A is an option in sleeping dogs. Would AGAIN request you to enable it provided you even have the game, and then run the in game benchmark. I get 107 with quad sli and 57 with a single 690.
And the game looks positively next gen with extreme AA especially in the nighttime missions with the reflections etc on the road.


----------



## Hokies83

Quote:


> Originally Posted by *zkalra*
> 
> 1. Sleeping Dogs runs way better with quad sli if the Extreme AA option is enabled. Its a slideshow on a single 690.
> 2. Max Payne 3 is outstanding and 120 fps smooth with quad sli and 8xMSAA. Again a slideshow on a single 690.
> *3. Metro 2033 has almost perfect scaling with quad sli with fps 4x that of a single 680.*


I do not see how u are not Heavy Vram limited with the 2gb of Vram you have.

Also 2x Gtx 680 = 20-30% faster then 1 gtx 690 so i do not see how this 4x 1Gtx 680 could ever be posted..

I run all those games maxxed out at 2560x1440 and 1920x1080 @ 120hz Smooth as silk. But i also have 4GB of Vram.

Also with High res AA etc Kepler's weakness is Bandwidth which can be helped with overclocking the memory .. I can overclock mine to 7200mhz How far can you with all those cards stuffed in there like that? And for a more simple ref that is +600 on the memory offset.


----------



## Divineshadowx

Quote:


> Originally Posted by *zkalra*
> 
> Uhhhh you are incorrect. Extreme AA is an option in sleeping dogs. Would AGAIN request you to enable it provided you even have the game, and then run the in game benchmark. I get 107 with quad sli and 57 with a single 690.
> And the game looks positively next gen with extreme AA especially in the nighttime missions with the reflections etc on the road.
> EDIT - apologies for double post. Ipad working it's wonders.


Did you read what i said? Extreme AA is a name they gave the option in the game to enable ssaa, which runs the textures at a higher res to make it look crisp. Basically the same thing as ubersampling in the witcher 2, just another name the devs gave it to make it seem epic, and it is epic but its not their technology its built in to the gpu.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> [/B]
> I do not see how u are not Heavy Vram limited with the 2gb of Vram you have.
> Also 2x Gtx 680 = 20-30% faster then 1 gtx 690 so i do not see how this 4x 1Gtx 680 could ever be posted..
> I run all those games maxxed out at 2560x1440 and 1920x1080 @ 120hz Smooth as silk. But i also have 4GB of Vram.


Which is what i would expect with 2 680's, at least thats what i think you're running. Those games should never need more than a single 690/2 680s, ive never seen any of those games on any benchmark. SSAA/8xmsaa really doesn't take as much vram as you would think, im not sure why yet, but im fairly sure that more physical monitors are more demanding, not sure why, I guess it doesn't render everything at a higher res, shadows, and post processing etc..., together they would destroy anything though, probably not possible to run 2x2ssaa combined with 8xmsaa or higher, maybe some games... And 20-30% is a bit too much, they are 5-7% of a single 690, and with a good oc on water probably up to 15%, the scaling of quad 690 vs quad 680 is pretty much the same, the vram kills it though.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Which is what i would expect with 2 680's, at least thats what i think you're running. Those games should never need more than a single 690/2 680s, ive never seen any of those games on any benchmark. SSAA/8xmsaa really doesn't take as much vram as you would think, im not sure why yet, but im fairly sure that more physical monitors are more demanding, not sure why, I guess it doesn't render everything at a higher res, shadows, and post processing etc..., together they would destroy anything though, probably not possible to run 2x2ssaa combined with 8xmsaa or higher, maybe some games... And 20-30% is a bit too much, they are 5-7% of a single 690, and with a good oc on water probably up to 15%, the scaling of quad 690 vs quad 680 is pretty much the same, the vram kills it though.


My % figured were factoring in Overclocks and benchmarks Lurking over on AnAndtech you see 1000000000 Benchmarks a day lol..

But anywho I have run into Vram bumps myself that is why i sold my 2gb 680s to get 4gb 680s Massive AA and texture mods and 2gb goes up in dust.

Then again i could have kept all my 680s and been running Quad Sli @ 2gb Vram But knowing that 680s is more then enough to max anything out and have smooth game play i opted to sell the 2 2gbs i had.
Vram is the more important factor. Also put funds in my Bank for Gtx 780s... Cant wait..

The real factor is the Memory Bus of These kepler cards and AA textures etc it is the bottle neck.

A card running these settings with lets say a stock core clock and + 500 memory overclock will Steam roll a card with a 1250mhz core clock and no memory overclock.

Again just Visit AnAndtech Gpu Section and you will see the benchmarks posted 100000000x lol... I swear they love to go into detail...


----------



## zkalra

Quote:


> Originally Posted by *Divineshadowx*
> 
> Did you read what i said? Extreme AA is a name they gave the option in the game to enable ssaa, which runs the textures at a higher res to make it look crisp. Basically the same thing as ubersampling in the witcher 2, just another name the devs gave it to make it seem epic, and it is epic but its not their technology its built in to the gpu.


Simple point is that über sampling/extreme sampling or whatever you wanna call it, when off-loaded to a GPU provides incredible scaling as is evident in MP3 and Sleeping Dogs. Where is the argument in that?


----------



## DinaAngel

bump


----------



## Divineshadowx

Quote:


> Originally Posted by *zkalra*
> 
> Simple point is that über sampling/extreme sampling or whatever you wanna call it, when off-loaded to a GPU provides incredible scaling as is evident in MP3 and Sleeping Dogs. Where is the argument in that?


There is no argument, they provide good scaling in every game, im just saying that those 2 games arn't as demanding as others. Take off the extreme sampling and see what you get. SSAA destroys any gpu on mostly every game, 3x3ssaa and i get 0.4fps on crysis 2, maybe vram issue im not even sure because i can't actually see anything as the game freezes.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> My % figured were factoring in Overclocks and benchmarks Lurking over on AnAndtech you see 1000000000 Benchmarks a day lol..
> But anywho I have run into Vram bumps myself that is why i sold my 2gb 680s to get 4gb 680s Massive AA and texture mods and 2gb goes up in dust.
> Then again i could have kept all my 680s and been running Quad Sli @ 2gb Vram But knowing that 680s is more then enough to max anything out and have smooth game play i opted to sell the 2 2gbs i had.
> Vram is the more important factor. Also put funds in my Bank for Gtx 780s... Cant wait..
> The real factor is the Memory Bus of These kepler cards and AA textures etc it is the bottle neck.
> A card running these settings with lets say a stock core clock and + 500 memory overclock will Steam roll a card with a 1250mhz core clock and no memory overclock.
> Again just Visit AnAndtech Gpu Section and you will see the benchmarks posted 100000000x lol... I swear they love to go into detail...


Ya no point in 2 690s or 4 680s if you dont have 4gb because the vram will be the issue before the power of the cards, posted this about 500times. 780s as in rumored gk110 or maxwell? Because i dont see a point in gk110 since my 690/690s destroy everything so far. I will upgrade to maxwell/4k res.


----------



## DinaAngel

Hii, i got my evga 690 sig edition now, anyone know if its worth it to replace any parts on the pcb?.
to can OC fourther. how is the new driver performance wise?, came one out few days ago.
i managed 145 on core and 200 on memmory, im bit scared to raise memmory alot, but 145 on core seem to can crash often the drivers, why is this?.

anyone got tips to share on how to oc on it?, i use evga oc tool pecision x.
i allways think theres things i can learn.

in games my min fps is quite low and i do have drops and often my 690 doesnt fully show its being used in exsample borderland 2
is there some sort of way i can try fix this or is it just driver issue or is it my cpu not being fully used by borderlands 2 or so,
i would doubt it would be my cpu as iv seen the overlay tell tht 690 has been used on 50% or so on each gpu in 1 core games, prettymuch same with borderlands 2 maxed out but in borderlands 2 i get 120 or so fps with it maxed out, but i have drops sometimes.

Kind regards
Dina


----------



## Divineshadowx

Quote:


> Originally Posted by *DinaAngel*
> 
> Hii, i got my evga 690 sig edition now, anyone know if its worth it to replace any parts on the pcb?.
> to can OC fourther. how is the new driver performance wise?, came one out few days ago.
> i managed 145 on core and 200 on memmory, im bit scared to raise memmory alot, but 145 on core seem to can crash often the drivers, why is this?.
> anyone got tips to share on how to oc on it?, i use evga oc tool pecision x.
> i allways think theres things i can learn.
> in games my min fps is quite low and i do have drops and often my 690 doesnt fully show its being used in exsample borderland 2
> is there some sort of way i can try fix this or is it just driver issue or is it my cpu not being fully used by borderlands 2 or so,
> i would doubt it would be my cpu as iv seen the overlay tell tht 690 has been used on 50% or so on each gpu in 1 core games, prettymuch same with borderlands 2 maxed out but in borderlands 2 i get 120 or so fps with it maxed out, but i have drops sometimes.
> Kind regards
> Dina


Replace part of the pcb to do what, remove the voltage regulator? Probably end up destroying your card if you mod it that far, and the power target is 135% max, so you would be limited in your oc anyway, since voltage would be part of the power draw. nvidia already designed the 690 to be more than overclocking ready, I wouldn't go more. Higher memory clock results in overall better fps, so raise the memory, you can go much higher than 200, ive gotten to 350 without trying and without raising voltage. I get good fps in borderlands 2, others dont, not sure why, im on the latest driver.


----------



## DinaAngel

isnt there an resistor for it? iv seen u can change it on 680 or so to make more volt go thru,
ty im unsure how far i could go, i tried few days ago on quite high on memmory but that wasnt stabile at all


----------



## Arizonian

Quote:


> Originally Posted by *DinaAngel*
> 
> Hii, i got my evga 690 sig edition now, anyone know if its worth it to replace any parts on the pcb?.
> to can OC fourther. how is the new driver performance wise?, came one out few days ago.
> i managed 145 on core and 200 on memmory, im bit scared to raise memmory alot, but 145 on core seem to can crash often the drivers, why is this?.
> anyone got tips to share on how to oc on it?, i use evga oc tool pecision x.
> i allways think theres things i can learn.
> in games my min fps is quite low and i do have drops and often my 690 doesnt fully show its being used in exsample borderland 2
> is there some sort of way i can try fix this or is it just driver issue or is it my cpu not being fully used by borderlands 2 or so,
> i would doubt it would be my cpu as iv seen the overlay tell tht 690 has been used on 50% or so on each gpu in 1 core games, prettymuch same with borderlands 2 maxed out but in borderlands 2 i get 120 or so fps with it maxed out, but i have drops sometimes.
> Kind regards
> Dina


Your over clock is very on par with a lot of the GTX 690 that's on air and I'm noticing a trend.

My offset +129 Core +196 Memory gives me a GPU #1 1176 Core & GPU #2 1202 Core. Bringing Memory to stock doesn't improve my Core over clock stable either. I always find top stable Core clock first then work on Memory. My memory offset after +200 forces GPU driver to fail.

There is no need at all for modding the GTX 690 as well built as it is unless it's going under water. Even going under water has minimal gains in over clock but that's more of a 28nm fabrication limitation rather than Kepler or Tahiti.

Don't have Boarderlands 2 to help with your other question.

On a side note: I have a question for everyone regarding *Power Target*.

Long story short...I turned down Power Target to +100 and found I'm still getting same Kepler boost over clocks as when my Power Target is set +130.

Turniing down Power Target to +100 seems to keep my temps staying below 70C more often than not.

I'm still seeing my Core hit 1170 MHz regardless of Power Target.

Has anyone else messed with Power Target and shown same results regardless of setting?


----------



## DinaAngel

Quote:


> Originally Posted by *Arizonian*
> 
> Your over clock is very on par with a lot of the GTX 690 that's on air and I'm noticing a trend.
> My offset +129 Core +196 Memory gives me a GPU #1 1176 Core & GPU #2 1202 Core. Bringing Memory to stock doesn't improve my Core over clock stable either. I always find top stable Core clock first then work on Memory. My memory offset after +200 forces GPU driver to fail.
> There is no need at all for modding the GTX 690 as well built as it is unless it's going under water. Even going under water has minimal gains in over clock but that's more of a 28nm fabrication limitation rather than Kepler or Tahiti.
> Don't have Boarderlands 2 to help with your other question.
> 
> On a side note: I have a question for everyone regarding *Power Target*.
> Long story short...I turned down Power Target to +100 and found I'm still getting same Kepler boost over clocks as when my Power Target is set +130.
> Turniing down Power Target to +100 seems to keep my temps staying below 70C more often than not.
> I'm still seeing my Core hit 1170 MHz regardless of Power Target.
> Has anyone else messed with Power Target and shown same results regardless of setting?


Ohh thanks!!







!. iv read and tested with power target, try get nvidia inspector and look on power %, but they say that it has nothing to say since u cant go higher than the max, so powertarget isnt being used, atleast from what iv read have been checked and tested,
i wish someone could mod the bios and get powersavings off, i might watercool my 690 if there happens unlocked bioses or so. u can increece volt but then it doesnt downclock again, u hafto use nvidia inspector console apparantly


----------



## Arkanor

I don't really run it anymore but from what I've heard Nvidia's support for Linux has gone markedly downhill. Used to be the other way around.


----------



## Qu1ckset

Got my GTX690 underwater!


----------



## Ruby Rabbit

Nice One! Looks great!

How are the temps and OC range?


----------



## Qu1ckset

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> Nice One! Looks great!
> 
> How are the temps and OC range?


ive never oc'd my 690 yet, never had to, it destroys everything the way it is for now, but here are my before and after temps

Before:
[email protected] 4.6Ghz Idle:45C Load:58-62C
[email protected] Idle: 45c Load:60-65C (most Games) 75-80C (BF3)

After:
[email protected] 4.6Ghz Idle:29C Load:45C max
[email protected] Idle: 22C Load:35C max (havent tested bf3 yet, but ran heaven at max settings)


----------



## V3teran

I love my 690,wouldnt change it for anything,even 2x680s running at 4gb which i could easily do.
I also had them in sli and they aint worth it but 1x690 is great with MSAA+SGSSAA at 1920*1200.
I play lots of games and this card eats all of them at my res with some nice AA and all the GPU options set to maximum.
The only time i will upgrade from this card is when the 790 appears or 890,Dual GPU.


----------



## PhantomTaco

Thought I asked to be added to the club, regardless here's mine:



And a photo of my build right now:



Really want to trade my white corsair dominator ram for something in black now that my case interior is no longer white...oh wells


----------



## Arizonian

Quote:


> Originally Posted by *PhantomTaco*
> 
> Thought I asked to be added to the club, regardless here's mine:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> And a photo of my build right now:
> 
> 
> 
> Really want to trade my white corsair dominator ram for something in black now that my case interior is no longer white...oh wells


Rig looks really cool with the green tubing. I got you added - Welcome to the club.


----------



## PhantomTaco

Quote:


> Originally Posted by *Arizonian*
> 
> Rig looks really cool with the green tubing. I got you added - Welcome to the club.


Thanks! Tubing isn't green though...Mayhem's


----------



## Qu1ckset

Im honestly getting annoyed, i keep getting abit of screen flicking on loading screens for sc2 and LoL, haven't really treid any other games just yet, at first i thought i was my monitor going, but this only happens at loading screen, i installed different drivers and still happens, anyone else having these issues?


----------



## pilla99

Quote:


> Originally Posted by *Qu1ckset*
> 
> Im honestly getting annoyed, i keep getting abit of screen flicking on loading screens for sc2 and LoL, haven't really treid any other games just yet, at first i thought i was my monitor going, but this only happens at loading screen, i installed different drivers and still happens, anyone else having these issues?


SC2 is about the only game I have been playing lately. Max at 1440p gives me 120's to 80's for FPS depending. No vsync but no weird tearing or flickering.


----------



## Qu1ckset

Quote:


> Originally Posted by *pilla99*
> 
> SC2 is about the only game I have been playing lately. Max at 1440p gives me 120's to 80's for FPS depending. No vsync but no weird tearing or flickering.


i have no issues in-game, and im not talking about a v-sync issue, its only on the loading screens


----------



## Ruby Rabbit

Quote:


> Originally Posted by *PhantomTaco*
> 
> Thought I asked to be added to the club, regardless here's mine:
> 
> And a photo of my build right now:
> 
> Really want to trade my white corsair dominator ram for something in black now that my case interior is no longer white...oh wells


@PhantomTaco

Nice Job! Your rig looks great.
Is that a 650D case, I notice you are running a single 240 or 280 Radiator?
I have am still waiting for my cooling gear to arrive and have I a 650D case. I was wondering about your temps (CPU and 690) with the single RAD.
I have ordered a 280 for the top of the case and a 200 for the front. However, if I can get away with a single rad at the top, this would be preferable.


----------



## PhantomTaco

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> @PhantomTaco
> Nice Job! Your rig looks great.
> Is that a 650D case, I notice you are running a single 240 or 280 Radiator?
> I have am still waiting for my cooling gear to arrive and have I a 650D case. I was wondering about your temps (CPU and 690) with the single RAD.
> I have ordered a 280 for the top of the case and a 200 for the front. However, if I can get away with a single rad at the top, this would be preferable.


Thanks for the praise haha. Unfortunately (for you I guess in this scenario lol) you got a few pieces wrong. I'm actually using an NZXT Switch 810 case, and I've got a 240mm rad on bottom and a slim (XSPC EX) 420mm rad up top. You should not run a 690 and processor off a single 240 or 280mm rad (maybe a 280mm Alphacool Monsta? even then it may be too much for the rad to dissipate)


----------



## Ruby Rabbit

Quote:


> Originally Posted by *PhantomTaco*
> 
> Thanks for the praise haha. Unfortunately (for you I guess in this scenario lol) you got a few pieces wrong. I'm actually using an NZXT Switch 810 case, and I've got a 240mm rad on bottom and a slim (XSPC EX) 420mm rad up top. You should not run a 690 and processor off a single 240 or 280mm rad (maybe a 280mm Alphacool Monsta? even then it may be too much for the rad to dissipate)


LOL I guess I did get it wrong.
Hmm, I was going to run the 280 and a 200 on the same loop. Images of another guys build below, I was going to follow this. I was also hoping to run 5ghz & the 690 (+180) for the rendering I do, along with gaming. It's probably bit overkill for gaming.


----------



## Jessekin32

Figured I'd better post my proof again since I still haven't been added xD

Signature EVGA. Link below incase you need it.



http://i.imgur.com/gYVma.jpg


----------



## max883

just got my Htpc case with Xspc water cooling kit


----------



## Ruby Rabbit

Quote:


> Originally Posted by *max883*
> 
> just got my Htpc case with Xspc water cooling kit


Very Nice! What model is that case?


----------



## Ruby Rabbit

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> Very Nice! What model is that case?


Sorry also wanted to ask which xspc cooling kit you purchased?


----------



## PhantomTaco

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> Sorry also wanted to ask which xspc cooling kit you purchased?


From the looks of it it's the Raystorm RX360 with 2 extra 120mm rads?


----------



## SimpleTech

Quote:


> Originally Posted by *Ruby Rabbit*
> 
> Very Nice! What model is that case?


Look at his sig:

Case
Thermaltake DH101


----------



## Arizonian

Quote:


> Originally Posted by *max883*
> 
> just got my Htpc case with Xspc water cooling kit
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *Jessekin32*
> 
> Figured I'd better post my proof again since I still haven't been added xD
> Signature EVGA. Link below incase you need it.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://i.imgur.com/gYVma.jpg


Congrats gentlemen. As always these cards look sooooo good in a system done up well.


----------



## DinaAngel

hii i updated my rig with video now and i got Gpuz validation of 690 also as cpuz too, i oced 690 and cpu too







, could do more proof if needed to get me added to the club









i got EVGA 690 signature edition
My Milkshake brings all the boys to the yard!!!


----------



## Arizonian

Quote:


> Originally Posted by *DinaAngel*
> 
> hii i updated my rig with video now and i got Gpuz validation of 690 also as cpuz too, i oced 690 and cpu too
> 
> 
> 
> 
> 
> 
> 
> , could do more proof if needed to get me added to the club
> 
> 
> 
> 
> 
> 
> 
> 
> i got EVGA 690 signature edition
> My Milkshake brings all the boys to the yard!!!
> 
> 
> Spoiler: Warning: Spoiler!


Well that's a first.







A simple picture would have been good too.









Welcome to the club and joining us.


----------



## Divineshadowx

Ahhh, every time I see the water-cooled systems I want one







I listed my second 690 today on ebay, once it sells I will water cool my system. But I have to get rid of my cosmos 2 first, so bad for water cooling.

Btw, add me to the club please.


----------



## dynn

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ahhh, every time I see the water-cooled systems I want one
> 
> 
> 
> 
> 
> 
> 
> I listed my second 690 today on ebay, once it sells I will water cool my system. But I have to get rid of my cosmos 2 first, so bad for water cooling.
> Btw, add me to the club please.


is cosmos II bad for watercooling like H100?

Because im about to buy a cosmos II soon....


----------



## PhantomTaco

The cosmos 2 is a fantastic case for watercooling. Take a look at some build logs in the liquid cooling section or do a google search. Either that or the Switch 810 are what I'd go for these days for a water cooling case.


----------



## Tweetbix

Quote:


> Originally Posted by *dynn*
> 
> is cosmos II bad for watercooling like H100?
> Because im about to buy a cosmos II soon....


Have a look at the Cosmos 2 Club for pics of water cooling.
http://www.overclock.net/t/1199098/cooler-master-cosmos-2-club
Although i only have a h100 from what ive seen is its great for WC.


----------



## Divineshadowx

The cosmos 2 is a horrible case in general trust me. I barely fit an h100 in the top with fans, this was with modding, half the screws are not in. The case is huge, but horribly organized. There is a max of about 40mm in the top. The switch 810 has 90mm I think. No decent size radiator with push pull can fit there, you would have to put a skinny radiator (20-25mm), a set of fans, and then a set of fans in the top mounting compartment. The bottom can hold a single 240mm radiator, with modding another one, too much work for a $350 case. And also it weights about 100lbs fully loaded, which is insane. Do yourself a favor and get a normal case, for $350 there are many better options.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> The cosmos 2 is a horrible case in general trust me. I barely fit an h100 in the top with fans, this was with modding, half the screws are not in. The case is huge, but horribly organized. There is a max of about 40mm in the top. The switch 810 has 90mm I think. No decent size radiator with push pull can fit there, you would have to put a skinny radiator (20-25mm), a set of fans, and then a set of fans in the top mounting compartment. The bottom can hold a single 240mm radiator, with modding another one, too much work for a $350 case. And also it weights about 100lbs fully loaded, which is insane. Do yourself a favor and get a normal case, for $350 there are many better options.


50mm up top With many Area options for other Rads the Cosmos 2 is the best Case ever made..

I do not know by the love of god how u needed to mod it for a H100 to fit?????? Ive put mine in 3 diff locations with no modding...

Cause you have an Issue do not say it sucks the other 100k people with one know how great the case is..


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> 50mm up top With many Area options for other Rads the Cosmos 2 is the best Case ever made..
> I do not know by the love of god how u needed to mod it for a H100 to fit?????? Ive put mine in 3 diff locations with no modding...
> Cause you have an Issue do not say it sucks the other 100k people with one know how great the case is..


Lol. I measured it was 40mm, maybe find a ruler next time. And even if it was 50mm, the h100 is 25mm, plus a 120mm fan which is 25mm, is 50, no more room or the cpu 8pin is blocked. Is it really this hard to do simple math? With the switch 810 you can fit a 60mm rad plus fans. Best case my ass. The doors are an inch thick, what am i using it as a shield? Half the time it wobbles since its so dam heavy.


----------



## DinaAngel

Thanks!!!







,i found out tht my gpu 1 can do 150 on core and 300 on memm, gpu 2 can do 140 on core and 300 on memm, does it help to have 8pin connected to PCIE? on RIVE.
does pcie spread spectrum help?

thanks!


----------



## irul77

thats awesome stuff


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Lol. I measured it was 40mm, maybe find a ruler next time. And even if it was 50mm, the h100 is 25mm, plus a 120mm fan which is 25mm, is 50, no more room or the cpu 8pin is blocked. Is it really this hard to do simple math? With the switch 810 you can fit a 60mm rad plus fans. Best case my ass. The doors are an inch thick, what am i using it as a shield? Half the time it wobbles since its so dam heavy.


Maybe it is 40mm for you but it is 50mm for me.
Also if u took the time to look you would see you can mount a a38mm fan on the inside then a rad above that on the outside but since there is a cover on it it is still on the inside giveing it about 100mm of space.

Wobbles? seems the surface u have it on is un even.. Or you need to RMA.. really if your going to down the best case out there besides a custom case u should sell it and get something else.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Maybe it is 40mm for you but it is 50mm for me.
> Also if u took the time to look you would see you can mount a a38mm fan on the inside then a rad above that on the outside but since there is a cover on it it is still on the inside giveing it about 100mm of space.
> Wobbles? seems the surface u have it on is un even.. Or you need to RMA.. really if your going to down the best case out there besides a custom case u should sell it and get something else.


Well maybe you have a different model. 50mm is still too small for a case that big. A decent size rad is 60mm, gl fitting that in. And the top is no where near 100mm, I have a set of fans installed on the top mounted to the radiator through the top panel and the cap fits on wjth a little room to spare, maybe 10mm, not 65.
And I will sell it but I dont have the original box or any of the original screws. Gonna be tough.


----------



## Cheesemaster

Update on my quad sli with new drivers......


----------



## Qu1ckset

have any of you experienced screen flickering on the loading screens of your games on your 690, because im getting it now and don't know why its happening, i even tried a different monitor and same thing happens, when actually playing a game everything is perfectly fine, i dont understand why its doing this


----------



## ceteris

Quote:


> Originally Posted by *Qu1ckset*
> 
> have any of you experienced screen flickering on the loading screens of your games on your 690, because im getting it now and don't know why its happening, i even tried a different monitor and same thing happens, when actually playing a game everything is perfectly fine, i dont understand why its doing this


Not I. Mine has been running fine and dandy since I got it after launch. Have you tried using another monitor or check cabling? I hope someone who knows exactly is going on will respond because that really sucks.


----------



## pilla99

No problems, latest drivers with OC.
crappy quality but the point is no flickering.


----------



## Qu1ckset

Ugh this is annoying, like I wanted to finish this build and be done with it, but now installing the water block I get this small messed up problem, but everything else works perfect!? I don't get it


----------



## ceteris

Quote:


> Originally Posted by *Qu1ckset*
> 
> Ugh this is annoying, like I wanted to finish this build and be done with it, but now installing the water block I get this small messed up problem, but everything else works perfect!? I don't get it


A quick google check says to check your monitors refresh rate. Have you messed with that part yet?


----------



## Qu1ckset

Quote:


> Originally Posted by *ceteris*
> 
> A quick google check says to check your monitors refresh rate. Have you messed with that part yet?


i tried it on a completely different monitor and it does the same thing


----------



## Qu1ckset

So disabling SLI gets rid of the problem....


----------



## ceteris

Quote:


> Originally Posted by *Qu1ckset*
> 
> So disabling SLI gets rid of the problem....


Sucks bro. Maybe you can RMA both and get a new one with Hydro Copper already installed?


----------



## PhantomTaco

I remember reading somewhere about someone else having this issue on here, but I don't know if they solved it either. Honestly you should get in touch with EVGA and see what they say about it. It's hard to believe it's a faulty card if it's only during loading screens, but I have no idea what else could cause it. Does this only happen with one game or any game during a loading screen?


----------



## Jessekin32

So I resolved my Borderlands 2 issue for those of you who still have it and need a fix.

All I simply did was go from "Fullscreen Windowed" to "Fullscreen" and dropped my PhysX to Low (I'm not seeing ANY noticeable graphical difference...

NEW ISSUE:

6050x1080 in Battlefield 3. My GPU usage hangs around 70%-86%, and this is with Ulltra - zero MSAA - and getting around 32-40 FPS. I KNOW my rig is more than capable of going to at least a steady 60 FPS maxed with BF3, I just need to find out what's going wrong.

Anyone with any ideas? This is now the only game I'm running into issues with.

Also, before anyone says it's VRAM, please don't tell me that. my single 2GB 680 gave me better performance, and higher frames.... (granted it was an MSI Lightning 680)


----------



## Qu1ckset

Quote:


> Originally Posted by *PhantomTaco*
> 
> I remember reading somewhere about someone else having this issue on here, but I don't know if they solved it either. Honestly you should get in touch with EVGA and see what they say about it. It's hard to believe it's a faulty card if it's only during loading screens, but I have no idea what else could cause it. Does this only happen with one game or any game during a loading screen?


Ugh this is annoying, like I wanted to finish this build and be done with it, but now installing the water block I get this small messed up problem, but everything else works perfect!? I don't get it

I'm just opened a ticket with them last night and I've only tried it in three games and it does it in a three, starcraft2, battlefield3 and league of legends


----------



## DinaAngel

i have noticed that my 690 isnt being fully used on 4.8 ghz neither 5 ghz on my 3930k. is this an driver thing?, borderlands 2 everything maxed out, i get only 60% on each gpu, using latest drivers

having 120+ tho so maybe i cant complain but maybe someone know why this is happening
like it jumps down in some places and then goes up and i can just turn and it goes down,
its not giving lagg but i think borderlands 2 only uses 2 cores of ur cpu

Update: Ohh yeah its just borderlands 2 using 2 cores or so, heaven benchmark its 99%


----------



## Divineshadowx

You're not going to see max gpu usage on games like borderlands 2, or most games in general because they don't scale well with sli due to the way the game was made. Its not your cpu or drivers, although drivers sometimes increase sli performance. Remember a 690 is technically sli (2 gpus). Borderlands 2 is by no means demanding, I have to force 8xmsaa with 8xsgssaa to get any lag, and this is probably because it doesn't work too well with the game. There is a reason why benchmarks are made, and that is to test the full potential of the gpus, and even benchmarks don't stress quad sli to its max, a single 690 ya, almost always 99%.


----------



## DinaAngel

Quote:


> Originally Posted by *Divineshadowx*
> 
> You're not going to see max gpu usage on games like borderlands 2, or most games in general because they don't scale well with sli due to the way the game was made. Its not your cpu or drivers, although drivers sometimes increase sli performance. Remember a 690 is technically sli (2 gpus). Borderlands 2 is by no means demanding, I have to force 8xmsaa with 8xsgssaa to get any lag, and this is probably because it doesn't work too well with the game. There is a reason why benchmarks are made, and that is to test the full potential of the gpus, and even benchmarks don't stress quad sli to its max, a single 690 ya, almost always 99%.


ok thanks!!


----------



## Sterling84

Just put the badboy in this week


----------



## thestache

Quote:


> Originally Posted by *Jessekin32*
> 
> So I resolved my Borderlands 2 issue for those of you who still have it and need a fix.
> All I simply did was go from "Fullscreen Windowed" to "Fullscreen" and dropped my PhysX to Low (I'm not seeing ANY noticeable graphical difference...
> NEW ISSUE:
> 6050x1080 in Battlefield 3. My GPU usage hangs around 70%-86%, and this is with Ulltra - zero MSAA - and getting around 32-40 FPS. I KNOW my rig is more than capable of going to at least a steady 60 FPS maxed with BF3, I just need to find out what's going wrong.
> Anyone with any ideas? This is now the only game I'm running into issues with.
> Also, before anyone says it's VRAM, please don't tell me that. my single 2GB 680 gave me better performance, and higher frames.... (granted it was an MSI Lightning 680)


VRAM would not cause low GPU usage. But it's worth you checking your usage with MSI Afterburner or EVGA Precision because you will hit it at some point in that game and your frame rate will half, if not stop. It's fact and you should be checking when it's happening so you aren't alarmed and think something else is wrong.

As for low GPU usage, windows core parking could be responisble but it's more likely drivers. When it happens do your clocks speeds change or get stuck at lower clocks than what they should be at? Used to happen to me all the time, sometimes several windows restarts would not fix it and it's just drivers. BF3 was unplayable for a week or so for me with this problem.

Have not had that problem with my 4GB GTX 680 SLI so far which is good.

What drivers you using?


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ahhh, every time I see the water-cooled systems I want one
> 
> 
> 
> 
> 
> 
> 
> I listed my second 690 today on ebay, once it sells I will water cool my system. But I have to get rid of my cosmos 2 first, so bad for water cooling.
> Btw, add me to the club please.


They really are. NZXT Switch 810 is the best at the moment I think.


----------



## Arizonian

Quote:


> Originally Posted by *Sterling84*
> 
> Just put the badboy in this week
> 
> 
> Spoiler: Warning: Spoiler!


Congrats.







Let us know how she performs for you should you do any over clocking and/or have any questions.









Welcome to the '690 Owners Club' list and OCN with your "bad boy" first post.


















Helpful site for new members - "How to put your Rig in your Sig" . See you around the threads.


----------



## Jessekin32

Quote:


> Originally Posted by *thestache*
> 
> VRAM would not cause low GPU usage. But it's worth you checking your usage with MSI Afterburner or EVGA Precision because you will hit it at some point in that game and your frame rate will half, if not stop. It's fact and you should be checking when it's happening so you aren't alarmed and think something else is wrong.
> As for low GPU usage, windows core parking could be responisble but it's more likely drivers. When it happens do your clocks speeds change or get stuck at lower clocks than what they should be at? Used to happen to me all the time, sometimes several windows restarts would not fix it and it's just drivers. BF3 was unplayable for a week or so for me with this problem.
> Have not had that problem with my 4GB GTX 680 SLI so far which is good.
> What drivers you using?


Clock speeds are steady at the max, except for a rare short dip whenever I look at nothing but the ground xD

I'm using the latest official drivers (306.23)

The GPU power is EVERYWHERE in afterburner, respectably mirroring gpu usage of course. It's just odd that everything is so weird. A single 680 was giving me better performance and "smoothness"

I just want to figure out what's going on. I might find out when I do a clean install of Windows when I get my new SSD. BUT that won't be for a while, so I want to see if I can get this resolved sooner than later.


----------



## thestache

Quote:


> Originally Posted by *Jessekin32*
> 
> Clock speeds are steady at the max, except for a rare short dip whenever I look at nothing but the ground xD
> I'm using the latest official drivers (306.23)
> The GPU power is EVERYWHERE in afterburner, respectably mirroring gpu usage of course. It's just odd that everything is so weird. A single 680 was giving me better performance and "smoothness"
> I just want to figure out what's going on. I might find out when I do a clean install of Windows when I get my new SSD. BUT that won't be for a while, so I want to see if I can get this resolved sooner than later.


I hate to say it but clean windows of the drivers and try it again. Maybe even try 306.63.

Normally when I got low GPU usage and bad perfomance, my clocks would get stuck at default speeds of 925 or 1070 or one core would get stuck so it's odd the power usage is all over the place but they are steady. Power management set to prefer maximum performance? But I'd have to say it'd be drivers unless it was doing it in other games and the only cure for that is fresh windows and fresh successful driver installation or a successful clean and re-install. So good luck, because there is nothing worse than bad drivers messing with GPUs.


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> They really are. NZXT Switch 810 is the best at the moment I think.


Ya, I would've preferred an aluminium case, but I like the huge roof clearance of the 810, 90mm will fit a very fat 3x140mm rad. How many rads do you think I should get all together to leave room for the pump and res?


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ya, I would've preferred an aluminium case, but I like the huge roof clearance of the 810, 90mm will fit a very fat 3x140mm rad. How many rads do you think I should get all together to leave room for the pump and res?


Sorry i was wrong the Cosmos 2 does not have 90mm of space it has 100mm.

2 inches on the inside and 2 inches under the top cover just used my Ruler.

It is the best case







Onlything that beats it is something like Mountain Mods.

If you were a case modder and did not want the fan holes there you could cut them away and reveal the whole 100mm of space with nothing inbetween it.

Then there is still room for a 240mm as fat as u want it in the bottom.

Mount SSDs on back side of case remove upper HDD bay and all the res/pump room in the world is yours.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Sorry i was wrong the Cosmos 2 does not have 90mm of space it has 100mm.
> 2 inches on the inside and 2 inches under the top cover just used my Ruler.
> It is the best case
> 
> 
> 
> 
> 
> 
> 
> Onlything that beats it is something like Mountain Mods.
> If you were a case modder and did not want the fan holes there you could cut them away and reveal the whole 100mm of space with nothing inbetween it.
> Then there is still room for a 240mm as fat as u want it in the bottom.
> Mount SSDs on back side of case remove upper HDD bay and all the res/pump room in the world is yours.


How on earth does it have100mm of space from the bottom roof. Go to my profile and look at my pic, notice I don't have any fans under my h100 because they simply would not fit, finally, using a very small screw driver and pliers, I mounted 2 more on the bottom, this is with half the screws not in the fans because any tighter would mean the fan would block my cpu 8-pin. That is an h100 (25mm) plus a fan (25mm) which is 50mm,


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> How on earth does it have100mm of space from the bottom roof. Go to my profile and look at my pic, notice I don't have any fans under my h100 because they simply would not fit, finally, using a very small screw driver and pliers, I mounted 2 more on the bottom, this is with half the screws not in the fans because any tighter would mean the fan would block my cpu 8-pin. That is an h100 (25mm) plus a fan (25mm) which is 50mm,


To be honest there is more then 50mm inside more like 55mm inside and 50mm under the cover on top.

2inches = 50.8mm


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> To be honest there is more then 50mm inside more like 55mm inside and 50mm under the cover on top.
> 2inches = 50.8mm


Ya, you either have a different model, I know there are more than one, or your motherboard is layed out differently, with my current config, 1 more mm would block the cpu 8pin. And I want a 60mm rad so I will go with the 810 probably.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ya, you either have a different model, I know there are more than one, or your motherboard is layed out differently, with my current config, 1 more mm would block the cpu 8pin. And I want a 60mm rad so I will go with the 810 probably.


80mm 240 in the bottom 50mm mounted on top with 38mm fans mounted inside in push config would be a better option.

You could also mod the top by cutting it to fit the 60 mm rad and still be able to have 38mm fans in push config.

When having a case you have to step back and look at all the options and they are there if you are willing to do the mods.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> 80mm 240 in the bottom 50mm mounted on top with 38mm fans mounted inside in push config would be a better option.
> You could also mod the top by cutting it to fit the 60 mm rad and still be able to have 38mm fans in push config.
> When having a case you have to step back and look at all the options and they are there if you are willing to do the mods.


If I cut the top off there would be no where to mount it to. I just don't see a case that weighs 50lbs, is absolutley gigantic, and costs $350 being a good case where you actually have to mod it and STILL get lower performance than a switch 810. With the 810, you have 90mm for a 420 rad, and can install 2 280's in the bottom with no alterations. Could even get a 3rd if you tried, I don't see how that can be a bad deal. Also it looks cool and has a window so you can actually see your build. And dont tell me to get one for the cosmos 2 because that is just more money in an already expensive case and takes patience for someone to create it, and there are risks.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> If I cut the top off there would be no where to mount it to. I just don't see a case that weighs 50lbs, is absolutley gigantic, and costs $350 being a good case where you actually have to mod it and STILL get lower performance than a switch 810. With the 810, you have 90mm for a 420 rad, and can install 2 280's in the bottom with no alterations. Could even get a 3rd if you tried, I don't see how that can be a bad deal. Also it looks cool and has a window so you can actually see your build. And dont tell me to get one for the cosmos 2 because that is just more money in an already expensive case and takes patience for someone to create it, and there are risks.


810 is a budget case. but u can fit the 420 rad in it yes...

What u can fit in the cosmos 2 is this... if you really want to get into detail..

4 360 rads and 2 240mm rad or 3 360 rads and 3 240mm rad.

You can also remove the top and fit a 420 rad on the top of the cosmos 2 if you like.

So the only thing a 810 has over the Cosmos 2 is a side window.


----------



## ceteris

Hey Divineshadowx! If you were prepared to drop $350+ for a case with watercooling, why not take a look at Caselabs? Has the aluminium you want









Switch 810 is awesome for the price, but I didn't like how you had to cut out a portion of the bay drive area to fit push/pull.


----------



## Qu1ckset

So after doing further testing, i found when i set my games to windowed mode the problem goes away... but when in fullscreen it happens...


----------



## Buzzkill

Quote:


> Originally Posted by *Qu1ckset*
> 
> So after doing further testing, i found when i set my games to windowed mode the problem goes away... but when in fullscreen it happens...


Have you tried all the diffrent drivers.

Xtreme-G Nvidia Graphics Drivers

Xtreme-G 305.67

Xtreme-G 306.63

NVIDIA GeForce 306.63 Driver Download


----------



## Qu1ckset

Quote:


> Originally Posted by *Buzzkill*
> 
> Have you tried all the diffrent drivers.
> Xtreme-G Nvidia Graphics Drivers
> Xtreme-G 305.67
> Xtreme-G 306.63
> NVIDIA GeForce 306.63 Driver Download


il try these ones but how come on the nvidea site, the highest they have is 306.23??


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> 810 is a budget case. but u can fit the 420 rad in it yes...
> What u can fit in the cosmos 2 is this... if you really want to get into detail..
> 4 360 rads and 2 240mm rad or 3 360 rads and 3 240mm rad.
> You can also remove the top and fit a 420 rad on the top of the cosmos 2 if you like.
> So the only thing a 810 has over the Cosmos 2 is a side window.


Okay now I wonder if you're trolling. 4 360 rads and 2 240mm rads? Using the way you do your calculations I can fit an infinity amount of rads in the switch 810 because I can stack them on top of each other until I reach the Ionosphere of the earth...

The cosmos 2 literally has 3 places for rads, the top, and 1 bottom, and another bottom 1 with a mod. How on earth did you get 6? Did you count the side door because somehow it doesn't interfere with your gpu? You probably taped another rad to the right side door... and maybe super glued 1 to the back i/o ports? Lol

If you love the cosmos 2 so much, then by all means go ahead and use it, i'm just giving the facts as an owner of the case to those looking to buy one.


----------



## Divineshadowx

Quote:


> Originally Posted by *Qu1ckset*
> 
> il try these ones but how come on the nvidea site, the highest they have is 306.23??


306.63 is a dev driver, which you can find by signing up at nvidia's site by going through some steps. I wouldn't use it as it isn't even in the beta state. I guess you can try it though. Since you're having problems, did you try reinstalling the games? Usually works if I find a glitch like that.


----------



## Qu1ckset

Quote:


> Originally Posted by *Divineshadowx*
> 
> 306.63 is a dev driver, which you can find by signing up at nvidia's site by going through some steps. I wouldn't use it as it isn't even in the beta state. I guess you can try it though. Since you're having problems, did you try reinstalling the games? Usually works if I find a glitch like that.


Doesn't work anyways, evga just approved a rma anyways, im just trying to ask if i can ship it with my hydrocopper block on and they ship me a hydrocopper card, because i dont have enough thermo pads to put the stock air cooler back on


----------



## Buzzkill

OpenGL Driver Support

Here is Nvidia Link. I am using 306.63 with EVGA 690's

Here is a thread on Forum if you want to read threw this.
306.23 SLI Driver issues - fixed and would like to share my experience with OCN


----------



## Buzzkill

Quote:


> Originally Posted by *Qu1ckset*
> 
> Doesn't work anyways, evga just approved a rma anyways, im just trying to ask if i can ship it with my hydrocopper block on and they ship me a hydrocopper card, because i dont have enough thermo pads to put the stock air cooler back on


The Instruction for backplates say to keep all parts because if you have a problem you have to return card the same as you recived it. So I would say you need to put air cooler back on.

The EVGA Warranty Policy
EVGA Support
All products must be returned in its original condition. Products received by EVGA for replacement that include 3rd-party attachments (CPU heatsink backplate, memory chip heatsinks, etc) will have all attachments removed and discarded. EVGA holds the discretion to return the product to the sender if necessary.

The product must be returned to EVGA in the original factory configuration and condition. All aftermarket modifications must be reversed before sending in the product for replacement.

EVGA reserves the right to claim for shipping fees as well as a service charge from you for any incomplete or modified product that is returned and requires repair or replacement, or when you are not entitled to any coverage under EVGA's warranty. Service charges may vary based upon the actual material and labor cost to replace missing or modified parts back to their original factory condition.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Okay now I wonder if you're trolling. 4 360 rads and 2 240mm rads? Using the way you do your calculations I can fit an infinity amount of rads in the switch 810 because I can stack them on top of each other until I reach the Ionosphere of the earth...
> The cosmos 2 literally has 3 places for rads, the top, and 1 bottom, and another bottom 1 with a mod. How on earth did you get 6? Did you count the side door because somehow it doesn't interfere with your gpu? You probably taped another rad to the right side door... and maybe super glued 1 to the back i/o ports? Lol
> If you love the cosmos 2 so much, then by all means go ahead and use it, i'm just giving the facts as an owner of the case to those looking to buy one.


And im giving the facts my photoshop skills suck but u can make it out for yourself.. 810 is to small to do all this... and a budget case.

Sometimes we have to help shine the light on things for new members like yourself.. but in the end if you understand it will change your way of thinking when you look at a computer.
select on the picture and it will get larger and u can see 360 rad locations all hidden with in the case.. and all been done by members before and proven to work,


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> And im giving the facts my photoshop skills suck but u can make it out for yourself.. 810 is to small to do all this... and a budget case.
> Sometimes we have to help shine the light on things for new members like yourself.. but in the end if you understand it will change your way of thinking when you look at a computer.
> select on the picture and it will get larger and u can see 360 rad locations all hidden with in the case.. and all been done by members before and proven to work,


Why thank you for saving me a 100lb lift, and showing that there are 3 rad locations yourself. As anyone with eyes can see, 1 rad on the roof, another 1 on top, each 25mm, as you have the corsair h100, and maybe 2 more on the bottom. The top location for the rad wouldn't actually work because there is no other way to run tubing up there other than externally, would probably have to remove the cover, which would make the rig look unpolished. May I ask why you have not water cooled your system? Oh ya, because there is no room in that case to do it. The bay area where you have the fan would be reserved for the pump and reservoir. Two 240mm on the bottom with a mod, 1 25mm top, with a push config, and 1 25mm rad on the top, no fans unless you remove the top cover. NZXT Switch 810: 420 any size (depth) rad on the top, and up to 3 280mm rads, GG.

Btw, what is the point of being a senior member if you can't give accurate information to a "noob" like myself? Whats that tell the people on this forum hmm? Now I will stop talking to a brick wall and move on to something productive.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Why thank you for saving me a 100lb lift, and showing that there are 3 rad locations yourself. As anyone with eyes can see, 1 rad on the roof, another 1 on top, each 25mm, as you have the corsair h100, and maybe 2 more on the bottom. The top location for the rad wouldn't actually work because there is no other way to run tubing up there other than externally, would probably have to remove the cover, which would make the rig look unpolished. May I ask why you have not water cooled your system? Oh ya, because there is no room in that case to do it. The bay area where you have the fan would be reserved for the pump and reservoir. Two 240mm on the bottom with a mod, 1 25mm top, with a push config, and 1 25mm rad on the top, no fans unless you remove the top cover. NZXT Switch 810: 420 any size (depth) rad on the top, and up to 3 280mm rads, GG.
> Btw, what is the point of being a senior member if you can't give accurate information to a "noob" like myself? Whats that tell the people on this forum hmm? Now I will stop talking to a brick wall and move on to something productive.


It seems the brick wall is yourself my friend if u made the picture larger u would see i outlined rad locations in red.

And yes i have water cooled in the past but do not any more.. all of these can be done..

The rad on top you would have to use a drill bit to make holes for tubes the whole front area under the 5.25 bays can have res/pump along with res/pump inside 5.25 bay.

Anyways do you know cooling results? One 50mm 240 mm rad is enough for 3 Gpus over clocked one skinny 360 rad is enough for a cpu..

Or modding the top of the case you could do a Fat 360 rad for cpu/3gpus..

But why did you buy a Cosmos II if you did not want it? all the info about is out there plan as day including the weight of the case..

All the rad locations info is out there and been done.. Use Google...

But at this point as nothing is sinking in id suggest selling the Cosmos 2 and pick up the cheaper build budget case you seem to like so much.. Why you did not in the first place i do not know?

An antec 620 cooler can keep a Gpu under 55c silent.. for a Ref stand point.

And i have 118 rep for a reason.. 75% of it has come from Cooling advice im quite good at case modding aswell...



My Former case Cosmos 1000 Lots of custom work.. Including Red powder coating and making my own Side window Custom fan cuts etc.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> It seems the brick wall is yourself my friend if u made the picture larger u would see i outlined rad locations in red.
> And yes i have water cooled in the past but do not any more.. all of these can be done..
> The rad on top you would have to use a drill bit to make holes for tubes the whole front area under the 5.25 bays can have res/pump along with res/pump inside 5.25 bay.
> Anyways do you know cooling results? One 50mm 240 mm rad is enough for 3 Gpus over clocked one skinny 360 rad is enough for a cpu..
> Or modding the top of the case you could do a Fat 360 rad for cpu/3gpus..
> But why did you buy a Cosmos II if you did not want it? all the info about is out there plan as day including the weight of the case..
> All the rad locations info is out there and been done.. Use Google...
> But at this point as nothing is sinking in id suggest selling the Cosmos 2 and pick up the cheaper build budget case you seem to like so much.. Why you did not in the first place i do not know?
> An antec 620 cooler can keep a Gpu under 55c silent.. for a Ref stand point.
> And i have 118 rep for a reason.. 75% of it has come from Cooling advice im quite good at case modding aswell...
> 
> My Former case Cosmos 1000 Lots of custom work.. Including Red powder coating and making my own Side window Custom fan cuts etc.


If I wanted a 50mm rad I would get a $50 case.

I bought the cosmos 2 a year ago thinking it was the best, since it cost a lot and was big. That isnt the case though, (apple, alienware, Bose, beats by dre)

Now, since youre so eager to proove your point, water cool the cosmos 2 or post a pic of someone that has 4 360 rads and 2 240 rads. Until then, my point stands because I know it cant be done. End of story.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> If I wanted a 50mm rad I would get a $50 case.
> I bought the cosmos 2 a year ago thinking it was the best, since it cost a lot and was big. That isnt the case though, (apple, alienware, Bose, beats by dre)
> Now, since youre so eager to proove your point, water cool the cosmos 2 or post a pic of someone that has 4 360 rads and 2 240 rads. Until then, my point stands because I know it cant be done. End of story.


Can be done quite easy..

Nobody has done it cause it is over kill.

But all of those locations have had rads in them









And like i said and shown there is over 100mm of space up top lol...

Modding the cosmos 2 top will be more easy then swaping a system into that budget case and re selling the cosmos 2..

Steps..

1 Buy the 360 60mm Rad u want..

2. Measure the rad and where the screw holes are...

3.Cut cosmos 2 top to fit where there is clearence for the fans and drill holes for the screw holes for the rad.

4. Take pleasure in having done something yourself.

To In stall it as a mid point... so you can push/pull 25mm fans.. Buy L brackets from your local hardware store.

To do the same with 480mm rad do all the above.

Then remove top Handle

Then remove fan controller

Then get a power switch from another case or buy one for 5$.. Done













An example of what i told you with a 480 50mm rad in push/pull



The Cosmos 2 club is there to help you if you have questions stop by and ask them.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Can be done quite easy..
> Nobody has done it cause it is over kill.
> But all of those locations have had rads in them
> 
> 
> 
> 
> 
> 
> 
> 
> And like i said and shown there is over 100mm of space up top lol...
> Modding the cosmos 2 top will be more easy then swaping a system into that budget case and re selling the cosmos 2..
> Steps..
> 1 Buy the 360 60mm Rad u want..
> 2. Measure the rad and where the screw holes are...
> 3.Cut cosmos 2 top to fit where there is clearence for the fans and drill holes for the screw holes for the rad.
> 4. Take pleasure in having done something yourself.
> To In stall it as a mid point... so you can push/pull 25mm fans.. Buy L brackets from your local hardware store.
> To do the same with 480mm rad do all the above.
> Then remove top Handle
> Then remove fan controller
> Then get a power switch from another case or buy one for 5$.. Done
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> An example of what i told you with a 480 50mm rad in push/pull
> 
> 
> The Cosmos 2 club is there to help you if you have questions stop by and ask them.


1st one you can do with any case, much easier with others because of the cosmos using rivets.
2nd one looks terrible lol. Has the feel of wow that doesn't belong there.
3. Admit the cosmos 2 is a train wreck and lets talk about the 690


----------



## Divineshadowx

Quote:


> Originally Posted by *ceteris*
> 
> Hey Divineshadowx! If you were prepared to drop $350+ for a case with watercooling, why not take a look at Caselabs? Has the aluminium you want
> 
> 
> 
> 
> 
> 
> 
> 
> Switch 810 is awesome for the price, but I didn't like how you had to cut out a portion of the bay drive area to fit push/pull.


Now thats a case lol. 2 sides... custom everything.... probably can fit 6 480s lol, I really want to buy that... $400 but probably will last a lifetime.


----------



## Qu1ckset

Hey Guys so i removed my gtx690, and just waiting for the last approval email then my gtx690 will be mailed out for a replacement!
Do you guys know where i can get the thermo pads like in the second pic?


----------



## SimpleTech

Quote:


> Originally Posted by *Qu1ckset*
> 
> Hey Guys so i removed my gtx690, and just waiting for the last approval email then my gtx690 will be mailed out for a replacement!
> Do you guys know where i can get the thermo pads like in the second pic?
> 
> http://i1134.photobucket.com/albums/m619/qu1ckset/20121004_142029.jpg
> 
> http://i1134.photobucket.com/albums/m619/qu1ckset/20121004_135120-Copy.jpg


Assuming they're 0.5mm thick..

http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=35537
http://www.frozencpu.com/cat/l3/g8/c487/s1288/list/p1/Thermal-Thermal_Pads_Tape-Thermal_Pad_-_05mm_-Page1.html


----------



## Buzzkill

Frozen CPU
Performance PCs
SideWinder

For any of you wondering, thermal pad width is 0.5mm for all of the EVGA graphics cards. Any 0.5mm thermal pad you can find at places like FrozenCPU.com will work fine, if you need another.

If you need them fast Performance PCs ships the same day most of the time and 1st Class mail should get you item in 3 day's. And you don't need to pay for Rush Processing


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> Ya, I would've preferred an aluminium case, but I like the huge roof clearance of the 810, 90mm will fit a very fat 3x140mm rad. How many rads do you think I should get all together to leave room for the pump and res?


Yeah the plastic is what ruins it.

I'd say just a 360mm and a 240mm. Plenty of room to still mount a nonbay res and pump artistically. The NZXT Switch 810 thread has some of the best watercooled cases I've seen in years. Check it out.

If I lived in the states I'd get a MountainMods H2GO. Awesome case.

http://www.guru3d.com/articles_pages/guru3d_rig_of_the_month_december_2011,2.html


----------



## dynn

Can someone link me in newegg or amazon
a really good HHD 1TB


----------



## Buzzkill

Quote:


> Originally Posted by *dynn*
> 
> Can someone link me in newegg or amazon
> a really good HHD 1TB


Western Digital WD Black WD1002FAEX 1TB 7200 RPM 64MB Cache SATA 6.0Gb/s 3.5" Internal Hard Drive -Bare Drive $109.99

WD Velociraptor WD1000DHTZ 1TB 3.5" SATA Hard Drive $269.99

The Velociraptor is Discontinued on Newegg so you have to get it from Amazon or maybe E-Bay but at 10,000 RPM it is faster than the Western Digital Black But 2.5 times as expensive so you decide if you want the best performance.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Qu1ckset*
> 
> Hey Guys so i removed my gtx690, and just waiting for the last approval email then my gtx690 will be mailed out for a replacement!
> Do you guys know where i can get the thermo pads like in the second pic?


Here:

https://www.dazmode.com/store/category/thermal_solutions/


----------



## Stray_Bullet

GTX690 Arctic Accelero review. http://www.kitguru.net/components/graphic-cards/zardon/arctic-accelero-twin-turbo-690-cooler-review-w-asus-gtx690/


----------



## Arizonian

Quote:


> Originally Posted by *Stray_Bullet*
> 
> GTX690 Arctic Accelero review. http://www.kitguru.net/components/graphic-cards/zardon/arctic-accelero-twin-turbo-690-cooler-review-w-asus-gtx690/


Welcome to OCN Stray_Bullet and thanks for the link to that review. I added it to the front page for additional info.


----------



## dynn

Quote:


> Originally Posted by *Buzzkill*
> 
> Western Digital WD Black WD1002FAEX 1TB 7200 RPM 64MB Cache SATA 6.0Gb/s 3.5" Internal Hard Drive -Bare Drive $109.99
> WD Velociraptor WD1000DHTZ 1TB 3.5" SATA Hard Drive $269.99
> The Velociraptor is Discontinued on Newegg so you have to get it from Amazon or maybe E-Bay but at 10,000 RPM it is faster than the Western Digital Black But 2.5 times as expensive so you decide if you want the best performance.


Thank so much, ordered the western digital


----------



## dynn

Im about to order cooler master cosmos II (its expensive but its great) I also like the cooler master HAF X

is there any reason to change the cosmos II by HAF X?

motherboard: maximus V formula
cooling: H100
gpu: gtx 690 (im not planning to sli)

the thing is that if i get the HAF X, i can affort the keyboard Ducky DK9008 Shine II Blue LED Backlit Mechanical Keyboard (Red Cherry MX)

http://www.amazon.com/Ducky-DK9008-Backlit-Mechanical-Keyboard/dp/B009E9GNM2/ref=sr_1_7?ie=UTF8&qid=1349241325&sr=8-7&keywords=ducky+shine+2

if not i can wait to december when my sister goes to NY again and buy the keyboard

Im looking for full cases with good look and cosmos II /HAF X seems to be the best ones


----------



## Big Shabazz

Hey guys, not a 690 owner YET, but just letting you know that they sell these at Micro Center as well now. Didn't see any indication if you knew this yet or not.

I'm planning on going the instant gratification route and buying it from MC for the extra $80


----------



## pilla99

Any of you guys playing with a single 690 on Metro 2033 1440p?? That game is kicking this systems ass. 2560x1440 DX9 very high with an overclock on the GPU gives me 30's for average FPS. High settings gives me 50's.

Is that just me or..?


----------



## Divineshadowx

Quote:


> Originally Posted by *pilla99*
> 
> Any of you guys playing with a single 690 on Metro 2033 1440p?? That game is kicking this systems ass. 2560x1440 DX9 very high with an overclock on the GPU gives me 30's for average FPS. High settings gives me 50's.
> Is that just me or..?


Metro 2033 is a ****ty game. Badly optimized. DX11, even though has limited gpu eye candy, besides tessellation and improved shadows, runs much better. If you can get 60+fps on heaven with a single 690 you should get 60+ on metro. And for those who say the game uses all the gpus, ya it does, doesn't mean it is scaling well in sli and quadsli. Usage literally has 0 relationship of how a gpu scales with sli. Before my 690, I had a 5770. Got another one, usage was at 100% for both, but did I get double the fps? No, I added another one just to test the scaling, all 3 gpu's were at 100%, did I get tripple fps? No, got about 210% instead of 300%.


----------



## Hokies83

Quote:


> Originally Posted by *pilla99*
> 
> Any of you guys playing with a single 690 on Metro 2033 1440p?? That game is kicking this systems ass. 2560x1440 DX9 very high with an overclock on the GPU gives me 30's for average FPS. High settings gives me 50's.
> Is that just me or..?


That does not seem right my 2 680s give me about an avg of 80-90fps with ever thing turned up.
I did a Metro 2033 Bench off with a guy on AnAndtech and he was getting avg of 55fps. so for some odd reason your 20fps off from him.

On a i7 930 @ 5ghz and 2 Heavy clocked gtx 470s i was able to hold 30-35fps.


----------



## lukeman3000

I'm in the market for a 690. Which specific version should I be looking at? I noticed that there's a "signature series" on newegg.. what are the differences? Different look?

What is "the" 690 to get?


----------



## Arizonian

Quote:


> Originally Posted by *lukeman3000*
> 
> I'm in the market for a 690. Which specific version should I be looking at? I noticed that there's a "signature series" on newegg.. what are the differences? Different look?
> What is "the" 690 to get?


Signature series comes with a T-shirt and mouse pad for an extra $50 is all that means.

Any vendor ASUS, EVGA, GIGABYTE etc making the GTX 690 is fine being a reference card there is no difference in specs or components.

It will really do well in your single 23" IPS monitor.


----------



## Hokies83

Quote:


> Originally Posted by *lukeman3000*
> 
> I'm in the market for a 690. Which specific version should I be looking at? I noticed that there's a "signature series" on newegg.. what are the differences? Different look?
> What is "the" 690 to get?


The cheapest one you can find









That signature crap etc is just Evga Marketing...


----------



## Buzzkill

Quote:


> Originally Posted by *lukeman3000*
> 
> I'm in the market for a 690. Which specific version should I be looking at? I noticed that there's a "signature series" on newegg.. what are the differences? Different look?
> What is "the" 690 to get?


The EVGA HYDRO COPPER is Factory Overclocked Look at specifications from EVGA.com

EVGA GeForce GTX 690 Hydro Copper Signature $1199.99
*Performance*
3072 CUDA Cores
993 MHz GPU
1045 MHz Boost Clock
*Memory*
4096 MB, 512 bit GDDR5
6008 MHz (effective)
384.52 GB/s Memory Bandwidth

EVGA GeForce GTX 690 Signature $1049.99
*Performance*
3072 CUDA Cores
915 MHz GPU
1019 MHz Boost Clock
*Memory*
4096 MB, 512 bit GDDR5
6008 MHz (effective)
384.52 GB/s Memory Bandwidth


----------



## thestache

Quote:


> Originally Posted by *Stray_Bullet*
> 
> GTX690 Arctic Accelero review. http://www.kitguru.net/components/graphic-cards/zardon/arctic-accelero-twin-turbo-690-cooler-review-w-asus-gtx690/


Thats all good and well but never did my GTX 690 even when overclocked to 1212mhz and 6600mhz reach over 74deg in heaven, planetside 2 or battlefield 3. With my fan profile on 90% once the temp hit 70deg.

They must have some serious airflow problems in the case they were using.

Pretty pointless the stock cooler is more than enough for that card and it's overclocking abilities and if it isn't then your case is the problem not the fans.

Interesting that the hydro copper is clocked higher than the regular version. Breaches all this green light rubbish doesn't it?


----------



## MrTOOSHORT

I would never change out this work of art cooler for that crappy looking Arctic on the gtx690.

Has to be one of the best, if not thee best, reference air cooler on a gpu. And looks fantastic doing it too!

I want that bios from the Hydro Copper gtx690, might be able to OC a tad more!


----------



## thestache

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I would never change out this work of art cooler for that crappy looking Arctic on the gtx690.
> Has to be one of the best, if not thee best, reference air cooler on a gpu. And looks fantastic doing it too!
> I want that bios from the Hydro Copper gtx690, might be able to OC a tad more!


Interesting point.

Want to see one of them overclocked now to see if they do get any extra out of them. Doubt it though.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *thestache*
> 
> Interesting point.
> Want to see one of them overclocked now to see if they do get any extra out of them. Doubt it though.


One thing to note is that higher board power and power limit ceilings are near as important as voltage to get more overclocks on Kepler imo.

My old Asus DCii680 would have this wierd throttle with the stock bios that when I was over 1.45GHz(1.36ish volts) for benchies, it just didn't score the same as the Lightning at the same speed with a similar system.

I used a custom LN2 bios and then finally got the 1.45 - 1.5GHz real scores. I believe it has to be something about the power limit and board power within a Kepler bios.

I'm hoping to see that the Hydro Copper bios has more headroom in that department.


----------



## max883

Updatet my bios on my GTX 690. now i have 150 power target, and a stable 1200.Mhz core and 7000.Mem









power target 150
core clock +150
Mem +500


----------



## lukeman3000

Quote:


> Originally Posted by *thestache*
> 
> Thats all good and well but never did my GTX 690 even when overclocked to 1212mhz and 6600mhz reach over 74deg in heaven, planetside 2 or battlefield 3. With my fan profile on 90% once the temp hit 70deg.
> They must have some serious airflow problems in the case they were using.
> Pretty pointless the stock cooler is more than enough for that card and it's overclocking abilities and if it isn't then your case is the problem not the fans.
> Interesting that the hydro copper is clocked higher than the regular version. Breaches all this green light rubbish doesn't it?


But what about noise level? Does it stay quiet in heaven? What about when overclocked? One of my big things is that I want a quiet card even under heavy load. Does the 690 fit the bill?


----------



## pilla99

Quote:


> Originally Posted by *max883*
> 
> Updatet my bios on my GTX 690. now i have 150 power target, and a stable 1200.Mhz core and 7000.Mem
> 
> 
> 
> 
> 
> 
> 
> 
> power target 150
> core clock +150
> Mem +500


Yea I'm going to need to know more about this. Are you on air? Where can I get a guide or some steps on this and where can I grab the BIOS for the card you're using?


----------



## max883

http://www.mvktech.net/component/option,com_joomlaboard/Itemid,34/func,view/id,63918/catid,10/limit,10/limitstart,20/

im on water cooling Xspc!


----------



## DinaAngel

iv figured the overclocking and power target now and why people doesnt get stabile clocks so easly and why hydrocopper is overclocked higher.

iv flashed my 690 with unlocked power target to 150% and power on the board is max
so iv seen that 66 degrees is the temp limit before gpus down volts since of heat.
so if you cool it down alot then it doesnt down volt and u get stabile to almost or 1300mhz depends on ur gpus, iv gotten to 1180 stabile but it down volts since of temp.

amps seems go higher unlocked, volts doesnt go up but amps do

http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html
works with 690 perfectly
commands to use depends on ur nvflash --list
if its
0 690
1 690
then commands are
nvflash -i0 "nameofbios0"
nvflash -i1 "nameofbios1"
without "
but if you have something occupying 1 or some others then the 690 1 and 0 will be different

remember
0 gpu is on SP8 "main gpu 0 bios"
SP16 is 1 gpu "1 bios"
look here to see what is main bios
http://i48.tinypic.com/orqmqb.jpg

i hope this helps!

Modded bioses 690 110k .zip file


----------



## thestache

Quote:


> Originally Posted by *lukeman3000*
> 
> But what about noise level? Does it stay quiet in heaven? What about when overclocked? One of my big things is that I want a quiet card even under heavy load. Does the 690 fit the bill?


Quiet under heavy load with the performance of a GTX 690? Never going to happen.

GTX 690 is really quiet without a big overclock but once you up the power target and clock speeds and it gets hotter it's a lot louder. No where near as loud as regular reference cards and tolerable at 90% while gaming with the sound low but if you want quiet you'll always have to go water.

Quietest cards I've had were the new lightnings, they were really good in terms of temps and noise. Pretty much the best at the moment.


----------



## Buzzkill

Anyone who flashes there Bios and is unable to Re-Flash card before sending in for warranty can have claim denied. Nvidia can also deny cards they suspect of overvolting due to degeradiation of transistors from electromigration. Giving more voltage than 1.175v will void the warranty. The limit on max voltage is very simply to prevent damage to the GPU chips. Runnig oner 1.175 over time will damge cards.

*Here is the official NVIDIA response to Overvolting:*
Some of our best and most passionate customers have told us (though forums, partners and directly) that they are frustrated with our position on GPU Overvoltaging. So we feel that it is important to explain exactly what our position is and why we feel that it is important.

We love to see our chips run faster and we understand that our customers want to squeeze as much performance as possible out of their GPUs. However there is a physical limit to the amount of voltage that can be applied to a GPU before the silicon begins to degrade through electromigration (http://en.wikipedia.org/wiki/Electromigration). Essentially, excessive voltages on transistors can over time "evaporate" the metal in a key spot destroying or degrading the performance of the chip. Unfortunately, since the process happens over time it's not always immediately obvious when it's happening. Overvoltaging above our max spec does exactly this. It raises the operating voltage beyond our rated max and can erode the GPU silicon over time.

In contrast, GPU Boost always keeps the voltage below our max spec, even as it is raising and lowering the voltage dynamically. That way you get great performance and a guaranteed lifetime.
So our policy is pretty simple:
1. We encourage users to go have fun with our GPUs. They are completely guaranteed and will perform great within the predefined limits.
2. We also recommend that our board partners don't build in mechanisms that raise voltages beyond our max spec. We set it as high as possible within long term reliability limits.
3. The reason we have a limit on max voltage is very simply to prevent damage to the GPU chips. At NVIDIA we know that our customers want to push their GPUs to the limit. We are all for it, and as a matter of fact NVIDIA has always prioritized support for hardware enthusiasts by providing tools to access hardware settings and by supporting our board partners in creating overclocked enthusiast products. Leading up to the GeForce GTX 680 release for example, we worked closely with developers of 3rd party overclocking utilities to make sure they fully supported GeForce GTX 680 and GPU Boost on the day of launch.


----------



## Divineshadowx

Quote:


> Originally Posted by *thestache*
> 
> Quiet under heavy load with the performance of a GTX 690? Never going to happen.
> GTX 690 is really quiet without a big overclock but once you up the power target and clock speeds and it gets hotter it's a lot louder. No where near as loud as regular reference cards and tolerable at 90% while gaming with the sound low but if you want quiet you'll always have to go water.
> Quietest cards I've had were the new lightnings, they were really good in terms of temps and noise. Pretty much the best at the moment.


690 loud? I cant hear a difference between 40% and 100% fans. Are my case fans really that loud wow... What kind of fans do people use for water. Because I dont want 3000rpm jet engines, using 2000-2500atm and its a bit loud. What kind you think ill need for 2-4 480rads and maybe more in the future?


----------



## PhantomTaco

Quote:


> Originally Posted by *Divineshadowx*
> 
> 690 loud? I cant hear a difference between 40% and 100% fans. Are my case fans really that loud wow... What kind of fans do people use for water. Because I dont want 3000rpm jet engines, using 2000-2500atm and its a bit loud. What kind you think ill need for 2-4 480rads and maybe more in the future?


2 480s is pretty massive in cooling space. You cooling CPU and gpus? You could probably go with quieter fans, but i'd probably go for b.gears, cougars, noctuas, scythe gts or noiseblockers. Of course you've always got the corsair silent/performance series as well that are good and xigmateks.


----------



## fLaXi0n

has anyone test the DPC Latency with his GTX 690?
In other forums, people got high latencys (normal latencys are under ~60) and sound crackle stutters or lags ingame cause high latency.
I have send my back for a new one which comes next week.
With my old GTX 580 i have no Problems always under 50 micro sec. but with the GTX 690 about 400-800 or more in windows and in games like d3 sc2 or other games, specially in windowfullscreen i get sound crackle after 20 or 30 seconds.

I think the GTX 690 is broken in hardware like the GTX 460 where no updates could fix this issue,
Sounds like the GTX 690 is an beta and it cant be fixed -.-

U can all easily check it with DPC Latency Checker


----------



## DinaAngel

Quote:


> Originally Posted by *Buzzkill*
> 
> Anyone who flashes there Bios and is unable to Re-Flash card before sending in for warranty can have claim denied. Nvidia can also deny cards they suspect of overvolting due to degeradiation of transistors from electromigration. Giving more voltage than 1.175v will void the warranty. The limit on max voltage is very simply to prevent damage to the GPU chips. Runnig oner 1.175 over time will damge cards.
> *Here is the official NVIDIA response to Overvolting:*
> Some of our best and most passionate customers have told us (though forums, partners and directly) that they are frustrated with our position on GPU Overvoltaging. So we feel that it is important to explain exactly what our position is and why we feel that it is important.
> We love to see our chips run faster and we understand that our customers want to squeeze as much performance as possible out of their GPUs. However there is a physical limit to the amount of voltage that can be applied to a GPU before the silicon begins to degrade through electromigration (http://en.wikipedia.org/wiki/Electromigration). Essentially, excessive voltages on transistors can over time "evaporate" the metal in a key spot destroying or degrading the performance of the chip. Unfortunately, since the process happens over time it's not always immediately obvious when it's happening. Overvoltaging above our max spec does exactly this. It raises the operating voltage beyond our rated max and can erode the GPU silicon over time.
> In contrast, GPU Boost always keeps the voltage below our max spec, even as it is raising and lowering the voltage dynamically. That way you get great performance and a guaranteed lifetime.
> So our policy is pretty simple:
> 1. We encourage users to go have fun with our GPUs. They are completely guaranteed and will perform great within the predefined limits.
> 2. We also recommend that our board partners don't build in mechanisms that raise voltages beyond our max spec. We set it as high as possible within long term reliability limits.
> 3. The reason we have a limit on max voltage is very simply to prevent damage to the GPU chips. At NVIDIA we know that our customers want to push their GPUs to the limit. We are all for it, and as a matter of fact NVIDIA has always prioritized support for hardware enthusiasts by providing tools to access hardware settings and by supporting our board partners in creating overclocked enthusiast products. Leading up to the GeForce GTX 680 release for example, we worked closely with developers of 3rd party overclocking utilities to make sure they fully supported GeForce GTX 680 and GPU Boost on the day of launch.


but then just buy new one!!


----------



## Hokies83

Quote:


> Originally Posted by *DinaAngel*
> 
> but then just buy new one!!


Why would you loose 1100$ for 1 - 5 fps?


----------



## pilla99

Quote:


> Originally Posted by *max883*
> 
> Updatet my bios on my GTX 690. now i have 150 power target, and a stable 1200.Mhz core and 7000.Mem
> 
> 
> 
> 
> 
> 
> 
> 
> power target 150
> core clock +150
> Mem +500


Anyone have a link or know what this guy did? I can't find any alternate BIOS for this card.


----------



## DinaAngel

Quote:


> Originally Posted by *pilla99*
> 
> Anyone have a link or know what this guy did? I can't find any alternate BIOS for this card.


i did same, just follow what i wrote x.x, its much more stabile


----------



## pilla99

Ok thanks. Another question. I have my main monitor plugged in via DVI, but my second is now a 32" Vizio via mini-display port. The screen does not quite fit correctly on the second screen however, too wide and too tall. So things are getting cut off. How can I resize this?

Edit: Just used the Nvidia control panel and got it to work correctly. Never mind.


----------



## Divineshadowx

I Can't decide on a case









810 feels cheap, kinda small. Cosmos 2 sucks. Mountain mods look like sheet metal... and case labs are overpriced, and can't get a 480 rad case w/o the $500+ range.


----------



## Hokies83

Cosmos 2 is great and u already have it 810 is indeed cheap.

Case labs is over priced.. Mountain mods builds theres the same as Case labs and it is a great case that will do anything.... for 25% less then Case labs. and it is aluminum not sheet metal lol.


----------



## PhantomTaco

Quote:


> Originally Posted by *Divineshadowx*
> 
> I Can't decide on a case
> 
> 
> 
> 
> 
> 
> 
> 
> 810 feels cheap, kinda small. Cosmos 2 sucks. Mountain mods look like sheet metal... and case labs are overpriced, and can't get a 480 rad case w/o the $500+ range.


I've got a little devil PC V8 i'm trying to sell







. Check my thread on the Cases section in marketplace ifyou're interested, fits up to 3 480mm rads


----------



## Crip

-Gigabyte ofc

yeyy :3


----------



## Arizonian

Quote:


> Originally Posted by *Crip*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> -Gigabyte ofc
> 
> yeyy :3


Congrat on the 690 Crip and welcome to OCN.







Would love to see it when it's sitting in it's new home.


----------



## Crip

Ty, its in and its performance is unspeakable stunning







Btw.. gigabyte, you wrote evga


----------



## Arizonian

Quote:


> Originally Posted by *Crip*
> 
> Ty, its in and its performance is unspeakable stunning
> 
> 
> 
> 
> 
> 
> 
> Btw.. gigabyte, you wrote evga


Whoops...so many EVGA cards as I was typing I thought Gigabyte but it came out EVGA....









/fixed


----------



## DinaAngel

Quote:


> Originally Posted by *pilla99*
> 
> Ok thanks. Another question. I have my main monitor plugged in via DVI, but my second is now a 32" Vizio via mini-display port. The screen does not quite fit correctly on the second screen however, too wide and too tall. So things are getting cut off. How can I resize this?
> Edit: Just used the Nvidia control panel and got it to work correctly. Never mind.


each screen should have each settings in control panel of Nvidia, follow nvidias recomendation with two screens, two screens needs dvi and connected like this.
red is second gpu


----------



## DinaAngel

http://www.techpowerup.com/gpuz/9c4h3/
150+ power target
lol 500+ on core
and 300 on memmory

im looking forward to watercool 690


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Divineshadowx*
> 
> I Can't decide on a case
> 
> 
> 
> 
> 
> 
> 
> 
> 810 feels cheap, kinda small. Cosmos 2 sucks. Mountain mods look like sheet metal... and case labs are overpriced, and can't get a 480 rad case w/o the $500+ range.


Danger Den Tower 29 or double wide tower 29. You can fit nearly anything in those cases, just have to plan the build ahead to make sure to order the correct panels from DD.


----------



## FiShBuRn

Quote:


> Originally Posted by *DinaAngel*
> 
> http://www.techpowerup.com/gpuz/9c4h3/
> 150+ power target
> lol 500+ on core
> and 300 on memmory
> im looking forward to watercool 690


wow +500 on core, thats hot! Its stable? How about the temps?


----------



## Divineshadowx

Quote:


> Originally Posted by *DinaAngel*
> 
> http://www.techpowerup.com/gpuz/9c4h3/
> 150+ power target
> lol 500+ on core
> and 300 on memmory
> im looking forward to watercool 690


How do you have that without wc lol...

Or did you not just stress test it yet? Thats pretty awesome, cant wait to start my wc project. Thats like 2 680s sli with another 670 lol... hope it doesnt kill anything.


----------



## PhantomTaco

Quote:


> Originally Posted by *DinaAngel*
> 
> http://www.techpowerup.com/gpuz/9c4h3/
> 150+ power target
> lol 500+ on core
> and 300 on memmory
> im looking forward to watercool 690


So wait where is this custom bios from exactly?


----------



## DinaAngel

Quote:


> Originally Posted by *PhantomTaco*
> 
> So wait where is this custom bios from exactly?


allready wrote about that.
heres the bioses but you need this to flash them, DOS version is only one that works with flashing 690 as i know as for today
http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html

Modded bioses 690 110k .zip file


Quote:


> Originally Posted by *Divineshadowx*
> 
> How do you have that without wc lol...
> Or did you not just stress test it yet? Thats pretty awesome, cant wait to start my wc project. Thats like 2 680s sli with another 670 lol... hope it doesnt kill anything.


yeah it was stabile but as temp clamped higher it got unstabile as volts dropped
its a powersaving feature built in 690 bios, i wish it was turned off.
so litterally u can say these cards cant run more than 50 degrees before it down volts.

what i mean is that on 65 degrees or so it tries to clock down to not get so hot.

id say use an 360 atleast for the 690 and get ur ambient quite low and u prob can get stabile 24/7 on 1300 mhz

temprature is everything on the 690.

im sorry if its confusing if i write in lines, just trying to explain as simple and good as i can, for us deep ocers id say that iv gotten around getting liquid metal thermal paste some new stuff right from factory in germany, name of it ill try add to my rig topic when i get around using it, but i fully recomend that most of you try wait for updated of programs and so.
but the updated http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html
does work fully to flash with,
heres my guide to the DOS flashing 690
http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2430#post_18310991


----------



## lukeman3000

Just pulled the trigger on a 690. Can't wait!

What are some of the most demanding/best looking games I can throw at it?


----------



## Arizonian

Quote:


> Originally Posted by *lukeman3000*
> 
> Just pulled the trigger on a 690. Can't wait!
> What are some of the most demanding/best looking games I can throw at it?


Congrats...post pic when it arrives.









In my arsenal the most demanding are BF3 64 player multiplayer - BF3 Campaign - Crysis 2 Campaign (my favorite for finding stability) seem to keep my Core maxed where usage is 97%-98%. Max settings, play for a few hours each game and you most likely found stability if not real close.

Games like Crysis & Metro 2033 are ok but are so out dated and un optimized I stopped using it as a gauge for stability.


----------



## lukeman3000

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats...post pic when it arrives.
> 
> 
> 
> 
> 
> 
> 
> 
> In my arsenal the most demanding are BF3 64 player multiplayer - BF3 Campaign - Crysis 2 Campaign (my favorite for finding stability) seem to keep my Core maxed where usage is 97%-98%. Max settings, play for a few hours each game and you most likely found stability if not real close.
> Games like Crysis & Metro 2033 are ok but are so out dated and un optimized I stopped using it as a gauge for stability.


Has anyone tried playing Day Z?

With my GTX 670, my FPS is anywhere from 55-80 in the wilderness. But as soon as I start looking at a city, it drops to mid-forties/high-thirties.

Should a 690 solve this problem? I have an i5 3570K at 4.2 GHz, so I don't think my processor is a bottleneck..


----------



## Divineshadowx

Quote:


> Originally Posted by *DinaAngel*
> 
> allready wrote about that.
> heres the bioses but you need this to flash them, DOS version is only one that works with flashing 690 as i know as for today
> http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html
> 
> Modded bioses 690 110k .zip file
> 
> yeah it was stabile but as temp clamped higher it got unstabile as volts dropped
> its a powersaving feature built in 690 bios, i wish it was turned off.
> so litterally u can say these cards cant run more than 50 degrees before it down volts.
> what i mean is that on 65 degrees or so it tries to clock down to not get so hot.
> id say use an 360 atleast for the 690 and get ur ambient quite low and u prob can get stabile 24/7 on 1300 mhz
> temprature is everything on the 690.
> im sorry if its confusing if i write in lines, just trying to explain as simple and good as i can, for us deep ocers id say that iv gotten around getting liquid metal thermal paste some new stuff right from factory in germany, name of it ill try add to my rig topic when i get around using it, but i fully recomend that most of you try wait for updated of programs and so.
> but the updated http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html
> does work fully to flash with,
> heres my guide to the DOS flashing 690
> http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2430#post_18310991


I'm going for 3x 480rads, so my temps will be quite low. Not trying this bios thing until someone actually confirms that it works lol.


----------



## Arizonian

Quote:


> Originally Posted by *lukeman3000*
> 
> Has anyone tried playing Day Z?
> With my GTX 670, my FPS is anywhere from 55-80 in the wilderness. But as soon as I start looking at a city, it drops to mid-forties/high-thirties.
> Should a 690 solve this problem? I have an i5 3570K at 4.2 GHz, so I don't think my processor is a bottleneck..


I don't own it but my kids have ARMA 2 on their Steam account playing with a 680. They do not play with any mods however to compare.

Your CPU is more than enough. No bottle neck on an i5 3570 [4.2].


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> I'm going for 3x 480rads, so my temps will be quite low. Not trying this bios thing until someone actually confirms that it works lol.


3 480 rads what are you cooling 12 Gpus and 3 Cpus? 

Having 8x to much does not get you lower temps the water will still heat up the same... All it gives you is alot of wasted money.. Or if you want to make 3 computers in one have the cooling power to do so.
If you want better cooling get LN or a phase changer.

Seen here http://www.overclock.net/t/58355/cooler-express-phase-change-systems/0_20

On over Volting a 1100$ Gpu for a few fps = total fail unless you have a limitless cash supply and just down care... Details below...


Spoiler: Warning: Spoiler!



*A source inside one of Nvidia's largest graphics manufacturing partners, who spoke to us on the condition that they remain anonymous, explains: 'The fact is Nvidia is stopping ALL partners from allowing any form of hardware/software overvolting, or providing hardware mods beyond its very limited restrictions. They threaten to cut allocation [of GK100 parts] if hardware mods aren't removed or avoided entirely.'*

While homebrew soldering-iron-and-prayer overvolting is still permitted, manufacturing partners aren't allowed to make it easy for buyers. 'We're not allowed to openly advertise the PCB markings [for overvoltage adjustment] on the GTX 680,' our source continues.
Quote:
Claims that manufacturers aren't being restricted in their designs beyond the confines of the Green Light programme are soundly denied by our source, however. We've been told that the secretive restrictions on board partners go yet further: 'They [Nvidia] also threaten allocation if you make a card faster than the [stock] GTX 690.'

These restrictions are not limited to just a couple of companies, either: they appear to stretch right across the board, and are responsible for product cancellations and - as with EVGA's removal of the EVBot header from the GTX 680 Classified - hardware modifications from multiple manufacturers. They're also leaving a bad taste in board partners' mouths: where in previous generations each company has been able to push its own cards to the limit in order to beat the competition, under Nvidia's alleged new rules all GTX 680 boards will be more or less identical in performance and features.

The hardware restrictions are a loss for the consumer, too: EVGA has already stated that it won't be reducing the price of the GeForce GTX 680 Classified, despite removing the EVBot header and corresponding facility for custom voltages outside Nvidia's recommended limits - meaning buyers now get less card for their cash than before the company capitulated to Nvidia's alleged demands.

We've approached other board partners, but thus far none have been willing to comment on the record regarding our source's claims of hardware restrictions - *and with our source alleging that Nvidia may even cut chip allocations for companies that talk publicly about the matter, that's no surprise.
*

*We contacted Nvidia for comment and received a response from their Senior PR Manager, Bryan Del Rizzo with the following,
*
"Green Light was created to help ensure that all of the GTX boards in the market all have great acoustics, temperatures, and mechanicals. This helps to ensure our GTX customers get the highest quality product that runs quiet, cool, and fits in their PC. GTX is a measureable brand, and Green Light is a promise to ensure that the brand remains as strong as possible by making sure the products brought to market meet our highest quality requirements.

Reducing RMAs has never been a focus of Green Light.

We support overvoltaging up to a limit on our products, but have a maximum reliability spec that is intended to protect the life of the product. We don't want to see customers disappointed when their card dies in a year or two because the voltage was raised too high.

Regarding overvoltaging above our max spec, we offer AICs two choices:

· Ensure the GPU stays within our operating specs and have a full warranty from NVIDIA.

· Allow the GPU to be manually operated outside specs in which case NVIDIA provides no warranty.

We prefer AICs ensure the GPU stays within spec and encourage this through warranty support, but it's ultimately up to the AIC what they want to do. Their choice does not affect allocation. And this has no bearing on the end user warranty provided by the AIC. It is simply a warranty between NVIDIA and the AIC.

With Green Light, we don't really go out of the way to look for ways that AICs enable manual OV. As I stated, this isn't the core purpose of the program. Yes, you've seen some cases of boards getting out into the market with OV features only to have them disabled later. This is due to the fact that AICs decided later that they would prefer to have a warranty. This is simply a choice the AICs each need to make for themselves. How, or when they make this decision, is entirely up to them.

With regards to your MSI comment below, we gave MSI the same choice I referenced above -- change their SW to disable OV above our reliability limit or not obtain a warranty. They simply chose to change their software in lieu of the warranty. Their choice. It is not ours to make, and we don't influence them one way or the other.

In short, Green Light is an especially important program for a major, new product introduction like Kepler, where our AICs don't have a lot of experience building and working with our new technologies, but also extends the flexibility to AICs who provide a design that can operate outside of the reliability limits of the board. And, if you look at the products in the market today, there is obviously evidence of differentiation. You only need to look at the large assortment of high quality Kepler boards available today, including standard and overclocked editions."

*What does this mean for consumers?*
This essentially breaks down to giving consumers fewer options between their cards and limits the innovation that AIBs are capable of implementing in their products. If Nvidia is limiting the AIBs within a set of parameters on their non-reference cards, then they are hurting those board vendors' most profitable products. This gives consumers less choice, while enabling Nvidia to theoretically have lower RMAs. Such a program does, however, make sense if you think about the perception of Nvidia if all of their board partners are running amok. They obviously have to have a certain level of control over what their AIBs do with their GPUs if they are going to warranty them. But, we believe that Nvidia has gone too far in their restrictions on board partners and amount of control they exercise in the process.

So, the Green Light program is a program that we believe hurts AIBs and consumers while enabling Nvidia to reduce their RMA rate and improve their margins. If you are an Nvidia investor, this is great news, but if you are a consumer, this is clearly bad news. Nvidia claims that this has to do with the quality of the product and smoothness of launches, however, we believe that in the end it's all about money.

*Sources...*

http://forum-en.msi.com/index.php?topic=162220.0
http://www.bit-tech.net/news/hardware/2012/10/05/nvidia-crippling-partners/1
http://www.xbitlabs.com/news/graphics/display/20120521120817_Nvidia_Denies_Plans_to_Recall_GeForce_GTX_600_Due_to_Performance_Degradation.html
http://www.tomshardware.co.uk/MSI-GTX-660-670-overvolting-PowerEdition,news-40278.html

*Nvidia forces Evga to remove EVBot from the classy*

Originally Posted by EVGA Forum, 1st October 2012
It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device.

I have just made some phone calls...

It appears that Nvidia is attempting to keep the performance differential intact from its unreleased "original" 680 that was withheld at the 600 series launch.
From what I gather with people who know about such things, this may portend a much earlier release of the withheld card than was projected just a few months ago.

This appears to be all about Nvidia milking the customer to the last red cent (since AMD has fallen flat on its face) and removing the ability to truly push ones cards into higher performance levels is being done to simply increase the profit margin.

It appears that Nvidia has forgotten the key to its overwhelming success. Providing the experienced user with the options to increase performance levels.

Once you start removing features to increase ones profit levels, you are heading downhill as a company.
It appears that Nvidia is doing just that. In my opinion this is a serious mistake that may increase profits "now" but lead to issues should such behavior be contiuned by Nvidia.

+1 for profit margins, -5 for end users
Greed is a terrible thing, if allowed to become company policy. Nvidia may be able to get away with this given the current state of videocard development between AMD and itself, but further down the road this could open the door to Nvidia becoming marginalized.

The massive profit margin increase with the 600 series over the 500 series, has appeared to make Nvidia management "profit drunk". As we have seen with other companies, such things can lead to driving into a ditch.....


----------



## MrTOOSHORT

Quote:


> Originally Posted by *DinaAngel*
> 
> http://www.techpowerup.com/gpuz/9c4h3/
> 150+ power target
> lol 500+ on core
> and 300 on memmory
> im looking forward to watercool 690


Can you fix your post, it says +500 on the core which would equate to 1500MHz core boost clocks. Telling people you can get 1.5GHz on a bios for a gtx690 gets everyone excited to flash thier cards when they don't know what they are doing.

In a few posts before you said 1180Mhz stable, so correct the +500 offset core remark please.


----------



## tonyjones

I'll pick one up when they drop to $800


----------



## MrTOOSHORT

Hey guys, the gtx 690 says in gpuz it runs in pci-E 3.0 on x79 natively without tweaks. But this isn't the case. It's running 3.0 with the PLX chip but not 3.0 pic-e from the cpu to the lane. You'll need to use the regedit 3.0 tweak from Nvidia if you want the full pci-e 3.0 x16.

http://www.techpowerup.com/downloads/2148/NVIDIA_GeForce_Kepler_PCIe_3.0_mode-enabling_patch_for_Sandy_Bridge-E_systems.html

Now you don't need to do this at all, but it does give a very small boost in performance. Actually not enough performance to even bother, but some. You might have to lower your overclock a tad, like 5-10Mhz to compensate.

Again, this is for x79 and gtx690 owners. IVY is fine.


----------



## thestache

Quote:


> Originally Posted by *Divineshadowx*
> 
> 690 loud? I cant hear a difference between 40% and 100% fans. Are my case fans really that loud wow... What kind of fans do people use for water. Because I dont want 3000rpm jet engines, using 2000-2500atm and its a bit loud. What kind you think ill need for 2-4 480rads and maybe more in the future?


Lol. They must be pretty loud.

I'm running 18DB 1500RPM fans and they are in my opinion audible. Best I've found for CFM and static pressure at a sub 20DB noise range and are PWM. Cougar Vortex PWM 120mm.

I run these for the pull and my CPU exhaust fan (are set to wind up and down when at dsktop or running games) and I run regular 3-pin versions of the same fan for the push which are always maxxed but slightly quieter.

It isn't the size of the rad that's important but the quality of the rad and it's effectiveness. It's thickness, density of the fins and how they perform with the fan your using at it's max RPM that determine what you need. Perferably if you can fit thick low density fin rads that operate well with low RPM and low noise fans then thats what you want but in some cases you need to use higher RPM fans because of space restrictions and thinner higher density rads. Thickest I can fit is 35mm.

Typically you want a rad that operates well at 1600RPM or less because once you add 6-8 of those fans in push/pull togeather they become quite loud. But I'd reccomend this as it's pertty much the best rad around at all fan speeds and two of these is all you'd need for any system no matter how crazy.

Alphacool NexXxoS UT60 360 Radiator

http://martinsliquidlab.org/2012/04/09/hardware-labs-gtx-360-radiator/4/

Or if thickness/sourcing becomes a restriction the Swiftech MCR 320 XP which is also quite good. I run Koolance rads simply because they match my other Koolance parts and are cheap and have always performed well.


----------



## DinaAngel

Quote:


> Originally Posted by *Divineshadowx*
> 
> I'm going for 3x 480rads, so my temps will be quite low. Not trying this bios thing until someone actually confirms that it works lol.


Nice!!


----------



## DinaAngel

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Can you fix your post, it says +500 on the core which would equate to 1500MHz core boost clocks. Telling people you can get 1.5GHz on a bios for a gtx690 gets everyone excited to flash thier cards when they don't know what they are doing.
> In a few posts before you said 1180Mhz stable, so correct the +500 offset core remark please.


i could made a video xD of it "thinkin of own rig videos"
the new bios rlly works great but still id say its up to them, i allways have one card laying around and i doubt people does just trow them away

at 549 core speed screen went red and froze so thats not stabile
but 500 was kinda stabile it could run for a few hours and went red, it was 69-71 degrees tho on idle

and yes i said 1180 was stabile before but that was before this bios,
most likely a mix of 15 degrees ambient and max fans and extra power target

and nvidias statement said that 1300mhz on the core boost was possible, so something is bit fishy
i do believe that the bios might be only issue and heat
http://www.techpowerup.com/gpuz/c4ed5/
http://www.techpowerup.com/gpuz/rfhs/
im gonna try borderlands 2 with new clocks
Update: yeah heat of 80 instant got unstabile
http://www.techpowerup.com/gpuz/82sug/
http://www.techpowerup.com/gpuz/mze78/
u might need liquid nitrogen on last one on load tho









however warranty goes away with bios flash, personally i never use warranty i just buy new but thats just me i guess


----------



## DinaAngel

is it possible to use longer sqrews and set a water block ontop of some tecs on 690?


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Hokies83*
> 
> 3 480 rads what are you cooling 12 Gpus and 3 Cpus?
> Having 8x to much does not get you lower temps the water will still heat up the same... All it gives you is alot of wasted money.. Or if you want to make 3 computers in one have the cooling power to do so.
> If you want better cooling get LN or a phase changer.


More radiators will yield better temps until flow decreases to the point that the lower water temp pulls less heat away from the block than the increased flow would. Until water temps are at ambient components temps will continue to get better as more dissipation is added. 3x 480s isn't really that much either for a high end system.


----------



## DinaAngel

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> More radiators will yield better temps until flow decreases to the point that the lower water temp pulls less heat away from the block than the increased flow would. Until water temps are at ambient components temps will continue to get better as more dissipation is added. 3x 480s isn't really that much either for a high end system.


totally true


----------



## Hokies83

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> More radiators will yield better temps until flow decreases to the point that the lower water temp pulls less heat away from the block than the increased flow would. Until water temps are at ambient components temps will continue to get better as more dissipation is added. 3x 480s isn't really that much either for a high end system.


What i was saying was 3 480s depending on load will only cool no better to room temp.. i think he can get the exact same results with less.

No matter if you have 1000 480 rads or 2.. The water will heat to room temp no matter what you do. It just needs enough to be able to displace the heat he has..

Which 4 gpus and 1 cpu is not enough to warrant 3 480s...

I was able to just about hold room temps on 2 50mm 360 rads with San ace fans running a i7 930 + 4 gtx 480s which is quite a bit more heat then his system.

It is a question that needs to go to the water cooling section as to how much he really needs.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Hokies83*
> 
> What i was saying was 3 480s depending on load will only cool no better to room temp.. i think he can get the exact same results with less.
> No matter if you have 1000 480 rads or 2.. The water will heat to room temp no matter what you do. It just needs enough to be able to displace the heat he has..
> Which 4 gpus and 1 cpu is not enough to warrant 3 480s...
> I was able to just about hold room temps on 2 50mm 360 rads with San ace fans running a i7 930 + 4 gtx 480s which is quite a bit more heat then his system.
> It is a question that needs to go to the water cooling section as to how much he really needs.


It's impossible to get to ambient temp in the loop. Physics won't allow it. You can get close tho. That's also not enough cooling area, regardless if you're running even 5000 rpm san aces to get to under 2C water/air delta with that tdp.


----------



## Hokies83

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> It's impossible to get to ambient temp in the loop. Physics won't allow it. You can get close tho. That's also not enough cooling area, regardless if you're running even 5000 rpm san aces to get to under 2C water/air delta with that tdp.


Yes that is what im saying... No matter how many rads you have you can never do any better then room temp you can get close.. but you will never reach it/ beat it.

You have to find out what enough is and go with it..

My temps were close to room temp meaning within 10c my System also sits beside the AC duct and the AC air goes into it first lol.


----------



## DinaAngel

Quote:


> Originally Posted by *Hokies83*
> 
> Yes that is what im saying... No matter how many rads you have you can never do any better then room temp you can get close.. but you will never reach it/ beat it.
> You have to find out what enough is and go with it..
> My temps were close to room temp meaning within 10c my System also sits beside the AC duct and the AC air goes into it first lol.


i got same as room temp with lowering one rad into dry ice and water


----------



## Hokies83

Quote:


> Originally Posted by *DinaAngel*
> 
> i got same as room temp with lowering one rad into dry ice and water


Hoorah for condensation if that was no issue i would have put my rads in a Fridge lol.


----------



## Divineshadowx

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> It's impossible to get to ambient temp in the loop. Physics won't allow it. You can get close tho. That's also not enough cooling area, regardless if you're running even 5000 rpm san aces to get to under 2C water/air delta with that tdp.


Well, if you have a good enough thermal conductor you can remove near 100% of the heat, so technically it can be possible. I'm not even sure why water is used, its probably because having giant heat pipes running through your system would be stupid. The copper pipes on an air system are actually close to water's thermal conductivity.


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Hokies83*
> 
> Yes that is what im saying... No matter how many rads you have you can never do any better then room temp you can get close.. but you will never reach it/ beat it.
> You have to find out what enough is and go with it..
> My temps were close to room temp meaning within 10c my System also sits beside the AC duct and the AC air goes into it first lol.


Good luck with even mediocre ocs being stable with a 10C delta. I've tested my setup with different amounts of fans on which allow it to run at a 15, 11, 8, 6,3, 2C delta @ TDP and each step there was a change in not only the maximum overclock attainable, but also stability. There was a 400mhz difference in 24hr [email protected] stability for the cpu and 50 offset difference on the gpus between 15C and 2C delta. There is a HUGE difference in performance between an adequate WC setup and an overkill one. $erformance ratio is crap, but there comes a point where that really is irrelevant (really any WC setup $2k+).


----------



## Divineshadowx

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Good luck with even mediocre ocs being stable with a 10C delta. I've tested my setup with different amounts of fans on which allow it to run at a 15, 11, 8, 6,3, 2C delta @ TDP and each step there was a change in not only the maximum overclock attainable, but also stability. There was a 400mhz difference in 24hr [email protected] stability for the cpu and 50 offset difference on the gpus between 15C and 2C delta. There is a HUGE difference in performance between an adequate WC setup and an overkill one. $erformance ratio is crap, but there comes a point where that really is irrelevant (really any WC setup $2k+).


What is adequate and what is normal for you? I have a budget of about 1400 that I will spend, this includes a $500 case. But considering I will buy basically the best water blocks and pumps, res, and thick rads, it should be overkill. Water in general is overkill, some laptops don't even have fans. You seriously can't get anything better at $1000+ for water, there is nothing more, the next step is sub zero cooling, which costs a lot. Ln2 isn't even doable for a htpc because either your components will die or you will run out of money, or end up freezing your hand


----------



## DinaAngel

Quote:


> Originally Posted by *Hokies83*
> 
> Hoorah for condensation if that was no issue i would have put my rads in a Fridge lol.


yeah only for short period of time like benches


----------



## fat_italian_stallion

Quote:


> Originally Posted by *Divineshadowx*
> 
> Well, if you have a good enough thermal conductor you can remove near 100% of the heat, so technically it can be possible. I'm not even sure why water is used, its probably because having giant heat pipes running through your system would be stupid. The copper pipes on an air system are actually close to water's thermal conductivity.


The water in the loop will never be at the room's ambient unless you are using a source with sub ambient temps to cool the loop. Components will be far off from it, even with phase many components will be above ambient
Quote:


> Originally Posted by *Divineshadowx*
> 
> What is adequate and what is normal for you? I have a budget of about 1400 that I will spend, this includes a $500 case. But considering I will buy basically the best water blocks and pumps, res, and thick rads, it should be overkill. Water in general is overkill, some laptops don't even have fans. You seriously can't get anything better at $1000+ for water, there is nothing more, the next step is sub zero cooling, which costs a lot. Ln2 isn't even doable for a htpc because either your components will die or you will run out of money, or end up freezing your hand


You can easily get better, I have that in fittings. Until you're under a 1C delta (generally too small for most monitoring systems to display) at TDP there's always more you can do. Then after you reach that you can do more by making it silent with a goal of still keeping that delta.


----------



## jassilamba

Is it too late to be added to the list, here is my proof pic. Or do I need another one?



Also is there any article that compares the water blocks. I wanna get the hearkiller and want to know if they are as good as I hear they are.


----------



## Arizonian

Quote:


> Originally Posted by *jassilamba*
> 
> Is it too late to be added to the list, here is my proof pic. Or do I need another one?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Also is there any article that compares the water blocks. I wanna get the hearkiller and want to know if they are as good as I hear they are.


Never too late to join.....congrats on the 690.







Welcome to the club.


----------



## Divineshadowx

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> The water in the loop will never be at the room's ambient unless you are using a source with sub ambient temps to cool the loop. Components will be far off from it, even with phase many components will be above ambient
> You can easily get better, I have that in fittings. Until you're under a 1C delta (generally too small for most monitoring systems to display) at TDP there's always more you can do. Then after you reach that you can do more by making it silent with a goal of still keeping that delta.


How? I mean getting the best cpu block and gpu block sets it at one point. The thing is, with my past air cooling heatsinks, and h100, the rads don't even get hot, so how will 3x480x60mm rads get hot?

And I am going for 1500rpm mid range fans for a near quiet system. My current 2000-2500rpm fans are annoying.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> How? I mean getting the best cpu block and gpu block sets it at one point. The thing is, with my past air cooling heatsinks, and h100, the rads don't even get hot, so how will 3x480x60mm rads get hot?
> And I am going for 1500rpm mid range fans for a near quiet system. My current 2000-2500rpm fans are annoying.


Cause the water still heats up no matter if you have 100000 rads..

The block will heat up and be cooled by that water... it can not be done..

1 Thick 480 Rad is enough for your needs.. 2 is overkill 3 is extra overkill.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Cause the water still heats up no matter if you have 100000 rads..
> The block will heat up and be cooled by that water... it can not be done..
> 1 Thick 480 Rad is enough for your needs.. 2 is overkill 3 is extra overkill.


The water does not cool the block, the water acts as a medium to carry the heat to the cooler rad.

And he said 3x480 isnt high end which is what i responded to.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> The water does not cool the block, the water acts as a medium to carry the heat to the cooler rad.
> And he said 3x480 isnt high end which is what i responded to.


And once the water heats up.. it can not cool the block anymore then the water temp...

I already know all of this i knew this 6 years ago...

If you really want to do something Delid that 3770k.... then you could get 5ghz on an Avg Air cooler....
http://www.overclock.net/t/1313179/delidded-ivy-bridge-club/0_20
Instead of doing this...


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> And once the water heats up.. it can not cool the block anymore then the water temp...
> I already know all of this i knew this 6 years ago...
> If you really want to do something Delid that 3770k.... then you could get 5ghz on an Avg Air cooler....
> http://www.overclock.net/t/1313179/delidded-ivy-bridge-club/0_20
> Instead of doing this...


Yes I know I never said it can cool more than ambient temps, I said going past a certain price point wont make a difference. And water destroys air, 30c-40c load on a 690 compared to 70, that is huge. And I will change the tim, 3770k is a failure compared to 2011 so let it die if I screw up.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Yes I know I never said it can cool more than ambient temps, I said going past a certain price point wont make a difference. And water destroys air, 30c-40c load on a 690 compared to 70, that is huge. And I will change the tim, 3770k is a failure compared to 2011 so let it die if I screw up.


Guess you do not know that a 3770k in 99% of the computing world that uses no more then 8 threads runs even with a 3930x... a delidded 3770k that does 5.3ghz = 5.5 ghz.. Is faster then a 3930x....

There is already one person running at 5.4ghz on a H100..









I can do 5ghz with 1.35v.. I am waiting on my Cool labs Liquid Pro to get here and im gunning for 5.3ghz Plus.


----------



## jassilamba

Quote:


> Originally Posted by *Divineshadowx*
> 
> Yes I know I never said it can cool more than ambient temps, I said going past a certain price point wont make a difference. And water destroys air, 30c-40c load on a 690 compared to 70, that is huge. And I will change the tim, 3770k is a failure compared to 2011 so let it die if I screw up.


And thats why I decided to stay with SB for my build.


----------



## Buzzkill

http://www.geforce.com/drivers *NEW* GeForce 306.97 Driver


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Guess you do not know that a 3770k in 99% of the computing world that uses no more then 8 threads runs even with a 3930x... a delidded 3770k that does 5.3ghz = 5.5 ghz.. Is faster then a 3930x....
> There is already one person running at 5.4ghz on a H100..
> 
> 
> 
> 
> 
> 
> 
> 
> I can do 5ghz with 1.35v.. I am waiting on my Cool labs Liquid Pro to get here and im gunning for 5.3ghz Plus.


3930k*

Any program that uses 4 cores and 8 threads most likely uses 6 cores and 12 threads, and those programs scale linearly with more cores and threads. 2011 destroys ivy bridge in every way. More cache, more cores, and basically same overclocking, you can't even get 5ghz+ with ivy w/o serious cooling, like dice, you can however with sb.

How come every in every post you write, you have someone else's info. Test for yourself, you get the most accurate results trust me.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> 3930k*
> Any program that uses 4 cores and 8 threads most likely uses 6 cores and 12 threads, and those programs scale linearly with more cores and threads. *2011 destroys ivy bridge in every way. More cache, more cores,* and basically same overclocking, you can't even get 5ghz+ with ivy w/o serious cooling, like dice, you can however with sb.
> How come every in every post you write, you have someone else's info. Test for yourself, you get the most accurate results trust me.


Those bench marks are almost every Cpu related Benchmark there is.... The results are there and everbody knows it.... it is AnAndtech... a Trusted Source ....

Hardly any programs use more then 8 threads where in gods earth did u get that info from??

IB clock for clock trades blows back and forth with a 3930x * everybody knows this... Common knowledge.. and if you add gaming to the picture 3770k beats a 3030x in over all performance...

And you can get lower temps on a 3770k for about 1$ from your hardware store....

Read my Signature...

How do you own a 3770k and not know any of this.. you talk like somebody who has been in a SB troll thread and not peeked out of it other then to post here lol..


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> 3930x *
> Those bench marks are almost every Cpu related Benchmark there is.... The results are there and everbody knows it.... it is AnAndtech... a Trusted Source ....
> Hardly any programs use more then 8 threads where in gods earth did u get that info from??
> IB clock for clock trades blows back and forth with a 3930x * everybody knows this... Common knowledge..
> And you can get lower temps on a 3770k for about 1$ from your hardware store....
> Read my Signature...
> How do you own a 3770k and not know any of this.. you talk like somebody who has been in a SB troll thread and not peeked out of it other then ti post here lol..


Give me a link to a store that sells the "3930x", how can you even talk about benchmarks if you don't know the accurate name of the cpu.

Anandtech? I've seen an official post showing a 660ti beating a 680, very trusted.

http://community.futuremark.com/hardware/cpu/Intel+Core+i7-3770K+Processor/review

Futuremark, a company who creates benchmarks, not a guy sitting behind a computer making pretty graphs with inaccurate info. 3930k is close to twice the 3770k's performance, when overclocked the score scales exponentially due to more cores/threads now you seem to be judging intel, if 3770k oc beats 3930k oc then why on earth would they put a price tag of almost $300 for the 2011?


----------



## Arizonian

Let's settle down gentlemen. Please keep your points / disagreements free of personal attacks. I don't mind 690 club owners having off topic discussions as long as it dosen't derail too much and become more of a CPU debate. The 690 club thread is not the proper forum for Intel CPU arguments.

Thread repopned.


----------



## lukeman3000

So.. uh.. Got my 690 yesterday!


----------



## iARDAs

I have a question folks.

If i get a 690 one day and hit the vram wall in 1440p gaming with lets say Skyrim + looots of texture mods.

What will happen?

Artifacting? Stutering? Slowing down?


----------



## Hokies83

Quote:


> Originally Posted by *iARDAs*
> 
> I have a question folks.
> If i get a 690 one day and hit the vram wall in 1440p gaming with lets say Skyrim + looots of texture mods.
> What will happen?
> Artifacting? Stutering? Slowing down?


FPS will take a Massive hit cause it will have to tap into your system ram.

at 2560x1440 with texture mods you need more then 2gb of Vram i can tell you in Skyrim with mods ive hit about 3gb Vram use..

Same with Crysis 2 you can also do it in Max Payne 3.

Getting another 4gb 670 is a far better investment for you.


----------



## iARDAs

Quote:


> Originally Posted by *Hokies83*
> 
> FPS will take a Massive hit cause it will have to tap into your system ram.
> at 2560x1440 with texture mods you need more then 2gb of Vram i can tell you in Skyrim with mods ive hit about 3gb Vram use..
> Same with Crysis 2 you can also do it in Max Payne 3.
> Getting another 4gb 670 is a far better investment for you.


Yeah I will but my temps might be too hot that I might have to go watercooling which will be even more expensive.

Best to wait for 790 than if such a GPU is released though.

Thanks for the answer.


----------



## Hokies83

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah I will but my temps might be too hot that I might have to go watercooling which will be even more expensive.
> Best to wait for 790 than if such a GPU is released though.
> Thanks for the answer.


Get Dwoods Antec 620 mod very cheap way to water cool and works great.

http://www.overclock.net/t/1237219/tript-cc-620-920-h50-h70-gpu-brackets-fan-grills-custom-case-badges/0_20

And you can keep re using them








and he will ship to Turkey..


----------



## iARDAs

I will look into it thank you


----------



## Hokies83

Quote:


> Originally Posted by *iARDAs*
> 
> I will look into it thank you


NP any advice you need on a GPU just pm me


----------



## dmaffo

[
This is a quick shot of my new aquired gpu.
If not an inconvenience would please update owners club with my name it
Thnak you


----------



## Arizonian

Quote:


> Originally Posted by *dmaffo*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This is a quick shot of my new aquired gpu.
> If not an inconvenience would please update owners club with my name it
> Thnak you


Congrats - welcome aboard.


----------



## Jessekin32

Random question:

Anyone have an good suggestions for a MicroATX chassis that will hold a Rampage IV Gene, Antec 1200w psu, and two 690's? xD I want the smallest form factor possible for my computer, and want to see if it's even possible. So far I've found a silverstone chassis, but don't think my PSU will fit. (1200's are rather lengthy...)

EDIT: Found this - http://www.newegg.com/Product/Product.aspx?Item=N82E16811163218

It looks big enough....


----------



## DinaAngel

Quote:


> Originally Posted by *Jessekin32*
> 
> Random question:
> Anyone have an good suggestions for a MicroATX chassis that will hold a Rampage IV Gene, Antec 1200w psu, and two 690's? xD I want the smallest form factor possible for my computer, and want to see if it's even possible. So far I've found a silverstone chassis, but don't think my PSU will fit. (1200's are rather lengthy...)
> EDIT: Found this - http://www.newegg.com/Product/Product.aspx?Item=N82E16811163218
> It looks big enough....


Nice XD


----------



## Nocturin

Quote:


> Originally Posted by *Divineshadowx*
> 
> Give me a link to a store that sells the "3930x", how can you even talk about benchmarks if you don't know the accurate name of the cpu.
> Anandtech? I've seen an official post showing a 660ti beating a 680, very trusted.
> http://community.futuremark.com/hardware/cpu/Intel+Core+i7-3770K+Processor/review
> Futuremark, a company who creates benchmarks, not a guy sitting behind a computer making pretty graphs with inaccurate info. 3930k is close to twice the 3770k's performance, when overclocked the score scales exponentially due to more cores/threads now you seem to be judging intel, if 3770k oc beats 3930k oc then why on earth would they put a price tag of almost $300 for the 2011?

























































































this is the funniest thing I've seen all week long. Thank you, I needed the laugh.


----------



## lukeman3000

I have a 1920x1080 Dell IPS monitor. It seems that the GTX 690 could handle a higher resolution, so what should I be looking at if I want to upgrade? I just want a single, 23" (or so) panel. I'm not current with monitor technologies, but I really like the way my IPS panel looks. Does anyone else here use IPS?


----------



## iARDAs

Quote:


> Originally Posted by *lukeman3000*
> 
> I have a 1920x1080 Dell IPS monitor. It seems that the GTX 690 could handle a higher resolution, so what should I be looking at if I want to upgrade? I just want a single, 23" (or so) panel. I'm not current with monitor technologies, but I really like the way my IPS panel looks. Does anyone else here use IPS?


690 certainly can handle higher resolution but I dont believe there are ny 23" 1440p displays in the market.

Your best bet could be to opt for a 27" 1440p panel.

Dell U2711 and a like
or a Korean 1440p panel which are cheaper.


----------



## lukeman3000

Quote:


> Originally Posted by *iARDAs*
> 
> 690 certainly can handle higher resolution but I dont believe there are ny 23" 1440p displays in the market.
> Your best bet could be to opt for a 27" 1440p panel.
> Dell U2711 and a like
> or a Korean 1440p panel which are cheaper.


To me, 27" is too big for a computer monitor. I really like the 23" size, and I don't think I'd want to get one much bigger than that unless I could put it further away from me.

That said, is there really any reason for me to be looking for a higher-resolution monitor if I don't care to have a larger one than 23"? At the distance I sit away from mine, I'm pretty sure I can't see individual pixels. I'll have to check again when I get home.


----------



## iARDAs

Quote:


> Originally Posted by *lukeman3000*
> 
> To me, 27" is too big for a computer monitor. I really like the 23" size, and I don't think I'd want to get one much bigger than that unless I could put it further away from me.
> That said, is there really any reason for me to be looking for a higher-resolution monitor if I don't care to have a larger one than 23"? At the distance I sit away from mine, I'm pretty sure I can't see individual pixels. I'll have to check again when I get home.


Trust me even if you want a 23" monitor a 1440p game or a content will still look sharper and better than 1080p. We are moving into a direction where pretty soon 1440p or 1600p will be standard even in 23" monitors because that's what people will demand and GPUs provide.

That being said I really don't believe that there are IPS panels at 1440p resolution with 23" but I might be wrong.

Check out the 1440p+ gaming club in my signature. All the 1440p and 1600p panels are 27" or above.


----------



## Hokies83

I sit 3 = 4 foot away from my Monitors...

There for a 27inch monitor is ideal for me.. anything smaller is just to small for me.


----------



## jassilamba

Quote:


> Originally Posted by *iARDAs*
> 
> Trust me even if you want a 23" monitor a 1440p game or a content will still look sharper and better than 1080p. We are moving into a direction where pretty soon 1440p or 1600p will be standard even in 23" monitors because that's what people will demand and GPUs provide.
> That being said I really don't believe that there are IPS panels at 1440p resolution with 23" but I might be wrong.
> Check out the 1440p+ gaming club in my signature. All the 1440p and 1600p panels are 27" or above.


I have been thinking about getting those korean monitors, and I wanna run 3 of em. I'm pretty sure a single 690 can handle them for your basic stuff, How is the performance on gaming. I mostly play shooters and of course would love to run them at as high quality as I can. Im not a big fan of 3D so wont be doing that. From what I understand a lil over clock on the 690 can deliver 60 FPS on your average games.

If it matters, Im running a 2700K, 16GB ram.

Would love some advise before I go and make decisions that I regret.

Quote:


> Originally Posted by *Hokies83*
> 
> I sit 3 = 4 foot away from my Monitors...
> There for a 27inch monitor is ideal for me.. anything smaller is just to small for me.


I have a 26 at home and have 3 21.5"s at work and will say I love my home setup for gaming as I do sit about 3 to 4 foot away.


----------



## Divineshadowx

Quote:


> Originally Posted by *jassilamba*
> 
> I have been thinking about getting those korean monitors, and I wanna run 3 of em. I'm pretty sure a single 690 can handle them for your basic stuff, How is the performance on gaming. I mostly play shooters and of course would love to run them at as high quality as I can. Im not a big fan of 3D so wont be doing that. From what I understand a lil over clock on the 690 can deliver 60 FPS on your average games.
> If it matters, Im running a 2700K, 16GB ram.
> Would love some advise before I go and make decisions that I regret.
> I have a 26 at home and have 3 21.5"s at work and will say I love my home setup for gaming as I do sit about 3 to 4 foot away.


If you get 3x high res monitors you will run into an instant vram problem on games like crysis 2. With a single you wont, not sure what the people above are com plaining about. I ran crysis 2 with every single mod and no vram problem, same with skyrim. Just because a gpu uses more vram does not mean the game requires it. I run crysis 2 max with all the mods and get 2gb, if you get higher it is because the game is using the load the textures before hand so if you turn quickly into a new area in a fps it wont stutter for a second, no noticible difference though with high end hardware.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> If you get 3x high res monitors you will run into an instant vram problem on games like crysis 2. With a single you wont, not sure what the people above are com plaining about. I ran crysis 2 with every single mod and no vram problem, same with skyrim. Just because a gpu uses more vram does not mean the game requires it. I run crysis 2 max with all the mods and get 2gb, if you get higher it is because the game is using the load the textures before hand so if you turn quickly into a new area in a fps it wont stutter for a second, no noticible difference though with high end hardware.


I do not know what your talking about. more miss information it seems..

If your card has 2gb Vram and your using 3gb Vram it then uses your system ram to cover the offset which is WAYYY Slower then GPU Vram...

So lets say you were at 120fps with 1.9gb Vram Use.. Then you jump to 2.5gb Use well dude your going to go from 120fps down to 30fps.. Common knowledge.

And if your running 2560x1440 Res with Skyrim/Crysis 2 with texture mods and not going over 2gb Vram then your doing something wrong.. Cause it has been proved that it goes over very easy.. Upwards to 3gb Vram use.


----------



## brettjv

Quote:


> Originally Posted by *Hokies83*
> 
> I do not know what your talking about. more miss information it seems..
> If your card has 2gb Vram and your using 3gb Vram it then uses your system ram to cover the offset which is WAYYY Slower then GPU Vram...
> So lets say you were at 120fps with 1.9gb Vram Use.. Then you jump to 2.5gb Use well dude your going to go from 120fps down to 30fps.. Common knowledge.
> And if your running 2560x1440 Res with Skyrim/Crysis 2 with texture mods and not going over 2gb Vram then your doing something wrong.. Cause it has been proved that it goes over very easy.. Upwards to 3gb Vram use.


Vram 'usage', as seen in afterburner or whatnot, is not a real-time measurement of exactly how much memory is required to render the scene you're looking at, it's a measurement of memory space that's reserved by the application. Many apps will 'use' (aka reserve) a lot more memory than they 'actually need in order to run smoothly', just 'because it's there'. Crysis 2 is one of those games. It ran perfectly on my SLI 470's (1280MB), although the mem usage was nearly constantly pegged.

When I later switched to my 670, I discovered that with the same settings, the app was 'using' 1.8MB of vram. Yet the gameplay experience between the 2 setups was virtually identical, the benchmark scores were within a couple FPS of each other.

It's really not like the minute your 'usage' goes over your 'allotment', that your performance tanks. There's typically somewhere between 10% and 50% of 'leeway' before the perf actually starts plummeting, depending on how the engine/driver are managing the memory.


----------



## Hokies83

Quote:


> Originally Posted by *brettjv*
> 
> Vram 'usage', as seen in afterburner or whatnot, is not a real-time measurement of exactly how much memory is required to render the scene you're looking at, it's a measurement of memory space that's reserved by the application. Many apps will 'use' (aka reserve) a lot more memory than they 'actually need in order to run smoothly'. Crysis 2 is one of those games. It ran perfectly on my SLI 470's (1280MB), although the mem usage was constantly pegged. THen when I switched to my 670, I discovered that with the same settings, the app was 'using' 1.8MB of vram.
> It's really not like the minute your 'usage' goes over your 'allotment', that your performance tanks. There's typically somewhere between 10% and 50% of 'leeway' before the perf actually starts plummetting, depending on how the engine/driver are managing memory usage.


Were talking High Res textures @ 2560x1440 Res here it uses a huge amount of Vram.. Skyrim and Crysis 2 Both would tank my 2 gb 680s when i started using more then lets say 2.1 / 2.2 GB of Vram..

As an example of just how much high res textures change things at 2560x1440 Skyrim with no texture mods would run at about 1.8gb Vram use... With Texture mods i have seen 3.1gb Vram use...
Now what is going to happen to a 2gb card when the app needs that much Vram.. it is going to tank.

Id go from over 100fps to a choppy 30-40fps..

Now with my 4GB 680s the game is smooth no matter the Vram use cause i have plenty to spare.


----------



## pilla99

Did anyone have problems with the latest driver update? last night I installed the driver and was met with a nice black screen near the end of the install. After some time I restarted the computer and had nothing but black, but could tell the computer was booting normally and at the login screen from the keyboard backlight timing.

I had to boot in safe mode and uninstall only the 3d vision portion of the driver. Things work fine now but that was weird man.


----------



## Hokies83

Quote:


> Originally Posted by *pilla99*
> 
> Did anyone have problems with the latest driver update? last night I installed the driver and was met with a nice black screen near the end of the install. After some time I restarted the computer and had nothing but black, but could tell the computer was booting normally and at the login screen from the keyboard backlight timing.
> I had to boot in safe mode and uninstall only the 3d vision portion of the driver. Things work fine now but that was weird man.


Played Darksiders II 50 hrs on 306.23 no issues..

With new driver it has crashed about 8x in 2 hrs.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Were talking High Res textures @ 2560x1440 Res here it uses a huge amount of Vram.. Skyrim and Crysis 2 Both would tank my 2 gb 680s when i started using more then lets say 2.1 / 2.2 GB of Vram..
> As an example of just how much high res textures change things at 2560x1440 Skyrim with no texture mods would run at about 1.8gb Vram use... With Texture mods i have seen 3.1gb Vram use...
> Now what is going to happen to a 2gb card when the app needs that much Vram.. it is going to tank.
> Id go from over 100fps to a choppy 30-40fps..
> Now with my 4GB 680s the game is smooth no matter the Vram use cause i have plenty to spare.


Do you not read what anyone posts? Brettjv said what I said and you still go on ranting about your 3gb. A game that is running fine at the same settings youre using would not need a gig more of ram. The only thing that really uses vram is resolution. And if you ran into a vram problem you would get 0.4 fps not 40.
And where did you get that system ram usage, that is an option in windows to allocate disk space for virtual ram. The gpu is not in anyway related to system ram, system ram is reserved for the cpu and the cpu only, that is why gpu ram was created, if it was your way companies would focus on creating 16gb ram at 6000mhz. Never once have I seen ram usage go up due to a gpu vram bottleneck. Talk about missinformation.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> Do you not read what anyone posts? Brettjv said what I said and you still go on ranting about your 3gb. A game that is running fine at the same settings youre using would not need a gig more of ram. The only thing that really uses vram is resolution. And if you ran into a vram problem you would get 0.4 fps not 40.
> And where did you get that system ram usage, that is an option in windows to allocate disk space for virtual ram. The gpu is not in anyway related to system ram, system ram is reserved for the cpu and the cpu only, that is why gpu ram was created, if it was your way companies would focus on creating 16gb ram at 6000mhz. Never once have I seen ram usage go up due to a gpu vram bottleneck. Talk about missinformation.


Im a Gpu Company Hardware rep / reviewer i know exactly what im talking about.

When a Gpu runs out of Vram it has to take it from somewhere else.. SYSTEM RAM.


----------



## Divineshadowx

Quote:


> Originally Posted by *Hokies83*
> 
> Im a Gpu Company Hardware rep / reviewer i know exactly what im talking about.
> When a Gpu runs out of Vram it has to take it from somewhere else.. SYSTEM RAM.


How does being a gpu reviewer for a gpu comapany mean you know anything. Next thing were gonna hear is that youre bill gates. Youre probably the one who told nvidia to put 2gb instead of 4gb vram in the 690. I havnt seen you post a single thing that makes sense.


----------



## Hokies83

Quote:


> Originally Posted by *Divineshadowx*
> 
> How does being a gpu reviewer for a gpu comapany mean you know anything. Next thing were gonna hear is that youre bill gates. Youre probably the one who told nvidia to put 2gb instead of 4gb vram in the 690. I havnt seen you post a single thing that makes sense.


Nothing you post makes sense like your 3930k is twice as fast as a 3770k...

When you run out of Vram your sysystem has to take it from somewhere else Your System ram.

I msged my guy who works for Nvidia on AnAndtech to post the Grafs and results and information on this just to show you have no idea...

And it is not my fault the Gtx 690 only has 2gb of Vram that was Nvidia's Fail for the price of that thing it should of had 4gb not 2gb.


----------



## jassilamba

Quote:


> Originally Posted by *Hokies83*
> 
> Nothing you post makes sense like your 3930k is twice as fast as a 3770k...
> When you run out of Vram your sysystem has to take it from somewhere else Your System ram.
> I msged my guy who works for Nvidia on AnAndtech to post the Grafs and results and information on this just to show you have no idea...
> And it is not my fault the Gtx 690 only has 2gb of Vram that was Nvidia's Fail for the price of that thing it should of had 4gb not 2gb.


But dosen't the 690 has total of 4gb of ram, 2 for each GPU??


----------



## Alatar

clock for clock the 3770K is around ~5-7% faster in less than 4 threaded stuff. The 3930K is around 45% faster in stuff that can use all the 12 threads.

(sorry for OT







)


----------



## Hokies83

Quote:


> Originally Posted by *jassilamba*
> 
> But dosen't the 690 has total of 4gb of ram, 2 for each GPU??


Yes 2gb per card but it does not add 2gb+2gb it is only 2gb of useable Vram.. Same as you were to SLI 2 2gb cards.

You'll possibly also see some odd behaviour at times - where, if you're asking the GPU to load too much in terms of textures, it doesn't even try to fit them into VRAM, and just goes straight to system RAM... in these scenarios, you can be seeing what amounts to a random VRAM usage number in monitoring software, and high system RAM usage levels, but also at the same time terrible framerates.

From my Nvidia guy 2 mins ago..
Also in here http://www.overclock.net/t/1131405/do-you-need-2gb-vram/0_20
Quote:


> Originally Posted by *Alatar*
> 
> clock for clock the 3770K is around ~5-7% faster in less than 4 threaded stuff. The 3930K is around 45% faster in stuff that can use all the 12 threads.
> (sorry for OT
> 
> 
> 
> 
> 
> 
> 
> )


Yes but how many things use more then 8 threads in the year 2012... Not many at all.. Were still 5 = 10 years away from when more then 8 threads is needed by the Avg user.


----------



## Alatar

Quote:


> Originally Posted by *Hokies83*
> 
> Yes but how many things use more then 8 threads in the year 2012... Not many at all.. Were still 5 = 10 years away from when more then 8 threads is needed by the Avg user.


Aside from games, most of the stuff that actually benefit from a CPU as fast as the 3770K or the 3930K do use more than 4 threads... And most of the time that means that the software can utilize all the 12 of the 3930K as well.

No point in comparing CPUs like that in mainstream programs, browser usage or something like that. Basically the market is gaming and worstation use. Gaming gives a small edge to the 3770K (SB-E does really well when extra cache is useful though) and rendering/video or img editing/ encoding etc. workstation usage pretty clearly favors the 3930K.

We can do a bench off when I get my RAM back from rma if you want


----------



## jassilamba

Quote:


> Originally Posted by *Hokies83*
> 
> Yes 2gb per card but it does not add 2gb+2gb it is only 2gb of useable Vram.. Same as you were to SLI 2 2gb cards.
> You'll possibly also see some odd behaviour at times - where, if you're asking the GPU to load too much in terms of textures, it doesn't even try to fit them into VRAM, and just goes straight to system RAM... in these scenarios, you can be seeing what amounts to a random VRAM usage number in monitoring software, and high system RAM usage levels, but also at the same time terrible framerates.


So I should stay away from 3 1440P monitors and stick with 1, or do I go 3 1080P ????


----------



## Hokies83

Quote:


> Originally Posted by *Alatar*
> 
> Aside from games, most of the stuff that actually benefit from a CPU as fast as the 3770K or the 3930K do use more than 4 threads... And most of the time that means that the software can utilize all the 12 of the 3930K as well.
> No point in comparing CPUs like that in mainstream programs, browser usage or something like that. Basically the market is gaming and worstation use. Gaming gives a small edge to the 3770K (SB-E does really well when extra cache is useful though) and rendering/video or img editing/ encoding etc. workstation usage pretty clearly favors the 3930K.
> We can do a bench off when I get my RAM back from rma if you want


Sure im going to De lid my 3770k this weekend and pump it way up 5.3 = 5.5ghz.. as i do not need a lot of Vcore....

But as far as 3D mark 11 goes.. My cpu was fail sauce i was scoreing about 1000 points less then i should have...
Quote:


> Originally Posted by *jassilamba*
> 
> So I should stay away from 3 1440P monitors and stick with 1, or do I go 3 1080P ????


3x 1080i monitors can use more then 2gb of Vram to.

How long do u plan on keeping the 690 is the question of concern here?

1 2560x1440 panel will not at this date go over 2gb Vram use if you do not use texture mods keep that in mind but that is just todays games you do not know what will happen 6 months from now..

The Gtx 790 will have 3gb of Vram.


----------



## Alatar

Quote:


> Originally Posted by *Hokies83*
> 
> Sure im going to De lid my 3770k this week end and pump it way up 5.3 = 5.5ghz.. as i do not need a lot of Vcore....
> But as far as 3D mark 11 goes.. My cpu was fail sauce i was scoreing about 1000 points less then i should have...


Sure, I'll crank my CPU up to 5.5 as well









Also clock memory higher and tighten timings for a better physics score.


----------



## iARDAs

Quote:


> Originally Posted by *Hokies83*
> 
> The Gtx 790 will have 3gb of Vram.


Honestly I would wish for 4gb of Vram







But still 3 is not bad. I am going to wait until t comes out and that will be my next purchase


----------



## Hokies83

Quote:


> Originally Posted by *Alatar*
> 
> Sure, I'll crank my CPU up to 5.5 as well
> 
> 
> 
> 
> 
> 
> 
> 
> Also clock memory higher and tighten timings for a better physics score.


My memory will do no more then 2400mhz.

But even with 2400mhz ram my 3d mark physx score was 1000 points lower then it should have been.

http://www.3dmark.com/3dm11/4159887


----------



## Alatar

Quote:


> Originally Posted by *Hokies83*
> 
> My memory will do no more then 2400mhz.
> But even with 2400mhz ram my 3d mark physx score was 1000 points lower then it should have been.


Compared to what scores? Those specific tridentX sticks aren't really special. Single sided hynix afaik, the 2600mhz+ versions have much better chips (dual sided as well).

Anyway, can't get the timings any tighter?


----------



## jassilamba

Quote:


> Originally Posted by *Hokies83*
> 
> Sure im going to De lid my 3770k this weekend and pump it way up 5.3 = 5.5ghz.. as i do not need a lot of Vcore....
> But as far as 3D mark 11 goes.. My cpu was fail sauce i was scoreing about 1000 points less then i should have...
> 3x 1080i monitors can use more then 2gb of Vram to.
> How long do u plan on keeping the 690 is the question of concern here?
> 1 2560x1440 panel will not at this date go over 2gb Vram use if you do not use texture mods keep that in mind but that is just todays games you do not know what will happen 6 months from now..
> The Gtx 790 will have 3gb of Vram.


I do plant to stay with this card for a while to be honest (well thats what I said 6 months ago). And i just dropped $200.00 on a heatkiller block with backplate, so would love to keep this for a while. (at-least) an year. And yes if a game comes out that i want to play and can play at 60 fps, I will get another card (give its as sexy as the 690).

And on a off note, why does NVIDIA market the card as 4GB....


----------



## lukeman3000

Quote:


> Originally Posted by *Hokies83*
> 
> Were talking High Res textures @ 2560x1440 Res here it uses a huge amount of Vram.. Skyrim and Crysis 2 Both would tank my 2 gb 680s when i started using more then lets say 2.1 / 2.2 GB of Vram..
> As an example of just how much high res textures change things at 2560x1440 Skyrim with no texture mods would run at about 1.8gb Vram use... With Texture mods i have seen 3.1gb Vram use...
> Now what is going to happen to a 2gb card when the app needs that much Vram.. it is going to tank.
> Id go from over 100fps to a choppy 30-40fps..
> Now with my 4GB 680s the game is smooth no matter the Vram use cause i have plenty to spare.


So.. is a 690 not such a hot decision then? I am a little surprised it only has 2GB of usable VRAM for being the mercedes benz of graphics cards. Not very future proof.. ? Or am I missing something here?

Will I run into problems gaming at 1440 with a 690?


----------



## pilla99

Quote:


> Originally Posted by *lukeman3000*
> 
> So.. is a 690 not such a hot decision then? I am a little surprised it only has 2GB of usable VRAM for being the mercedes benz of graphics cards. Not very future proof.. ? Or am I missing something here?
> Will I run into problems gaming at 1440 with a 690?


I've had no problems personally. The game that kicks my system the most right now is Metro 2033. On DX9 very high settings 1440p I get 30's for FPS. I don't think that's a vram thing though.

BF3 and everything else run 60 FPS minimum.


----------



## dmaffo

Thank you! Having lots of fun simply playing around with stress tests. But would like to try out some games. Anything good flight or driving style games you recommend?
I know, I have this card and not a gamer... go figure...


----------



## Hokies83

Quote:


> Originally Posted by *dmaffo*
> 
> Thank you! Having lots of fun simply playing around with stress tests. But would like to try out some games. Anything good flight or driving style games you recommend?
> I know, I have this card and not a gamer... go figure...


Download Steam and have a look thru everything.


----------



## lukeman3000

Quote:


> Originally Posted by *pilla99*
> 
> I've had no problems personally. The game that kicks my system the most right now is Metro 2033. On DX9 very high settings 1440p I get 30's for FPS. I don't think that's a vram thing though.
> BF3 and everything else run 60 FPS minimum.


Woah what? I thought you should be getting at least 60FPS solid on Metro? What res are you at? How many monitors?

Edit: Nevermind, just saw your 2x 1440 Catleap. Well, that's probably why lol. I bet I'll be fine with a single monitor at 1920x1080.

Also, add me to the list!


----------



## jassilamba

Quote:


> Originally Posted by *lukeman3000*
> 
> Woah what? I thought you should be getting at least 60FPS solid on Metro? What res are you at? How many monitors?
> Edit: Nevermind, just saw your 2x 1440 Catleap. Well, that's probably why lol. I bet I'll be fine with a single monitor at 1920x1080.
> Also, add me to the list!
> 
> 
> Spoiler: Warning: Spoiler!


Everytime I see this sexy card, I get a nice warm feeling.


----------



## pilla99

Quote:


> Originally Posted by *lukeman3000*
> 
> Woah what? I thought you should be getting at least 60FPS solid on Metro? What res are you at? How many monitors?
> Edit: Nevermind, just saw your 2x 1440 Catleap. Well, that's probably why lol. I bet I'll be fine with a single monitor at 1920x1080.
> Also, add me to the list!


I should probably edit my sig I no longer have the two catleaps just one. Sold the second one because it turns out here at college I just don't have the room for two of those bad boys. Right now I have a catleap and a 32" Vizio.

I am playing Metro on one catleap at 1440p and getting those poor results. I am going to edit some setting tonight and see if the new driver will do anything. But yea the game wasn't being kind in the opening scene. I did much better than this performance wise when I had the two 6870's at 1920.

Also another thing I didn't think about but I don't have multi GPU mode enabled because of the two monitors. Hmm ok I'll report back with this disabled and see what happens.

Edit2: Ok I'm ******ed. The multi GPU being disabled on account of the two monitors was why my performance sucked. I am getting 60's for average FPS now at very high settings 1440p. Much better.


----------



## Hokies83

Quote:


> Originally Posted by *pilla99*
> 
> I should probably edit my sig I no longer have the two catleaps just one. Sold the second one because it turns out here at college I just don't have the room for two of those bad boys. Right now I have a catleap and a 32" Vizio.
> I am playing Metro on one catleap at 1440p and getting those poor results. I am going to edit some setting tonight and see if the new driver will do anything. But yea the game wasn't being kind in the opening scene. I did much better than this performance wise when I had the two 6870's at 1920.
> Also another thing I didn't think about but I don't have multi GPU mode enabled because of the two monitors. Hmm ok I'll report back with this disabled and see what happens.


After market cooler = FTW


----------



## Arizonian

Quote:


> Originally Posted by *lukeman3000*
> 
> Woah what? I thought you should be getting at least 60FPS solid on Metro? What res are you at? How many monitors?
> Edit: Nevermind, just saw your 2x 1440 Catleap. Well, that's probably why lol. I bet I'll be fine with a single monitor at 1920x1080.
> Also, add me to the list!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats and welcome to the club.







Looks good.


----------



## lukeman3000

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats and welcome to the club.
> 
> 
> 
> 
> 
> 
> 
> Looks good.


Thanks!

And to the guy with poor Metro 2033 performance -- I'm glad you got it sorted! You were essentially playing with a single 680 LOL


----------



## DinaAngel

Quote:


> Originally Posted by *jassilamba*
> 
> Everytime I see this sexy card, I get a nice warm feeling.


heat related







?
we can recomend radiators for yourself








http://en.wikipedia.org/wiki/Liquid_Cooling_and_Ventilation_Garment


----------



## jassilamba

Quote:


> Originally Posted by *DinaAngel*
> 
> heat related
> 
> 
> 
> 
> 
> 
> 
> ?
> we can recomend radiators for yourself
> 
> 
> 
> 
> 
> 
> 
> 
> http://en.wikipedia.org/wiki/Liquid_Cooling_and_Ventilation_Garment


What dont we have out there. That is awesome, but I guess this time of the year in MN, any heat is welcome lol.


----------



## Bkirca

hi,

I have just bought an asus gtx 690 and have some questions as a noob one in overclocking








well i have not any kind of OC experience before but after reading some posts here i wonder why are some values look weird on my card;


Spoiler: Warning: Spoiler!
























Any idea about why the "current clock" line shows values that low? and also i am using Asus P8Z68-V Pro Gen3 mainboard which supports PCI-E 3.0 but nvidia inspector shows PCI-E 3.0x16*@x16 2.0* (Asus Gpu Tweak even shows it *@x16 1.1*). The card is slotted in the uppermost PCI slot (as seen in fotos above) and i am pretty sure that the slot is suitable for PCI-E 3.0/2.0/1.1

I want to get the real performance of this card so please help me with it


----------



## Hokies83

Quote:


> Originally Posted by *Bkirca*
> 
> hi,
> I have just bought an asus gtx 690 and have some questions as a noob one in overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> well i have not any kind of OC experience before but after reading some posts here i wonder why are some values look weird on my card;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any idea about why the "current clock" line shows values that low? and also i am using Asus P8Z68-V Pro Gen3 mainboard which supports PCI-E 3.0 but nvidia inspector shows PCI-E 3.0x16*@x16 2.0* (Asus Gpu Tweak even shows it *@x16 1.1*). The card is slotted in the uppermost PCI slot (as seen in fotos above) and i am pretty sure that the slot is suitable for PCI-E 3.0/2.0/1.1
> I want to get the real performance of this card so please help me with it


Card throttles down at idle to reduce power use.

When u stress the card depending on how hard it will go to it's max boost clock.


----------



## DinaAngel

Quote:


> Originally Posted by *Bkirca*
> 
> hi,
> I have just bought an asus gtx 690 and have some questions as a noob one in overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> well i have not any kind of OC experience before but after reading some posts here i wonder why are some values look weird on my card;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any idea about why the "current clock" line shows values that low? and also i am using Asus P8Z68-V Pro Gen3 mainboard which supports PCI-E 3.0 but nvidia inspector shows PCI-E 3.0x16*@x16 2.0* (Asus Gpu Tweak even shows it *@x16 1.1*). The card is slotted in the uppermost PCI slot (as seen in fotos above) and i am pretty sure that the slot is suitable for PCI-E 3.0/2.0/1.1
> I want to get the real performance of this card so please help me with it


nothing wrong there


----------



## Bkirca

Great!









Then what about PCI-E 3.0 thing? Also i was wondering if its that important or makes difference whether it is 2.0 or 3.0...


----------



## PhantomTaco

Honestly it makes 0 difference. There are no cards out there that entirely saturate PCI-E 2.0 (or maybe the 690 was at the limit I forget), it'll make a difference maybe in a generation of cards or so, but not now.


----------



## Hokies83

Quote:


> Originally Posted by *Bkirca*
> 
> Great!
> 
> 
> 
> 
> 
> 
> 
> 
> Then what about PCI-E 3.0 thing? Also i was wondering if its that important or makes difference whether it is 2.0 or 3.0...


Your hardware is older..

Pci-e 3.0 Is Z77 chipset with an Ivy Bridge Cpu.
Quote:


> Asus P8Z68-V Pro Gen3 mainboard


Z68 chip set is Sandy Bridge which does not support PCi-E 3.0

I donno about the 690 and Bandwidth you would think it would help a Gtx 690 more so then a Single gpu... I do know how ever on a Single Gpu like a 680 Pci-e 2.0 vs 3.0 With normal Use is 1-2% performance..

With Extreme res and multi gpu's it is alot more.


----------



## Bkirca

thank you for the replies!









I am not 100% confident when they say they "do support" something but as written in the specs (below link) the mainboard supports PCI-E 3.0;

http://www.asus.com/Motherboards/Intel_Socket_1155/P8Z68V_PROGEN3/

Anyway... Then anyone can tell me how far can i force this card? I saw below setting in the first page, can it be forced further?


----------



## DinaAngel

Quote:


> Originally Posted by *Bkirca*
> 
> thank you for the replies!
> 
> 
> 
> 
> 
> 
> 
> 
> I am not 100% confident when they say they "do support" something but as written in the specs (below link) the mainboard supports PCI-E 3.0;
> http://www.asus.com/Motherboards/Intel_Socket_1155/P8Z68V_PROGEN3/
> Anyway... Then anyone can tell me how far can i force this card? I saw below setting in the first page, can it be forced further?
> 
> 
> Spoiler: Warning: Spoiler!


doesnt hurt to go bit extreme







but u need very good cooling for this not nitrogen but good enaugh watercooling, i get stabile on 150 on core mostly but 500 on core is half stabile its just i can get it stabile with dryice and alcohol


----------



## pilla99

Figured I'd post my system pic and desk since I never really did. This is my college / no room anywhere setup. Poor baby has to operate at 30,000 feet. Been a tank though. With all these MacBooks around here it confuses hipsters when they see it.


----------



## lukeman3000

Quote:


> Originally Posted by *pilla99*
> 
> Figured I'd post my system pic and desk since I never really did. This is my college / no room anywhere setup. Poor baby has to operate at 30,000 feet. Been a tank though. With all these MacBooks around here it confuses hipsters when they see it.


Is your 690 water cooled?


----------



## pilla99

Quote:


> Originally Posted by *lukeman3000*
> 
> Is your 690 water cooled?


Nope all air.


----------



## lukeman3000

Quote:


> Originally Posted by *pilla99*
> 
> Nope all air.


What kind of temps do you get up to under load? I think I've seen mine hit 84 at the max.

Also, if I wanted to water cool my 690, what is the cheapest solution? I don't want to sacrifice a ton of quality, but would it make sense to get a radiator/pump/reservoir combo along with a 690 water block instead of doing a full-on custom loop?

Honestly, I don't really care about water cooling my CPU because I was just planning on getting Hyper 212 even though they are extremely cliche. However, they have been shown to work as well if not better than some of the all-in-one cpu coolers such as the Antec 620.


----------



## Divineshadowx

Recommendations for a gtx 690 water block? I ordered my TH10 from case labs, with a xxl window and white matte finish, and want to go with a blue tubing scheme so I was looking to get the xspc water block. The koolance seems to give good temps, but I don't really like the look. The evga also looks cool, but I think it's overpriced and seems to have problems when I looked at reviews.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Divineshadowx*
> 
> Recommendations for a gtx 690 water block? I ordered my TH10 from case labs, with a xxl window and white matte finish, and want to go with a blue tubing scheme so I was looking to get the xspc water block. The koolance seems to give good temps, but I don't really like the look. The evga also looks cool, but I think it's overpriced and seems to have problems when I looked at reviews.


I have the EK full cover block for my gtx 690. Cools great and looks pretty good too. I'm happy with it.


----------



## jassilamba

Quote:


> Originally Posted by *Divineshadowx*
> 
> Recommendations for a gtx 690 water block? I ordered my TH10 from case labs, with a xxl window and white matte finish, and want to go with a blue tubing scheme so I was looking to get the xspc water block. The koolance seems to give good temps, but I don't really like the look. The evga also looks cool, but I think it's overpriced and seems to have problems when I looked at reviews.


TH10, nice choice. I would like to have a case like that in my next build.

I ordered the heatkiller 690 hole edition. its a heavy yet very pretty looking block. I'm a fan of all metal blocks. From what I have heard heatkillers perform really well too.


----------



## DinaAngel

Alphacool NexXxoS Monsta 480mm
and bitspower fans
and 690 waterblock
ordered!!


----------



## Divineshadowx

Quote:


> Originally Posted by *DinaAngel*
> 
> Alphacool NexXxoS Monsta 480mm
> and bitspower fans
> and 690 waterblock
> ordered!!


Which block? Ima order 2 of those fat rads for now, they seem like the best on the market for their fpi rating.


----------



## DinaAngel

http://www.ekwb.com/shop/blocks/vga-blocks/fc-geforce/geforce-gtx-6x0-series/ek-fc690-gtx-acetal-nickel.html

sweet







, yeah i wanted the 360 one but my mind told me from the price that long run its worth with 480


----------



## DinaAngel

hi, is this loop ok or is it better a different way?


Spoiler: Warning: Big picture!


----------



## jassilamba

Quote:


> Originally Posted by *DinaAngel*
> 
> hi, is this loop ok or is it better a different way?
> 
> 
> Spoiler: Warning: Big picture!


Looks like you are headed

Res/Pump > 240 > CPU > 480 > GPU > Res/Pump.

How about this:

Res/Pump > GPU > 480 > CPU > 240 > Res/Pump. (your GPU is going to dump the most heat in the loop)

From what I have been told and to me it kind of makes sense is, - the water is gonna flow really fast so does not matter what order you go in as long as you start at the res and end at the res, and you have the shortest tubing runs.


----------



## DinaAngel

not sure if thats an better idea since with
Res/Pump > 240 > CPU > 480 > GPU > Res/Pump.
then its gonna let the 480 for the gpu and 240 for cpu what i feel is smartest but thanks for the idea, i look forward to more people giving theyr toughts

Thanks


----------



## Divineshadowx

Quote:


> Originally Posted by *DinaAngel*
> 
> not sure if thats an better idea since with
> Res/Pump > 240 > CPU > 480 > GPU > Res/Pump.
> then its gonna let the 480 for the gpu and 240 for cpu what i feel is smartest but thanks for the idea, i look forward to more people giving theyr toughts
> Thanks


That is the optimal way. Res> Pump> Rad> Gpu/Cpu> Rad> Gpu/Cpu> Res. The only time you wouldn't follow that is if you will require significant more tubing to get it that way, that will just reduce your flow and make minimal difference.


----------



## DinaAngel

Quote:


> Originally Posted by *Divineshadowx*
> 
> That is the optimal way. Res> Pump> Rad> Gpu/Cpu> Rad> Gpu/Cpu> Res. The only time you wouldn't follow that is if you will require significant more tubing to get it that way, that will just reduce your flow and make minimal difference.


Thanks


----------



## Methos07

Count me in


----------



## Arizonian

Quote:


> Originally Posted by *Methos07*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Count me in


Congrats and consider yourself in.


----------



## Methos07

Thanks! Maybe you guys would know, but how does the GeForce logo on the side of the 690 heatsink work exactly? Are the lights inside actually green, or could the color be changed to something not green?


----------



## lukeman3000

Quote:


> Originally Posted by *Methos07*
> 
> Thanks! Maybe you guys would know, but how does the GeForce logo on the side of the 690 heatsink work exactly? Are the lights inside actually green, or could the color be changed to something not green?


I'm pretty sure you can't change the color, but you can change its behavior. You can dim it, make it "breathe", or turn it off altogether if you want.

Here's the tool


----------



## Methos07

Indeed, it's nifty piece of software. I'm pretty sure it's just colorless backlighting, but I wanted to make sure. I would love to make it red to match my RIVE.


----------



## DinaAngel

cant be that hard to hard modd on the pcb


----------



## DinaAngel

btw guys i reocmend this http://coollaboratory.com/shop/product_info.php/products_id/3/osCsid/117470e92e67e2692ac16801c75f02cc

thermal paste


----------



## max883

Evga realesed new PresisinX 3.04 tool with Evga-K boost volt


----------



## thestache

Quote:


> Originally Posted by *max883*
> 
> Evga realesed new PresisinX 3.04 tool with Evga-K boost volt


Didn't work for me or my little brother.

His GTX 690 reported 0% usage on the cores in ganes and weird errors and my GTX 680 SLI did the same. K-Boost was a ncie idea to fix the possibility of the sticky clock bug but didn't work for me.


----------



## Jamesk

I got a new toy Today:


----------



## Arizonian

Quote:


> Originally Posted by *Jamesk*
> 
> I got a new toy Today:
> 
> 
> Spoiler: Warning: Spoiler!


And we got a new member







She looks good lurking in the dark......Congrats and welcome aboard.


















"How to put your Rig in your Sig"


----------



## OverClocker55

Ok so I have a friend and he own a pc site and has tons of contacts and wholesalers. He says he could get me a GTX 690 for around 400 oem. Is that even possible?


----------



## grassy

I have had one of these cards for a while now and thought i would get another. Now i am running them in quad sli. Havn't overclocked anything and have never messed with my bios. I have been doing some testing and so far have come to the conclusion that when i was using1 of these cards, my gaming was a little better than using two in sli. Has anyone else found this to be true. When i was using only the one my game ran a lot smoother. I must be missing something or doing something wrong.


----------



## DinaAngel

Quote:


> Originally Posted by *grassy*
> 
> I have had one of these cards for a while now and thought i would get another. Now i am running them in quad sli. Havn't overclocked anything and have never messed with my bios. I have been doing some testing and so far have come to the conclusion that when i was using1 of these cards, my gaming was a little better than using two in sli. Has anyone else found this to be true. When i was using only the one my game ran a lot smoother. I must be missing something or doing something wrong.


its the reason i never buyed two, since 2 does worse than 1 since no games can stress two almost not one, Tiny Tom logan tested with two and found out that two is worse than one, so no ty


----------



## iARDAs

I never tested or used a 690 but from the things i read here in OCN or other places, although a 690 QUAD SLI is a very hardcore setup and I admire people with such setups, the gains are minimal and usually its actually worse.

Maybe its because of the games or maybe driver problems I don't know.

Again, i neevr used a 690 or a quad 690 (used to own 590 though) and what I said above is what i read in various places.


----------



## Methos07

Even when the quad sli doesn't hurt the performance, the gains you get from another card are nothing compared to the gains of a single 690. Even at it's best the gains are what, 35-45% from adding another card? Hardcore diminishing returns.


----------



## MrTOOSHORT

2gb of vram hurts quad-sli gtx 690s.

On to other news, here is my watercooled gtx690:


----------



## bitMobber

How has everyone's GTX 690 been holding up? Has anyone had to RMA a card?

I've had no problems with mine for over 5 months until now. It all started when I begin to receive the error: "Display driver stopped responding and has recovered". Now I don't even received the error but instead when I'm playing a game the game will freeze for a split second and then the my monitor screen will go black. It loses the video connection completely but the computer does not freeze; I can still hear the game running in the background. I'm forced to hard restart my computer at this point.

Nothing in my computer is over clocked and I haven't changed any hardware or software. I've tried to downgrade to older Nvidia drivers, cleaned all traces of Nvidia drivers/software and reinstalled the newest drivers but the problem still persists.


----------



## lukeman3000

Quote:


> Originally Posted by *bitMobber*
> 
> How has everyone's GTX 690 been holding up? Has anyone had to RMA a card?
> I've had no problems with mine for over 5 months until now. It all started when I begin to receive the error: "Display driver stopped responding and has recovered". Now I don't even received the error but instead when I'm playing a game the game will freeze for a split second and then the my monitor screen will go black. It loses the video connection completely but the computer does not freeze; I can still hear the game running in the background. I'm forced to hard restart my computer at this point.
> Nothing in my computer is over clocked and I haven't changed any hardware or software. I've tried to downgrade to older Nvidia drivers, cleaned all traces of Nvidia drivers/software and reinstalled the newest drivers but the problem still persists.


Interesting. I received a similar message the other day about how the display kernel had stopped responding and had been recovered or something like that.

However, it seemed to be an isolated incident until one day when I restarted my computer, windows would not load. I saw my bios POST and the Windows loading screen, but when I should've seen the desktop I just saw some funky lines of color at the top of an otherwise black screen.

I rebooted into safe mode and uninstalled the driver. I installed driver fusion, rebooted, and cleaned out all traces of Nvidia graphics driver. Then, I reinstalled the driver from my local Nvidia folder. C:/Nvidia/306.97 (or whatever the most recent driver is)

I restarted and Windows loaded just fine. Been playing games and everything has been good thus far. I'm thinking/hoping it was just a driver issue. If not, no big. EVGA has a great warranty.


----------



## tonyjones

What resolution are you guys gaming at to take advantage of the beastly GTX 690?


----------



## bitMobber

Quote:


> Originally Posted by *lukeman3000*
> 
> Interesting. I received a similar message the other day about how the display kernel had stopped responding and had been recovered or something like that.
> However, it seemed to be an isolated incident until one day when I restarted my computer, windows would not load. I saw my bios POST and the Windows loading screen, but when I should've seen the desktop I just saw some funky lines of color at the top of an otherwise black screen.
> I rebooted into safe mode and uninstalled the driver. I installed driver fusion, rebooted, and cleaned out all traces of Nvidia graphics driver. Then, I reinstalled the driver from my local Nvidia folder. C:/Nvidia/306.97 (or whatever the most recent driver is)
> I restarted and Windows loaded just fine. Been playing games and everything has been good thus far. I'm thinking/hoping it was just a driver issue. If not, no big. EVGA has a great warranty.


That's basically what happened to me at first. The issue only happened once or twice a few weeks apart but now it's happening every time I play a game. Oddly enough it sometimes will happen when I'm just at the Windows desktop. I've tried to clean all traces of the driver using Driver Sweeper and installing the latest 306.97 as well.

I've already contact EVGA support and they suggest I use Driver Fusion to clean drivers and re-install them. They also asked me to run EVGA OC Scanner X for awhile and see if any artifacts appear. So far I've been running it for 1.5 hours with no artifacts. They also suggest to put the 690 in another computer and see if the problem follows the card to the other computer - - I'm going to try this later today.


----------



## Hokies83

Quote:


> Originally Posted by *tonyjones*
> 
> What resolution are you guys gaming at to take advantage of the beastly GTX 690?


It has 2GB of Vram so 1080i would be 100% safe zone with out running dry. <-- I have heard reports of ppl also running out of Vram on 1080i.. But best i have done with extreme AA is around 1800Vram use..

The Eye Candy games i play on my 2560x1440 monitor.

2560x1440 Could be an issue in some games but on a single monitor Near all should be fine with out texture mods and extreme AA and such.

I know it says 4gb in the Ad but there just adding 2gb per card but it only had 2gb of usable Vram which was a Fail on Nvida/Board partners part... The Gtx 690 should have shipped out with 4GB of usable

Vram..


----------



## lukeman3000

Quote:


> Originally Posted by *tonyjones*
> 
> What resolution are you guys gaming at to take advantage of the beastly GTX 690?


I'm just playing at 1920x1080 on my 23" Dell IPS. Love it


----------



## tonyjones

damn i was hoping it would be okay max'd out on any game at 2560x1600, I currently have the 6990 radeon but I play at 1920x1200 only so haven't punched the bullet yet


----------



## Hokies83

Quote:


> Originally Posted by *tonyjones*
> 
> damn i was hoping it would be okay max'd out on any game at 2560x1600, I currently have the 6990 radeon but I play at 1920x1200 only so haven't punched the bullet yet


If your gaming at that Res 2560x1600 look at 4gb 680s or a 7970/7950.

Depending on the brand you like better.

1920x1200 690 is more then enough.. I am not sure what 4k texture mods and extreme AA will do to it tho.

Somebody here would have to chime in on that.


----------



## tonyjones

thanks on the insight, i'll research more on that, i play on getting 2 x 27" or 1 x 30" monitor soon


----------



## qiplayer

Hi everybody!
In june I bought 2 gtx690, first I was running them on a p8p67 mobo with i2600 k.
Then as I didn't get better performance as with 2 gtx 680, I upgraded mobo and cpu to a 3930k.
Still I was getting better performance with the 2 680.
I wrote for about 3 weeks with the evga tech support, without a positive esit.
I even changed psu, as they were saying thqt a 850 watt psu wasn't enough, but nothing.

Now I'm back with the 2 gtx 680, just wanted to say it has a big waste of money, and the techsupport made me wait days for every answer and ended with a no answer after I run tests and upgraded the rest of the pc.

Just to say don't be loyal to a brand it isn't worth it, and buy where you can bring things back, this mostly with dual gpu's.

One single 690 with oc, little oc, was 20%slower than 2 680.
This on res 6000x1080.

That's it.
Just if you think about to add another, don't!
Wait for gk110


----------



## Arizonian

Quote:


> Originally Posted by *grassy*
> 
> I have had one of these cards for a while now and thought i would get another. Now i am running them in quad sli. Havn't overclocked anything and have never messed with my bios. I have been doing some testing and so far have come to the conclusion that when i was using1 of these cards, my gaming was a little better than using two in sli. Has anyone else found this to be true. When i was using only the one my game ran a lot smoother. I must be missing something or doing something wrong.
> 
> 
> Spoiler: Warning: Spoiler!


Welcome to OCN. Great first post showing your 690 SLI set up.


















I can't answer your question regarding 690 SLI issue not having it myself. I do recall another 690 member who had those issues in this thread with his SLI set up and I forgot what that individual did to resolve it.


----------



## bobbavet

Gday Guys

New to the club and modder of cases.

I'm looking for info on minimum WC for a 3770K/gtx 690 build.

Goin for a mini ITX build and space at a minumum. Silence is the goal not absolute cool, but in saying that I do OC the 3770K.

Rule of thumb says I'm looking at 360 total rad.

Was considering a push pull 280 x 56mm rad for the entire loop.

Any experience, info and links appreciated. 8)

Cheers

Bob


----------



## Falcon3




----------



## Arizonian

Quote:


> Originally Posted by *Falcon3*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Wow....two new members to OCN and the 690 club in one day....welcome.







Congrats on the card.


















How to put your Rig in your Sig


----------



## Methos07

Right now I'm using dual 120hz 1080p displays for my 690. Actually takes more horsepower graphically to do 1080p @120hz than 1440p @ 60hz.


----------



## grassy

Quote:


> Originally Posted by *bitMobber*
> 
> How has everyone's GTX 690 been holding up? Has anyone had to RMA a card?
> I've had no problems with mine for over 5 months until now. It all started when I begin to receive the error: "Display driver stopped responding and has recovered". Now I don't even received the error but instead when I'm playing a game the game will freeze for a split second and then the my monitor screen will go black. It loses the video connection completely but the computer does not freeze; I can still hear the game running in the background. I'm forced to hard restart my computer at this point.
> Nothing in my computer is over clocked and I haven't changed any hardware or software. I've tried to downgrade to older Nvidia drivers, cleaned all traces of Nvidia drivers/software and reinstalled the newest drivers but the problem still persists.


Try switching your dvi monitor cable to another port and see what happens. Thats the only thing i can think off as to why your picture would tatally break down like that. That can also bring up the blue screen telling you that the computer has recovered from a serious error. Dont downgrade your drivers as you shouldnt have to do that Go into your nvidia control panel and have a browes in thier and see how your config is setup but try the monitor cable first.


----------



## bobbavet

Quote:


> Originally Posted by *Arizonian*
> 
> Wow....two new members to OCN and the 690 club in one day....welcome.
> 
> 
> 
> 
> 
> 
> 
> Congrats on the card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How to put your Rig in your Sig


Yeh had it about 3 weeks, had to sell heaps of stuff and my soul on Ebay for my fist ever top line card. Well I had the 6990 but meh. lmao


----------



## bitMobber

Quote:


> Originally Posted by *grassy*
> 
> Try switching your dvi monitor cable to another port and see what happens. Thats the only thing i can think off as to why your picture would tatally break down like that. That can also bring up the blue screen telling you that the computer has recovered from a serious error. Dont downgrade your drivers as you shouldnt have to do that Go into your nvidia control panel and have a browes in thier and see how your config is setup but try the monitor cable first.


I'm pretty sure the reason the monitor goes black is because of that driver not responding error. I think the nvidia drivers crash but they do not recover.

I did 4 hours of stress testing the 690 and there were no artifacts. Right now I'm testing the card in another computer to see if the issue follows the card to the other machine. I'm running my old GTX 295 in my machine and I'm having no problems. Hmm...


----------



## Buzzkill

Quote:


> Originally Posted by *bitMobber*
> 
> I'm pretty sure the reason the monitor goes black is because of that driver not responding error. I think the nvidia drivers crash but they do not recover.
> I did 4 hours of stress testing the 690 and there were no artifacts. Right now I'm testing the card in another computer to see if the issue follows the card to the other machine. I'm running my old GTX 295 in my machine and I'm having no problems. Hmm...


I have the same kind of issue. Sometimes while surfing internet I get a black screen and I hit Escape or Ctrl+Alt+Del and it come's back or I can wait and desktop comes back. It has not done it gaming. I don't get the driver is not responding error message when It goes black. It has done it for with the last three drivers; counting Beta. Using 306.97 - WHQL. I am looking into getting a bigger power supply but I think more of driver issue than power supply.


----------



## bitMobber

Quote:


> Originally Posted by *Buzzkill*
> 
> I have the same kind of issue. Sometimes while surfing internet I get a black screen and I hit Escape or Ctrl+Alt+Del and it come's back or I can wait and desktop comes back. It has not done it gaming. I don't get the driver is not responding error message when It goes black. It has done it for with the last three drivers; counting Beta. Using 306.97 - WHQL. I am looking into getting a bigger power supply but I think more of driver issue than power supply.


Your power supply looks like it's powerful enough to handle it.

When my screen goes black there is nothing I can do, not even ctrl+alt+del. The signal to the video card is completely lost.

I'm beginning to think it's a software or driver issue as well. So far my 690 is acting perfectly fine in my other machine.


----------



## OverClocker55

So I contacted a wholesaler today and there is a GTX 690 3GB??? Legit? 260$ OEM. No branding or anything.


----------



## grassy

Quote:


> Originally Posted by *Arizonian*
> 
> Welcome to OCN. Great first post showing your 690 SLI set up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't answer your question regarding 690 SLI issue not having it myself. I do recall another 690 member who had those issues in this thread with his SLI set up and I forgot what that individual did to resolve it.


Thanks for the welcome, its good to be here.Its good to know that there is people out there with the same card as myself to be able to share the ins and outs of the 690.


----------



## Venatik

*points at sig*

Built the rig at the beginning of this month, but cannot post (or have any) pictures from work.









Will post some when I get home in about 30 days!









And yeah, I haven't used the PC yet. A friend keeps reminding me of the fact every day... the bastard.


----------



## pilla99

So sort of a nightmare this morning. Last night per usual I left the computer on so that it can charge my phone. It goes to sleep after an hour of no activity but still charges USB devices. I turned the monitor off before it actually fell asleep because it's sorta bright and the girlfriend complained.

Ok cool.

This morning I wake it up and turn the monitor back on.
This is happening, the lines are moving around and changing direction, this is not static.



Commence freak out.
I restart the computer and everything boots up just fine. I didn't touch a single cable to the computer or montitor and everything is working fine now.
These are my overclock settings. Vsync off in BF3 I hit 82C max after an hour or so of playing. So I don't think it's getting too hot.



I have no idea what this was. Again everything is find right now. I have never had artifacting or anything weird with this card ever. I am going to game for a bit and see if anything weird happens. Could this just be a fluke?


----------



## Hokies83

Quote:


> Originally Posted by *pilla99*
> 
> So sort of a nightmare this morning. Last night per usual I left the computer on so that it can charge my phone. It goes to sleep after an hour of no activity but still charges USB devices. I turned the monitor off before it actually fell asleep because it's sorta bright and the girlfriend complained.
> Ok cool.
> This morning I wake it up and turn the monitor back on.
> This is happening, the lines are moving around and changing direction, this is not static.
> 
> Commence freak out.
> I restart the computer and everything boots up just fine. I didn't touch a single cable to the computer or montitor and everything is working fine now.
> These are my overclock settings. Vsync off in BF3 I hit 82C max after an hour or so of playing. So I don't think it's getting too hot.
> 
> I have no idea what this was. Again everything is find right now. I have never had artifacting or anything weird with this card ever. I am going to game for a bit and see if anything weird happens. Could this just be a fluke?


Did it go into sleep mode?

Sleep mode can cause all kinda of little issues.. i had a gtx 470 die in sleep mode cause i forgot to cut it off...

Also your board does not have the option of USB power to be on with the computer off?


----------



## pilla99

No it has gone to sleep many times without any issues. Almost every night. I haven't checked maybe it does support it.


----------



## Hokies83

Quote:


> Originally Posted by *pilla99*
> 
> No it has gone to sleep many times without any issues. Almost every night. I haven't checked maybe it does support it.


So has others does not mean it happens everytime.

First thing i always do now is turn sleep mode off.


----------



## thestache

Quote:


> Originally Posted by *qiplayer*
> 
> Hi everybody!
> In june I bought 2 gtx690, first I was running them on a p8p67 mobo with i2600 k.
> Then as I didn't get better performance as with 2 gtx 680, I upgraded mobo and cpu to a 3930k.
> Still I was getting better performance with the 2 680.
> I wrote for about 3 weeks with the evga tech support, without a positive esit.
> I even changed psu, as they were saying thqt a 850 watt psu wasn't enough, but nothing.
> Now I'm back with the 2 gtx 680, just wanted to say it has a big waste of money, and the techsupport made me wait days for every answer and ended with a no answer after I run tests and upgraded the rest of the pc.
> Just to say don't be loyal to a brand it isn't worth it, and buy where you can bring things back, this mostly with dual gpu's.
> One single 690 with oc, little oc, was 20%slower than 2 680.
> This on res 6000x1080.
> That's it.
> Just if you think about to add another, don't!
> Wait for gk110


Why are people still bothering with GTX 690 SLI when so many people in this thread have shown its pointless.

Tri 4GB GTX 680 SLI is all you need for surround. Anything more is pointless and for your useless e-peen.


----------



## DinaAngel

Quote:


> Originally Posted by *thestache*
> 
> Why are people still birthing with GTX 690 SLI when so many people in this thread have shown its pointless.
> Tri 4GB GTX 680 SLI is all you need for surround. Anything more is pointless and for your useless e-peen.


Totally agree
got a friend who is all like, ohh i got two 690s!! and then we line up scores and he got worse fps than single mine stock, but he has major lagg issues with scaling all the time, i dont have issues with lagg so im pleased.

my 690 got issues in games lately with newest drivers, most games are like 20% on each gpu on lowest.
it really seem to be drivers not allowing fully loads, so two 690s must be really really really bad,

but in heaven benchmark its 99% load on each gpu core on mine


----------



## Arizonian

My Core clock went from 1170 MHz to 1163 MHz Core without me touching anything. Memory has been at stock. Noticed during 306.23 driver & persisted in 306.42 using Windows 7. Rolling back to 301.42 did not fix the lower Core clocks achievable any longer.

Seems my normal +Core offset that used to get me 1170 MHz Core is only getting me 1163 MHz Core.

Increasing Core Offset to max OC 1176 MHz I've achieved it still only reached 1163 MHz even though GPU usage was 97%.

Only took +10 on Core Offset to crash game.

Installed Windows 8 Pro Preview tonight and still no fixing the Core over clock decrease. I'm leaning toward chip degradation even though voltage was never added. It's strange because I'm gaming very stable ATM and maxed settings in games isn't effecting stellar performance except for the fact I'm not going over 1163 MHz on the first GPU anymore.

My second GPU is suffering same decrease as it used to run at 1197 MHz and now only 1189 MHz beside the other GPU.

What do you guys think?









Edited to add: Whether I have my Power Target at +135% or +100% results in 1163 MHz Core clock max OC.


----------



## thestache

Quote:


> Originally Posted by *Arizonian*
> 
> My Core clock went from 1170 MHz to 1163 MHz Core without me touching anything. Memory has been at stock. Noticed during 306.23 driver & persisted in 306.42 using Windows 7. Rolling back to 301.42 did not fix the lower Core clocks achievable any longer.
> Seems my normal +Core offset that used to get me 1170 MHz Core is only getting me 1163 MHz Core.
> Increasing Core Offset to max OC 1176 MHz I've achieved it still only reached 1163 MHz even though GPU usage was 97%.
> Only took +10 on Core Offset to crash game.
> Installed Windows 8 Pro Preview tonight and still no fixing the Core over clock decrease. I'm leaning toward chip degradation even though voltage was never added. It's strange because I'm gaming very stable ATM and maxed settings in games isn't effecting stellar performance except for the fact I'm not going over 1163 MHz on the first GPU anymore.
> My second GPU is suffering same decrease as it used to run at 1197 MHz and now only 1189 MHz beside the other GPU.
> What do you guys think?
> 
> 
> 
> 
> 
> 
> 
> 
> Edited to add: Whether I have my Power Target at +135% or +100% results in 1163 MHz Core clock max OC.


My brothers card has dropped 10mhz off one core also since we got it and they both used to run at 1212mhz. Could it be possible the GPUs don't like the overclock and as the result of it are degrading and needing more voltage to maintain it?

Could be why Nvidia don't want overclocking on this generation.


----------



## DinaAngel

Quote:


> Originally Posted by *Arizonian*
> 
> My Core clock went from 1170 MHz to 1163 MHz Core without me touching anything. Memory has been at stock. Noticed during 306.23 driver & persisted in 306.42 using Windows 7. Rolling back to 301.42 did not fix the lower Core clocks achievable any longer.
> Seems my normal +Core offset that used to get me 1170 MHz Core is only getting me 1163 MHz Core.
> Increasing Core Offset to max OC 1176 MHz I've achieved it still only reached 1163 MHz even though GPU usage was 97%.
> Only took +10 on Core Offset to crash game.
> Installed Windows 8 Pro Preview tonight and still no fixing the Core over clock decrease. I'm leaning toward chip degradation even though voltage was never added. It's strange because I'm gaming very stable ATM and maxed settings in games isn't effecting stellar performance except for the fact I'm not going over 1163 MHz on the first GPU anymore.
> My second GPU is suffering same decrease as it used to run at 1197 MHz and now only 1189 MHz beside the other GPU.
> What do you guys think?
> 
> 
> 
> 
> 
> 
> 
> 
> Edited to add: Whether I have my Power Target at +135% or +100% results in 1163 MHz Core clock max OC.


Quote:


> Originally Posted by *thestache*
> 
> My brothers card has dropped 10mhz off one core also since we got it and they both used to run at 1212mhz. Could it be possible the GPUs don't like the overclock and as the result of it are degrading and needing more voltage to maintain it?
> Could be why Nvidia don't want overclocking on this generation.


i doubt its degrading it might be something getting hotter than 70 degrees and then it auto will down clock amps,
try set fan speed to max and open heaven benchmark and try. dont use any games to test.
the setting power target to +135 isnt gonna do anything at all since ur core clock is too low for that,

id say ur power is 65 to 75% out of 100% power so why set it to 135%?????
its not gonna do much at all, its not gonna make it any hotter even.

# kepler overclocking
mhz #if then# 1
= stable or not calculating by risk and need of amps

#1#
temp #if then# amp needed
= sets amp higher if temp is ok to its own calculations or some diagram built in,
like this

###########################
overclocking
1300mhz #if then# #2#
= sends request #2# is allowed?
Aquired from #3# was allowed
= stable

#2#
-100 degrees #if then# 999
= sends signal #3# that its perfectly fine for max amps from its calculations of the temp on load
# kepler overclocking
###########################
higher volts arent needed.
the volt isnt the ampere the volt is the volt on kepler.
power draw and power limit and so depends
like if u set it to 150+ on core then its gonna draw more power but it might not need more than 100% power but the temps is too risky for its sensors to set amp higher to where it needs it

1280 mhz then it might go over 100% power and needs 35+ powertarget
1300mhz needs 135 or 150 to stay stabile, id say it might need lilbit more, i havent calculated it.
but yeah i got the 150 power target bioses, so when i get waterblock end of week or next monday then ill try for 1300 or so

u can try refresh bioses.

i dont understand how u guys dont know this?
this is dynamic !!

the 690 have built in adjuster to allow more or less amps in so if the temp is ok and then it can set it higher,
i bet all of you can manage 1300mhz with just low enaugh temps
70% out of 100% on 1180mhz


----------



## Brocky-LJ

New GTX690 owner here, awesome card.


----------



## Methos07

The noctuas we have certainly do a wonderful job of just smothering a motherboard.

Like the case,. is that some sort of haf-x green edition?


----------



## Brocky-LJ

I think they look awesome, but best of all they work very well. It is the Nvidia edition HAF-X case with custom SMD LED lighting under the panels and inside the case. All the lighting runs connected to the stock fan light switch on the front panel of the case so the lights can be switched off at any time. The idea had been done before, i liked it so did it myself, but with a bit more customisation.


----------



## Methos07

Looks pretty good. I've been wanting to do something aesthetically pleasing based around the glowing logo, but, nothing yet.



My RIVE has red lights, the card has green, and my ocz revo 3 x2 has blue. Derp.


----------



## qiplayer

Quote:


> Originally Posted by *thestache*
> 
> Why are people still bothering with GTX 690 SLI when so many people in this thread have shown its pointless.
> Tri 4GB GTX 680 SLI is all you need for surround. Anything more is pointless and for your useless e-peen.


I didn't read the other 260 pages, anyway I think it's something nvidia itself should tell


----------



## pilla99

Sorry can someone explain to why 2 690's actually do worse than a single card?
I was playing BF3 today and some kid was bragging that he gets about 150 fps on 1440p ultra. This isn't actually possible right?

For reference I get 70 -120 on ultra 1440p. Decent overclock too.


----------



## Nocturin

Quote:


> Originally Posted by *pilla99*
> 
> Sorry can someone explain to why 2 690's actually do worse than a single card?
> *I was playing BF3 today and some kid was bragging* that he gets about 150 fps on 1440p ultra. This isn't actually possible right?
> For reference I get 70 -120 on ultra 1440p. Decent overclock too.


thats your problem







.


----------



## Hokies83

Quote:


> Originally Posted by *pilla99*
> 
> Sorry can someone explain to why 2 690's actually do worse than a single card?
> I was playing BF3 today and some kid was bragging that he gets about 150 fps on 1440p ultra. This isn't actually possible right?
> For reference I get 70 -120 on ultra 1440p. Decent overclock too.


150fps nothing to brag about i can hit between 175 and 200 Fps at 2560x1440 on Ultra lol.


----------



## pilla99

Quote:


> Originally Posted by *Nocturin*
> 
> thats your problem
> 
> 
> 
> 
> 
> 
> 
> .


He basically sat there and listed every dream component you could imagine that his tower supposedly possessed but I was genuinely curious because two 690's (what he claimed) does worse correct? is 150 FPS even possible to average?

Quote:


> Originally Posted by *Hokies83*
> 
> 150fps nothing to brag about i can hit between 175 and 200 Fps at 2560x1440 on Ultra lol.


That's pretty much what the server chat turned into.


----------



## Nocturin

Quote:


> Originally Posted by *Hokies83*
> 
> 150fps nothing to brag about i can hit between 175 and 200 Fps at 2560x1440 on Ultra lol.


roof:
?








Quote:


> Originally Posted by *pilla99*
> 
> He basically sat there and listed every dream component you could imagine that his tower supposedly possessed but I was genuinely curious because two 690's (what he claimed) does worse correct? is 150 FPS even possible to average?
> That's pretty much what the server chat turned into.


CPU performance has a lot to do with BF3 MP.

Although 120+ FPS is kinda un-noticable.

I want me some 120hz monitors so I can test it for myself


----------



## Hokies83

Quote:


> Originally Posted by *Nocturin*
> 
> roof:
> ?
> 
> 
> 
> 
> 
> 
> 
> 
> CPU performance has a lot to do with BF3 MP.
> Although 120+ FPS is kinda un-noticable.
> I want me some 120hz monitors so I can test it for myself


LoL Proof is in the 25k gpu scores









I do not have BF3 installed atm did not find it very fun... Also just got a clean install of Windows on here after i got my Overclock stable.


----------



## Nocturin

Quote:


> Originally Posted by *Hokies83*
> 
> LoL Proof is in the 25k gpu scores
> 
> 
> 
> 
> 
> 
> 
> 
> I do not have BF3 installed atm did not find it very fun... Also just got a clean install of Windows on here after i got my Overclock stable.


bah. i'm to simple for points


----------



## OverClocker55

Quote:


> Originally Posted by *Brocky-LJ*
> 
> New GTX690 owner here, awesome card.
> 
> 
> Spoiler: Warning: Spoiler!


Look so good! Loving the colors


----------



## pilla99

Quote:


> Originally Posted by *Brocky-LJ*
> 
> New GTX690 owner here, awesome card.


Mkay as an owner of a HAF X I'm gonna need to know what the hell is going on here. Did you LED those panels?


----------



## Brocky-LJ

Thanks guys, I'm also using a gigabyte G1 Sniper 3 board so colours are going well.
The nvidia edition has sort of transparent rubber feel green panels. This is what it looks like with lights behind it. I had to cut tabs out from under the panels, then I laid SMD led light strips, soldered them together, soldered them into the main front fan wires with ground going to ground on fan power and positive going to fan light switch ground and I can turn them on and off. It looks awesome in person it just glows. I'm just in the need of two 200mm green fans preferably with clear casing and must move decent air, then a 140mm green fan.


----------



## Renairy

Add me please....








-


----------



## Arizonian

Quote:


> Originally Posted by *Brocky-LJ*
> 
> New GTX690 owner here, awesome card.
> 
> 
> Spoiler: Warning: Spoiler!


I like the case coming from a modder prespective personally I've not seen that one before.









Welcome to the GTX 690 Club and OCN.









How to list a Rig

Quote:


> Originally Posted by *Renairy*
> 
> Add me please....
> 
> 
> 
> 
> 
> 
> 
> 
> -
> 
> 
> Spoiler: Warning: Spoiler!


Nice looking red theme rig Renairy. Congrats on the 690 and welcome to the club.









Both


----------



## Renairy

Do you think the 690's got the lower binned chips?
Both GPU ASIQ quality's are in the 60's but they OC to 1202Mhz stable.


----------



## Arizonian

Quote:


> Originally Posted by *thestache*
> 
> My brothers card has dropped 10mhz off one core also since we got it and they both used to run at 1212mhz. Could it be possible the GPUs don't like the overclock and as the result of it are degrading and needing more voltage to maintain it?
> Could be why Nvidia don't want overclocking on this generation.


Verdict isn't out but for Core to drop to 1163 MHz, even though it's under 70C, where before same gaming (everything) I hit 1170 MHz seems weird. I've not touched settings now for months on my 24/7 OC I set and forget. I was stable so I didn't bother to look back at settings once I found my ceiling.

I've ruled out drivers as previously posted. If by some means my Core drops further we'll know something is up. In any rate 7 MHz translates to less than 1 FPS so it's no biggie.

As for the voltage & Nvidia.....one thing I've noticed is they come out with pretty high clocks right off the bat. Personally I've used voltage bumps on my GTX 580 only benching. Otherwise I'm against running my cards out of voltage specs. At first I thought they were being conservative with voltage and now I'm middle of the fence that they don't want to be replacing cards within two years of purchase for everyone.

@ DinaAngel - Kepler down clocks in straps of 13 MHz so it's not my temps. I did rule that out as my temps didn't change.

Quote:


> Originally Posted by *Renairy*
> 
> Do you think the 690's got the lower binned chips?
> Both GPU ASIQ quality's are in the 60's but they OC to 1202Mhz stable.


Both GPU's are hitting 1202 MHz stable is great. My second does 1202 MHz but my first GPU is holding it back.

My ASIC is 62.1% and I hit 1170 MHz eerrrr I mean 1163 MHz Core. I don't feel they are poorly binned at all. I think they did test the chips at 915 MHz but why check any higher when binning batches? So it's possible we do get good chips. Another reason I don't feel they do I ask myself, Why would any company put out their highest priced card and use poorly binned chips on purpose?

Companies bin chips for advertised Core clocks and nothing more. Hence the lottery we all hope is golden chip in a box.


----------



## Renairy

Quote:


> Originally Posted by *Arizonian*
> 
> Both GPU's are hitting 1202 MHz stable is great. My second does 1202 MHz but my first GPU is holding it back.
> My ASIC is 62.1% and I hit 1170 MHz eerrrr I mean 1163 MHz Core. I don't feel they are poorly binned at all. I think they did test the chips at 915 MHz but why check any higher when binning batches? So it's possible we do get good chips. Another reason I don't feel they do I ask myself, Why would any company put out their highest priced card and use poorly binned chips on purpose?
> Companies bin chips for advertised Core clocks and nothing more. Hence the lottery we all hope is golden chip in a box.


I would say they are below average binned chips since the GTX690 is clocked lower therefore no chance of instability...
It makes logical sense.
Putting under performing chips into a card that doesnt require them to clock high is relevant.
I could be wrong.

I had sli 680's before this and they were both in the high 80's.

I dno, but most 690's wont hit 1200mhz.


----------



## Ripa1

http://i11.aijaa.com/b/00706/11141599.jpghttp://i11.aijaa.com/b/00706/11141599.jpgCount me in


----------



## Ripa1

http://i11.aijaa.com/b/00706/11141599.jpg One more closer picture.


----------



## Brocky-LJ

Quote:


> Originally Posted by *Arizonian*
> 
> I like the case coming from a modder prespective personally I've not seen that one before.
> 
> 
> 
> 
> 
> 
> 
> 
> Welcome to the GTX 690 Club and OCN.
> 
> 
> 
> 
> 
> 
> 
> 
> How to list a Rig
> Nice looking red theme rig Renairy. Congrats on the 690 and welcome to the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both


Just a note, my card is gigabyte not evga, sorry, I should have mentioned that.


----------



## DinaAngel

Quote:


> Originally Posted by *Renairy*
> 
> I would say they are below average binned chips since the GTX690 is clocked lower therefore no chance of instability...
> It makes logical sense.
> Putting under performing chips into a card that doesnt require them to clock high is relevant.
> I could be wrong.
> I had sli 680's before this and they were both in the high 80's.
> I dno, but most 690's wont hit 1200mhz.


iv allready explained why
in earlier post

http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2610#post_18422267
this is dynamic overclocking, and its bit special on the 690 since it doesnt want to give the bump in amps to get stabile. because it doesnt give it above 50 degrees Celsius

also u have frequency and ripple and many factors, there is a diagram how the overclocking works ill try find it but its hyper sensitive on 690s

this is for the 680 but for 690 i think its much more sensitive to temprature
http://cdn.overclock.net/3/34/34a9aeaa_clock_vs_temp.gif
dont forget that the 690s got built in powersaving that does make this difficult


----------



## jassilamba

Had my Heatkiller GTX 690 block come in, thought would share a pic:


----------



## Hokies83

Quote:


> Originally Posted by *jassilamba*
> 
> Had my Heatkiller GTX 690 block come in, thought would share a pic:


That's a Sexy block Sir.


----------



## bobbavet

Gday Guys

I have now started a worklog on my new build around the GTX690.

The "Silent but Deadly" Bitfenix Prodigy

enjoy

Bob


----------



## Renairy

This card is amazingly powerful.
My previous 2x 680's @ 1230Mhz SLI scored *102* FPS

This monster on its own scores almost *101* FPS


----------



## bobbavet

Quote:


> Originally Posted by *Renairy*
> 
> This card is amazingly powerful.
> My previous 2x 680's @ 1230Mhz SLI scored *102* FPS
> This monster on its own scores almost *101* FPS










Exactly, 680 Sli owners can blah blah blah all they like with just as much hot air and energy as what is generated by their setup.

I jumped from 680 to the 690, put simply it is a POWERPACK!


----------



## iARDAs

a 590 was not even as powerful as a 570 SLI but the 690 is as powerful as 680 SLI

I would always choose 690 over 680 2GB SLI.


----------



## jassilamba

Quote:


> Originally Posted by *bobbavet*
> 
> 
> 
> 
> 
> 
> 
> 
> Exactly, 680 Sli owners can blah blah blah all they like with just as much hot air and energy as what is generated by their setup.
> I jumped from 680 to the 690, put simply it is a POWERPACK!


That was my main reason to go for the 690, and I think its cheaper to run a 690, then a 680 SLI. Takes less space too.


----------



## iARDAs

Quote:


> Originally Posted by *jassilamba*
> 
> That was my main reason to go for the 690, and I think its cheaper to run a 690, then a 680 SLI. Takes less space too.


spot on.

670 SLI vs 690 could be debatable in few ways
but
680 SLI vs 690 is not as debatable.

The only upside of a 680 SLI setup is that unfortunately dual core GPUs tend to loose a bit more value in the future. This is a fact here in Turkey but I have a feeling that it could be the same in some of the other countries as well.


----------



## jassilamba

Quote:


> Originally Posted by *iARDAs*
> 
> spot on.
> 670 SLI vs 690 could be debatable in few ways
> but
> 680 SLI vs 690 is not as debatable.
> The only upside of a 680 SLI setup is that unfortunately dual core GPUs tend to loose a bit more value in the future. This is a fact here in Turkey but I have a feeling that it could be the same in some of the other countries as well.


I think its the same all across the globe.


----------



## Cobolt005

Sup guys just figured I'd let you guys know that Watercool has released their HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl I haven't seen that anyone other then Watercool has it atm.
http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/15531


----------



## pilla99

So..... This summer I am thinking of putting my whole system underwater. I have never really seen a system personally with it and I have never obviously had to put it together or work on it.

I was wondering. If I wanted to put my 690 and CPU under, what would I need and or where is a good tutorial on how to put it together? GPU and CPU just run too hot for me to be comfortable.


----------



## jassilamba

Quote:


> Originally Posted by *Cobolt005*
> 
> Sup guys just figured I'd let you guys know that Watercool has released their HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl I haven't seen that anyone other then Watercool has it atm.
> http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/15531


I got mine from Performance Pcs. LINK

*Edit: Never mind spoke too soon.* That block looks great.


----------



## Buzzkill

Quote:


> Originally Posted by *Cobolt005*
> 
> Sup guys just figured I'd let you guys know that Watercool has released their HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl I haven't seen that anyone other then Watercool has it atm.
> http://shop.watercool.de/epages/WatercooleK.sf/en_GB/?ObjectPath=/Shops/WatercooleK/Products/15531


Was released Monday


----------



## jassilamba

Quote:


> Originally Posted by *pilla99*
> 
> So..... This summer I am thinking of putting my whole system underwater. I have never really seen a system personally with it and I have never obviously had to put it together or work on it.
> I was wondering. If I wanted to put my 690 and CPU under, what would I need and or where is a good tutorial on how to put it together? GPU and CPU just run too hot for me to be comfortable.


Start looking at build logs of other people and that will get you an idea.

At minimum you will need the following:

CPU Waterblock
GPU Waterblock
2 Radiators (1 for CPU and 1 for GPU)
A water pump
A Reservoir
Tubing and fittings

What radiators you can fit will depend on what case you have.

Here is link to a basic guide. LINK

Martins liquid lab is a great source for water cooling learning. I'm building my first loop and I will say took me about a month of reading and understanding and planing to get all the parts.

Here is a link to a very good beginner kit that comes with everything you need to water cool your CPU - LINK


----------



## thestache

Quote:


> Originally Posted by *DinaAngel*
> 
> i doubt its degrading it might be something getting hotter than 70 degrees and then it auto will down clock amps,
> try set fan speed to max and open heaven benchmark and try. dont use any games to test.
> the setting power target to +135 isnt gonna do anything at all since ur core clock is too low for that,
> id say ur power is 65 to 75% out of 100% power so why set it to 135%?????
> its not gonna do much at all, its not gonna make it any hotter even.
> # kepler overclocking
> mhz #if then# 1
> = stable or not calculating by risk and need of amps
> #1#
> temp #if then# amp needed
> = sets amp higher if temp is ok to its own calculations or some diagram built in,
> like this
> ###########################
> overclocking
> 1300mhz #if then# #2#
> = sends request #2# is allowed?
> Aquired from #3# was allowed
> = stable
> #2#
> -100 degrees #if then# 999
> = sends signal #3# that its perfectly fine for max amps from its calculations of the temp on load
> # kepler overclocking
> ###########################
> higher volts arent needed.
> the volt isnt the ampere the volt is the volt on kepler.
> power draw and power limit and so depends
> like if u set it to 150+ on core then its gonna draw more power but it might not need more than 100% power but the temps is too risky for its sensors to set amp higher to where it needs it
> 1280 mhz then it might go over 100% power and needs 35+ powertarget
> 1300mhz needs 135 or 150 to stay stabile, id say it might need lilbit more, i havent calculated it.
> but yeah i got the 150 power target bioses, so when i get waterblock end of week or next monday then ill try for 1300 or so
> u can try refresh bioses.
> i dont understand how u guys dont know this?
> this is dynamic !!
> 
> the 690 have built in adjuster to allow more or less amps in so if the temp is ok and then it can set it higher,
> i bet all of you can manage 1300mhz with just low enaugh temps
> 70% out of 100% on 1180mhz


Card has been watercooled out of the box. So it's got nothing to do with that. Card is at its max overclock which is why it's running max power target.


----------



## Arm3nian

Do you guys increase the voltage when increasing clock and memory speeds? I got 150mhz on the clock and 500 on the memory with the stock voltage. Either I have the god of 690s or something is up.


----------



## pilla99

Quote:


> Originally Posted by *Arm3nian*
> 
> Do you guys increase the voltage when increasing clock and memory speeds? I got 150mhz on the clock and 500 on the memory with the stock voltage. Either I have the god of 690s or something is up.


Jesus Christ is that card melted or what. What temps are you getting under load?


----------



## DinaAngel

Quote:


> Originally Posted by *thestache*
> 
> Card has been watercooled out of the box. So it's got nothing to do with that. Card is at its max overclock which is why it's running max power target.


yeah but it was much different for me than for u guys with that bios i used, it was drawing 55 amps from pcie so i had much more overclocking possibillity with the bioses for 690 i had. so it wasnt drawing power from the 8 pins almost at all, but with normal bioses right now then its totally using alot of power from 8 pins and rarely much from pcie.


----------



## Brocky-LJ

I upgraded from GTX680 Sli, performance is on par definitely.


----------



## Renairy

Quote:


> Originally Posted by *Arm3nian*
> 
> Do you guys increase the voltage when increasing clock and memory speeds? I got 150mhz on the clock and 500 on the memory with the stock voltage. Either I have the god of 690s or something is up.


Nothing is up... that's pretty normal (actually *slightly* above average) since the memory chips on the 690's are Samsung opposed to the ones in the 670's and 680's which are hynix.

I get +145 on the core and +550 on the memory..


----------



## lukeman3000

Quote:


> Originally Posted by *fLaXi0n*
> 
> has anyone test the DPC Latency with his GTX 690?
> In other forums, people got high latencys (normal latencys are under ~60) and sound crackle stutters or lags ingame cause high latency.
> I have send my back for a new one which comes next week.
> With my old GTX 580 i have no Problems always under 50 micro sec. but with the GTX 690 about 400-800 or more in windows and in games like d3 sc2 or other games, specially in windowfullscreen i get sound crackle after 20 or 30 seconds.
> I think the GTX 690 is broken in hardware like the GTX 460 where no updates could fix this issue,
> Sounds like the GTX 690 is an beta and it cant be fixed -.-
> U can all easily check it with DPC Latency Checker


Yes, I just noticed that my DPC is like 700 us, and when I disable the GTX 690 driver in device manager, it drops down to ~50.


----------



## pilla99

Quote:


> Originally Posted by *Renairy*
> 
> Nothing is up... that's pretty normal (actually *slightly* above average) since the memory chips on the 690's are Samsung opposed to the ones in the 670's and 680's which are hynix.
> I get +145 on the core and +550 on the memory..


Edit: I'm a dumbass


----------



## Arm3nian

Quote:


> Originally Posted by *pilla99*
> 
> Jesus Christ is that card melted or what. What temps are you getting under load?


My case cooling is really bad. 25c idle and 60load max fan speed. I got my case labs Th10 the other day though, and ordered about $800 of watercooling gear which will come on Tuesday due to performance pcs terrible shipping. The problem is filling up 960mm of rad with push pull will cost $250 of fans... which I dont have considering i just blew $1400. I guess i will have to do with my cm sickleflows for now , they match my build but they are cheap and dont give the rad performance for their rpm/noise. Got the xspc block btw.


----------



## Arizonian

I'll be a monkeys uncle. Running Windows 8 (Preview) & 310.33 beta drivers have somehow tamed Google Chrome from taking my GTX 690 for a jog. Looks promising.










Once I took the beta Windows plunge I decided what the heck and go for beta drivers too. I'm sure I'll be reformatting soon enough.


----------



## DinaAngel

the latest driver doesnt seem to work for me im afraid, it just crashes all the time


----------



## Arizonian

Quote:


> Originally Posted by *DinaAngel*
> 
> the latest driver doesnt seem to work for me im afraid, it just crashes all the time


I haven't tried gaming yet. I've got some other quirks that have kept me busy as well as new mice software that won't load with Windows 8. Will let you know how that goes this weekend gaming.

I usually just stick with final WHQL releases and never have issues. Now that I'm here no turning back......


----------



## thestache

Quote:


> Originally Posted by *DinaAngel*
> 
> the latest driver doesnt seem to work for me im afraid, it just crashes all the time


Has crashed a few times for me also with my GTX 680 4GB SLI. Not enough for me to roll back because it does perform well.


----------



## DinaAngel

Quote:


> Originally Posted by *pilla99*
> 
> Edit: I'm a dumbass


i get 140+ core max with stabillity, well sometimes its only stable on 130 on core but its gonna be much better when i get my waterblock next week








500 on memmory is stable

aww dont u guys love it when one pixel is dead on one of ur screens?








mhm got small pixel on old screen







, well second id call it
it makes me want to kill it with fire xD


----------



## pilla99

Quote:


> Originally Posted by *DinaAngel*
> 
> i get 140+ core max with stabillity, well sometimes its only stable on 130 on core but its gonna be much better when i get my waterblock next week
> 
> 
> 
> 
> 
> 
> 
> 
> 500 on memmory is stable
> aww dont u guys love it when one pixel is dead on one of ur screens?
> 
> 
> 
> 
> 
> 
> 
> 
> mhm got small pixel on old screen
> 
> 
> 
> 
> 
> 
> 
> , well second id call it
> it makes me want to kill it with fire xD


How can you play on air at those clocks. I turned it up to those clock for a minute to see what would happen at idle and I went up to a nice 42C.
I can imagine gaming on those clocks on air puts you easily into the 90's with full GPU load.


----------



## MrTOOSHORT

Remember sometimes with a new driver, you'll have to figure out your overclock again.


----------



## Renairy

690's are so loud right?
Mine is on max load :/


----------



## Divineshadowx

Quote:


> Originally Posted by *Renairy*
> 
> 690's are so loud right?
> Mine is on max load :/


I can't here a difference between 30% and 100% fan speed over my 2000rpm fans. But then again the fans I have are really bad. I'm planning on getting GT's at 1500-1850rpm but a push pull will cost $250 on my 2x 480mm rads, maybe those will actually work as a and not a sound generator.


----------



## DinaAngel

Quote:


> Originally Posted by *pilla99*
> 
> How can you play on air at those clocks. I turned it up to those clock for a minute to see what would happen at idle and I went up to a nice 42C.
> I can imagine gaming on those clocks on air puts you easily into the 90's with full GPU load.


it is max 70, but my ambient is 20 normally, atm its 15, snow and ice outside







, its 4degrees outside atm, but in weekend its gonna get -14


----------



## thestache

Quote:


> Originally Posted by *pilla99*
> 
> How can you play on air at those clocks. I turned it up to those clock for a minute to see what would happen at idle and I went up to a nice 42C.
> I can imagine gaming on those clocks on air puts you easily into the 90's with full GPU load.[/quote
> 
> Not if your case has proper airflow and fans.
> 
> Ran mine 165+ and 300+ with my surround set-up and it would hover low 70s. Never over 74-75deg.


----------



## Arizonian

What a long night with a reformat, Win 7 installation, followed by Win 8 upgrade at midnight and finally back up to 100%.









Few pages back I had discussed about a drop in max Core on my 690 and I can now confirm I'm no longer able to reach 1176 MHz as my card once did. [Proof]

Core Offset past+130 crashes eventually and where it once provided 1176 MHz Kepler Boost it now only maxes 1163 MHz Core. If Power Target is maxed I'm hitting 70C-71C and I'm running mostly at 1150 MHz Core. If I lower Core Offset & Power Target to +120 my max Core is still is able to reach 1163 MHz and will barely throttle anymore with temps hovering just below 70C.

So running my stable 24/7 OC has degraded my GPU by 13 MHz from 1176 to 1163 MHz. A 14% over clock in the Core down to 13% realistically.

Will be looking to see if it degrades further as I intend to keep my stable max OC which is now 1163 MHz running 24/7.


----------



## DinaAngel

Quote:


> Originally Posted by *Arizonian*
> 
> What a long night with a reformat, Win 7 installation, followed by Win 8 upgrade at midnight and finally back up to 100%.
> 
> 
> 
> 
> 
> 
> 
> 
> Few pages back I had discussed about a drop in max Core on my 690 and I can now confirm I'm no longer able to reach 1176 MHz as my card once did. [Proof]
> Core Offset past+130 crashes eventually and where it once provided 1176 MHz Kepler Boost it now only maxes 1163 MHz Core. If Power Target is maxed I'm hitting 70C-71C and I'm running mostly at 1150 MHz Core. If I lower Core Offset & Power Target to +120 my max Core is still is able to reach 1163 MHz and will barely throttle anymore with temps hovering just below 70C.
> So running my stable 24/7 OC has degraded my GPU by 13 MHz from 1176 to 1163 MHz. A 14% over clock in the Core down to 13% realistically.
> Will be looking to see if it degrades further as I intend to keep my stable max OC which is now 1163 MHz running 24/7.


as i said before, i doubt it has degraded, since nvidia has tried to perfect this, if you overclock your 690 then they can rma it, but if you modify ur bios then ur warranty is gone.
so u most likely has the Newest driver what makes overclocking worse, or ur card is faulty.
also is that GTX 690 cant overclock to its specified mhz over 50 degrees.
also windows 8 will make ur gpu do less scores since of the cpu handling isnt done right, might get fixed but its up to you if you want to use windows 8, gpu boost will be worse on win8 since of cpu handling also.

try go back to win7 and the older drivers u used, prob work like it used to


----------



## thestache

Quote:


> Originally Posted by *DinaAngel*
> 
> as i said before, i doubt it has degraded, since nvidia has tried to perfect this, if you overclock your 690 then they can rma it, but if you modify ur bios then ur warranty is gone.
> so u most likely has the Newest driver what makes overclocking worse, or ur card is faulty.
> also is that GTX 690 cant overclock to its specified mhz over 50 degrees.
> also windows 8 will make ur gpu do less scores since of the cpu handling isnt done right, might get fixed but its up to you if you want to use windows 8, gpu boost will be worse on win8 since of cpu handling also.
> try go back to win7 and the older drivers u used, prob work like it used to


I still don't understand this 50deg thing you keep going on about. My brothers GTX 690 on air at 60-70deg did the same it does under water at 36deg. Infact less now, one of the cores does 10-15mhz less now.


----------



## DinaAngel

Quote:


> Originally Posted by *thestache*
> 
> I still don't understand this 50deg thing you keep going on about. My brothers GTX 690 on air at 60-70deg did the same it does under water at 36deg. Infact less now, one of the cores does 10-15mhz less now.


i say 50 degrees since then u got a headroom of 10 degrees, 70 degrees the 690 wont be fully stable,
98 is total max for the gtx 690,
iv tried explain this countless of times even detailed,
the 690 isnt the same parts as on the 680,
even the memmory chip isnt hynix as of 670 and 680.
the 690 memmory chip is samsung.
also the 690 can differ on sensitivity to temps but 70 is its edge, it wont be stable.


----------



## Tslm

Quote:


> Originally Posted by *Arizonian*
> 
> What a long night with a reformat, Win 7 installation, followed by Win 8 upgrade at midnight and finally back up to 100%.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Few pages back I had discussed about a drop in max Core on my 690 and I can now confirm I'm no longer able to reach 1176 MHz as my card once did. [Proof]
> 
> Core Offset past+130 crashes eventually and where it once provided 1176 MHz Kepler Boost it now only maxes 1163 MHz Core. If Power Target is maxed I'm hitting 70C-71C and I'm running mostly at 1150 MHz Core. If I lower Core Offset & Power Target to +120 my max Core is still is able to reach 1163 MHz and will barely throttle anymore with temps hovering just below 70C.
> 
> So running my stable 24/7 OC has degraded my GPU by 13 MHz from 1176 to 1163 MHz. A 14% over clock in the Core down to 13% realistically.
> 
> Will be looking to see if it degrades further as I intend to keep my stable max OC which is now 1163 MHz running 24/7.


Degradation is pretty unlikely. 1176 @ ~70ish at stock voltage is honestly really good when you think about how warm older generations used to run and the fact they were often overvolted. It could be something to do with drivers, or maybe a slight increase in ambient temps which in turn is pushing your 690 just above what it was usually at?

I think about how long my 6970 was running between 85-95c overclocked and overvolted with no degradation and I honestly just can't see it ever happening to Kepler.


----------



## DinaAngel

Quote:


> Originally Posted by *Tslm*
> 
> Degradation is pretty unlikely. 1176 @ ~70ish at stock voltage is honestly really good when you think about how warm older generations used to run and the fact they were often overvolted. It could be something to do with drivers, or maybe a slight increase in ambient temps which in turn is pushing your 690 just above what it was usually at?
> I think about how long my 6970 was running between 85-95c overclocked and overvolted with no degradation and I honestly just can't see it ever happening to Kepler.


finally someone who understands some


----------



## Arizonian

Quote:


> Originally Posted by *DinaAngel*
> 
> finally someone who understands some


It's not that I'm not understanding what you said. I do see what you mean. I just haven't ruled it out. Your assuming I can't understand what you were trying to say and I don't know what ambient temps, drivers, or new OS can do. LOL.

Ambient temps in Arizona have come down outside and I maintain same temp all year long inside.

I noticed this on Win 7 and three drivers ago. It's not drivers, ambient temp, or Win 8. It could be faulty which may be the case.

I do appreciate the input. Looks like RMA might be in my future either way. Thanks to all who replied with their take.


----------



## Renairy

Quote:


> Originally Posted by *Arizonian*
> 
> It's not that I'm not understanding what you said. I do see what you mean. I just haven't ruled it out. Your assuming I can't understand what you were trying to say and I don't know what ambient temps, drivers, or new OS can do. LOL.
> Ambient temps in Arizona have come down outside and I maintain same temp all year long inside.
> I noticed this on Win 7 and three drivers ago. It's not drivers, ambient temp, or Win 8. It could be faulty which may be the case.
> I do appreciate the input. Looks like RMA might be in my future either way. Thanks to all who replied with their take.


Wait, wait wait........ your going to RMA your 690 because u lost 8Mhz ?
Dude, with all due respect....thats just OCD.

Like no joke, im shocked at your concern. I can guarentee there is zero degradation happening and its just the way this series works with boost.
My advice, for what it is worth, do not in the slightest worry about that lost 8Mhz because even in synthetic benchmarks you will not see a difference.


----------



## Arizonian

Quote:


> Originally Posted by *Renairy*
> 
> Wait, wait wait........ your going to RMA your 690 because u lost 8Mhz ?
> Dude, with all due respect....thats just OCD.
> Like no joke, im shocked at your concern. I can guarentee there is zero degradation happening and its just the way this series works with boost.
> My advice, for what it is worth, do not in the slightest worry about that lost 8Mhz because even in synthetic benchmarks you will not see a difference.


No no no. LOL. I wouldn't RMA unless it was unstable at base clocks. 13 MHz is like 1 FPS.









I'm waiting to see what happens down the road. I said it MAY come down to it.


----------



## Tslm

Lets hope not. Would be an ominous sign for everyone if it was happening because of what is a fairly mild clock for GK104. My 680 SC's both boost to 1176 out of the box for example.

Most likely not though but let us know how it goes. Who knows maybe one day you'll mysteriously get that 13MHz back


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Arizonian*
> 
> No no no. LOL. I wouldn't RMA unless it was unstable at base clocks. 13 MHz is like 1 FPS.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm waiting to see what happens down the road. I said it MAY come down to it.


That's good.

Consider also that our cards might need a *work in* period, like for instance Sandy bridge-E chips. For the first couple of weeks, my cpu needed a certain voltage for a certain overclock. Then after that period, it needed more voltage for the same speed.

I think it's the same way here in this situation. I pushed 1.5v into my gtx 680 and it was fine. 1.21v won't do a thing to our cards. I understand the frustartion though, paying $1000 for a gpu and being cautious.


----------



## icanhasburgers

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> That's good.
> Consider also that our cards might need a *work in* period, like for instance Sandy bridge-E chips. For the first couple of weeks, my cpu needed a certain voltage for a certain overclock. Then after that period, it needed more voltage for the same speed.
> I think it's the same way here in this situation. I pushed 1.5v into my gtx 680 and it was fine. 1.21v won't do a thing to our cards. I understand the frustartion though, paying $1000 for a gpu and being cautious.


My 2500k has been the same recently. The voltage has creeped up bit by bit (my own doing obviously) since my first overclock with it that was stable. Hungry little thing, bless it.


----------



## Arm3nian

When would you need to put a kill coil or biocide in the loop? I got all my parts today, just don't have a kill coil. Would my system be fine until maybe Wednesday when I can get one. I don't want stuff growing and have to disassemble the entire thing. Will use distilled water.


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> When would you need to put a kill coil or biocide in the loop? I got all my parts today, just don't have a kill coil. Would my system be fine until maybe Wednesday when I can get one. I don't want stuff growing and have to disassemble the entire thing. Will use distilled water.


just use premix, they use glykol instead of kill coil, since kill coil does go into the destilled water and it makes it lead electric current and then its just a moment of time before a part gets shocked from either pump or some other parts.
this is tested soo many times and people doesnt usually talk about it anymore


----------



## Arm3nian

Quote:


> Originally Posted by *DinaAngel*
> 
> just use premix, they use glykol instead of kill coil, since kill coil does go into the destilled water and it makes it lead electric current and then its just a moment of time before a part gets shocked from either pump or some other parts.
> this is tested soo many times and people doesnt usually talk about it anymore


Ya, but will my loop be fine for 5 days without anything but distilled water. I want to build it now but I need to buy the biocide/coil.


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> Ya, but will my loop be fine for 5 days without anything but distilled water. I want to build it now but I need to buy the biocide/coil.


well the destilled water wont be destilled when it touches something so its up to you, destilled water reacts with almost anything to become polluted, id recomend coolant but up to you.

http://specialtech.co.uk/spshop/customer/Mayhems-Ice-Dragon-Nano-Fluid-for-Extreme-Water-Cooling-1-Litre---Nano-Fluid-pid-15118.html

thats what i use and i fully recomend it, i got 2 liters stocked up for my 480 and 690 waterblock this upcomming week


----------



## DinaAngel

3dmark vantage
GTX 690 "newest beta drivers"

extreme
http://www.3dmark.com/3dmv/4370702

performance
http://www.3dmark.com/3dmv/4370729


----------



## DinaAngel

newer scores at 5ghz cpu

extreme
http://www.3dmark.com/3dmv/4372556

performance
http://www.3dmark.com/3dmv/4372538


----------



## Qu1ckset

i finally got my GTX690 back in my computer!







, having no computer for 3weeks blows!


----------



## DinaAngel

Quote:


> Originally Posted by *Qu1ckset*
> 
> i finally got my GTX690 back in my computer!
> 
> 
> 
> 
> 
> 
> 
> , having no computer for 3weeks blows!


Congratulations !! Iknow how it feels


----------



## OverClocker55

Quote:


> Originally Posted by *DinaAngel*
> 
> Congratulations !! Iknow how it feels


I have been without a gaming pc for 2 months now.. I'm slowly dying.


----------



## DinaAngel

Quote:


> Originally Posted by *OverClocker55*
> 
> I have been without a gaming pc for 2 months now.. I'm slowly dying.


iv been a whole year since i was in usa working







+ i was in coma for 3 months, well it fealt like a long sleep, amazing what an shock can do to someone. 2011 july - 2011 october, still trying to recover.......


----------



## Arm3nian

My loop so far...

The problem I'm having is that there is barely any pressure from the tubes. My pump pulls the water straight from the res, which is on the bay. Then it discharges it out the top, when I put my hand on that tube I can feel the water moving. However, any tube after that, nothing. The pump goes into the xspc 690 block, which I think I have installed the fittings correctly on, there is literally no diagram or instructions. Then it goes out from that to the ray storm cpu, out that into rad 1, then rad 2, which are both xspc 480 rads. The water seems to be flowing, but i'm unsure, no pressure on any tubing beside the pump to 690 block. Maybe I have installed the compression fittings to tight? There is no leakage for about an hour so far, turning the pump up from speed 1 to 5 makes no difference in the pressure on the tubes, but the flow is increased I think. The pump costs $100 so its not a cheap piece of crap. Any help would be nice thanks


----------



## ceteris

It should be working, but can't tell unless your power the system on and check temps or buy and connect a flow meter to the loop. Just make sure you top off your res and tip the case backwards on it's rear corner to help get rid of air bubbles before powering the system. You have to do this because you are using a bay res as opposed to a cylinder one. I usually run the loop alone for at least 2 hours before powering her up.

Are you going to buy a top for the pump? The extra fitting holes really help with loop routing.


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> My loop so far...
> The problem I'm having is that there is barely any pressure from the tubes. My pump pulls the water straight from the res, which is on the bay. Then it discharges it out the top, when I put my hand on that tube I can feel the water moving. However, any tube after that, nothing. The pump goes into the xspc 690 block, which I think I have installed the fittings correctly on, there is literally no diagram or instructions. Then it goes out from that to the ray storm cpu, out that into rad 1, then rad 2, which are both xspc 480 rads. The water seems to be flowing, but i'm unsure, no pressure on any tubing beside the pump to 690 block. Maybe I have installed the compression fittings to tight? There is no leakage for about an hour so far, turning the pump up from speed 1 to 5 makes no difference in the pressure on the tubes, but the flow is increased I think. The pump costs $100 so its not a cheap piece of crap. Any help would be nice thanks


get sensor for flow and i usually blow air thru waterblocks to know tht they work -w-, if u get sensors and then u might see that u need another pump in serial or the waterblock isnt working or, u cant just touch the tube to check pressure u do know pressure isnt most important since its a sealed loop, i use a d5 1200lph costed 130 euro and considering another one =w=


----------



## DinaAngel

damn my main gpu is getting hot with waterblock but second gpu seem to touch the waterblock, what have i done wrong???







, first gpu drops very fast tho on temp when not in use, but second gpu stays on 36 on load and 26 in idle,

first gpu is idling at 28 degrees so few degrees difference. but why does main gpu go up to like 80 degrees in millisecs and then when i turn off furry test then it drops to 28 instant almost.

tighting sqrews made temps some better but still about 50% difference in both.
right now all sqrews are very tight

if i leave them idle then they go to same temps at 27 degrees
at 170+ core then gpu 2 is 42 degrees
and gpu 1 is 72 degrees


----------



## DinaAngel

i tried nut the sqrews bit more and two sqrews broke, sheez ordered new waterblock, this time an acetal and copper one and not acetal and nickel. (ㄒ﹏ㄒ)

temps seem to have dropped lilbit on the first gpu
update: 10 degrees!! 64 degrees on 170 on core and 500 on memmory no artifacts, but gpu 2 is like 41 degrees and its like LOOK at me im just fine.
i hope it will be better with a new try on the new waterblock i ordered now







.

hopefully i will have enaugh thermal paste for the pad areas too then


----------



## DinaAngel

gpu 2 is stable at 210 core + and 500 memmory 38 degrees load
1254 mhz

gpu 2 stable at 240+ on core, and 500 on memmory 40 degrees load
1280 mhz

250 on core crashes the nvidia driver,

however i might be able to set memmory higher.

i was right about the temps and gpu boost reduction at 50 degrees, saw that on gpu 1 all the time

gpu 2 is drawing 115 % power 15% over its normal limit

its 25 degrees ambient atm, if i put more clothes on and let the -6 degrees air in then its gonna go far









update: almost stable at 1293 mhz







on memmory at 100+
1752 mhz memmory on 1280mhz core fully stable

when i get both gpus at 1280mhz core and 1752 mhz memmory, then ill bench some, might be next week hopefully







, off to shower got like coolant all over me


----------



## DinaAngel

get water block today









anyone know of a optimal tightning ?


----------



## ceteris

Quote:


> Originally Posted by *DinaAngel*
> 
> get water block today
> 
> 
> 
> 
> 
> 
> 
> 
> anyone know of a optimal tightning ?


I just tighten it to the point where it's decently snug. I don't do it too tight out of fear of busting up the PCB. There might be a sweet spot for mounting pressure, but I usually only see people worry about it for CPUs.


----------



## DinaAngel

Quote:


> Originally Posted by *ceteris*
> 
> I just tighten it to the point where it's decently snug. I don't do it too tight out of fear of busting up the PCB. There might be a sweet spot for mounting pressure, but I usually only see people worry about it for CPUs.


yes but on dual gpus then both need to be kinda done right or else one or both wont be fully to the nm on the waterblock


----------



## gizmo83

hi guys. ihave a gigabyte gtx 690 and i would have a tool to modify the light led of this board. Evga tool don't recognize my board. Any alternative tool?


----------



## Arizonian

Quote:


> Originally Posted by *gizmo83*
> 
> hi guys. ihave a gigabyte gtx 690 and i would have a tool to modify the light led of this board. Evga tool don't recognize my board. Any alternative tool?


The only way it might be possible for the EVGA LED software to work on your Gigabyte card will be to flash the Gigabyte with EVGA's BIOS. I'm betting it will afterward. However flash at your own risk and back up your own Gigabyte BIOS before attempting.

Otherwise I've not heard of any other software that controls the lighting.

PS: Add a pic of your GTX 690 and join our club list.


----------



## jassilamba

Got the PCIE cables sleeved. I will say that the 690 looked awesome with the stock heatsink, but oh well, a heatkiller would do.


----------



## Arizonian

Quote:


> Originally Posted by *jassilamba*
> 
> Got the PCIE cables sleeved. I will say that the 690 looked awesome with the stock heatsink, but oh well, a heatkiller would do.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> ]


Nice job on the colored sleeving and really slick looking rig.


----------



## Buzzkill

Quote:


> Originally Posted by *jassilamba*
> 
> Got the PCIE cables sleeved. I will say that the 690 looked awesome with the stock heatsink, but oh well, a heatkiller would do.


How was the Heatkiller water-block installation? They include all thermal pads? What about for the GPU is thremal paste included? I ordered the Nickel version with extra Thermal Pads and Dual Link. I am going to try to use EVGA backplates with them

Status Date
Order was received 02/11/2012 (14:33)
Paid 02/11/2012 (14:33)

Quantity Name UP TP
2 piece(s)
HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl
Product no.: 15531

Rate product
134.42 € 268.84 €
1 piece(s)
GPU-X² / X³ Dual-Link (1-Slot) GPU-X² / X³ Dual-Link (1-Slot)
Product no.: 10191

Rate product
12.56 € 12.56 €
2 piece(s)
Thermal Pads HEATKILLER® GPU-X³ 690 Thermal Pads HEATKILLER® GPU-X³ 690
Product no.: 79363

Rate product
5.84 € 11.68 €
Subtotal 293.08 €
Delivery method world wide shipping 39.95 €
Payment method PayPal

Tax area Non-EU country
Total amount (without VAT) 333.03 €
Total amount 333.03 €
Additional duties, taxes and fees may be charged for shipments outside the EU.


----------



## jassilamba

Quote:


> Originally Posted by *Buzzkill*
> 
> How was the Heatkiller water-block installation? They include all thermal pads? What about for the GPU is thremal paste included? I ordered the Nickel version with extra Thermal Pads and Dual Link. I am going to try to use EVGA backplates with them
> Status Date
> Order was received 02/11/2012 (14:33)
> Paid 02/11/2012 (14:33)
> Quantity Name UP TP
> 2 piece(s)
> HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl
> Product no.: 15531
> Rate product
> 134.42 € 268.84 €
> 1 piece(s)
> GPU-X² / X³ Dual-Link (1-Slot) GPU-X² / X³ Dual-Link (1-Slot)
> Product no.: 10191
> Rate product
> 12.56 € 12.56 €
> 2 piece(s)
> Thermal Pads HEATKILLER® GPU-X³ 690 Thermal Pads HEATKILLER® GPU-X³ 690
> Product no.: 79363
> Rate product
> 5.84 € 11.68 €
> Subtotal 293.08 €
> Delivery method world wide shipping 39.95 €
> Payment method PayPal
> 
> Tax area Non-EU country
> Total amount (without VAT) 333.03 €
> Total amount 333.03 €
> Additional duties, taxes and fees may be charged for shipments outside the EU.


Thermal pads are included but they are not enough unless you cut em up to size like stock thermal pads so good call on the pads. No thermal paste is included (at least in mine) I had some Tunix 4 left over so i used that.

Overall the installation was super easy plus I'm using heatkiller backplate for the primary reason the weight of the block. It is really heavy not sure if the EVGA backplate can handle it.

Great choice on the block by the way.


----------



## Buzzkill

http://watercool.de/sites/default/files/downloads/MA_GPU-X3_GTX_690_A5.pdf

HEATKILLER® GPU-X³ GTX 690

Prior to the first commissioning of the
graphics card with installed back plate, the
distances between soldering lugs and backplate should be inspected visually. The
soldering lugs of the electronic components
may not touch the back plate. This could result
in an electrical short circuit that would damage
the hardware. The distance is generally
sufficiently dimensioned. Production may
however cause soldering lugs that are too long
on a few graphics cards. In this case, the card
must not be commissioned with an installed
back plate. If the card is nevertheless
commissioned, then this will be at one's own.

Thanks for the Info .The instructions say that some videocards the Backplate can short out by touching it. So they are not mandatory. I will try before I buy Heatkiller Backplates.


----------



## jassilamba

Quote:


> Originally Posted by *Buzzkill*
> 
> http://watercool.de/sites/default/files/downloads/MA_GPU-X3_GTX_690_A5.pdf
> 
> HEATKILLER® GPU-X³ GTX 690
> Prior to the first commissioning of the
> graphics card with installed back plate, the
> distances between soldering lugs and backplate should be inspected visually. The
> soldering lugs of the electronic components
> may not touch the back plate. This could result
> in an electrical short circuit that would damage
> the hardware. The distance is generally
> sufficiently dimensioned. Production may
> however cause soldering lugs that are too long
> on a few graphics cards. In this case, the card
> must not be commissioned with an installed
> back plate. If the card is nevertheless
> commissioned, then this will be at one's own.
> Thanks for the Info .The instructions say that some videocards the Backplate can short out by touching it. So they are not mandatory. I will try before I buy Heatkiller Backplates.


The heatkiller backplate came with some standoffs to create some seperation from the board and backplate. Looks cool too


----------



## gizmo83

HI. Is possible to change my gigabyte 690 bios with the last updated evga withouth any issue? If it s possible can you show me the guide? Thank you.

Inviato dal mio GT-N7000 con Tapatalk 2


----------



## DinaAngel

Quote:


> Originally Posted by *gizmo83*
> 
> HI. Is possible to change my gigabyte 690 bios with the last updated evga withouth any issue? If it s possible can you show me the guide? Thank you.
> Inviato dal mio GT-N7000 con Tapatalk 2


dont believe u can since evga got master and slave bioses and as im aware the other brands dont have this. atleast from reading in the gtx 690 bios editing forums, they are making editor though so wait until january and ud prob have editor so u can enable it for your 690


----------



## gizmo83

Quote:


> Originally Posted by *DinaAngel*
> 
> dont believe u can since evga got master and slave bioses and as im aware the other brands dont have this. atleast from reading in the gtx 690 bios editing forums, they are making editor though so wait until january and ud prob have editor so u can enable it for your 690


thank you. However i post any pics of my lovely card to join the club


----------



## Arizonian

Quote:


> Originally Posted by *gizmo83*
> 
> thank you. However i post any pics of my lovely card to join the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Very nice gizmo83 congrats and welcome to the list.


----------



## D3skt0pG4m3r

Hey guys I was just wondering where the list of required tests to see how good my OC'd 690 is doing. I remember reading back where it was that you guys had posted what you used during your tests, maybe someone could link me those so i can see how my 690 stacks up. I'll get some pics up to join the club asap


----------



## D3skt0pG4m3r

Here we go, I received it in the mail Wednesday


----------



## Arizonian

Quote:


> Originally Posted by *D3skt0pG4m3r*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here we go, I received it in the mail Wednesday


Congrats D3skt0pG4m3r and welcome aboard.







Time to get that baby in your rig and take her for an FPS ride.


----------



## DinaAngel

Quote:


> Originally Posted by *D3skt0pG4m3r*
> 
> Hey guys I was just wondering where the list of required tests to see how good my OC'd 690 is doing. I remember reading back where it was that you guys had posted what you used during your tests, maybe someone could link me those so i can see how my 690 stacks up. I'll get some pics up to join the club asap


check my scores and try reach it urself with ur 690.

if u get close or not too far away, or u reach over it, then ur fine.

however in a few days its gonna change since then ill be running 1280mhz on both cores most likely since of the thermal paste im awaiting.

3dmark vantage scores


----------



## iARDAs

Hey folks I need to ask you guys something.

We all know that 4GB advertisement is not true with the 690. 2+2 is not = 4 in Vram

But is this the same in memory bandwith as well?

Can the 690 take advantage of the all 384 memory bandwith?

Because lately thanks to this forum I came to some conclusions that perhaps since the 690 has this much memory bandwith, it might help with resolutions over 1080p? hence 2GB of Vram could not be a giant problem?

Am I on the right thinking track here?


----------



## HeadlessKnight

Quote:


> Originally Posted by *iARDAs*
> 
> Hey folks I need to ask you guys something.
> 
> We all know that 4GB advertisement is not true with the 690. 2+2 is not = 4 in Vram
> 
> But is this the same in memory bandwith as well?
> 
> Can the 690 take advantage of the all 384 memory bandwith?
> 
> Because lately thanks to this forum I came to some conclusions that perhaps since the 690 has this much memory bandwith, it might help with resolutions over 1080p? hence 2GB of Vram could not be a giant problem?
> 
> Am I on the right thinking track here?


GTX 690 = GTX 680 SLI

The new benchmarks show how the GTX 680 SLI is bandwidth starved at high resolutions even against HD7950s Crossfire. The situation will be the same with GTX 690. Sleeping Dogs with extreme AA is enough to kill a 2 GB card at 5760x1080
2 GB is good for unmodded games, once you mod your games with heavy textures you should look at something else. My GTX 670 even at 1080p can get its 2 GB VRAM full and stutter with heavily modded games like Skyrim.


----------



## Arm3nian

Getting about 55c on gpu1 and 40c on gpu2 on crysis 2, running 120+core and 400+memory. I'm on two 480rads, do the temps seem a bit high? Ambient of about 75. I got 70c-60c on air with a MUCH less oc, how are my temps?


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> Getting about 55c on gpu1 and 40c on gpu2 on crysis 2, running 120+core and 400+memory. I'm on two 480rads, do the temps seem a bit high? Ambient of about 75. I got 70c-60c on air with a MUCH less oc, how are my temps?


bit high, try fasten sqrews bit more
i can do 240+ on both gpus and 500 on memmory
42 degrees max
1280mhz
and
1750mhz mem

be sure to check if your thermal pads arent bad placed since it can cause the gpus surfaces to not hit correctly, remember to have even pressure and remember to have even ammount of thermal paste, i recomend liquid metal since its much better than anything iv tried before


----------



## Arm3nian

Wow 690 is a monster overclocker. I didn't even have to try to get 150mhz and 500mhz. I guess you get what you pay for


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> Wow 690 is a monster overclocker. I didn't even have to try to get 150mhz and 500mhz. I guess you get what you pay for


yup, try 210 on both gpus


----------



## Rei86

Got my replacement card after having to RMA my 1st card because I stripped some T6 screws on it









Putting one of these on it when I have time









My old card stock (CPU at stock clocks, and GPU at stock clocks) it was able to hit P14299 gonna see if this stock for stock can do better.

And I hope this one is a better OCer.


----------



## rationalthinking

Quote:


> Originally Posted by *Arm3nian*
> 
> Wow 690 is a monster overclocker. I didn't even have to try to get 150mhz and 500mhz. I guess you get what you pay for


150 and 650 seem to be my max stable.

Anything over 150 for me seems to crash midway in tests and games.


----------



## gregcade

Hi,

Is there anyone who have the EVGA Led Controller please?

Because I tried to download it several times from EVGA FTP site but it always failed.

Thanks


----------



## rationalthinking

Quote:


> Is there anyone who have the EVGA Led Controller please?
> 
> Because I tried to download it several times from EVGA FTP site but it always failed.


I can email it to you when I get home. Have it saved on my gaming rig.


----------



## gregcade

Ho thanks !


----------



## DinaAngel

Quote:


> Originally Posted by *rationalthinking*
> 
> 150 and 650 seem to be my max stable.
> Anything over 150 for me seems to crash midway in tests and games.


yeah its temp issue. my gpu 2 can do 1280mhz fine on 240+ on 34 degrees max load


----------



## rationalthinking

Quote:


> Originally Posted by *DinaAngel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rationalthinking*
> 
> 150 and 650 seem to be my max stable.
> Anything over 150 for me seems to crash midway in tests and games.
> 
> 
> 
> yeah its temp issue. my gpu 2 can do 1280mhz fine on 240+ on 34 degrees max load
Click to expand...

Must be, I'm still on air and will keep it on air simply because I upgrade every refresh cycle.

I'm hitting 55-58C on full load. Arkham City is the only game that really puts real stress on them, they might hit 60C with ambient temps at 20C then.

Edit:

I have to say, it stays pretty cool compared to my 670s I had in SLi.

Also I can not be happier with the 1 card solution over my 670s in SLi. I seem to have an easier time setting up surround when I had it and for less problems.


----------



## Arm3nian

Has anyone tried the new need for speed most wanted? My 1st core on my 690 seems to maxing out, and the game does not even have any AA settings, and doesn't really look all that great. Also, my second core is idle, so it doesn't even support sli. Horrible optimization, or just early drivers it seems like.


----------



## juanP

i am having trouble running all three monitors on the 690, one of the dvi ports is not sending any signal i think.

am i doing anything wrong or do i need to setup something to enable three monitors?


----------



## ceteris

FYI Sidewinder Computers just got the Heatkiller GTX 690 "Hole Edition" blocks at a REALLY NICE PRICE of $139.99









I would totally jump on them if I wasn't still holding out for the nickel plated one.... REALLY tempted to just get this one despite somewhat mixing in the nickel plate CPU block I'm going to be installing.


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> Has anyone tried the new need for speed most wanted? My 1st core on my 690 seems to maxing out, and the game does not even have any AA settings, and doesn't really look all that great. Also, my second core is idle, so it doesn't even support sli. Horrible optimization, or just early drivers it seems like.


yup i tried it, its a console game for sure, but it was fun until i found out u hafto beat the game with all the cars xD, wasnt that fun then but the sli sucks on it and its badly optimized, your 690 could run that game at 10% fine if it was made much better, its like crysis 2 was healthy optimality but the textures and graphics could been much better, but hey it wasnt finished so but still,
we could play some time Arm3nian on some game, got some on my steam


----------



## DinaAngel

Quote:


> Originally Posted by *juanP*
> 
> i am having trouble running all three monitors on the 690, one of the dvi ports is not sending any signal i think.
> am i doing anything wrong or do i need to setup something to enable three monitors?


ok, listen

GeForce GTX 690 Multi-GPU On
> Maximum Resolution: 2560x1600 per display
> Maximum Displays: 3

Special Instructions:
> With Multi-GPU enabled, displays must be connected using DVI connectors 2 and 3, plus the
mini-DisplayPort connector.
> Orientation can be defined per display.


----------



## juanP

so i clicked on the "activate all displays" checkbox. and all the three displays work. but i guess sli is disabled then.


----------



## pilla99

Exactly, that's why I sold my second catleap. I want maximum performance on my display.


----------



## gizmo83

hi guys, i'm looking for information about the M020-00-000243 EVGA GTX 690 Backplate. The gpu temperature have a benefit with it ? How many degrees decrease ?


----------



## Marcsrx

So I also need info. What are you guys finding are safe and stable OC's on the stock air configuration of the 690? I understand that at 70degrees it throttles the clock. I get up to about 68/69 stock so would I get any benefit in OCing my 690?


----------



## Arizonian

Quote:


> Originally Posted by *gizmo83*
> 
> hi guys, i'm looking for information about the M020-00-000243 EVGA GTX 690 Backplate. The gpu temperature have a benefit with it ? How many degrees decrease ?


No temperature difference. Protection and asthetics only.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *gizmo83*
> 
> hi guys, i'm looking for information about the M020-00-000243 EVGA GTX 690 Backplate. The gpu temperature have a benefit with it ? How many degrees decrease ?


The backplate is for looks and PCB protection. Doesn't really help any in the cooling department.
Quote:


> Originally Posted by *Marcsrx*
> 
> So I also need info. What are you guys finding are safe and stable OC's on the stock air configuration of the 690? I understand that at 70degrees it throttles the clock. I get up to about 68/69 stock so would I get any benefit in OCing my 690?


The gtx 690 cooler can easily handle a 1200MHz overclock if your card can get that high. I'd get some more airflow in your case, if it's possible, to help OC your card. You really don't need to overclock your card though as it's a beast already at stock clocks.


----------



## Marcsrx

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> The backplate is for looks and PCB protection. Doesn't really help any in the cooling department.
> The gtx 690 cooler can easily handle a 1200MHz overclock if your card can get that high. I'd get some more airflow in your case, if it's possible, to help OC your card. You really don't need to overclock your card though as it's a beast already at stock clocks.


Thanx for the reply. I understand and am happy with it. I just would like to sustain higher FPS to match the 100hz I'm putting to my 1440p monitor.


----------



## gizmo83

thank you for reply







i have another question. i would try to connect my monitor with a cable minidisplay port - hdmi male-male. is a good choice? is there an impact on framerate? now im using dvi-vga connection.

Inviato dal mio GT-N7000 con Tapatalk 2


----------



## thestache

Quote:


> Originally Posted by *DinaAngel*
> 
> ok, listen
> GeForce GTX 690 Multi-GPU On
> > Maximum Resolution: 2560x1600 per display
> > Maximum Displays: 3
> Special Instructions:
> > With Multi-GPU enabled, displays must be connected using DVI connectors 2 and 3, plus the
> mini-DisplayPort connector.
> > Orientation can be defined per display.


Wrong.

Using all three DVI connections does work in SLI. Why it says that I don't know but that's how I ran mine (my GTX 690) and I have pictures of my brothers rig with his GTX 690 playing BF3 with it set-up that way from yesterday. You don't want to mix display connections because it can cause screen tearing.

Quote:


> Originally Posted by *juanP*
> so i clicked on the "activate all displays" checkbox. and all the three displays work. but i guess sli is disabled then.


Why do you want to "activate all displays" instead of "span displays with surround"? Connect all your monitors via DVI, turn on surround and "configure" your displays. You want all three via DVI or you'll get horrible screen tearing.

If you are running three 60hz monitors have them all connected via DVI to DVI, run them in surround and then in NVidia control panel - 3d Settings - Manage 3D settings - turn on vertical sync, turn on triple buffering and have pwer management at maximum performance. Limit your FPS to 60FPS in EVGA PrecisionX or MSI Afterburner and thats it. Never turn on Vsync in game.

That way you'll have 0% screen tearing in all situations with 0% input lag and never have to set it in game etc. In games like BF3 where it's available enable triple buffering in game but never vsync and let EVGA PrecisionX handle limiting the framerate.


----------



## thestache

...and my brothers GTX 690 is now running at 1200mhz and 1187mhz on both cores with the latest drivers. Used to run at 1212mhz on both cores. Has been watercooled from day one, still have no idea why it's slowly dropping. and Is overclocked 165+ core, 350+ memory at 135+ power.

Have not seen any change in my 4GB GTX 680 SLI clock speeds for their overclock and these have been running as high as 90deg playing planetside 2 at 3856x1920 for nearly 2 months. So it's safe to say I'm abusing the hell out of them until I finish my watercooling loop and they seem to be fine.


----------



## Arizonian

Quote:


> Originally Posted by *thestache*
> 
> ...and my brothers GTX 690 is now running at 1200mhz and 1187mhz on both cores with the latest drivers. Used to run at 1212mhz on both cores. Has been watercooled from day one, still have no idea why it's slowly dropping. and Is overclocked 165+ core, 350+ memory at 135+ power.
> Have not seen any change in my 4GB GTX 680 SLI clock speeds for their overclock and these have been running as high as 90deg playing planetside 2 at 3856x1920 for nearly 2 months. So it's safe to say I'm abusing the hell out of them until I finish my watercooling loop and they seem to be fine.


Very interesting. I personally experienced this same drop in Core but not from current beta drivers. Went from normal 1176 MHz Core to 1163 MHz at best for no apparent reason. Regardless of drivers used even back to 301.xx No matter how high I set Core Offset. You know what I'm talking about from previous posts. Keep us informed if anything changes further.

Just today I had to roll back from 310.33 beta to 306.97 WHQL. I was no longer down clocking to 2D clocks in idle. Had to force Nvidia Inspector to run multiple display power savings in order to force 2D clocks to kick in and allow me to idle.

I went rolled back to latest WHQL and all is fine with 2D clocks without need for Nvidia Inspector to be running multiple display power savings mode.


----------



## thestache

Quote:


> Originally Posted by *Arizonian*
> 
> Very interesting. I personally experienced this same drop in Core but not from current beta drivers. Went from normal 1176 MHz Core to 1163 MHz at best for no apparent reason. Regardless of drivers used even back to 301.xx No matter how high I set Core Offset. You know what I'm talking about from previous posts. Keep us informed if anything changes further.
> Just today I had to roll back from 310.33 beta to 306.97 WHQL. I was no longer down clocking to 2D clocks in idle. Had to force Nvidia Inspector to run multiple display power savings in order to force 2D clocks to kick in and allow me to idle.
> I went rolled back to latest WHQL and all is fine with 2D clocks without need for Nvidia Inspector to be running multiple display power savings mode.


Yeah I'll keep monitoring it because it's interesting. No idea whats causing it at all. Be interesting to see if it's a GTX 670/680/690 thing or just a GTX 690 thing.


----------



## Arizonian

Quote:


> Originally Posted by *thestache*
> 
> Yeah I'll keep monitoring it because it's interesting. No idea whats causing it at all. Be interesting to see if it's a GTX 670/680/690 thing or just a GTX 690 thing.


I'd say worth keeping an eye on and if there is any correlation with each other that could confirm this. Never know.


----------



## DinaAngel

Quote:


> Originally Posted by *thestache*
> 
> Wrong.
> Using all three DVI connections does work in SLI. Why it says that I don't know but that's how I ran mine (my GTX 690) and I have pictures of my brothers rig with his GTX 690 playing BF3 with it set-up that way from yesterday. You don't want to mix display connections because it can cause screen tearing.
> Why do you want to "activate all displays" instead of "span displays with surround"? Connect all your monitors via DVI, turn on surround and "configure" your displays. You want all three via DVI or you'll get horrible screen tearing.
> If you are running three 60hz monitors have them all connected via DVI to DVI, run them in surround and then in NVidia control panel - 3d Settings - Manage 3D settings - turn on vertical sync, turn on triple buffering and have pwer management at maximum performance. Limit your FPS to 60FPS in EVGA PrecisionX or MSI Afterburner and thats it. Never turn on Vsync in game.
> That way you'll have 0% screen tearing in all situations with 0% input lag and never have to set it in game etc. In games like BF3 where it's available enable triple buffering in game but never vsync and let EVGA PrecisionX handle limiting the framerate.


why does nvidia state this then, yes u can run it on all dvi but then ur not running on one gpu with all screens, just test it out u get about 20-30% perf drop,

i think id run all mine on master gpu to get all the performance and not have any scale issues








uknow input lagg is monitor's fault most soo, why go say im wrong when u havent tested it?


----------



## DinaAngel

Quote:


> Originally Posted by *Arizonian*
> 
> I'd say worth keeping an eye on and if there is any correlation with each other that could confirm this. Never know.


its Inplanted by nvidia so u hafto buy 790 next year







since the 690 dies over time









yeah same here, running k mode atm


----------



## Rei86

Quote:


> Originally Posted by *thestache*
> 
> Yeah I'll keep monitoring it because it's interesting. No idea whats causing it at all. Be interesting to see if it's a GTX 670/680/690 thing or just a GTX 690 thing.


Most people I know have rolled back from the 310 drivers on their single GPU cards to 306.97 and most people with 690s back to 306.23.

I've found to have issues myself when I was running even 306.97 taking a performance hit here and there compared to the older 306.23 drivers. As for holding boost I haven't had that many issues with the card getting stuck at boost.


----------



## juanP

Quote:


> Originally Posted by *thestache*
> 
> Wrong.
> Using all three DVI connections does work in SLI. Why it says that I don't know but that's how I ran mine (my GTX 690) and I have pictures of my brothers rig with his GTX 690 playing BF3 with it set-up that way from yesterday. You don't want to mix display connections because it can cause screen tearing.
> Why do you want to "activate all displays" instead of "span displays with surround"? Connect all your monitors via DVI, turn on surround and "configure" your displays. You want all three via DVI or you'll get horrible screen tearing.
> If you are running three 60hz monitors have them all connected via DVI to DVI, run them in surround and then in NVidia control panel - 3d Settings - Manage 3D settings - turn on vertical sync, turn on triple buffering and have pwer management at maximum performance. Limit your FPS to 60FPS in EVGA PrecisionX or MSI Afterburner and thats it. Never turn on Vsync in game.
> That way you'll have 0% screen tearing in all situations with 0% input lag and never have to set it in game etc. In games like BF3 where it's available enable triple buffering in game but never vsync and let EVGA PrecisionX handle limiting the framerate.


ok i followed ur note and got the uad sli up. now i was wondering if i will loose performance in this connection method or will i be ok without using the displayport connection?


----------



## juanP

ok finally got my triple monitors setup


----------



## Arizonian

Quote:


> Originally Posted by *juanP*
> 
> ok finally got my triple monitors setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


That is a beautiflul set up juanP.









Yet somehow I still feel bad for you being so close to all that screen.







Might want to try a wider desk to get some distance between you and your sweet monitors.







You must feel like your actually inside that car driving with the rushing scenery on either side of your eyes peripheral vision.









Nice system.


----------



## ceteris

Quote:


> Originally Posted by *juanP*
> 
> ok finally got my triple monitors setup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Indeed. You need a proper "Altar of Worship" my friend


----------



## Arizonian

It's been really nice the past few days of cold in Arizona. My 690 that used to sit at 32C & 30C idle is now sitting at 29C & 27C idle.







The inside house ambient temps are 3C down during the winter months and it's reflecting on my GPU.

My single GTX 580 ran about 10C higher idle than my dual GTX 690. I'd say quite the architectural accomplishment IMO regardless where the credit is to be given.

My fan is running so quite I had to look underneath to check if it was still spinning.


----------



## thestache

Quote:


> Originally Posted by *juanP*
> 
> ok i followed ur note and got the uad sli up. now i was wondering if i will loose performance in this connection method or will i be ok without using the displayport connection?


Good to see you got it working. Looks great.

If you set it up like I said it'll run perfect. Nvidia surround has some issues like green screen flashing on flash videos and some other weird video playback issues and if running bezel correction on the desktop and then not in game your monitors will turn off and on in game when explosions happen but generally it's very solid overall. Much better than eyefinity at the moment. Eyefinity is good but Vsync is broken and pretty much ruins the whole experience.


----------



## Jessekin32

Can anyone help me fix an issue I'm having with my games? The shadows in most games will flicker, and look really jaggity. They start tweaking out, and it's really noticeable when I run the heaven benchmark. Also really noticeable in bf3.

Clean install of Windows, and I think 306.xx drivers. (But at my pc right now) any tips will be appreciated. Thanks!


----------



## iARDAs

Quote:


> Originally Posted by *Jessekin32*
> 
> Can anyone help me fix an issue I'm having with my games? The shadows in most games will flicker, and look really jaggity. They start tweaking out, and it's really noticeable when I run the heaven benchmark. Also really noticeable in bf3.
> Clean install of Windows, and I think 306.xx drivers. (But at my pc right now) any tips will be appreciated. Thanks!


Is this problem only happening with the latest WHQL driver?

I remember having a similar problem with my previous 590 GTX and for me updating the drivers had worked.

You might have a clean install of Windows but maybe your driver is conflicting with something. Update to the latest beta driver 310 something. Let's see if the problem will happen again. 310 is also a very good driver. You won't regret.


----------



## Arm3nian

I have been without a computer now for 6 days. I was getting rather high temps with mx-4, so I applied CL ultra and now I can't even turn it on because it instantly crashes from my cpu oc, at stock 3500mhZ it idles at 70c in a 60faren room.... gpu idles at 35-50c, have not even tried gaming. It is almost like I failed mounting so bad that there is no contact. I will order a pump top and tube res plus some tim since I will reassembling my loop again because it is impossible to mount waterblocks with tubing still on it properly, as I basically destroyed my loop. Meh...


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> I have been without a computer now for 6 days. I was getting rather high temps with mx-4, so I applied CL ultra and now I can't even turn it on because it instantly crashes from my cpu oc, at stock 3500mhZ it idles at 70c in a 60faren room.... gpu idles at 35-50c, have not even tried gaming. It is almost like I failed mounting so bad that there is no contact. I will order a pump top and tube res plus some tim since I will reassembling my loop again because it is impossible to mount waterblocks with tubing still on it properly, as I basically destroyed my loop. Meh...


its totally not the ultras fault if its only on the cpu but remember tht its electric conductive since its 100% metal.
u need to place the thermal compound differently than any other compound too btw, very thin layer on whole cpu.

fasten cooler it until u hear sqrews make sounds <3


----------



## Arm3nian

Quote:


> Originally Posted by *DinaAngel*
> 
> its totally not the ultras fault if its only on the cpu but remember tht its electric conductive since its 100% metal.
> u need to place the thermal compound differently than any other compound too btw, very thin layer on whole cpu.
> fasten cooler it until u hear sqrews make sounds <3


My gpu is at 35c idle, before remounting I had 25c idle with mx-4 in a hotter room. I screwed it down all the way, both cpu and gpu, so it has to make contact.

After sticking it on though, I did move both around A LOT because it was difficult with tubing already on the block to mount, and I do think I applied the paste normally, since I did it on my delided cpu and nothing blew up, whatever will try to remount when I get new parts.


----------



## DinaAngel

Quote:


> Originally Posted by *Arm3nian*
> 
> My gpu is at 35c idle, before remounting I had 25c idle with mx-4 in a hotter room. I screwed it down all the way, both cpu and gpu, so it has to make contact.
> After sticking it on though, I did move both around A LOT because it was difficult with tubing already on the block to mount, and I do think I applied the paste normally, since I did it on my delided cpu and nothing blew up, whatever will try to remount when I get new parts.


the ultra doesnt work if u move block to the side since then u need to reapply, its kinda **** like that.

like it hafto be direct and mount with 0 twisting or movement only down motion to cpu


----------



## Arm3nian

Quote:


> Originally Posted by *DinaAngel*
> 
> the ultra doesnt work if u move block to the side since then u need to reapply, its kinda **** like that.
> like it hafto be direct and mount with 0 twisting or movement only down motion to cpu


Ya well that explains my failure temps. Idk if I suck at mounting but cpu backplates are horribly designed imo, and make it hard to install the wb with no twisting. The 690 seems better but it has thermal pads and a lot of screws which also make it hard, not mentioning the cl ultra being more slippery than mx-4.


----------



## pilla99

What is the expected time frame for the 7 series cards? That and 8 series ATi? Just want to start planning on when we could see 790's and 8970's on newegg.


----------



## iARDAs

Quote:


> Originally Posted by *pilla99*
> 
> What is the expected time frame for the 7 series cards? That and 8 series ATi? Just want to start planning on when we could see 790's and 8970's on newegg.


Nothing certain yet but I would expect April. Not before that.

The current GPUs sell quite well.


----------



## hoody

Hi Guys

im looking at getting 2x 690's and triple monitors etc but cant decide on a motherboard i like the look off the sabertooth z77 but im worried about throttling and or bottleneck would a x79 board be better ??

any help suggestions would be great .

Thanks
Hoody


----------



## iARDAs

Quote:


> Originally Posted by *hoody*
> 
> Hi Guys
> im looking at getting 2x 690's and triple monitors etc but cant decide on a motherboard i like the look off the sabertooth z77 but im worried about throttling and or bottleneck would a x79 board be better ??
> any help suggestions would be great .
> Thanks
> Hoody


Hey buddy. Please look into getting 2 690s for triple monitor more carefully. Although 2 690s have the beef, it will lack in vram with 3 monitors IMO. BUt we have real 690 users here who can help you better.

Besides that I have the sabertooth z77 and its kick ass. Should serve you well with 2 690s if you want to go that route.

My real advice to you is that if you want 3 monitors think of 680 4gb 3way SLI ... and if you go 3way SLI avoid the sabertooth z77.

Do we have someone in the club with 2 690s and 3 monitors though? Maybe he/she can share the experience


----------



## Qu1ckset

2x GTX690s are a waste of time for large multi monitor resolutions as lack of vram will hurt in the end, and on top of that, scaling for 4way SLI is not worth it so getting 2 GTX690s for any setup is a waste. The forth card or GPU adds such a small % of performance over 3 way SLI it makes it not worth the added price of buying that forth card.

If i were you id go 3way SLI with 4GB GTX670/680s or go with AMDs 7970s which is also i could choice


----------



## thestache

Quote:


> Originally Posted by *hoody*
> 
> Hi Guys
> im looking at getting 2x 690's and triple monitors etc but cant decide on a motherboard i like the look off the sabertooth z77 but im worried about throttling and or bottleneck would a x79 board be better ??
> any help suggestions would be great .
> Thanks
> Hoody


You do not want GTX 690 SLI, you want 3-Way GTX 670/680 4GB SLI for three monitors. Read this thread in more detail and you'll realize why. Not enough VRAM.

And for 3-Way SLI you don't want a motherboard that 'looks cool' but only has one x16 PCIe 3.0 slot you want an X79 with 40 PCIe 3.0 lanes in total. Get the Sabertooth or Forumla X79.


----------



## rationalthinking

Quote:


> Originally Posted by *pilla99*
> 
> What is the expected time frame for the 7 series cards? That and 8 series ATi? Just want to start planning on when we could see 790's and 8970's on newegg.


Sometime right before spring, March-April.

Also looking forward to, what has to be.. a a TRUE 4GB GTX790. Before that I want Ivy-E to come out though, really looking forward to jumping in on the extreme side.


----------



## juanP

Quote:


> Originally Posted by *hoody*
> 
> Hi Guys
> im looking at getting 2x 690's and triple monitors etc but cant decide on a motherboard i like the look off the sabertooth z77 but im worried about throttling and or bottleneck would a x79 board be better ??
> any help suggestions would be great .
> Thanks
> Hoody


as all the others above me mentioned its recommended that you don't go for triple monitors with quad 690 sli. specially if you do heavy fps gaming.

as for me i own triple 30's with quad sli 690's and all i use them are for racing games only. so its a huge overkill for me infact but i did a impulse buy







back then and bought the cards without reading the reviews abt the 2gb limitation and the 4gb marketing gimmick by evga.
but i am hoping they will last me for the next 5 yrs specially since i just do racing games.

on a side note, been playing with triple monitors for the last couple of days and its freakin awesome.


----------



## jassilamba

Quote:


> Originally Posted by *thestache*
> 
> You do not want GTX 690 SLI, you want 3-Way GTX 670/680 4GB SLI for three monitors. Read this thread in more detail and you'll realize why. Not enough VRAM.
> And for 3-Way SLI you don't want a motherboard that 'looks cool' but only has one x16 PCIe 3.0 slot *you want an X79 with 40 PCIe 3.0 lanes in total*. Get the Sabertooth or Forumla X79.


I have the Z77 and the next upgrade is going to be the X-79 platform depending how the IB-E turns out. But an X-79 will be better compared to Z77 for the reasons mentioned above.


----------



## hoody

Many thanks guys upon ready reviews i think im going to go for the asus formula and gtx670 4gb x3 many thanks for your help !

Hoody


----------



## iARDAs

Quote:


> Originally Posted by *hoody*
> 
> Many thanks guys upon ready reviews i think im going to go for the asus formula and gtx670 4gb x3 many thanks for your help !
> Hoody


Much better call. Good job.


----------



## Buzzkill

The HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl The Nickel and Black looks good. They came with thermal pads but no TIM for GPU's whats the recomended TIM?


----------



## ceteris

Quote:


> Originally Posted by *Buzzkill*
> 
> The HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl The Nickel and Black looks good. They came with thermal pads but no TIM for GPU's whats the recomended TIM?


Alot of people like to use MX-4. Doesn't take long to cure. I like to use PK-1, mostly because I still have alot of it lol.

BTW, are you in Europe? Been waiting for those babies to hit the US shores.


----------



## jassilamba

Quote:


> Originally Posted by *Buzzkill*
> 
> The HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl The Nickel and Black looks good. They came with thermal pads but no TIM for GPU's whats the recomended TIM?
> 
> 
> Spoiler: Warning: Spoiler!


I'm using Tuniq TX-4 on my copper edition and so far my average temps are around 30-32 under load. (No OC)

Quote:


> Originally Posted by *ceteris*
> 
> Alot of people like to use MX-4. Doesn't take long to cure. I like to use PK-1, mostly because I still have alot of it lol.
> BTW, are you in Europe? Been waiting for those babies to hit the US shores.


PK-1 is really great.


----------



## Buzzkill

Quote:


> Originally Posted by *ceteris*
> 
> Alot of people like to use MX-4. Doesn't take long to cure. I like to use PK-1, mostly because I still have alot of it lol.
> BTW, are you in Europe? Been waiting for those babies to hit the US shores.


No USA. Ordered from Watercool. DHL only took two weeks


----------



## Qu1ckset

Quote:


> Originally Posted by *Buzzkill*
> 
> The HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl The Nickel and Black looks good. They came with thermal pads but no TIM for GPU's whats the recomended TIM?


Pictures when its complete!


----------



## ceteris

^ I second the pics
Quote:


> Originally Posted by *Buzzkill*
> 
> No USA. Ordered from Watercool. DHL only took two weeks


Deep pockets, High Roller! Even if the price on exchange was right for me, the last time I ordered directly from them it took almost a month to get here in Cali.


----------



## Arm3nian

Anyone try liquid pro on the 690? Using an xspc block


----------



## Jessekin32

Still only getting about 50-60% usage on each GPU while playing BF3 (6050x1080) and no, not all the ram is being used. It uses MUCH less when I enable adaptive V-Sync, too. (I know vsync uses less memory, but 500-600MB less?)

This is dramatically affecting my FPS in game, and it's driving me nuts. I've been monitoring the system and GPU's while in game with Afterburner, and I know I'm not maxing my ram at that resolution. Plus the low usage tells me that my GTX 690 is not at fault here.

Any idea? Clean install of Windows, clean install of BF3, and I'm using the 310.54 beta drivers. I was using 306.xx before, and I still had the same problem.

Edit: nearly same results no matter the graphical setting. I'm trying to play on Ultra minus the AA..


----------



## 5150jester

dont know if this has been discussed ..but can someone tell me is there a driver or fix somewhere that i havent found yet that allows you to run 2 690's on Windows 8 ...because i havent been able to find anything allowing them both to operate in my system??


----------



## Buzzkill

Installed Heatkiller 690 Blocks with EVGA Backplates. Had to get some new screws so It would work.
Move radiator from floor to front and make a drive cage fit in front of power supply. AX1200i replaced HX1000. Copper Raystorm installed 2 days before GPU blocks and primochill clear tubing plasticized in a few days.


----------



## Buzzkill

Quote:


> Originally Posted by *5150jester*
> 
> don't know if this has been discussed ..but can someone tell me is there a driver or fix somewhere that i haven't found yet that allows you to run 2 690's on Windows 8 ...because i haven't been able to find anything allowing them both to operate in my system??


I use 310.53 Beta with 2 690's and Windows 8 pro. I have not had any problem with SLI and windows 8. Windows 7 I had to make sure monitor is plugged it to right card and in the right DIV or Quad SLI will not enable. You would get Tri-SLI. The water block's keep cards around 30c and They run good nice and cool. Before my cards would get to 90c because cards sit on top of each other blocking air flow. Windows 7 I would install one card the drivers then reboot Install second card and install driver again. Windows 8 I had both cards installed and installed Win 8 and after windows update I could enable quad SLI but I still used the 310 Beta driver.


----------



## Rei86

Got the card in the case and ran 3DMarks with Beta drivers....lost 200pts for my P score because the driver is not supported


----------



## rationalthinking

Quote:


> Originally Posted by *Buzzkill*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Installed Heatkiller 690 Blocks with EVGA Backplates. Had to get some new screws so It would work.
> Move radiator from floor to front and make a drive cage fit in front of power supply. AX1200i replaced HX1000. Copper Raystorm installed 2 days before GPU blocks and primochill clear tubing plasticized in a few days.


Very nice!

Really like those blocks.


----------



## rationalthinking

Quote:


> Originally Posted by *Rei86*
> 
> Got the card in the case and ran 3DMarks with Beta drivers....lost 200pts for my P score because the driver is not supported


What did you score?


----------



## jaredmergel

Can't wait to use mine, Ordered it 2 days ago but Won't be able to use it till I get off deployment. So excited


----------



## iARDAs

You are going to love it jaredmergel

For a 1080p 60hz monitor 690 will be great for the next 2-3 years at least


----------



## Rei86

Quote:


> Originally Posted by *rationalthinking*
> 
> What did you score?


Couldn't save it because I don't have pro but it was around 13XXX. Nothing OCed btw.


----------



## dynn

thanks to everyone for helping me with my new comp. Here are the photos of this.


----------



## Rei86

lol you're system is pretty much a twin to my system


----------



## dynn

Quote:


> Originally Posted by *Rei86*
> 
> lol you're system is pretty much a twin to my system


the keyboard is a joke! ahhaha, i ordered the ducky shine 2, it will arrive in 2 weeks


----------



## Rei86

Quote:


> Originally Posted by *dynn*
> 
> the keyboard is a joke! ahhaha, i ordered the ducky shine 2, it will arrive in 2 weeks


Using a N7 edition Widow

Can't move on without a mechanical keyboard anymore.


----------



## Arizonian

Quote:


> Originally Posted by *dynn*
> 
> thanks to everyone for helping me with my new comp. Here are the photos of this.
> 
> 
> Spoiler: Warning: Spoiler!


Welcome aboard the club Dynn, very nice set up.


















On a further note: I've been very busy (I'll be honest self-absorbed) past few weeks with my system upgrading storage as I did homework. I'm afraid I've missed someone to be added to the club. I've rescanned the pages but the entry post I could have sworn I briefly glanced one day is eluding me.

*IF there is anyone I've missed adding to the club list please let me know and link me to your original post you submitted and I apologize up front.*









I've got a gut feeling one person has slipped by and did not want them to think I did not want to enter them or something.


----------



## dynn

Comp specifications:
CPU: 3770K OC to 4.4
GPU: GTX 690 ASUS
Motherboard: maximus V Formula
PSU: corsair AX GOLD 850W
SSD: SAMSUNG 830 256GB
HDD: WD 1.5 T
RAM: 8GB- 2400
Cooler: H100

Notes:
CPU and Motherboard: well im new with all this stuff, im overcloacking CPU to 4.4 with H100, i really dont have idea if my motherboard is toasting or not (i think is fine).
GPU: Im playing sc2 with 1820 res. with max graphics quality with 4v4 games, and looks like GPU is suffering (gets to 49C), i really dont know how to configure or use it to get it stable. But im very worried.

Is there any programs that you all suggest to monitor everything, tests, benchamark, how to configure everything, stuff that are very usefull that i dont know? im very noob at all this.


----------



## Arizonian

Quote:


> Originally Posted by *dynn*
> 
> Comp specifications:
> CPU: 3770K OC to 4.4
> GPU: GTX 690 ASUS
> Motherboard: maximus V Formula
> PSU: corsair AX GOLD 850W
> SSD: SAMSUNG 830 256GB
> HDD: WD 1.5 T
> RAM: 8GB- 2400
> Cooler: H100
> Notes:
> CPU and Motherboard: well im new with all this stuff, im overcloacking CPU to 4.4 with H100, i really dont have idea if my motherboard is toasting or not (i think is fine).
> GPU: Im playing sc2 with 1820 res. with max graphics quality with 4v4 games, and looks like GPU is suffering (gets to 49C), i really dont know how to configure or use it to get it stable. But im very worried.
> Is there any programs that you all suggest to monitor everything, tests, benchamark, how to configure everything, stuff that are very usefull that i dont know? im very noob at all this.


Easier way to show your system. *"How to put your Rig in your Sig"*

*3DMark11* for GPU benchmark. *GPU-Z* for specs. Since you have an ASUS 690 try *ASUS GPU-Tweak* for your GPU over clocking utility but technically you can use *EVGA Precision X* which is my favorite.

On a side note - I don't put much weight on the Windows 8 System Performance Benchmark but to get a 7.2 on a GTX 690.....











Spoiler: Warning: Spoiler!







Guess one needs a quad 690 to score top 9.9....


----------



## dynn

is the speedfan usefull for monitoring entire comp? or theres another better program?


----------



## dynn

Quote:


> Originally Posted by *dynn*
> 
> is the speedfan usefull for monitoring entire comp? or theres another better program?


here is my 3d mark bench

http://www.3dmark.com/3dm11/5023565

is a good bench?


----------



## alienware

Heres my contribution to this thread




























build log can be found here

http://www.overclock.net/t/1298433/build-log-project-iamextreme


----------



## MrTOOSHORT

Pretty darn nice there buddy!


----------



## Arizonian

Quote:


> Originally Posted by *alienware*
> 
> Heres my contribution to this thread
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> b
> 
> 
> uild log can be found here
> http://www.overclock.net/t/1298433/build-log-project-iamextreme


Sharp rig you got alienware, nice work. Got you included in the club list.


















Edited to add: If I'm not mistaken you make *the 91st member in the club with the 100th GTX 690 for a total of $100,000 in cards*.


----------



## Falcon3

Hello Buzzkill - 2 questions if I may
1. In your opinon Win 8 a better op system than win 7 for 690's in sli?
2. First time water cooler - are you suggesting before the water is filled in the res , one should install 1- 690 at a time.... if I opt to go with win 7 pro?

Thanks for your help.


----------



## Kaapstad

One of the guys on the 3dmark11 thread asked for a pic of my GTX 690s



http://imageshack.us/photo/my-images/195/056fullsize.jpg/

I took this some time ago but nothing has changed.


----------



## Arizonian

Quote:


> Originally Posted by *Kaapstad*
> 
> One of the guys on the 3dmark11 thread asked for a pic of my GTX 690s
> 
> http://imageshack.us/photo/my-images/195/056fullsize.jpg/
> I took this some time ago but nothing has changed.


That was me because I had to see the rig that took down the *Top 3DMark11* score on *OCN* for quad since the 7970's release.

In defense to those 7970's, they were original drivers. I'm sure though your new score may spark a revisit with current drivers but in the meantime we can bask in the glow of your 690's rather impressive bench with other 690 owners you did proud.









Nice job with your sleeving and color scheme btw. Looks great and hope you don't mind me adding you to the club list.







I normally would spoiler your pic but your rig deserves a second look.









Welcome to OCN with an impressive showing.







- "How to put your Rig in your Sig"


----------



## Kaapstad

Thank you for the warm welcome

I would be very happy to join the GTX 690 club

Kaapstad


----------



## Buzzkill

Quote:


> Originally Posted by *Falcon3*
> 
> Hello Buzzkill - 2 questions if I may
> 1. In your opinon Win 8 a better op system than win 7 for 690's in sli?
> 2. First time water cooler - are you suggesting before the water is filled in the res , one should install 1- 690 at a time.... if I opt to go with win 7 pro?
> Thanks for your help.


I would install you Operating System and get everything set up and drivers installed before you put blocks on GPU(or GPU's) because it will be a lot more work to Install one fill and bleed and the second after you set up. Windows 8 supported SLI 690's after Windows Update. I used the latest beta driver( 310.61) for Black OPS 2 . I had my OS installed and then I installed GPU blocks. I was running on 690 then Added the second one then reinstalled driver and go to Nvidia Control panel to enable Quad Sli.

Windows 8 and 7 are about the same. But Windows 8 will have driver support.
How to Build a Quad SLI Gaming Rig
SLI MINIMUM SYSTEM REQUIREMENTS


----------



## saifbukhari

Here goes my Asus GTX690. . . . Just waiting for a decent psu shortly although the currently owned is bearing the brunt without a hitch... im impressed























Sent from my GT-N7100 using Tapatalk 2


----------



## Arizonian

Quote:


> Originally Posted by *saifbukhari*
> 
> Here goes my Asus GTX690. . . . Just waiting for a decent psu shortly although the currently owned is bearing the brunt without a hitch... im impressed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my GT-N7100 using Tapatalk 2


Very nice saifbukhari - welcome aboard.


----------



## saifbukhari

Thanks Arizonian







any inputs from your side regarding my setup? would be glad to know


----------



## Falcon3

Thanks mate, I think I win go with w7 pro and do as you suggest run on air to install cards one at a time and if no issues water cool.
On holidays - cant wait to get back home and get stuck in, my frozen Q res and fan controllers from Aqua computer should have arrived by mail.
I see you have had issues with dyes - would you recommend the Primochill red coolent?


----------



## OmniScience

Hey all,

Recently popped in my OEM, straight from Nvidia apparently...:S (as per GPU-Z) *GTX 690.*.

It's replacing my twin Gigabyte GTX 670's as I plan to go DUAL 690's in time. Currently the 670 is running dedicated PhysX.





Now a question. I really like the EVGA version of the 690 due to the ability to control the LED. I don't get this feature with mine.
Is it possible to flash to another manufacturer's bios/firmware safely? How different are they really? I haven't had the opportunity to comb through all 200+ pages yet to see if this has been discussed at all.
Any help would be appreciate!


----------



## MrTOOSHORT

^^^^

All the gtx 690s are reference design so any brand bios will work for your OEM gtx 690. I flashed my card with a modded bios and I find it's easy to do, safe imo.


----------



## OmniScience

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> ^^^^
> All the gtx 690s are reference design so any brand bios will work for your OEM gtx 690. I flashed my card with a modded bios and I find it's easy to do, safe imo.


How did you go about doing so safely? Nvflash? Which modded bios did you use? (advantages of yours or some others?)


----------



## MrTOOSHORT

Used nvflash in DOS mode with a usb stick. The bios I'm using didn't help much if any. It allows 150% power target compared to the stock 135%.

Only thing to worry about when flashing your card is if the power goes out, otherwise I feel it's very safe and easy to do.


----------



## gizmo83

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Used nvflash in DOS mode with a usb stick. The bios I'm using didn't help much if any. It allows 150% power target compared to the stock 135%.
> Only thing to worry about when flashing your card is if the power goes out, otherwise I feel it's very safe and easy to do.


I would like to install the most recent bios evga on my gigabyte gtx 690. Can anyone give me a guide?


----------



## Methos07

I feel silly since I actually own a 690 gtx, but I'm at work and can't test this out. Will 3 displays work with this card in non-surround? I remember when I had tri-sli 470's I had to have another seperate non-sli gpu in order to run a third display. I dunno if this would be any different since it's a dual GPU card, but SLI effectively disables those outputs on the slave GPU(s).

I was contemplating getting a few displays for a Portrait-Landscape-Landscape setup. If anything I can just add a small gpu for the third display, but ideally I'd like to just use the one card.


----------



## OmniScience

Quote:


> Originally Posted by *Methos07*
> 
> I feel silly since I actually own a 690 gtx, but I'm at work and can't test this out. Will 3 displays work with this card in non-surround? I remember when I had tri-sli 470's I had to have another seperate non-sli gpu in order to run a third display. I dunno if this would be any different since it's a dual GPU card, but SLI effectively disables those outputs on the slave GPU(s).
> I was contemplating getting a few displays for a Portrait-Landscape-Landscape setup. If anything I can just add a small gpu for the third display, but ideally I'd like to just use the one card.


You should be able to. I had 3 displays running on it 2 days ago. But do you want to replicate the same image? Or extend desktop? I know for a fact my buddy does 3 monitors on his Windows 8 Machine, with a 660TI.


----------



## OmniScience

So after doing some research. It would appear that EVGA seems to be the only company updating their 690's.

And for the life of it NVflash wont let me flash my NVIDIA 690 to EVGA's bios

Zotac/Gigabyte have the same BIOS as the Stock Reference NVIDIA cards (the one I have).

Asus has a revision. Updated slightly.

EVGA has the latest. Here are the two links for GPU 1 and GPU 2.

GPU1: http://www.techpowerup.com/vgabios/128173/EVGA.GTX690.2048.120430_1.html

GPU2: http://www.techpowerup.com/vgabios/127971/EVGA.GTX690.2048.120430.html


----------



## jassilamba

Quote:


> Originally Posted by *OmniScience*
> 
> So after doing some research. It would appear that EVGA seems to be the only company updating their 690's.
> And for the life of it NVflash wont let me flash my NVIDIA 690 to EVGA's bios
> Zotac/Gigabyte have the same BIOS as the Stock Reference NVIDIA cards (the one I have).
> Asus has a revision. Updated slightly.
> EVGA has the latest. Here are the two links for GPU 1 and GPU 2.
> GPU1: http://www.techpowerup.com/vgabios/128173/EVGA.GTX690.2048.120430_1.html
> GPU2: http://www.techpowerup.com/vgabios/127971/EVGA.GTX690.2048.120430.html


+1 for you sir.


----------



## zalbard

Quote:


> Originally Posted by *OmniScience*
> 
> EVGA has the latest. Here are the two links for GPU 1 and GPU 2.
> GPU1: http://www.techpowerup.com/vgabios/128173/EVGA.GTX690.2048.120430_1.html
> GPU2: http://www.techpowerup.com/vgabios/127971/EVGA.GTX690.2048.120430.html


These are all the same. They are not more recent. Compiled on the same day as others. Different sub-revisions for different vendors.

EVGA GTX 690 *Hydro Copper Signature*, however, runs a special tweaked BIOS with higher clocks. Would be interesting to give it a try if someone owns one.


----------



## OmniScience

A few guys on another other site have enhanced the power target on their cards, but it only seems applicable to EVGA and really doesn't do much for them overall.

Also those were posted by a guy who got his emailed from EVGA apparently they have a few fixes for issues people were having.

Now is there any way to force a flash from my version to an EVGA card? Keeps telling me my subsets are off. Same deal when I used Zotac, Gigabyte, Galaxy, EVGA. The only flashes that work are when I backup my own and reflash to test.

My card came out of an Alienware machine running dual 690's so I assume they get those direct supply from Nvidia.

I'm interested in the LED controller for the EVGA cards.


----------



## alpout79

help pls...
i have gigabyte gtx690
i use 3ds max with nitro driver .(cos opengl n direct 3d shaders are not good as than the nitro driver)my scene is 20 milion polygons n fps:15 -20 .
i check my gpus with gpu temp n gpu-z .only 1 gpu s working with 3ds max.i am realy sick cos i use my graphic card wit %50 performance.i change my nvida settings .i talked with nivdia.they told me to the same things that i made.
i am sure my all gpus are workin at normaly .cos i start vray rt(real tym renderin) wit 3ds max,my 1 gpu work for to show my viewport n other 1 s work for vray rt.but idont want that.Cos i need 2 gpus for work with 40 million polygons in my viewport screen.
could u help me about this problem?
n sory for my bad english typin
thnx for help


----------



## pilla99

Someone call the UN

just kidding, make sure that you have SLI enabled.. also does 3d max utilize more than one GPU?


----------



## alpout79

lol. yea dude. i am sure it s enabled.it s realy annoyed.








if it s work all gpus,i can work wit nitro driver in 3dsmax ,cos gtx690 is better wit 3000s cuda core than quadro .i mean accordin to price/performance balance.


----------



## C0re0per4tive

Submitting the following to join the big spenders club *N**VI**DI**A* *GTX 690* Owners Club.


----------



## jassilamba

Quote:


> Originally Posted by *C0re0per4tive*
> 
> Submitting the following to join the big spenders club *N**VI**DI**A* *GTX 690* Owners Club.
> 
> 
> 
> http://www.overclock.net/content/type/61/id/1151751/ http://www.overclock.net/content/type/61/id/1151749/


Welcome to the club, and have the Adult Signature ready







....


----------



## Arizonian

Quote:


> Originally Posted by *C0re0per4tive*
> 
> Submitting the following to join the big spenders club *N**VI**DI**A* *GTX 690* Owners Club.
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.overclock.net/content/type/61/id/1151751/ http://www.overclock.net/content/type/61/id/1151749/


Welcome to the GTX 690 Owners Club.


----------



## rationalthinking

I have have been looking everywhere for a SLi bridge that is posted in the OP.

Where can I get one of those for my 690s?


----------



## ceteris

Quote:


> Originally Posted by *rationalthinking*
> 
> I have have been looking everywhere for a SLi bridge that is posted in the OP.
> Where can I get one of those for my 690s?


Sorry, bro. Alot of people earlier in this thread wanted to get those too but that bridge never made it out to market. I guess they had that one done just for promotional purposes only.


----------



## rationalthinking

Quote:


> Originally Posted by *ceteris*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rationalthinking*
> 
> I have have been looking everywhere for a SLi bridge that is posted in the OP.
> Where can I get one of those for my 690s?
> 
> 
> 
> Sorry, bro. Alot of people earlier in this thread wanted to get those too but that bridge never made it out to market. I guess they had that one done just for promotional purposes only.
Click to expand...

Wow what a major disappointment.


----------



## Kaapstad

Quote:


> Originally Posted by *rationalthinking*
> 
> Wow what a major disappointment.


Yes for that SLI bridge they could have charged silly money and people would have paid it.

There must be a market for non standard sli or xfire bridges

I hope some company is reading this with the same idea.


----------



## Rei86

Was bored so I finally got around to doing this


----------



## iARDAs

Are you happy with those coolers?


----------



## Rei86

Complicated answer incoming....

I should've WC my whole system but the thought of going with the next 2011 socket put a hold on it for me. Which doesn't answer your question.

At 169.99 MSRP its as expensive as the best waterblocks for the 690. The only thing is with waterblocks you'll have to have the res, pump, tubes, barbs, rad etc. So I didn't feel bad buying this.

My 1st 690 with IC Diamond was hitting a high of 69 on GPU 1 and 65 GPU 2 when loaded, when idle it was 29/30 GPU 1 and 27/28 GPU2 With stock TIM my 690 was sitting around 75 ~ 70 load and 30~40 idle. With a custom fan profile the damn thing was audible once it hit 45% Fan Speed which made it annoying to me. It was the loudest thing in my case besides the optical drives (But its not like I have a disc in thos all the time).

My 2nd 690 (the one pictured post http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2840#post_18738869 ) was hitting the same highs as my 1st 690. And the fans audible once it hit 45% Fan Speed... Went ahead and put on the AC 690 Turbo and its whisper silent. However the temps aren't what I like. Used the tube of MX4 that was provided with the cooler and when I was testing it in 3DMarks11 I was hitting a high of 71 on GPU 1. Not what I was looking for as I want to keep it below 70. Probably gonna reTIM to ICDiamond or something else soon. And with the fans turned up to 100percent I can't hear it.

So for the money its pretty good. Instalation wasn't hard at all. Took all but 20mins to take the GPU out the case, take off the stock cooler, install 690 turbo, and run the test


----------



## mxthunder

Quote:


> Originally Posted by *Rei86*
> 
> Was bored so I finally got around to doing this


blasphemy!!









I bet those nvidia engineers cringe when they see stuff like that.


----------



## Rei86

Quote:


> Originally Posted by *mxthunder*
> 
> blasphemy!!
> 
> 
> 
> 
> 
> 
> 
> 
> I bet those nvidia engineers cringe when they see stuff like that.


Meh, runs cooler and quieter

If I was gonna WC my system it would've had a Heatkiller Hole Ed.


----------



## zalbard

Aren't you going to need a backplate? Doesn't the card sag and bend a lot due to cooler being in two pieces instead of one?


----------



## Arizonian

Quote:


> Originally Posted by *zalbard*
> 
> Aren't you going to need a backplate? Doesn't the card sag and bend a lot due to cooler being in two pieces instead of one?


I don't think the 690 has the same length / wieight sagging issues as other cards of same length due to it having mother board thick PCB. Only time will tell. When I didn't have the back plate I didn't foresee sag being an issue. I did put a back plate on but more for protection of a single uber card and the other asthetics.


----------



## Rei86

Quote:


> Originally Posted by *zalbard*
> 
> Aren't you going to need a backplate? Doesn't the card sag and bend a lot due to cooler being in two pieces instead of one?


They give you some foam sticks to support the card when its installed inside the case. Letting you support it form the bottom. However I've noticed if you have something old like a CM Stacker case it won't help since its too short. It works for my CM Cosmos II tho. However I'm using a Powercolor Power Jack to support the card ATM.

I have the EVGA GTX690 backplate but since you replace the screws when installing the heatsink I don't wanna mess with it


----------



## Alex132

Gonna be joining this club soon









Gonna get a 690 in a few weeks, hopefully pick up an EVGA backplate too! I love backplates (totally my 5870's fault








)

Apart from uninstalling the ATI drivers / using driversweeper in safemode etc. Anything else I should know about?


----------



## jassilamba

Quote:


> Originally Posted by *Alex132*
> 
> Gonna be joining this club soon
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna get a 690 in a few weeks, hopefully pick up an EVGA backplate too! I love backplates (totally my 5870's fault
> 
> 
> 
> 
> 
> 
> 
> )
> Apart from uninstalling the ATI drivers / using driversweeper in safemode etc. Anything else I should know about?


Dont know about any other person in here. I had to RMA my 690 as windows would not recognize it at all, I think it was code 43. After 2 RMAs, I ended up re-installing windows. EVGA does not want ppl re-installing windows so they will do what every they can to figure it out. I hope I was the only one who had that issue.


----------



## Arizonian

Quote:


> Originally Posted by *Alex132*
> 
> Gonna be joining this club soon
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna get a 690 in a few weeks, hopefully pick up an EVGA backplate too! I love backplates (totally my 5870's fault
> 
> 
> 
> 
> 
> 
> 
> )
> Apart from uninstalling the ATI drivers / using driversweeper in safemode etc. Anything else I should know about?


Pretty straightforward installation of drivers. The only thing I suggest is anytime you install drivers always use the 'clean install' option. It removes previous drivers before installing new ones. If you stick with WHQL only you'll never have a problem. Obviously Beta comes with its own share of issues at times.

Post pics back when you get her installed.


----------



## ceteris

Quote:


> Originally Posted by *jassilamba*
> 
> Dont know about any other person in here. I had to RMA my 690 as windows would not recognize it at all, I think it was code 43. After 2 RMAs, I ended up re-installing windows. EVGA does not want ppl re-installing windows so they will do what every they can to figure it out. I hope I was the only one who had that issue.


That sucks... Out of curiosity, did you have to reinstall the stock cooler before sending it back? Or did you just need to include it in the package going back to EVGA?


----------



## jassilamba

Quote:


> Originally Posted by *ceteris*
> 
> That sucks... Out of curiosity, did you have to reinstall the stock cooler before sending it back? Or did you just need to include it in the package going back to EVGA?


I could never get the card to work out of the box and that was before I water cooled it. They just asked me to send nothing but the card.


----------



## Rei86

Quote:


> Originally Posted by *jassilamba*
> 
> I could never get the card to work out of the box and that was before I water cooled it. They just asked me to send nothing but the card.


When I RMAed it that's all that they asked for. Clean the card, package it up, have RMA number outside of the box, leave no accessories as you won't get it back.

hope you can get this worked out man.


----------



## fat_italian_stallion

Just got mine back from evga's Rma today. Running like a charm. Gonna fold for a few days on it to see if everything is ok then waterblock it


----------



## Arizonian

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Just got mine back from evga's Rma today. Running like a charm. Gonna fold for a few days on it to see if everything is ok then waterblock it


Good to hear bud.









Out of curiosity, Was it refurbished or new?


----------



## fat_italian_stallion

Brand spankin new thankfully. Evga seems to be good with that, even when I sent back a 480 last year


----------



## Arizonian

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Brand spankin new thankfully. Evga seems to be good with that, even when I sent back a 480 last year


Sweet.


----------



## Arm3nian

Max voltage, 21c idle, 35c full load after an hour on both cores, on a hot day. Can oc more but too lazy.


----------



## jassilamba

Quote:


> Originally Posted by *Arm3nian*
> 
> Max voltage, 21c idle, 35c full load after an hour on both cores, on a hot day. Can oc more but too lazy.


That is impressive.


----------



## rationalthinking

Quote:


> Arm3nian


Wow, nice temps!


----------



## Alex132

So what's an average overclock with the 690? And can what is the 'safe' limit for overvolting?


----------



## Arm3nian

Quote:


> Originally Posted by *Alex132*
> 
> So what's an average overclock with the 690? And can what is the 'safe' limit for overvolting?


I would say 120-150 is in the range that most can get, about 180+ on core with high voltage if you have a nice card, memory clock I've seen up to 500 I think. The max safe voltage is the highest voltage the default bios gives you, you can go more by flashing another bios, i've done 220 on the core and 700 on the memory with overvolting but I don't want to run that 24/7, even with really low temps.


----------



## Alex132

Quote:


> Originally Posted by *Arm3nian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> So what's an average overclock with the 690? And can what is the 'safe' limit for overvolting?
> 
> 
> 
> I would say 120-150 is in the range that most can get, about 180+ on core with high voltage if you have a nice card, memory clock I've seen up to 500 I think. The max safe voltage is the highest voltage the default bios gives you, you can go more by flashing another bios, i've done 220 on the core and 700 on the memory with overvolting but I don't want to run that 24/7, even with really low temps.
Click to expand...

I think I'll probably just stick with overclocking it to roughly 1050 for 24/7 - so at least it is equal to two 680s


----------



## max883

Can i have your gtx690 bios Arm3nian.


----------



## rationalthinking

Quote:


> Originally Posted by *Arm3nian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> So what's an average overclock with the 690? And can what is the 'safe' limit for overvolting?
> 
> 
> 
> I would say 120-150 is in the range that most can get, about 180+ on core with high voltage if you have a nice card, memory clock I've seen up to 500 I think. The max safe voltage is the highest voltage the default bios gives you, you can go more by flashing another bios, i've done 220 on the core and 700 on the memory with overvolting but I don't want to run that 24/7, even with really low temps.
Click to expand...

Memory clock I could push to 750/800 without a problem, it has always been core clocks that held me to 125-150.


----------



## chabze

Help I'm a bit confused here. Ok here goes. I have recently bought a new asus gtx 690 and installed it in my machine to replace my hd7970, confusing part is it performs far far worse than the hd7970 over nVIDIA surround at 5880 x 1080. I'm getting almost unplayable frame rates on bf3 and moh warfighter and other games. I have tried numerous drivers including beta drivers but no joy. Tested the card on various benchmarking software and it performs as it should. Any ideas? Just to clarify my rig is Running a 3770k and yes a descent psu. It's a new build almost.
Not sure if I should just go back to the hd7970 at least I was getting high frame rates on maxed out settings on that.
Just feel like I've wasted £800


----------



## iARDAs

Quote:


> Originally Posted by *chabze*
> 
> Help I'm a bit confused here. Ok here goes. I have recently bought a new asus gtx 690 and installed it in my machine to replace my hd7970, confusing part is it performs far far worse than the hd7970 over nVIDIA surround at 5880 x 1080. I'm getting almost unplayable frame rates on bf3 and moh warfighter and other games. I have tried numerous drivers including beta drivers but no joy. Tested the card on various benchmarking software and it performs as it should. Any ideas? Just to clarify my rig is Running a 3770k and yes a descent psu. It's a new build almost.
> Not sure if I should just go back to the hd7970 at least I was getting high frame rates on maxed out settings on that.
> Just feel like I've wasted £800


Hey buddy

I believe you might be suffering from lack of Vram. For 1080p Surround setups at least 3 or 4 GB of vram is recommended for modern games.

690 has 2 GB of Vram per core which does not stack together, but advertised incorrectly as 4GB of vram.

So basically 690 is a 2GB of vram GPU. Not 4.

Even at 2560 x 1440p resolution I can see my Vram being above 2000mb but luckily i have a 4GB GPU.


----------



## chabze

Hmm my 2gb VRAM HD6970 pulls better (playable) frame rates tho :-(


----------



## fat_italian_stallion

Welp 2nd 690 off for RMA. Apparently they don't take too kindly to folding 24/7. EVGA must hate me.

also in regards to above comments, vram doesn't always matter. I use my monitors for things other than games so the extra vram offered by other cards is moot, just need the actual gpu itself for number crunching


----------



## mxthunder

What happend? Did it blow a VRM or anything?


----------



## fat_italian_stallion

Quote:


> Originally Posted by *mxthunder*
> 
> What happend? Did it blow a VRM or anything?


Funky lines across the screen after a few hours of folding.


----------



## Coldmud

Please explain overvolting on gtx690.. I've treid to find any succes on voltmods on 690 but to no avail. One guy succeeded doing a voltmod to the vrm named TiN on kingpingcooling.com but he's an elitist bastard and only helps ln2 guys... Does anyone know how to give this puppy more then 1.175v? 150% target bios is useless, it still throttles the core but now at 98% instead of 133% power perc???


----------



## dpoverlord

I have a GTX 690 and 2 monitors. I am looking at running SLI but am wondering if there is a way to keep the 2nd monitor running on the side while being in SLI. I want the second monitor so I can exit in and out and do work when I need to. Any ideas? I have a spare GTX 460 but I am not sure if that would make any sense.

If not would it have been better to have just purchased to 680's, and would this allow me to utilize both monitors in SLI?


----------



## Shogon

Quote:


> Originally Posted by *dpoverlord*
> 
> I have a GTX 690 and 2 monitors. I am looking at running SLI but am wondering if there is a way to keep the 2nd monitor running on the side while being in SLI. I want the second monitor so I can exit in and out and do work when I need to. Any ideas? I have a spare GTX 460 but I am not sure if that would make any sense.
> If not would it have been better to have just purchased to 680's, and would this allow me to utilize both monitors in SLI?




I'm currently running 2 monitors on my 690, Asus VG278HE and a U2711, and I am in SLI. Hook up your monitors to the green ports shown in the image. Someone posted this earlier luckily or else I wouldn't be able to find it.


----------



## KrynnTech

What do i need to post to join? I have 2 GTX 690s in SLI...


----------



## Alex132

Quote:


> Originally Posted by *KrynnTech*
> 
> What do i need to post to join? I have 2 GTX 690s in SLI...


Read the OP.
Quote:


> To be added to the members list, please post the following:
> 
> A) Proof of card ownership (picture, etc.)
> B) Brand of card


----------



## MrTOOSHORT

Well I tried the 150% power limit bios, and the new KGB tool to edit your curent bios. Doesn't do anything for the gtx 690.

GTX690 bios is a wierd one for the pros to figure out.

I'm stuck @1180MHz, oh well.


----------



## Kaapstad

I have never used any mods for my cards but have found the clock speeds a bit of a lottery I have 3 GTX 690s

The first is good for about 1130mhz

The second around 1170-1180mhz

The third on a good day 1180 - 1200mhz and will boost to about 1260mhz

On another forum I post on, one of the guys has a GTX 690 that gives performance figures very close to my best card but with much lower clocks. The only difference is he runs his on an X58 system.


----------



## PinzaC55

Hi everybody, this is my first post on Overclockers. I took delivery of my GTX 690 on Tuesday. It is an OEM card obtained from a dealer on ebay for a considerable saving on a consumer card. One thing I am curious about is that it came with a stabilising bracket fixed on the back - I haven't seen this actually used on any photo's of 690's? Initially I thought it wouldn't fit in my HAF-X case but it fits perfectly and just sits on top of the HDD cage. The card itself is running as sweet as a nut and one of my most impressive results was when I ran Unigine Heaven benchmark - my "old" card (XFX Radeon 7950) got a score of 35 FPS with all setting on maximum but the 690 gets 86!! Nearly 2.5 times the score.
Photo's of my build here on my flickr pages http://www.flickr.com/photos/pinzac55/



And fresh out of the box, showing the bracket-


----------



## MrTOOSHORT

^^^

Welcome to the club and to OCN for that matter!









I'm not sure what that peice is at the end of your card, but there is no doubt that you could just remove it.

Enjoy your new gtx 690, it's a beast!


----------



## rationalthinking

Quote:


> Originally Posted by *Kaapstad*
> 
> I have never used any mods for my cards but have found the clock speeds a bit of a lottery I have 3 GTX 690s
> 
> The first is good for about 1130mhz
> 
> The second around 1170-1180mhz
> 
> The third on a good day 1180 - 1200mhz and will boost to about 1260mhz


Your weakest card is still better than my card. It is extremely stable with under clocks under 1050 but do not go a har over that.


----------



## PinzaC55

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> ^^^
> Welcome to the club and to OCN for that matter!
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not sure what that peice is at the end of your card, but there is no doubt that you could just remove it.
> Enjoy your new gtx 690, it's a beast!


Thanks for the welcome! I wasn't sure what to class the "brand" of card as but I would simply say "Nvidia". The bracket actually shares the weight of the card so it doesn't all go onto the PCI-E slot - rather a neat feature I think and it doesn't get in the way of any cables. It fixes onto the back of the card by 3 screws - I know other cards have the screw holes. Like I said I use the HAF-X and what I can't understand is how Nvidia knew the EXACT height it needed to be to sit on the HDD cage. Here's a flash shot of how it looks in practice. I may use the hot swap bays of the HAF-X for my two conventional hard drives shortly in which case it might interfere with the cables but for it is doing a good job.Also note it has a slight curve at the inside end to make it easier to manoeuvre - they thought of everything! I guess that's what happens when you buy a Ferrari instead of a Skoda


----------



## Arizonian

Quote:


> Originally Posted by *PinzaC55*
> 
> Hi everybody, this is my first post on Overclockers. I took delivery of my GTX 690 on Tuesday. It is an OEM card obtained from a dealer on ebay for a considerable saving on a consumer card. One thing I am curious about is that it came with a stabilising bracket fixed on the back - I haven't seen this actually used on any photo's of 690's? Initially I thought it wouldn't fit in my HAF-X case but it fits perfectly and just sits on top of the HDD cage. The card itself is running as sweet as a nut and one of my most impressive results was when I ran Unigine Heaven benchmark - my "old" card (XFX Radeon 7950) got a score of 35 FPS with all setting on maximum but the 690 gets 86!! Nearly 2.5 times the score.
> Photo's of my build here on my flickr pages http://www.flickr.com/photos/pinzac55/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> And fresh out of the box, showing the bracket-
> [


As said welcome aboard both OCN and the 690 Club. Congrats.


----------



## jassilamba

Quote:


> Originally Posted by *PinzaC55*
> 
> Thanks for the welcome! I wasn't sure what to class the "brand" of card as but I would simply say "Nvidia". The bracket actually shares the weight of the card so it doesn't all go onto the PCI-E slot - rather a neat feature I think and it doesn't get in the way of any cables. It fixes onto the back of the card by 3 screws - I know other cards have the screw holes. Like I said I use the HAF-X and what I can't understand is how Nvidia knew the EXACT height it needed to be to sit on the HDD cage. Here's a flash shot of how it looks in practice. I may use the hot swap bays of the HAF-X for my two conventional hard drives shortly in which case it might interfere with the cables but for it is doing a good job.Also note it has a slight curve at the inside end to make it easier to manoeuvre - they thought of everything! I guess that's what happens when you buy a Ferrari instead of a Skoda
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Im doing something simillar in my rig, but I'm using the bracket that holds my bits power res as a rest to support the card. Since I have a heatkiller block those suckers are heavy.


----------



## PinzaC55

Here's the thing though - when I got the card I searched the web for "GTX 690 Support Bracket" and the only thing I could find was the EVGA backplate. No mention of this as a separate item,yet the screwholes show it was obviously intended to be there?


----------



## chabze

I'm a bit disappointed with my gtx 690 :-(
Installed the new beta drivers last night and still can't play bf3 smoothly maxed out on ultra settings on 5760 x 1080
Yet my single HD7970 can with no problems.
Some people say its down to the amount of VRAM on the card, I'm thinking it's down to the poor design of nvidia surround using to much VRAM over surround.
Eyefinity uses way less.
Any ideas? A lot of money spent for less performance just don't add up ;-(


----------



## Alex132

Quote:


> Originally Posted by *chabze*
> 
> I'm a bit disappointed with my gtx 690 :-(
> Installed the new beta drivers last night and still can't play bf3 smoothly maxed out on ultra settings on 5760 x 1080
> Yet my single HD7970 can with no problems.
> Some people say its down to the amount of VRAM on the card, I'm thinking it's down to the poor design of nvidia surround using to much VRAM over surround.
> Eyefinity uses way less.
> Any ideas? A lot of money spent for less performance just don't add up ;-(


uh.

It totally makes sense. I use 1gb/1gb of VRAM on my 5870 when playing on high/medium at 1920x1080.
Now you're using 3 times that horizontal resolution but only double the VRAM.
1920x1080 = 2073600 pixels
5760x1080 = 6220800 pxiels

So 3 times as many pixels, with only 2 times more VRAM. The 7970 has 3GB of VRAM.

Run MSI Afterburner or some other utility (maybe GPUz) that allows you to monitor the VRAM usage. Not sure what does so far (as I don't have the 690 yet), but I know on my 5870 I can monitor VRAM usage with GPUz.

I would still say that the 690 isn't really meant for triple monitor gaming, rather 1440p while powering other monitors.


----------



## Arm3nian

Quote:


> Originally Posted by *chabze*
> 
> I'm a bit disappointed with my gtx 690 :-(
> Installed the new beta drivers last night and still can't play bf3 smoothly maxed out on ultra settings on 5760 x 1080
> Yet my single HD7970 can with no problems.
> Some people say its down to the amount of VRAM on the card, I'm thinking it's down to the poor design of nvidia surround using to much VRAM over surround.
> Eyefinity uses way less.
> Any ideas? A lot of money spent for less performance just don't add up ;-(


You should have read the posts on here that said Gtx 690 is for 2560x1600 or lower. It destroys any game at that res, the Vram is the ONLY thing limiting it from higher res and that is due to nvidia's poor vram decision on the 690, not the way surround works.


----------



## TechAnalyst

Looking for opinions on the 690s in SLI

Ive had a pair for a few months but didn't want to go through taking my 7970 quads that are WB'd apart

Does the 690s in SLI work well with surround?


----------



## Qu1ckset

Quote:


> Originally Posted by *TechAnalyst*
> 
> Looking for opinions on the 690s in SLI
> Ive had a pair for a few months but didn't want to go through taking my 7970 quads that are WB'd apart
> Does the 690s in SLI work well with surround?


sli 690s is a waste of time for surround... your 7970s is a way better setup, 2gb vram is not enough for surround


----------



## emett

Quote:


> Originally Posted by *chabze*
> 
> Installed the new beta drivers last night and still can't play bf3 smoothly maxed out on ultra settings on 5760 x 1080
> Yet my single HD7970 can with no problems.


I'm calling bull**** on this one. Sure a crossfire 7970 setup will beat a 690. However a single 7970 playing bf3 at 5760x1080 on ultra smooth is just **** talk.. At a guess you'd avg 30 to 40 fps max.
Not so good for an online multiplayer game.


----------



## TechAnalyst

Quote:


> Originally Posted by *Qu1ckset*
> 
> sli 690s is a waste of time for surround... your 7970s is a way better setup, 2gb vram is not enough for surround


Yeah my 7970s in quadfire were pretty much a waste of time, Ive easily lost months of game time because of AMD and their s**t drivers, AMD has hit me twice in a row, I had a pair of 6990s which were junk for surround and my quads that just started working with the Performance Beta Driver, but other benchies make no sense:

1.) Heaven: score is no different in Tri or Quad
2.) Score hits about 16k on average in 3dmark11

I have so much frustration with me setup that I dumped over 20k on


----------



## chabze

@emett

I never said I was getting super fps all I'm saying is that my single 7970 performs better than my gtx 690 over 3 monitors. 5760 x 1080
Bf3 is smooth enough to be playable on ultra. Where as the gtx690 ain't playable over that res on ultra.


----------



## Qu1ckset

Quote:


> Originally Posted by *TechAnalyst*
> 
> Yeah my 7970s in quadfire were pretty much a waste of time, Ive easily lost months of game time because of AMD and their s**t drivers, AMD has hit me twice in a row, I had a pair of 6990s which were junk for surround and my quads that just started working with the Performance Beta Driver, but other benchies make no sense:
> 1.) Heaven: score is no different in Tri or Quad
> 2.) Score hits about 16k on average in 3dmark11
> I have so much frustration with me setup that I dumped over 20k on


If you look at benches 4way sli/cfx isn't worth the performance boost/price of the forth card because most times it only adds like 11-20% on top of 3way sli/cfx

And ya I had a hd6990 with 3x 1080p and i had bsod all the time in fps, I've now switched to a single 1440p monitor and my gtx690 and I'm more then happy.

But ya I'd sell your forth card well prices on used 7970s are decent and just save your money or use it to upgrade some thing else on your rig lol


----------



## Qu1ckset

Have any of your played farcry 3 on 1440p with your gtx690, how many fps are you getting? my sig rig aint built yet so i have to wait a few weeks before i can try for myself, just curious


----------



## TechAnalyst

Quote:


> Originally Posted by *Qu1ckset*
> 
> If you look at benches 4way sli/cfx isn't worth the performance boost/price of the forth card because most times it only adds like 11-20% on top of 3way sli/cfx
> And ya I had a hd6990 with 3x 1080p and i had bsod all the time in fps, I've now switched to a single 1440p monitor and my gtx690 and I'm more then happy.
> But ya I'd sell your forth card well prices on used 7970s are decent and just save your money or use it to upgrade some thing else on your rig lol


Ive already had my 690s for months just didn't want to tear it everything apart







money isn't exactly a concern, and my old 7970s will be given away, I don't like selling things either

My question is does the 690s in quad work awesome for surround gaming now, all screens at 120hz etc


----------



## ceteris

Quote:


> Originally Posted by *TechAnalyst*
> 
> Ive already had my 690s for months just didn't want to tear it everything apart
> 
> 
> 
> 
> 
> 
> 
> money isn't exactly a concern, and my old 7970s will be given away, I don't like selling things either
> My question is does the 690s in quad work awesome for surround gaming now, all screens at 120hz etc


A couple of people back awhile ago posted here saying no and went with dual sli with 670 or 680 4gb. You'd have to go back a ways to read up on the specifics.
Quote:


> Originally Posted by *Qu1ckset*
> 
> Have any of your played farcry 3 on 1440p with your gtx690, how many fps are you getting? my sig rig aint built yet so i have to wait a few weeks before i can try for myself, just curious


I still need to pick up that game!! I'm still recovering from the Steam Autumn Sale and have been playing XCOM, Shogun 2, occasional BO2 multiplayer and a couple of others. I haven't even booted up Assassins Creed III


----------



## PinzaC55

Quote:


> Originally Posted by *Qu1ckset*
> 
> Have any of your played farcry 3 on 1440p with your gtx690, how many fps are you getting? my sig rig aint built yet so i have to wait a few weeks before i can try for myself, just curious


Just bought Far Cry 3 from an Ebay dealer (It is sold out on Amazon UK) and I will report back as soon as I start playing it. It is supposed to be very demanding and TimeToLiveCustoms on Youtube says he intends to use it as his future benchmark.


----------



## Shogon

So how far are some of you pushing your 690s? I'm surprised I haven't crashed or hard locked yet with +400 on the memory and +125 on the core. Loving winter though, 30-32C load in BF3 or Shogun 2


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Shogon*
> 
> So how far are some of you pushing your 690s? I'm surprised I haven't crashed or hard locked yet with +400 on the memory and +125 on the core. Loving winter though, 30-32C load in BF3 or Shogun 2


+150(1180MHz boost) and +700 memory are my 24/7 clocks.


----------



## Geneside

Hi guys

New to the forums and what not so not sure about posting regulations and what not, anyhow wonder if any of you people can help me i just build a new gaming rig like 2months ago with the GTX 690 ofc but this morning mid gaming the pc just crashed been spending hours trying to find the problem and finally realized its the 690, read somewhere there is a problem with the 690's some bad batch or something any truth to this as far as you guys know?

or do you guys have any idea why the pc works fine can do my animation and vid editing with no problem but as soon as i launch a game the pc freezes and gives me a blank screen?


----------



## iARDAs

First things first

are you OCing and which drivers are you using currently?


----------



## Geneside

sorry still noob.

OCing? and im running the latest drivers version 306.97


----------



## iARDAs

I also used to be a NOOB  Still one in some areas

I meant did you Overclock your card?

Also give the 310.70 beta drivers a shot. Sometimes drivers can create such issues.


----------



## TechAnalyst

Rephrase my question









I already have Quadfire 7970s all Waterblocked, but due to the issues I've had over the months I also bought a pair of 690 GTXs, my question is which set would give me a better experience? So please don't tell me to go with 670s etc







I already own the cards

http://steamcommunity.com/id/techanalyst <-- system specs


----------



## KaRLiToS

Quote:


> Originally Posted by *TechAnalyst*
> 
> Rephrase my question
> 
> 
> 
> 
> 
> 
> 
> 
> I already have Quadfire 7970s all Waterblocked, but due to the issues I've had over the months I also bought a pair of 690 GTXs, my question is which set would give me a better experience? So please don't tell me to go with 670s etc
> 
> 
> 
> 
> 
> 
> 
> I already own the cards
> http://steamcommunity.com/id/techanalyst <-- system specs


What issues did you have?


----------



## Geneside

ok iARDAs

i tryed that still dident work think the card might be faulty since it worked fine this morning then suddenly just started giving problems

and no did no overclocking what so ever


----------



## Kaapstad

Quote:


> Originally Posted by *TechAnalyst*
> 
> Rephrase my question
> 
> 
> 
> 
> 
> 
> 
> 
> I already have Quadfire 7970s all Waterblocked, but due to the issues I've had over the months I also bought a pair of 690 GTXs, my question is which set would give me a better experience? So please don't tell me to go with 670s etc
> 
> 
> 
> 
> 
> 
> 
> I already own the cards
> http://steamcommunity.com/id/techanalyst <-- system specs


If you use a setup up to 1600p use the GTX 690s

Over 1600p use the HD 7970s

Just checked your signature you are using 3 monitors so use the HD 7970s.

The GTX 690s suffer from lack of vram at those resolutions.


----------



## Qu1ckset

Quote:


> Originally Posted by *Kaapstad*
> 
> If you use a setup up to 1600p use the GTX 690s
> Over 1600p use the HD 7970s
> Just checked your signature you are using 3 monitors so use the HD 7970s.
> The GTX 690s suffer from lack of vram at those resolutions.


This


----------



## Geneside

One question. i opened my pc today to take a look at what happens physically to the pc when it crashes, (since i cant really run software to check since i always need to do a hard boot when it does crash)


----------



## Arm3nian

18c idle









My delta is less than .01c, i'm amazed how well I applied TIM.


----------



## benjaminlev09

anyone else having issues with the latest BETAs? im runing an OC 690 +135 +130 +400 , and with 310.54 ,310.61 and 310.64 i experienced an event 4101 "display drivers stopped working and successfully recovered" while just browsing in desktop work, and now with 310.70 this problem gone but now it happens in games, happend after 2 hours playing BL2 event 4101 again,

anyone else experiencing it?


----------



## max883

Go down to +135 +90 +200 and it shold be ok. its the new drivers!


----------



## Stay Puft

Quote:


> Originally Posted by *benjaminlev09*
> 
> anyone else having issues with the latest BETAs? im runing an OC 690 +135 +130 +400 , and with 310.54 ,310.61 and 310.64 i experienced an event 4101 "display drivers stopped working and successfully recovered" while just browsing in desktop work, and now with 310.70 this problem gone but now it happens in games, happend after 2 hours playing BL2 event 4101 again,
> anyone else experiencing it?


Last 3 betas have been horrible. I'm still on 306's


----------



## Arizonian

Quote:


> Originally Posted by *benjaminlev09*
> 
> anyone else having issues with the latest BETAs? im runing an OC 690 +135 +130 +400 , and with 310.54 ,310.61 and 310.64 i experienced an event 4101 "display drivers stopped working and successfully recovered" while just browsing in desktop work, and now with 310.70 this problem gone but now it happens in games, happend after 2 hours playing BL2 event 4101 again,
> anyone else experiencing it?


I've been running 310.70 since release date. I had "display drivers stopped working and successfully recovered" happen during internet browsing on desktop while using 310.70 WHQL 'Candidate'.

I assumed since I had just re-applied some OC to Memory day prior it may have been too high even though that Memory OC never had issues prior.

Since drivers stopped responding when I was running a newly applied OC to Memory and my Core OC had been stable for months prior I've decided to revert back to stock Memory but keep my Core OC. Giving this a test and see if it happens again. I'm leaning it has nothing to do with anything other than driver issue to be ironed out before final WHQL release.

Reapplied 310.70 driver and since re-install a few days ago it hasn't happened again but too soon to tell.

So 310.70 failed in game on you and during normal internet browsing on desktop for me with over clocks. Curious if anyone had drivers stop responding but with stock clocks using 310.70?

As for everything else this driver has given me a gaming performance increase in the games I play and fixed my GPU from not being able to down clock to 2D clocks after gaming. So I'm sticking with it, display drivers failing once is not enough for me to switch back to previous WHQL.

So benjaminlev09 Congrats on your GTX 690.







Please join us, we'd love to see a pic of your 690 in your rig.


----------



## D4rkThanatoS

Hello to all members of this prestigious forum, I'm from Lima - Peru, this time I present my settings EVGA GTX 690, but soon buy you a card for SLI.

Greetings.


----------



## Arizonian

Quote:


> Originally Posted by *D4rkThanatoS*
> 
> Hello to all members of this prestigious forum, I'm from Lima - Peru, this time I present my settings EVGA GTX 690, but soon buy you a card for SLI.
> Greetings.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats on your GTX 690.







Welcome aboard D4rkThanatoS to OCN and the club.







Very nice rig.










*"How to put your Rig in your Sig"*


----------



## ceteris

I've been fine with latest beta other than AC3 not matching clock speeds. I've been primarily on stock settings though. I usually only OC when I run benchmarks or if there is a game that was system spec intensive. Haven't gotten it yet, but I read that FC3 might require me to load up some OC profiles when I start playing it soon.


----------



## PinzaC55

Quote:


> Originally Posted by *D4rkThanatoS*
> 
> Hello to all members of this prestigious forum, I'm from Lima - Peru, this time I present my settings EVGA GTX 690, but soon buy you a card for SLI.
> Greetings.
> 
> Superb looking rig there


----------



## Arm3nian

Bf3 is just a broken game overall. I can't even start the game with any overclock, only stock. My normal overclock passes heaven everytime, which is more demanding than bf3. Btw, what do you guys get on heaven, I seem to always be less than 60fps with heaven maxed out, running 1440p.

This is with the new beta drivers.


----------



## tsm106

Heaven is not that demanding. BF3 is a tougher stability benchmark by far.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> Heaven is not that demanding. BF3 is a tougher stability benchmark by far.


Bf3 is a terrible game to stress test. It doesn't even get my card up to 70% load. Not only that but it doesn't actually work half the time.


----------



## ceteris

Quote:


> Originally Posted by *Arm3nian*
> 
> Bf3 is a terrible game to stress test. It doesn't even get my card up to 70% load. Not only that but it doesn't actually work half the time.


That was my problem too, until I updated my sound card driver. Hard to believe it was that, but I can pretty much do anything with the game now. Except find other players who aren't idiots to help me finish the last 2 Co-Ops's.


----------



## tsm106

Quote:


> Originally Posted by *Arm3nian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Heaven is not that demanding. BF3 is a tougher stability benchmark by far.
> 
> 
> 
> Bf3 is a terrible game to stress test. It doesn't even get my card up to 70% load. Not only that but it doesn't actually work half the time.
Click to expand...

It's not unusual to have BF3 issues with the lastest beta driver you know? Ppl are complaining of BSODs. That said, I'm not surprised you have scaling issues. I've read about it and been pm'd by other 690 owners. Nv have some work left to do on the scaling front.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> It's not unusual to have BF3 issues with the lastest beta driver you know? Ppl are complaining of BSODs. That said, I'm not surprised you have scaling issues. I've read about it and been pm'd by other 690 owners. Nv have some work left to do on the scaling front.


Obviously it is the beta driver. But even with working drivers bf3 has many issues. And the scaling issue is because I get 60fps, card doesn't need all of it's power to run bf3. When I had quad sli, bf3 would work better with a single 690 because bf3 can't utilize all 4 cores very well, on the other heaven can. Not saying heaven is the best benchmark but it does stress cards more than bf3 does.

Quote:


> Originally Posted by *ceteris*
> 
> That was my problem too, until I updated my sound card driver. Hard to believe it was that, but I can pretty much do anything with the game now. Except find other players who aren't idiots to help me finish the last 2 Co-Ops's.


Lol, ya I have the latest drivers on my sound. Anyway, I like MOH Warfighter online a bit better than bf3 in the team deathmatch point of view, although bf3 does look a bit better, and ofc has vehicles, and is overall more polished.


----------



## jassilamba

Quote:


> Originally Posted by *Arm3nian*
> 
> Bf3 is a terrible game to stress test. It doesn't even get my card up to 70% load. Not only that but it doesn't actually work half the time.


Metro 2033 on everything maxed out at 1440P, my single 690 has both GPUs at almost 99%..


Spoiler: Warning: Spoiler!


----------



## ceteris

Metro 2033 prefers AMD GPU's. Same with my current favorite game, Shogun 2


----------



## Qu1ckset

How is shogun 2 and how is the learning curve for the game?


----------



## jassilamba

Quote:


> Originally Posted by *ceteris*
> 
> Metro 2033 prefers AMD GPU's. Same with my current favorite game, Shogun 2


Sad given it says, Nvidia when you start the game.


----------



## ceteris

Quote:


> Originally Posted by *Qu1ckset*
> 
> How is shogun 2 and how is the learning curve for the game?


Its ok. Its just RoShamBo with the units and terrain advantages (uphill advantages, ambushing from forests, etc). I love the Total War series and other turn based war sims. The only thing I dont' like is the loading of battles and loading of games. I'm hoping when I RAID 0 whatever 2 SSD's I upgrade to, would cut the loading time down substantially. If it does, I hope it helps on Total War Rome 2 when it comes out as well.


----------



## Arm3nian

Quote:


> Originally Posted by *jassilamba*
> 
> Metro 2033 on everything maxed out at 1440P, my single 690 has both GPUs at almost 99%..
> 
> 
> Spoiler: Warning: Spoiler!


Metro is demanding and looks great, but there hasnt been ant driver optimizations cuz its so old.


----------



## Buzzkill

Quote:


> Originally Posted by *Arm3nian*
> 
> 
> 18c idle
> 
> 
> 
> 
> 
> 
> 
> 
> My delta is less than .01c, i'm amazed how well I applied TIM.


What TIM did you use? Did you put a dot in center or spread a thin layer? What is you room temp? Thanks..


----------



## PinzaC55

I know I should read the whole of this thread but its a little daunting. I have some questions re the EVGA backplate - first, is it difficult to put on, second does it adversely affect the temperatures and third is it still possible to fit thermal probes underneath it from my fan controller?


----------



## Arm3nian

Quote:


> Originally Posted by *Buzzkill*
> 
> What TIM did you use? Did you put a dot in center or spread a thin layer? What is you room temp? Thanks..


My room was around 65faren I think. And CL pro. Dot method does not work for that, you need a very thin layer, you can get CL ultra if you dont want to lap your block if you want to apply new tim, but I lapped mine anyway for a smooth surface. My oc is mild but I put to max voltage just to test temps, 30c max after 30min 100%load, could get better temps and much higher oc with tweaking.


----------



## Kongwiuff

Hey bro´s -

I havnt read all the 294 pages. But i need some help with my GTX690

Hello there.

In battlefield 3

i got a little problem i only get **** fps like 60 - 160 in metro HUUUGE fps drops.

Normaly its around 100 fps in LOW - only mesh on High everything is on low.

i have been in the driver and turned on maximum power mode and for the windows settings to.

- i run at 1920x1080 - with the newest Beta driver for the card.

My rig is

3930K 4.9Ghz
GTX690
16Gb Ram 2133Mhz
3x 120GB Ocz Agility 3 in Raid (For the Game)
1200W Power supply.

When i got to MSI Afterburner i can see both gpu´s is only using around 45 - 51 % of there capacity ? How do i get it to use fully so i can get some frames

http://www.overclock.net/t/1337485/gtx690-fps-problem-gpu-usage#post_18812125

Hope you guys can help. :=


----------



## ceteris

Quote:


> Originally Posted by *PinzaC55*
> 
> I know I should read the whole of this thread but its a little daunting. I have some questions re the EVGA backplate - first, is it difficult to put on, second does it adversely affect the temperatures and third is it still possible to fit thermal probes underneath it from my fan controller?


1. Its pretty easy. Just make sure you have a T6 Torx screwdriver to remove most of the stock screws and a small phillips to remove the rest and add the new ones.

2. It does not add any noticible dramatic temp drops. It's mostly for aesthetics and to maybe reinforce the PCB...

3. It will be a tight fit. You might have to sandwich the line between the PCB and the heatsink block. There might be some clearance next to the 8 pin connectors, but I'm too lazy to drain the loop, take out my card and fit the stock air sink in to check








Quote:


> Originally Posted by *Arm3nian*
> 
> My room was around 65faren I think. And CL pro. Dot method does not work for that, you need a very thin layer, you can get CL ultra if you dont want to lap your block if you want to apply new tim, but I lapped mine anyway for a smooth surface. My oc is mild but I put to max voltage just to test temps, 30c max after 30min 100%load, could get better temps and much higher oc with tweaking.


That's hardcore man lol. I used dot method (REALLY SMALL like half a BB) and I'm getting slightly over 1C delta idle.


----------



## PinzaC55

Quote:


> Quote:
> Originally Posted by PinzaC55
> 
> I know I should read the whole of this thread but its a little daunting. I have some questions re the EVGA backplate - first, is it difficult to put on, second does it adversely affect the temperatures and third is it still possible to fit thermal probes underneath it from my fan controller?
> 
> 1. Its pretty easy. Just make sure you have a T6 Torx screwdriver to remove most of the stock screws and a small phillips to remove the rest and add the new ones.
> 
> 2. It does not add any noticible dramatic temp drops. It's mostly for aesthetics and to maybe reinforce the PCB...
> 
> 3. It will be a tight fit. You might have to sandwich the line between the PCB and the heatsink block. There might be some clearance next to the 8 pin connectors, but I'm too lazy to drain the loop, take out my card and fit the stock air sink in to check


Ta very much! I thought the backplate would tend to raise temps but if it has zero effect that'll be OK.


----------



## Methos07

Now running a U3011 + 2x 2007FP setup in PLP. 690 is a champ card, no doubt.


----------



## jassilamba

Quote:


> Originally Posted by *Kongwiuff*
> 
> Hey bro´s -
> I havnt read all the 294 pages. But i need some help with my GTX690
> Hello there.
> In battlefield 3
> i got a little problem i only get **** fps like 60 - 160 in metro HUUUGE fps drops.
> Normaly its around 100 fps in LOW - only mesh on High everything is on low.
> i have been in the driver and turned on maximum power mode and for the windows settings to.
> - i run at 1920x1080 - with the newest Beta driver for the card.
> My rig is
> 3930K 4.9Ghz
> GTX690
> 16Gb Ram 2133Mhz
> 3x 120GB Ocz Agility 3 in Raid (For the Game)
> 1200W Power supply.
> When i got to MSI Afterburner i can see both gpu´s is only using around 45 - 51 % of there capacity ? How do i get it to use fully so i can get some frames
> http://www.overclock.net/t/1337485/gtx690-fps-problem-gpu-usage#post_18812125
> Hope you guys can help. :=


I have heard ppl having issues with the latest beta. Also in the nvidia control panel make sure that multi gpu is enabled and maximize 3d performance is selected.
I'm on version 306.97. Try uninstalling the GPU and install it back. See if that helps.


----------



## V3teran

Hi guys long time since i been in this thread since i got my 690.
What i want to know is is it worth modding them?
What about Dinas rom. is it worth using? I see he has the core oat 1400?
I dont want to read over 100 pages if i can help it guys
cheers


----------



## V3teran

Ok i flashed my 690 and it works ok the boost on the core when overclocked is up by about 15mhz on each core.
Does anybody know of a nice bios that was posted on here?


----------



## DinaAngel

Quote:


> Originally Posted by *V3teran*
> 
> Ok i flashed my 690 and it works ok the boost on the core when overclocked is up by about 15mhz on each core.
> Does anybody know of a nice bios that was posted on here?


hi, um the bios i did upload works fine on 1400 mhz, well u hafto be some lucky also tho on your card, iv noticed that ripple or freq noise seem to cause instabillity when both gpus are oced but the bios i uploaded loads of months ago makes ur 690 draw 90% from motherboard and 8pin almost only for overclock stabillity.
normally 690 draws from 8pin fully almost

remember that im using asus iv extreme and i got extra pcie power 6pin thats making the 24 pin less stressed.
so motherboard lives longer and it does improve overclock stabillity

as iv seen some dont really agree with me on this it seems but its what i do believe is right


----------



## max883

dinaangel: Can i have the gtx690 bios from you :


----------



## max883

dinaangel: are you on water cooling, and how are youre tems and clocks


----------



## max883

i tried your bios, no go







+150 +130 +500 is max i get. volt max at 1.175. need more voltage!!


----------



## pilla99

What are you guys getting for Far Cry 3 performance? I just beat the game (best single player game I think I have ever experienced) and ran at 1440p ultra settings. No MSAA but I had about 60 frames the entire game.

That about right?


----------



## Arm3nian

Quote:


> Originally Posted by *pilla99*
> 
> What are you guys getting for Far Cry 3 performance? I just beat the game (best single player game I think I have ever experienced) and ran at 1440p ultra settings. No MSAA but I had about 60 frames the entire game.
> That about right?


You should be getting 60fps with msaa on max.


----------



## DinaAngel

Quote:


> Originally Posted by *max883*
> 
> i tried your bios, no go
> 
> 
> 
> 
> 
> 
> 
> +150 +130 +500 is max i get. volt max at 1.175. need more voltage!!


http://www.techpowerup.com/gpuz/kevfy/


----------



## DinaAngel

Quote:


> Originally Posted by *pilla99*
> 
> What are you guys getting for Far Cry 3 performance? I just beat the game (best single player game I think I have ever experienced) and ran at 1440p ultra settings. No MSAA but I had about 60 frames the entire game.
> That about right?


i played thru it, was okey but not my type of game, i feel it lacked soo much and the co op was soo linear and multiplayer is like call of duty.
so yeah but i do play planetside 2

performance wise i played with everything maxed on 1080p no tear or lagg
same goes with ps2, iv tweaked ps2 to use ultra what usually isnt possible


----------



## max883

hahaha DinaAngel. i can do that to in gpuz!!! but you can not run a game or 3dmark. all gtx 690 owners need more voltage if they want over 1200.mhz on the Gpu!!









Update! Now i have a god Bios and volt for the GTX 690







+150 + 150 +500














1215.mhz core and 7000.mhz mem















http://www.xtremesystems.org/forums/showthread.php?284014-KGB-Kepler-BIOS-Editor-Unlocker


----------



## V3teran

Hey max what lines did you alter in the kgb cfg?
What did you set them too?
Either that or can i have your bios and test it please?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *DinaAngel*
> 
> http://www.techpowerup.com/gpuz/kevfy/


This means nothing. Your sig Vantage graphics score is lower than mine in my Sig. My gtx690 was clocked about 1200Mhz for that run and you claim 1400Mhz core on your gtx690.

I don't mind someone posting here and there about false information, but when I see it getting members here in this thread worked up that thier gtx690 can't get near 1400Mhz and are feeling ripped off, I get irritated.

So prove me wrong and I'll be happy to appologize to you. Post some 1400Mhz 3dmark11 and Vantage runs with your gtx690. Thanks.


----------



## pilla99

Quote:


> Originally Posted by *Arm3nian*
> 
> You should be getting 60fps with msaa on max.


At 1440p?

Edit: just saw this for the latest beta drivers:

GeForce GTX 680:
Up to 38% in Far Cry 3

lol ok well try this again now.

Edit: didn't actually help much.


----------



## qiplayer

Don't get 2 gtx690 I wasted lot's of money with it!!!!!!!!!
Tried with 2 pc's on i7 2600k and on i7 3930k but one works better than two cards. Believe me don't do it! You won't get good scaling. 4 gpu working 45%, and game was unplayable, even with 100fps I had costant slow down/acceleration of the image.
See?
The companies don't offer enough powerful cards at the moment. If you wanna exagerate with your resolution wait for gtx780 or play with low fps.
Or overclock like a crazy!

Good lock.
Dont go quad sli!!!!!!


----------



## jassilamba

Quote:


> Originally Posted by *pilla99*
> 
> At 1440p?
> Edit: just saw this for the latest beta drivers:
> GeForce GTX 680:
> Up to 38% in Far Cry 3
> lol ok well try this again now.
> Edit: didn't actually help much.


I'm in the same boat as you. 60 with no MSAA, about 30 - 45 with everything maxed out.

Running at 1440P


----------



## Kaapstad

Quote:


> Originally Posted by *qiplayer*
> 
> Don't get 2 gtx690 I wasted lot's of money with it!!!!!!!!!
> Tried with 2 pc's on i7 2600k and on i7 3930k but one works better than two cards. Believe me don't do it! You won't get good scaling. 4 gpu working 45%, and game was unplayable, even with 100fps I had costant slow down/acceleration of the image.
> See?
> The companies don't offer enough powerful cards at the moment. If you wanna exagerate with your resolution wait for gtx780 or play with low fps.
> Or overclock like a crazy!
> Good lock.
> Dont go quad sli!!!!!!


Im not saying its the same for all setups or all games but I often get good GPU usage with quad sli

This link is running BF3 the CPU is @4.0ghz

http://imageshack.us/photo/my-images/29/membf3.jpg/

This link is running Far Cry 3 the CPU is @4.0ghz

http://imageshack.us/photo/my-images/687/farcryw.jpg/

Unfortunately I am rubbish at these type of games and get killed very quickly


----------



## pilla99

Quote:


> Originally Posted by *Kaapstad*
> 
> Im not saying its the same for all setups or all games but I often get good GPU usage with quad sli
> This link is running BF3 the CPU is @4.0ghz
> http://imageshack.us/photo/my-images/29/membf3.jpg/
> This link is running Far Cry 3 the CPU is @4.0ghz
> http://imageshack.us/photo/my-images/687/farcryw.jpg/
> Unfortunately I am rubbish at these type of games and get killed very quickly


I don't understand why people get 2 690's? I mean most of the time you are going to be using multiple monitors in which case you are going to run out of vram way before the cards actually gets pegged. If you are using only 1 monitor, (like you seem to be) what the actual #$%^ are you doing that needs that much horsepower?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *pilla99*
> 
> I don't understand why people get 2 690's? I mean most of the time you are going to be using multiple monitors in which case you are going to run out of vram way before the cards actually gets pegged. If you are using only 1 monitor, (like you seem to be) what the actual #$%^ are you doing that needs that much horsepower?


Some people still don't understand the vram thing and it doesn't help the poor marketing of dual gpu cards advertising the total amount of vram as to the effective amount.


----------



## V3teran

Flashing bios instructions for any Nvidia Keplar GPU.
*YOU DO THIS AT YOUR OWN RISK*

I wrote this guide as i couldnt find one anywhere.

First of all, You then need to download these files.

GPU-Z (This is used to grab the original Bios from the Gpu wether multi or single)
http://www.techpowerup.com/downloads/2181/mirrors.php

KGB - Kepler BIOS Editor/Unlocker. (This is used to modify the Bios that you just grabbed from your Gpu using Gpu-z)

KGB supports: *GTX690, GTX680, GTX670, GTX660Ti, GTX660OEM and GTX660*
https://www.dropbox.com/s/vrunxuq03vj0m5y/kgb_0.5.zip

NVFLASH (This is used to reflash or reinstall the new modified bios)
http://www.softpedia.com/progDownload/nVFlash-Download-16133.html

Steps 1-10.

1.Ok what you need is firstly to make a USB dongle bootable into DOS, To do this read this very easy to follow guide.
http://www.bay-wolf.com/usbmemstick.htm

2.Run GPU-Z and and click on the icon that allows you to grab your bios from the Gpu and save it to your PC.

The pictures below show how.....




Also make sure you have a copy of your bios!!

3.Once you have grabbed your bios put it into C:\Users\My and rename it 1.rom

For example the bios from the GTX 690 Gpu 1 is called GK104.rom, I renamed it to 1.rom and the bios from Gpu 2 I renamed 2.rom.

The reason why you should put the bios in C:\Users\My is because the default path will be set here from KGB.exe so if you are not familiar with the CMD (Commandline) then this is the easiest way of doing it.

5.Take the KGB.exe as well as the KGB.cfg file and also place both of these in C:\Users\My.
You should now have KGB.exe, KGB.cfg and 1.rom files in C:\Users\My.

If you have more than 1xGpu then you will have 1.rom and 2.rom respectively in C:\Users\My.

6.The next thing to do is open up the CMD (Commandline) which im sure you all know how to do.
Once you have the CMD open, Copy and paste in this command.

kgb 1.rom unlock (press enter).

It will now save the new values (i.e. from kgb.cfg) in your bios then print out the new values in the bios.

7.Now all we need to do is put the new modified 1.rom back onto the gpu, To do this we need to use the USB DOS bootable Dongle that we created in step 1.

Take the nvflash files once you have extracted them and place them onto the USB DOS bootable Dongle, There should be 2 files.

Then place the 1.rom bios also on the USB DOS bootable Dongle.

8.Now the Dongle is ready to be booted from so reboot the PC and go into the bios.
Once you are in the bios make it so the PC boots from the dongle, Once this happens you will then have a flashing cursor telling you that you are in DOS mode.

9.Type in nvflash --list (press enter)
This will then tell you what gpu(s) that you have installed on your motherboard.

10.Type in nvflash -i1 1.rom (press enter)

It will then say flashing Gpu and press Y to confirm, so press Y and the flashing will commence which takes a few seconds.

If you have a second gpu with 2.rom.

Type in nvflash -i2 2.rom (press enter)

It will then say flashing Gpu and press Y to confirm, so press Y and the flashing will commence which takes a few seconds.

Once this is done reboot the PC go into the bios and reset your boot so it boots from your hardrive with the Windows OS on.

Once you get into windows you can see from MSI afterburner or EVGA precision that the power target can be raised from 130 to 150.

When gaming the boost clock should be higher depending on your GPU.

I achieved around an increase of 30 on the core clock on boost alone on both GPU's which is very good.

It will differ for each person and each GPU(s), You can also try other peoples bios as long as iyts the same GPU as your own.

You can also modify the KGB.exe and tweak power settings for even more results but I wouldnt advise this unless you know what you are doing, I would just leave it at stock and use that for the time being.

*Everything you do is at your own risk*

I hope this guide has helped.

Good luck.

The original Creator of KGB.exe can be found here.
http://www.xtremesystems.org/forums/showthread.php?284014-KGB-Kepler-BIOS-Editor-Unlocker/page4


----------



## iARDAs

Quote:


> Originally Posted by *pilla99*
> 
> I don't understand why people get 2 690's? I mean most of the time you are going to be using multiple monitors in which case you are going to run out of vram way before the cards actually gets pegged. If you are using only 1 monitor, (like you seem to be) what the actual #$%^ are you doing that needs that much horsepower?


only scenario where 690 Quad SLI would be nice is probably 3Dvision as it doesnt require more Vram to run games in 3Dvision. Same can be said running games in 120hz stable

But other than that I believe some people are marketing victims.


----------



## qiplayer

I
Quote:


> Originally Posted by *pilla99*
> 
> I don't understand why people get 2 690's? I mean most of the time you are going to be using multiple monitors in which case you are going to run out of vram way before the cards actually gets pegged. If you are using only 1 monitor, (like you seem to be) what the actual #$%^ are you doing that needs that much horsepower?


I'm not getting out of vram at 6000x1080
But I disabled aereo on my win7


----------



## qiplayer

Quote:


> Originally Posted by *iARDAs*
> 
> only scenario where 690 Quad SLI would be nice is probably 3Dvision as it doesnt require more Vram to run games in 3Dvision. Same can be said running games in 120hz stable
> 
> But other than that I believe some people are marketing victims.


I play alot crysis 2 multiplayer. I never run out of vram but get lewer fps, in fast action (i don't play watching the fps number) i get about 40-50 fps.
So more gpu power would be helpful.
Anyway I hope they are gonna make the 780 with 4gb vram


----------



## Stay Puft

Watercooled guys. What are your max overclocks?


----------



## Kaapstad

Quote:


> Originally Posted by *pilla99*
> 
> I don't understand why people get 2 690's? I mean most of the time you are going to be using multiple monitors in which case you are going to run out of vram way before the cards actually gets pegged. If you are using only 1 monitor, (like you seem to be) what the actual #$%^ are you doing that needs that much horsepower?


As you have noticed I use one monitor @1600p so vram is not going to be a problem. I like being able to totally max any game I play and using 2 GTX 690s may sometimes be overkill but I would rather have too much horsepower than too little.

I also own a third GTX 690, maybe I should put that in as well and use it as a physics card lol.


----------



## qiplayer

Quote:


> Originally Posted by *Kaapstad*
> 
> Im not saying its the same for all setups or all games but I often get good GPU usage with quad sli
> This link is running BF3 the CPU is @4.0ghz
> http://imageshack.us/photo/my-images/29/membf3.jpg/
> This link is running Far Cry 3 the CPU is @4.0ghz
> http://imageshack.us/photo/my-images/687/farcryw.jpg/
> Unfortunately I am rubbish at these type of games and get killed very quickly


Interesting!
I tried them with the drivers the last the one that came out in july. Are you using newer drivers?

About the game I suggest you crysis 2 multiplayer, its quite faster than bf3 and amazing


----------



## Buzzkill

http://www.geforce.com/optimize (Optimal Playable Settings)

http://www.geforce.com/optimize/guides (Game Guides)

http://www.geforce.com/drivers (At bottom of page you can search for beta drivers.)

Or

http://www.geforce.com/drivers/beta-legacy


----------



## zalbard

Quote:


> Originally Posted by *Stay Puft*
> 
> Watercooled guys. What are your max overclocks?


Same as on air.


----------



## Buzzkill

Quote:


> Originally Posted by *zalbard*
> 
> Same as on air.


On air my top card would be around 90c and with Blocks never above 43c Cards do not throttle at all

43c is with 3770K at 90c-100c running Intel Burn Test.


----------



## iARDAs

Quote:


> Originally Posted by *Kaapstad*
> 
> As you have noticed I use one monitor @1600p so vram is not going to be a problem. I like being able to totally max any game I play and using 2 GTX 690s may sometimes be overkill but I would rather have too much horsepower than too little.
> I also own a third GTX 690, maybe I should put that in as well and use it as a physics card lol.


ORRRRRRR

orrrrrrr

Just bare with me in this one.

You can always SEND me the 3rd 690.

Just think about it.

Wouldn't it be something?


----------



## Kaapstad

Quote:


> Originally Posted by *qiplayer*
> 
> Interesting!
> I tried them with the drivers the last the one that came out in july. Are you using newer drivers?
> About the game I suggest you crysis 2 multiplayer, its quite faster than bf3 and amazing


BF3 I think was on 301.42

FC3 was on 310.70


----------



## Billymac10

Hi Everyone,

Love this thread and I'm a gtx 690 owner as of two weeks ago. I'm at work right now so I'll send in my proof of ownership items once I get home. I couldn't be more excited about owning a 690! I upgraded from 1gb 460's in SLI and the difference... well I don't need to tell you guys the difference. Whoa!

My only concerns are that I seem to run a bit high on temps compared to some of you and my overclocking results aren't up to par. I'm running a resolution of 5142x1050 and so far I've only really checked out what the 690 can do with heavily modded Skyrim and Far Cry 3.

With both games I seem to hit 80C or so with the card at stock settings. When I adjust the fan profiles I can get the max temp down to about 72C. Reading through the forum, it seems that these temps are causing the card to throttle. I've got a 200mm fan on the front of my case that I put in exhaust mode to help the "case" side of the 690 exhaust and it seemed to help some. I'm wondering if it's just the higher resolution that's causing the card to work harder and hence, the temps.

I'm not having any FPS problems in Skyrim. It' stays at 60 almost all the time, but Far Cry 3 drops to around 40-45 in outdoors sometimes. Everything in Far Cry is maxed except MSAA isn't on.

Also, I've been using EVGA Precision for monitoring and some overclocking and am not getting nearly the overclocks on air that some have posted. My best stable overclock so far has only been +70 core and +181 mem with 135%, voltage set to max. I figure the mem could go up more but haven't had the time to find the sweet spot yet because I've been too concerned with the core clock. With the fan set to 95% I'm never getting over 75C on the card so it doesn't seem to be heat related. I'm getting black screen crashes after a few minutes with higher overclocks.

Posting my specs below. Does anyone have any ideas for me or did I just get a little bit worse card than the rest?

i7 3770K @3.8 (stock cooling)
Asus Z77 Sabertooth
8GB Viper RAM 1600mhz
EVGA GTX 690
OCZ 60GB SSD
WD 2 TB 7200 HDD
Cooler Master HAF 912

310.70 Nvidia drivers


----------



## Buzzkill

310.70 WHQL was released today.

GeForce 310.70 Driver
Version 310.70 - WHQL
Release Date Mon Dec 17, 2012
Operating System Windows 7 64-bit
Windows 8 64-bit
Windows Vista 64-bit
Language English (US)
File Size 168.55 MB
Release Notes for 310.70 WHQL

You may be disappointed to hear that the WHQL driver will be the same as the 310.70 WHQL candidate driver just with a Microsoft digital signature. They will be released in a few days. It is the same exact version as we have now in driver R310.70. The only difference will be the driver signature from Microsoft.

Beta Drivers

Nvidia has closed the Official GeForce Experience Closed Beta (12/6/2012)
Today 12/6/2012 NVIDIA will be releasing GeForce Experience closed Beta which will be limited to 10,000 users. This application basically enables users to get the optimal game settings that balance speed and image quality with the single click. For additional information on the application see the following links:
http://www.geforce.com/drivers/geforce-experience
GeForce Experience System Requirements:
http://www.geforce.com/drivers/geforce-experience/system-requirements
GeForce Experience User guide (pdf)
http://international.download.nvidia.com/GFE/User-Guides/GeForceExperience_UG_v02.pdf
Supported Games:
http://www.geforce.com/drivers/geforce-experience/supported-games


----------



## Alex132

Just ordered my 690, along with 2 Neutron GTX120s!

Can't wait!


----------



## maximus56

Anyone who thinks that "gtx quad 690 sucks", either can't afford it, or doesn't know how to use it, period!


----------



## noob.deagle

Perhaps i can join this club ? got my 690 yesterday because i got frustrated that i drove all over Adelaide and no one had a reference GTX680 and all they had were the 3 slot DCUII models that wont fit in SLI :\


----------



## Billymac10

IMAG0529.jpg 1598k .jpg file


----------



## Arizonian

Quote:


> Originally Posted by *Billymac10*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> IMAG0529.jpg 1598k .jpg file


Hi Billymac10 - welcome to OCN and congrats on your new GTX 690.

















Quote:


> Originally Posted by *Billymac10*
> 
> Hi Everyone,
> Love this thread and I'm a gtx 690 owner as of two weeks ago. I'm at work right now so I'll send in my proof of ownership items once I get home. I couldn't be more excited about owning a 690! I upgraded from 1gb 460's in SLI and the difference... well I don't need to tell you guys the difference. Whoa!
> My only concerns are that I seem to run a bit high on temps compared to some of you and my overclocking results aren't up to par. I'm running a resolution of 5142x1050 and so far I've only really checked out what the 690 can do with heavily modded Skyrim and Far Cry 3.
> With both games I seem to hit 80C or so with the card at stock settings. When I adjust the fan profiles I can get the max temp down to about 72C. Reading through the forum, it seems that these temps are causing the card to throttle. I've got a 200mm fan on the front of my case that I put in exhaust mode to help the "case" side of the 690 exhaust and it seemed to help some. I'm wondering if it's just the higher resolution that's causing the card to work harder and hence, the temps.
> I'm not having any FPS problems in Skyrim. It' stays at 60 almost all the time, but Far Cry 3 drops to around 40-45 in outdoors sometimes. Everything in Far Cry is maxed except MSAA isn't on.
> Also, I've been using EVGA Precision for monitoring and some overclocking and am not getting nearly the overclocks on air that some have posted. My best stable overclock so far has only been +70 core and +181 mem with 135%, voltage set to max. I figure the mem could go up more but haven't had the time to find the sweet spot yet because I've been too concerned with the core clock. With the fan set to 95% I'm never getting over 75C on the card so it doesn't seem to be heat related. I'm getting black screen crashes after a few minutes with higher overclocks.
> Posting my specs below. Does anyone have any ideas for me or did I just get a little bit worse card than the rest?
> i7 3770K @3.8 (stock cooling)
> Asus Z77 Sabertooth
> 8GB Viper RAM 1600mhz
> EVGA GTX 690
> OCZ 60GB SSD
> WD 2 TB 7200 HDD
> Cooler Master HAF 912
> 310.70 Nvidia drivers


The higher your res the harder your card works which results in higher temps and lower ability to OC than someone else running lower resolution. Also take over clock ability which is luck of the draw into account and it varies even further.

Getting black crashes means either CPU, GPU or combo of both might be possibly over clocked too high. Keep your CPU at stock clocks if your only running stock cooling. Find your GPU OC first.


----------



## Billymac10

Thanks Arizonian!

Fells good to be part of the club.

An update on my overclocks:

Last night I downloaded the latest WQHL Nvidia driver and was able to bump it up to +80 +182 135% and played Far Cry for 2 hours without a crash. I thought I already had the lastest driver but it was the 310.64 I was using previously. Precision X said the card got up to a high of 78C but for the majority of the play time I was at 75C or less. This is a nice development.

This morning before work I used an alternate connection for the power cord on my case side fan to boost the rpm and then played at +90 +182 135% without crashing (only about 15 min of game time though). The temp settled in at around 72C.

That overclock allowed me to use 2X MSAA at 5160x1050 everything on Ultra with SSAO and stay above 35 FPS avg. BEASTLY!









I'm going to try your suggestion Arizonian and am going to take off the CPU overclock since Far Cry doesn't seem very CPU intensive at all. I'll see if I can pump the GPU clock up a bit higher tonight. Progress people! I'm amazed at what this card can do. Even with the high pricece, it's been my best component purchase ever.

Oh and at that resolution above with the MSAA I was using about 1950MB of VRAM so I almost hit the limit. Interestingly enough, I tried 4x MSAA and although the framerate took a little hit, about 5-6 FPS, I didn't get the huge dropoff I'd expect after going over the VRAM limit.


----------



## ceteris

Quote:


> Originally Posted by *Buzzkill*
> 
> 310.70 WHQL was released today.


SOFB... I just did a fresh install of Windows too LOL








Quote:


> Originally Posted by *maximus56*
> 
> Anyone who thinks that "gtx quad 690 sucks", either can't afford it, or doesn't know how to use it, period!


Some people here are far from not being able to afford it. It has more to do with the fact that there are better alternatives to push surround. You aren't that wrong about not knowing how to use it.... although it's mostly for single display purposes.


----------



## PinzaC55

Just uploaded a short video clip of my new PC to youtube 



 Click "settings" (cog wheel near bottom right of video) to view in HD.


----------



## jcde7ago

Oh man, this thread....got...ridiculously huge.

Arizonian, you have my eternal thanks.


----------



## Jessekin32

I hate to say it guys... But I'm selling my 690









I've picked up two FTW+ 4GB 680's, and I've gotta get rid of this thing asap to make up for the upgrade.

I don't know if I should be removed from the list, or what, but I just thought I'd let everyone know it's going to be gone soon


----------



## pilla99

Quote:


> Originally Posted by *maximus56*
> 
> Anyone who thinks that "gtx quad 690 sucks", either can't afford it, or doesn't know how to use it, period!


Quad 690 is only for a single monitor and even then you still only have 2gb of vram. Games today like skyrim push past that and guaranteed the games of tomorrow like GTA VI will too. Going back two 4gb 680's would be a better choice for about the same cost. Im waiting till April and the second the 6gb 8970 is out im grabbing two.


----------



## Arizonian

Quote:


> Originally Posted by *jcde7ago*
> 
> Oh man, this thread....got...ridiculously huge.
> Arizonian, you have my eternal thanks.


Hey jcde7ago glad to see you back and hope all is well on your side of the fence.







I see your still rocking your GTX 690.

Personally between my main & second rigs I'll have this card for about four years. I took out the extended warranty when I bought it from EVGA since It's built like a tank and would be hard to get rid of as long as it's doing it's job gaming. The kids will get to play with the 690 the moment a single GPU hits the market that can at least perform as well.









This club had made for a good thread not just for 690 owners but those interested who've come here to find out about it. We've learned quite a bit since release what the cards limits and capabilities are with different set ups. Again thanks for starting it, along with the great home page of info.


----------



## saifbukhari

AWESOME! !!!!

Quote:


> Originally Posted by *PinzaC55*
> 
> Just uploaded a short video clip of my new PC to youtube
> 
> 
> 
> Click "settings" (cog wheel near bottom right of video) to view in HD.


Sent from my GT-N7100 using Tapatalk 2


----------



## PinzaC55

Hey thank you!







A gay friend said it looked like the kind of nightclub he would love to hang out in !









New photo http://www.flickr.com/photos/pinzac55/8283586705/in/photostream


----------



## maximus56

I am assuming that your "matter of fact" observations are based on your own personal experience, and not a synthesis of everyone else' who also currently don't seem to have a 690 quad sli set up, yet have similar "matter of fact" conclusions?


----------



## Qu1ckset

Quote:


> Originally Posted by *maximus56*
> 
> Anyone who thinks that "gtx quad 690 sucks", either can't afford it, or doesn't know how to use it, period!


Buddy you have no idea what your talking about, quad sli 690s is such a waste for both single and surround.
Sucks for single because most cases a single gtx690 rapes, and if you need more horse power then a single 690 3way sli 4gb 680s or 3way hd7970 3gbs are such a better buy!

and quad sli 690s for surround suck due to limited vram...

And i can easily buy a second and even third gtx690 but such a waste when one does the job perfect on 1440p gaming1


----------



## maximus56

Quote:


> Originally Posted by *Qu1ckset*
> 
> Buddy you have no idea what your talking about, quad sli 690s is such a waste for both single and surround.
> Sucks for single because most cases a single gtx690 rapes, and if you need more horse power then a single 690 3way sli 4gb 680s or 3way hd7970 3gbs are such a better buy!
> and quad sli 690s for surround suck due to limited vram...
> And i can easily buy a second and even third gtx690 but such a waste when one does the job perfect on 1440p gaming1


Right..as I thought, another posting from someone not running a quad gtx 690


----------



## Alex132

Quote:


> Originally Posted by *maximus56*
> 
> Right..as I thought, another posting from someone not running a quad gtx 690


someone is sore about something


----------



## maximus56

Lol...just trying to get some data points based actual owner experience for quad 690, not noise


----------



## pilla99

Quote:


> Originally Posted by *maximus56*
> 
> Right..as I thought, another posting from someone not running a quad gtx 690


Ok I can't resist.

It turns out actually that facts don't change depending on your hardware configuration! Now while you comprehend that, understand something: two 690's is a waste of money. You are stuck with 2GB of vram. You clearly don't understand what that is because you keep ducking the subject.
Here is some recommended reading for you:

http://www.overclock.net/t/1276950/what-happens-when-youre-out-of-vram

Your card could be an absolute monster in processing power, but when that vram runs out it doesn't matter, fps comes to a crushing halt. Again, some games today at 1080p can push 2GB and be assured that games of tomorrow will if you hope to run on max. 1440p even more so. Multi-monitor absolutely.

So when you see so few people with 2 690's it's not because they can't afford it, it's because they did their homework and got 4, 4GB 680's instead. Just check out the 680 club. And in many cases they spent more on their system to do so. I should have done more homework too, getting my 690 in retrospect was a mistake. I can live with it for now because nothing I like to play uses 2GB, but as soon as possible I'm selling this and getting new cards.

Again I recommend you do some reading so as to further prevent you from looking like a fool. It's persons like you that help spread misinformation, especially if some young kid gets on here and reads your posts.


----------



## Qu1ckset

Quote:


> Originally Posted by *pilla99*
> 
> Ok I can't resist.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> It turns out actually that facts don't change depending on your hardware configuration! Now while you comprehend that, understand something: two 690's is a waste of money. You are stuck with 2GB of vram. You clearly don't understand what that is because you keep ducking the subject.
> Here is some recommended reading for you:
> http://www.overclock.net/t/1276950/what-happens-when-youre-out-of-vram
> Your card could be an absolute monster in processing power, but when that vram runs out it doesn't matter, fps comes to a crushing halt. Again, some games today at 1080p can push 2GB and be assured that games of tomorrow will if you hope to run on max. 1440p even more so. Multi-monitor absolutely.
> So when you see so few people with 2 690's it's not because they can't afford it, it's because they did their homework and got 4, 4GB 680's instead. Just check out the 680 club. And in many cases they spent more on their system to do so. I should have done more homework too, getting my 690 in retrospect was a mistake. I can live with it for now because nothing I like to play uses 2GB, but as soon as possible I'm selling this and getting new cards.
> Again I recommend you do some reading so as to further prevent you from looking like a fool. It's persons like you that help spread misinformation, especially if some young kid gets on here and reads your posts.


+1 Rep


----------



## maximus56

You have a lot to say for someone who neither has a GTX 690 quad nor a GTX 680 Quad, a mistake indeed!

photo.JPG 815k .JPG file


photo (1).JPG 2153k .JPG file


http://www.3dmark.com/hall-of-fame/3dmark-11-top-extreme-preset/

http://cdn.overclock.net/c/c5/c5c9ea93_00019.jpeg

System Specs:
Mother Board: ASUS Rampage IV Extreme
CPU: Intel i7-3930K @4.6GHZ
GPU: EVGA Nvidia GTX690 (Quad SLI)
Memory: Corsair Dominator GT DDR3-2133 16GB (4x4GB
Case: CaseLabs STH10
SSD: Samsung Series 250GB SSD x2 and Segate Barracuda 3TB SATA 6.0Gb/s x1
Optical Drive: LG Electronics 12X Internal SATA Blu-Ray Rewriter
Sound Card: Creative Sound Blaster Recon3D
Power Supply: Corsair AX1200 and Seasonic 850W
Cooling: Koolance CPU, GPU and MOBO WB | Black Ice SR1 560, Alphacool NexXxoS Monsta,
XSPC RX480, Koolance 3x120mm | Swiftech MCP655 x2| Scythe Gentle
Typhoon 120MM 1850 RPM x15, Noiseblocker NB-BlackSilentPro 140 MM 1700 RPM
x4, BitFenix Spectre Pro 120mm RED LED Fans x4
Monitor: Asus VG278H 27-Inch 120Hz 3D Monitor x3


----------



## Qu1ckset

Quote:


> Originally Posted by *maximus56*
> 
> You have a lot to say for someone who neither has a GTX 690 quad nor a GTX 680 Quad, a mistake indeed!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> photo.JPG 815k .JPG file
> 
> 
> photo (1).JPG 2153k .JPG file
> 
> http://www.3dmark.com/hall-of-fame/3dmark-11-top-extreme-preset/
> http://cdn.overclock.net/c/c5/c5c9ea93_00019.jpeg
> System Specs:
> Mother Board: ASUS Rampage IV Extreme
> CPU: Intel i7-3930K @4.6GHZ
> GPU: EVGA Nvidia GTX690 (Quad SLI)
> Memory: Corsair Dominator GT DDR3-2133 16GB (4x4GB
> Case: CaseLabs STH10
> SSD: Samsung Series 250GB SSD x2 and Segate Barracuda 3TB SATA 6.0Gb/s x1
> Optical Drive: LG Electronics 12X Internal SATA Blu-Ray Rewriter
> Sound Card: Creative Sound Blaster Recon3D
> Power Supply: Corsair AX1200 and Seasonic 850W
> Cooling: Koolance CPU, GPU and MOBO WB | Black Ice SR1 560, Alphacool NexXxoS Monsta,
> XSPC RX480, Koolance 3x120mm | Swiftech MCP655 x2| Scythe Gentle
> Typhoon 120MM 1850 RPM x15, Noiseblocker NB-BlackSilentPro 140 MM 1700 RPM
> x4, BitFenix Spectre Pro 120mm RED LED Fans x4
> Monitor: Asus VG278H 27-Inch 120Hz 3D Monitor


Do you have a hate for tubing? lol

p.s much easier if your fill out the rig builder


----------



## Alex132

Quote:


> Originally Posted by *maximus56*
> 
> You have a lot to say for someone who neither has a GTX 690 quad nor a GTX 680 Quad, a mistake indeed!
> 
> photo.JPG 815k .JPG file
> 
> 
> photo (1).JPG 2153k .JPG file
> 
> 
> http://www.3dmark.com/hall-of-fame/3dmark-11-top-extreme-preset/
> 
> http://cdn.overclock.net/c/c5/c5c9ea93_00019.jpeg
> 
> System Specs:
> Mother Board: ASUS Rampage IV Extreme
> CPU: Intel i7-3930K @4.6GHZ
> GPU: EVGA Nvidia GTX690 (Quad SLI)
> Memory: Corsair Dominator GT DDR3-2133 16GB (4x4GB
> Case: CaseLabs STH10
> SSD: Samsung Series 250GB SSD x2 and Segate Barracuda 3TB SATA 6.0Gb/s x1
> Optical Drive: LG Electronics 12X Internal SATA Blu-Ray Rewriter
> Sound Card: Creative Sound Blaster Recon3D
> Power Supply: Corsair AX1200 and Seasonic 850W
> Cooling: Koolance CPU, GPU and MOBO WB | Black Ice SR1 560, Alphacool NexXxoS Monsta,
> XSPC RX480, Koolance 3x120mm | Swiftech MCP655 x2| Scythe Gentle
> Typhoon 120MM 1850 RPM x15, Noiseblocker NB-BlackSilentPro 140 MM 1700 RPM
> x4, BitFenix Spectre Pro 120mm RED LED Fans x4
> Monitor: Asus VG278H 27-Inch 120Hz 3D Monitor x3


So what youre trying to say is that now because you own 2 690s that they have to be much better than what people are saying.
It's pretty darn obvious that 2 690s are a waste because you will not have enough VRAM to run games at a high, high resolution especially in the future.

Benchmarks don't use a lot of VRAM, so there is no point in trying to prove your point with that - but if you had 2 690s vs 4gb 4 680s in a game like BF3 or heavily modded skyrim with 3 1080p monitors you would see a massive difference between them because of the extra VRAM that the 680s have.


----------



## pilla99

Quote:


> Originally Posted by *Alex132*
> 
> So what youre trying to say is that now because you own 2 690s that they have to be much better than what people are saying.
> It's pretty darn obvious that 2 690s are a waste because you will not have enough VRAM to run games at a high, high resolution especially in the future.
> Benchmarks don't use a lot of VRAM, so there is no point in trying to prove your point with that - but if you had 2 690s vs 4gb 4 680s in a game like BF3 or heavily modded skyrim with 3 1080p monitors you would see a massive difference between them because of the extra VRAM that the 680s have.


All he's trying to say is he has more money than sense.


----------



## Arm3nian

690 quad sli is the biggest waste of money ever. I've had it I would know. At 2560x1600 you hit 2gb vram. Any higher resolution and you get 0.4fps. A single 690 can max every game out there at [email protected], for 120hz monitors it will also get 120fps on mostly every game.

There is also no need to get it for benchmarking, if you're benchmarking in the first place go with multiple 7970ghz edition because it beats every 600 series out there in benchmarks. The only logical option for games is either a 690, or 2x 680 for 2560x1600, 3 will do also, when you go 4 your performance actually drops because the game cannot utilize all 4 cores, which gives low fps. 690 is the sweet spot for a single monitor, 3-4 4gb 680's are for triple screen gaming.

There is seriously no point to "Benchmark" on 600 series cards unless the benchmark is a game. Also, if you are that serious about benchmarking you probably have your cards under water anyway so you can overclock them a lot. 600 series suffer from locked bios, and even when unlocked it is still behind. My gpu gets 30c max under 100% load for 4 hours, I would like to oc much more but I can't because of the voltage limitations. ati's side doesn't suffer nearly as much from that.

1920 x 1080/1200 or 2560 x 1440/1600 GAMING or GAMING BENCHMARKS

> GTX 690 or 2-3 GTX 680's (2gb or 4gb)

5760 x 1080/1200 or 7680 x 1440/1600 GAMING or GAMING BENCHMARKS

> 2-4 GTX 680 (4gb)

ANY RESOLUTION "BENCHMARKS"

> 7970's (3gb or 6gb)

Don't be a fool like me and waste $1000 on a 2nd 690 when it will do you no good and then go through trouble reselling it for less than what you paid.


----------



## PowerK

For best possible image quality (2560x1600 with SGSSAA), I think 3-Way SLI, 4-Way SLI and/or Quad-SLI is a must, IMHO.... at least, with the games I enjoy playing. (Crysis with SGSSAA, Witcher 2 with ulbersampling, Sleeping Dogs with Extreme AA, Metro 2033, GTA4 with ENB super-sampling AA etc) I pretty much apply SGSSAA with almost every game I play and loving the visual outcome.

My experience with 690 Quad-SLI since its launch back in May has been very positive. If I was to build a gaming rig again today, I'd still go for 690 Quad-SLI.


----------



## maximus56

Now here is a data point that should have the most relevance, since it is based on the experience of a "real user", who has been running a GTX Quad 690 set up over a long enough period of time to smooth out any anomalies. Similarly, I have been playing with quad set up since June, and have played every so called "demanding game" in surround with max settings, mods etc. I could not be happier with my experience, and like PowerK, I would make the same choice again.
Would love to hear from more *existing* 690 quad owners about their experience, whether negative or positive.


----------



## maximus56

Quote:


> Originally Posted by *pilla99*
> 
> All he's trying to say is he has more money than sense.


I believe that the latter is the prerequisite for the former. Unfortunately, it appears that you are lacking in both.


----------



## pilla99

Quote:


> Originally Posted by *maximus56*
> 
> I believe that the latter is the prerequisite for the former. Unfortunately, it appears that you are lacking in both.


Say what you like but unfortunately at the end of the day anyone that knows what they're doing (most people on this forum) understand a simple concept like vram, and hence why your decision was a poor one. That's not an opinion, that's a fact. 2GB of vram is not going to cut it in many games today(Skrim with mods, GTA with mods, NS2 with mods etc etc etc) and certainly not tomorrow.

I can understand though, a 2k drop on GPU's that you then come to realize was a poor decision would rustle my jimmies too.


----------



## Arizonian

Ok gentlemen, let's just agree to disagree. At the very least keep personal comments out of your posts. Much appreciated.


----------



## Jessekin32

This is silly. You must not be maxing out the games you are playing on your three screens, because I was hitting VRAM caps in BF3, Metro 2033, AND very very heavily modded Skyrim. (My friend is a nut when it comes to making mods, and I ginnie pig them all.)

I wish I could have loved my 690 enough to keep it, but I simply couldn't max out my games because of the 2GB VRAM limit per GPU. In some of the games, I'm getting better performance on just one of my FTW+ 680's (4GB) alone, than I was with my 690, and it was only because I was going over 2GB in some games. (mostly skyrim)

It's a shame, really... Because as games become more VRAM demanding, I will more frequently come across this issue.

*NOTE:* I'm using a resolution of 6050x1080 (bezel corrected).


----------



## Kaapstad

GTX 690 quad sli no good for benching compared to HD 7970s ?

Look what is at the top of this list

http://www.3dmark.com/hall-of-fame/

I don't know what hes using to cool them but I want two.


----------



## pilla99

Quote:


> Originally Posted by *Kaapstad*
> 
> GTX 690 quad sli no good for benching compared to HD 7970s ?
> Look what is at the top of this list
> http://www.3dmark.com/hall-of-fame/
> I don't know what hes using to cool them but I want two.


Benches don't take vram into account. Hence they can be misleading.


----------



## maximus56

Quote:


> Originally Posted by *Kaapstad*
> 
> GTX 690 quad sli no good for benching compared to HD 7970s ?
> Look what is at the top of this list
> http://www.3dmark.com/hall-of-fame/
> I don't know what hes using to cool them but I want two.


Touché!..lol


----------



## Kaapstad

Quote:


> Originally Posted by *pilla99*
> 
> Say what you like but unfortunately at the end of the day anyone that knows what they're doing (most people on this forum) understand a simple concept like vram, and hence why your decision was a poor one. That's not an opinion, that's a fact. 2GB of vram is not going to cut it in many games today(Skrim with mods, GTA with mods, NS2 with mods etc etc etc) and certainly not tomorrow.
> I can understand though, a 2k drop on GPU's that you then come to realize was a poor decision would rustle my jimmies too.


Your argument seems to rely on games with mods, this is an unsound argument. If you mod any game enough it will not run on any available GPU. If you base your argument on off the shelf games the bottom line is - up to 1600p the GTX 690 is fine, 2 x GTX 690 is even better, beyond 1600p its time to break out the HD 7970s.

Im also a HD 7970 owner - 3 x Asus matrix platinums to be exact so im not biased one way or another.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *pilla99*
> 
> Say what you like but unfortunately at the end of the day anyone that knows what they're doing (most people on this forum) understand a simple concept like vram, and hence why your decision was a poor one. That's not an opinion, that's a fact. 2GB of vram is not going to cut it in many games today(Skrim with mods, GTA with mods, NS2 with mods etc etc etc) and certainly not tomorrow.
> I can understand though, a 2k drop on GPU's that you then come to realize was a poor decision would rustle my jimmies too.


Yep, I agree here.


----------



## Billymac10

I can't speak for 690 Quad SLI, but I'd just like to put in my two cents with my single 690 surround setup.

I've found it more than adequate to run my resolution of 5160x1050. Now granted I've only been testing it with Far Cry 3 so far, but I'd say as demanding games go, it's a pretty good litmus test. With a mild overclock of +80 +182 135% I'm getting 50-70 FPS at that resolution all the time. When checking VRAM usage with Precision X, I've never been higher than 1400MB. When I add in MSAA 2x I'm up around 1700MB at 40-50 FPS and with 4x It's around 1900-1950MB. At 4x my FPS is dropping to 25-35 FPS with occasional spikes to 40+. Just for jollies I turned on 8x MSAA and it crawled. You could say it's because of the VRAM limit but even if it wasn't it's not going to be playable given the FPS drop off from 2x to 4x.

My point is that I'm actually running out of horsepower before hitting the VRAM limit.

So let's say I turn my resolution up to 5760x1080. I'd expect to take about a 15% performance hit. That would make 4x MSAA unplayable but 2X would still work and I'd still be under the VRAM limit even with the extra pixel count.

My conclusion is that if you want to run Far Cry 3 at 5760x1080 with 4X or 8X MSAA you'd need more than either a single 690 or 2gb/4gb SLI 680's because you're going to run out of processing power before VRAM. If you've got the funds for 3 way SLI 4gb 680's that would probably be the best bet but that's an extra $500 over a single 690. Personally, I don't care that much for high MSAA to justify that extra cost.

You can say "but in the future..." all you want but the fact is that unless the next gen consoles come with more than 2GB VRAM you're going to be fine playing about any game on PC with 2GB for the foreseeable future given the port nature of today's video game climate. There of course, will be exceptions to every rule though.


----------



## Kaapstad

I ran Far Cry 3 with everything maxed including 8x MSAA @1600 with 2 x GTX 690s quad sli. I used the full 2gb of vram in the process but I don't think that I needed any more. The major problem was GPU horsepower, I was getting a playable 45 fps and the game looked great. The GPUs were all scaling well so I hate to think what it would have been like on one or two GPUs.

Any one who says 2 GTX 690s are a waste should try Far Cry 3 with everything maxed because you are going to run out of GPU muscle with anything less.


----------



## Alex132

Quote:


> Originally Posted by *pilla99*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maximus56*
> 
> I believe that the latter is the prerequisite for the former. Unfortunately, it appears that you are lacking in both.
> 
> 
> 
> Say what you like but unfortunately at the end of the day anyone that knows what they're doing (most people on this forum) understand a simple concept like vram, and hence why your decision was a poor one. That's not an opinion, that's a fact. 2GB of vram is not going to cut it in many games today(Skrim with mods, GTA with mods, NS2 with mods etc etc etc) and certainly not tomorrow.
> 
> I can understand though, a 2k drop on GPU's that you then come to realize was a poor decision would rustle my jimmies too.
Click to expand...

Yep
Quote:


> Originally Posted by *Kaapstad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pilla99*
> 
> Say what you like but unfortunately at the end of the day anyone that knows what they're doing (most people on this forum) understand a simple concept like vram, and hence why your decision was a poor one. That's not an opinion, that's a fact. 2GB of vram is not going to cut it in many games today(Skrim with mods, GTA with mods, NS2 with mods etc etc etc) and certainly not tomorrow.
> I can understand though, a 2k drop on GPU's that you then come to realize was a poor decision would rustle my jimmies too.
> 
> 
> 
> Your argument seems to rely on games with mods, this is an unsound argument. If you mod any game enough it will not run on any available GPU. If you base your argument on off the shelf games the bottom line is - up to 1600p the GTX 690 is fine, 2 x GTX 690 is even better, beyond 1600p its time to break out the HD 7970s.
> 
> Im also a HD 7970 owner - 3 x Asus matrix platinums to be exact so im not biased one way or another.
Click to expand...

Yeah, and obviously you can't go for 5760x1080 with only 2GB of VRAM


----------



## Kaapstad

Quote:


> Originally Posted by *Alex132*
> 
> Yep
> Yeah, and obviously you can't go for 5760x1080 with only 2GB of VRAM


Not likely

If I did want to run that resolution I would use my HD 7970s.

2gb of vram is only good up to 1600p - Horses for Courses


----------



## maximus56

Works fine for me! and if I feel that my game enjoyment is being hampered by this "vram" issue, I will remedy the problem. However, until such time, and based on my experience, vram is an academic issue and seems to be argued by those who never have it.


----------



## chabze

Just sold my hd7970 and my gtx690 is now on eBay.
Found it useless @ 5880 x 1080 on bf3
The hd7970 was better. Most likely down to the VRAM

So it's a hd7990 me thinks. Looks like a better card.
Yeh I know I could have brought another 7970 and had them in xfire but I like the space it saves having just the 1 card. ( cooling )

See how the hd7990 performs then if all good I can get another for the other 3 screens and get all 6 on the go.
Anyone had any experience with 6 screen eyefinity?


----------



## maximus56

Quote:


> Originally Posted by *Kaapstad*
> 
> I ran Far Cry 3 with everything maxed including 8x MSAA @1600 with 2 x GTX 690s quad sli. I used the full 2gb of vram in the process but I don't think that I needed any more. The major problem was GPU horsepower, I was getting a playable 45 fps and the game looked great. The GPUs were all scaling well so I hate to think what it would have been like on one or two GPUs.
> Any one who says 2 GTX 690s are a waste should try Far Cry 3 with everything maxed because you are going to run out of GPU muscle with anything less.


I find it quite remarkable that, in spite of the positive experiences being shared by the users of Quad 690s throughout this forum, some people continue to shamelessly recycle canned visceral responses against this configuration; as if in an attempt to convince themselves, rather than the actual owners of quad 690s...quite pitiful, really.

Glad to hear that you are enjoying your Quad 690s, as I am!


----------



## Rei86

Well, gonna RMA another 690. This one died on me when it was folding for the 1st for only 15mins









Gonna see if I can send it back in and for some cash pick up a FTW + 4GB x2 or Classified x2. If not I'll just get another RMAed 690 and sell it on the forums/Ebay and pick up the 680x2 FTW+ or Classified.


----------



## iARDAs

how can a GPU die with Folding?

I don't get it.

I wonder if the quality control is not as great in some GPUs. However 690 is the flagship GPU so it shouldn't be an issue as well.


----------



## Buzzkill

Quote:


> Originally Posted by *maximus56*
> 
> I find it quite remarkable that, in spite of the positive experiences being shared by the users of Quad 690s throughout this forum, some people continue to shamelessly recycle canned visceral responses against this configuration; as if in an attempt to convince themselves, rather than the actual owners of quad 690s...quite pitiful, really.
> Glad to hear that you are enjoying your Quad 690s, as I am!


I agree with with you. You don't see this on other forums (Nvidia, Evga & AsusROG). Bleeding edge tech always has qurks or issues out the gate. But driver support has been pretty good. It's funny how someone can repeat what someone else says when they have no experience with quad.


----------



## Rei86

Quote:


> Originally Posted by *iARDAs*
> 
> how can a GPU die with Folding?
> 
> I don't get it.
> 
> I wonder if the quality control is not as great in some GPUs. However 690 is the flagship GPU so it shouldn't be an issue as well.


310.70 was the driver I was running

Downloaded and ran [email protected] today

Monitor went blank but the CPU was running and I can hear the fans ramping up

Changed the different DVI ports and still wouldnt work

I have a 650Ti also in the system and connected the monitor and everything was showing up fine

Took the 690 out of this system and put it in the HTPC and nothing.


----------



## maximus56

Quote:


> Originally Posted by *Buzzkill*
> 
> I agree with with you. You don't see this on other forums (Nvidia, Evga & Asus). Bleeding edge tech always as qurks or issues out the gate. But driver support has been pretty good. It's funny how someone can repeat what someone else says when they have no experience with quad.


Right on..the above noted observation was precisely the reason why I made my initial post, as I was perplexed by the disconnect between my user experience and what I was reading on this forum.
So, this is what I can delineate, based on the sample data of responses on this forum, with statistical significance:

1. 98% of quad GTX 690 users are very pleased with their configuration, provided that their period of ownership/use exceeds 30 days;
2. 95% of the negative comments regarding the quad 690 configuration are contributed by members who have a) no first hand experience of the quad 690 setup, and, b) have little to 0 experience with any quad configurations at all.

I guess that we can chalk this behavior up to ignorance at best, and maliciousness at worst.


----------



## armando666

Quote:


> Originally Posted by *maximus56*
> 
> Right on..the above noted observation was precisely the reason why I made my initial post, as I was perplexed by the disconnect between my user experience and what I was reading on this forum.
> So, this is what I can delineate, based on the sample data of responses on this forum, with statistical significance:
> 1. 98% of quad GTX 690 users are very pleased with their configuration, provided that their period of ownership/use exceeds 30 days;
> 2. 95% of the negative comments regarding the quad 690 configuration are contributed by members who have a) no first hand experience of the quad 690 setup, and, b) have little to 0 experience with any quad configurations at all.
> I guess that we can chalk this behavior up to ignorance at best, and maliciousness at worst.


So, I guess it's ok to play Minecraft in surround with this set up?


----------



## pilla99

Quote:


> Originally Posted by *maximus56*
> 
> Right on..the above noted observation was precisely the reason why I made my initial post, as I was perplexed by the disconnect between my user experience and what I was reading on this forum.
> So, this is what I can delineate, based on the sample data of responses on this forum, with statistical significance:
> 1. 98% of quad GTX 690 users are very pleased with their configuration, provided that their period of ownership/use exceeds 30 days;
> 2. 95% of the negative comments regarding the quad 690 configuration are contributed by members who have a) no first hand experience of the quad 690 setup, and, b) have little to 0 experience with any quad configurations at all.
> I guess that we can chalk this behavior up to ignorance at best, and maliciousness at worst.


I'm not sure why you are adopting a victim complex here, I am simply pointing out (as others have) that even a single 680 4GB can perform better than a quad 690 setup when the vram exceeds to 2GB. Again it doesn't matter who posts these facts or what configuration they are running, that's just the fact of the matter. So spending 2k on GPU's that have a ceiling like that, especially with high resolutions and multi-monitor setups might be considered a mistake. Considering that for around the same price 4 4GB 680's could be had fixing the vram ceiling.

But sure if you're a casual gamer that never gets into mods and things like that for now 2GB should be just fine for you even at higher than 1600p in some cases.

Edit: At this point I'm done with the argument, the only thing you've been able to come with with are off the cuff ad hominem attacks that do nothing to back up your point and make you look like you can't argue. If we were in court or a debate you would lose, and badly. I have been trying to get you to understand by using thngs like "logic" and "facts." You seem to ignore these. Whatever it is you do for a living, Jesus I hope you aren't a lawyer.


----------



## Alex132

Quote:


> Originally Posted by *maximus56*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Buzzkill*
> 
> I agree with with you. You don't see this on other forums (Nvidia, Evga & Asus). Bleeding edge tech always as qurks or issues out the gate. But driver support has been pretty good. It's funny how someone can repeat what someone else says when they have no experience with quad.
> 
> 
> 
> Right on..the above noted observation was precisely the reason why I made my initial post, as I was perplexed by the disconnect between my user experience and what I was reading on this forum.
> So, this is what I can delineate, based on the sample data of responses on this forum, with statistical significance:
> 
> 1. 98% of quad GTX 690 users are very pleased with their configuration, provided that their period of ownership/use exceeds 30 days;
> 2. 95% of the negative comments regarding the quad 690 configuration are contributed by members who have a) no first hand experience of the quad 690 setup, and, b) have little to 0 experience with any quad configurations at all.
> 
> I guess that we can chalk this behavior up to ignorance at best, and maliciousness at worst.
Click to expand...

I love how someone insulted your intelligence, and now you're trying to use more sophisticated words in sentences. It really comes off badly, I suggest you just stick to normal syntax, please.


----------



## Kaapstad

Quote:


> Originally Posted by *pilla99*
> 
> I'm not sure why you are adopting a victim complex here, I am simply pointing out (as others have) that even a single 680 4GB can perform better than a quad 690 setup when the vram exceeds to 2GB.


You really do need to do your homework

Yes above 1600p 2gb of vram is not enough

But to say a single gtx 680 4gb will outperform a quad 690 setup is unwise. The single 680 does not have enough GPU grunt to get the FPS up high enough to use that 4gb of vram. If you are going to benefit from that vram you really need 3 or even better still 4 GPUs to get the FPS up to a high enough level to start really using the extra vram.

The other thing that comes into the equation is the 256 bit memory bus on a GTX 680 which for high resolutions is considered poor wheather the card has 2 or 4 gb of vram. Although most HD 7970s only have 3gb of vram, they do come with a 384 bit memory bus which helps a lot at high resolutions. This is one of reasons HD 7970s often do well with multi monitor setups.

Having said all that the GTX 690 is not suitable for above 1600p with maxed settings.

But that still leaves a lot of options @1600p or below.


----------



## pilla99

Quote:


> Originally Posted by *Kaapstad*
> 
> You really do need to do your homework
> Yes above 1600p 2gb of vram is not enough
> But to say a single gtx 680 4gb will outperform a quad 690 setup is unwise. The single 680 does not have enough GPU grunt to get the FPS up high enough to use that 4gb of vram. If you are going to benefit from that vram you really need 3 or even better still 4 GPUs to get the FPS up to a high enough level to start really using the extra vram.
> The other thing that comes into the equation is the 256 bit memory bus on a GTX 680 which for high resolutions is considered poor wheather the card has 2 or 4 gb of vram. Although most HD 7970s only have 3gb of vram, they do come with a 384 bit memory bus which helps a lot at high resolutions. This is one of reasons HD 7970s often do well with multi monitor setups.
> Having said all that the GTX 690 is not suitable for above 1600p with maxed settings.
> But that still leaves a lot of options @1600p or below.


I play NS2 competitively. When I'm messing around in modded servers (TLG) r_stats can be used to show vram usage. At max settings at 1440p I hit 2.3GB and my FPS comes to a grinding halt. Again mainstream games like Cod or Far Cry built for consoles will be fine in most cases, but that doesn't mean more adventurous gamers that get off the beaten path won't have issues. Issues that I personally wouldn't have if I just was smart and grabbed two 680's. This is a problem for me and I'm only at 1440p in 2012. 1600 + multi monitor in 2013 I think is going to be a bad setup for 2GB.


----------



## Kaapstad

I think 3gb of vram is going to be the bare minimum for a top GPU in 2013 and 4 gb will be very common.

I have not pointed it out but even when I bought my 690s back in May I thought nvidia had made a bad mistake not giving them more vram. For the price they charge for the cards they could have done this easily.


----------



## HBoisvert

Seasons Greetings One & All.

I am a proud owner of an EVGA GeForce GTX690 Signature 4096MB. Can anyone tell me how to join this distinguished owners club?

Thanks


----------



## Alex132

Quote:


> Originally Posted by *HBoisvert*
> 
> Seasons Greetings One & All.
> 
> I am a proud owner of an EVGA GeForce GTX690 Signature 4096MB. Can anyone tell me how to join this distinguished owners club?
> 
> Thanks


picture and brand, its in the OP.

Also; why aren't they selling the 690 backplate anymore (EVGA), something wrong with it.... or what?


----------



## Buzzkill

Quote:


> Originally Posted by *Alex132*
> 
> picture and brand, its in the OP.
> Also; why aren't they selling the 690 backplate anymore (EVGA), something wrong with it.... or what?


Backplates are out of stock everywhere but eBay. Evga has removed listing for 690 Backplates. You can get email when in stock at Frozen Cpu, Performance-PCs , & Sidewinder

Ebay.com EVGA GTX 690 Backplate US $32.99

FrozenCpu.com EVGA GeForce GTX 690 Backplate $24.99 Out of Stock

Perormance-Pcs.com eVGA GeForce GTX 690 Backplate $24.99 Out Of Stock

Sidewinder.com EVGA GTX 690 Backplate $24.99 Out Of Stock


----------



## Rei86

Quote:


> Originally Posted by *Alex132*
> 
> picture and brand, its in the OP.
> Also; why aren't they selling the 690 backplate anymore (EVGA), something wrong with it.... or what?


Because the life of the 600 series is done. EVGA has put out all of their version of the 600 series and nVidia is now done with the release of the GT640-620-650-650Ti.

I'll have my GTX690 backplate up for sale soon.


----------



## qiplayer

ABOUT VRAM

I had 2xgtx690, and have now 2 gtx 680.
With none of them I had a VRAM bottleneck.
And my res is 6000x1080.
So more than just a 1600p monitor.
Maybe you don't know that you can disable aereo in win7 and you save 4-500mb of VRAM.
So I can max out Crysis2, the bottleneck isn't by the VRAM.
By Bf3 I don't remember if there was just one thing that made me reach the 2gb of VRAM. But with one 690, or 2 680, the bottleneck was the power of these cards and not the vram.

Then about the quad sli on 690...
I'm the third one here that tried that setting, had other problems more than a vram issue. With 1200 mb of vram in use and 120 fps the game wasn't fluid.
I must say I wasn't satisfied by the Evga support and also not by the Nvidia support. So I sold the cards meanwhile these where still hard to find.

Good for you if this setup works fine


----------



## Jessekin32

Quote:


> Originally Posted by *Rei86*
> 
> Well, gonna RMA another 690. This one died on me when it was folding for the 1st for only 15mins
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna see if I can send it back in and for some cash pick up a FTW + 4GB x2 or Classified x2. If not I'll just get another RMAed 690 and sell it on the forums/Ebay and pick up the 680x2 FTW+ or Classified.


I'm currently trying to sell my 690, while picking up 680 FTW+ cards for Dual SLI.

IMHO, go with FTW+, you won't regret it. You're more likely to get a higher clocking card with the FTW+ than the Classifieds. The FTW+ are hand picked cards, while the classifieds are all manufactured and shipped out. they aren't cherry picked like the FTW+'s for stability.

Also, The Classifieds recycle more heat inside the case than the FTW+'s because of the split exhaust. If you are like me, and want lower temps inside your case for watercooling / cpu temps, you'd go for the FTW+. Even with my case, which has ample cooling, it would still heat up fast. Especially with two.

*EDIT:* I'm not trying, nor do I think I am derailing the forum saying this. It still relates to 690's, in that people are looking for alternatives, as I did. If people are leaving the Owners Club with good reason, I feel it's our job as a community to point them in a well educated direction.


----------



## Rei86

hmm 1st time hearing that.

The EVGA Classifieds are their Cherry Picked "best of the best" cards.

Either ways already purchased two off of amazon.


----------



## maximus56

Quote:


> Originally Posted by *pilla99*
> 
> I'm not sure why you are adopting a victim complex here, I am simply pointing out (as others have) that even a single 680 4GB can perform better than a quad 690 setup when the vram exceeds to 2GB. Again it doesn't matter who posts these facts or what configuration they are running, that's just the fact of the matter. So spending 2k on GPU's that have a ceiling like that, especially with high resolutions and multi-monitor setups might be considered a mistake. Considering that for around the same price 4 4GB 680's could be had fixing the vram ceiling.
> But sure if you're a casual gamer that never gets into mods and things like that for now 2GB should be just fine for you even at higher than 1600p in some cases.
> Edit: At this point I'm done with the argument, the only thing you've been able to come with with are off the cuff ad hominem attacks that do nothing to back up your point and make you look like you can't argue. If we were in court or a debate you would lose, and badly. I have been trying to get you to understand by using thngs like "logic" and "facts." You seem to ignore these. Whatever it is you do for a living, Jesus I hope you aren't a lawyer.


Didn't know whether to laugh or cry once I finally got around to reading your post. You can't understand what I am saying, because you don't know. And, THAT IS THE POINT!!

I appreciate your concern for wanting to know what I do for a living, so in the same spirit, here is my suggestion: have a backup plan if this school thing does not work out for you, or better yet, stay in school a bit longer and may be you will learn a thing or two.


----------



## maximus56

Quote:


> Originally Posted by *Alex132*
> 
> I love how someone insulted your intelligence, and now you're trying to use more sophisticated words in sentences. It really comes off badly, I suggest you just stick to normal syntax, please.


I have no idea what you said here, as your IQ meter seems to be dropping in lockstep with Pilla99. Do you even have a 690, or are you just here for nuisance value? I hope this is a "normal enough syntax" for you.


----------



## maximus56

Quote:


> Originally Posted by *qiplayer*
> 
> ABOUT VRAM
> I had 2xgtx690, and have now 2 gtx 680.
> With none of them I had a VRAM bottleneck.
> And my res is 6000x1080.
> So more than just a 1600p monitor.
> Maybe you don't know that you can disable aereo in win7 and you save 4-500mb of VRAM.
> So I can max out Crysis2, the bottleneck isn't by the VRAM.
> By Bf3 I don't remember if there was just one thing that made me reach the 2gb of VRAM. But with one 690, or 2 680, the bottleneck was the power of these cards and not the vram.
> Then about the quad sli on 690...
> I'm the third one here that tried that setting, had other problems more than a vram issue. With 1200 mb of vram in use and 120 fps the game wasn't fluid.
> I must say I wasn't satisfied by the Evga support and also not by the Nvidia support. So I sold the cards meanwhile these where still hard to find.
> Good for you if this setup works fine


Appreciate you sharing your experience with this set up


----------



## Arizonian

Quote:


> Originally Posted by *maximus56*
> 
> photo.JPG 815k .JPG file
> 
> 
> photo (1).JPG 2153k .JPG file
> 
> http://www.3dmark.com/hall-of-fame/3dmark-11-top-extreme-preset/
> http://cdn.overclock.net/c/c5/c5c9ea93_00019.jpeg
> System Specs:
> Mother Board: ASUS Rampage IV Extreme
> CPU: Intel i7-3930K @4.6GHZ
> GPU: EVGA Nvidia GTX690 (Quad SLI)
> Memory: Corsair Dominator GT DDR3-2133 16GB (4x4GB
> Case: CaseLabs STH10
> SSD: Samsung Series 250GB SSD x2 and Segate Barracuda 3TB SATA 6.0Gb/s x1
> Optical Drive: LG Electronics 12X Internal SATA Blu-Ray Rewriter
> Sound Card: Creative Sound Blaster Recon3D
> Power Supply: Corsair AX1200 and Seasonic 850W
> Cooling: Koolance CPU, GPU and MOBO WB | Black Ice SR1 560, Alphacool NexXxoS Monsta,
> XSPC RX480, Koolance 3x120mm | Swiftech MCP655 x2| Scythe Gentle
> Typhoon 120MM 1850 RPM x15, Noiseblocker NB-BlackSilentPro 140 MM 1700 RPM
> x4, BitFenix Spectre Pro 120mm RED LED Fans x4
> Monitor: Asus VG278H 27-Inch 120Hz 3D Monitor x3


Hi maximus56, think jcde7ago missed you as he's taking over the reigns again and there was some confusion. I got you added. My aplolgizes we almost missed it.


----------



## Rei86

I missed some of the conversation that's been taking place here since I'm abit miffed about own 690 taking the dump but this comment
Quote:


> Originally Posted by *pilla99*
> 
> I am simply pointing out (as others have) that even a single 680 4GB can perform better than a quad 690 setup when the vram exceeds to 2GB.


is pretty twisted.

Yes once the 690 hits the VRAM ceiling it won't be performing but the fact that you're stating a SINGLE 4GB 680 can perform better is a flat out lie.

Whatever game that's able to gum up a quad GK104 set up can defiantly slow down a single 680.


----------



## maximus56

Quote:


> Originally Posted by *Arizonian*
> 
> Hi maximus56, think jcde7ago missed you as he's taking over the reigns again and there was some confusion. I got you added. My aplolgizes we almost missed it.


Many Thanks


----------



## maximus56

Quote:


> Originally Posted by *Qu1ckset*
> 
> Do you have a hate for tubing? lol
> p.s much easier if your fill out the rig builder


"Hate for tubing?" ..lol. You are right Qu1ckset, something like that..just didn't like the look of it in my rig


----------



## jcde7ago

Quote:


> Originally Posted by *HBoisvert*
> 
> Seasons Greetings One & All.
> I am a proud owner of an EVGA GeForce GTX690 Signature 4096MB. Can anyone tell me how to join this distinguished owners club?
> Thanks


Post a pic with of your card/name and i'll get you added ASAP!


----------



## Victor_Mizer

Is it a good idea to get a 690 with 1440p res? Or would getting x2 4GB cards be the better option?


----------



## Qu1ckset

Quote:


> Originally Posted by *Victor_Mizer*
> 
> Is it a good idea to get a 690 with 1440p res? Or would getting x2 4GB cards be the better option?


i have a single gtx690 with my 1440p monitor and maxes everything with ease








(4gb 680s is better option tho)


----------



## Victor_Mizer

Quote:


> Originally Posted by *Qu1ckset*
> 
> i have a single gtx690 with my 1440p monitor and maxes everything with ease
> 
> 
> 
> 
> 
> 
> 
> 
> (4gb 680s is better option tho)


Alright thanks


----------



## Rei86

Quote:


> Originally Posted by *Victor_Mizer*
> 
> Is it a good idea to get a 690 with 1440p res? Or would getting x2 4GB cards be the better option?


Single GTX690 with a 1440p monitor and I'm doing alright. Steady 60fps with everything maxed out in games that I play.


----------



## maximus56

Quote:


> Originally Posted by *Kaapstad*
> 
> Not likely
> If I did want to run that resolution I would use my HD 7970s.
> 2gb of vram is only good up to 1600p - Horses for Courses


As an FYI, I have never run out of VRAM with ultra settings for BF3 and Skyrim (45 mods). I have not played FC3 yet, but will try once I get a chance. I am running 5760 x 1080.
Here is someone doing a 2gb benchmark, albeit with 680s:
http://forums.overclockers.co.uk/showthread.php?t=18444958

Just thought that I should share my experience using quad 690 and 3 x VG278H.


----------



## Victor_Mizer

Quote:


> Originally Posted by *maximus56*
> 
> As an FYI, I have never run out of VRAM with ultra settings for BF3 and Skyrim (45 mods). I have not played FC3 yet, but will try once I get a chance. I am running 5760 x 1080.
> Here is someone doing a 2gb benchmark, albeit with 680s:
> http://forums.overclockers.co.uk/showthread.php?t=18444958
> Just thought that I should share my experience using quad 690 and 3 x VG278H.


I was all ready to order the 690, but went 4GB 680 instead. I figure with the new 7xx due q2 next year I could put that 500$ towards that if they are good. Also all the talk about vram is just confusing at times, people saying they have no problems and others saying they go well over the 2gb vram... that site is pretty nice little benchmark.


----------



## jcde7ago

Quote:


> Originally Posted by *Victor_Mizer*
> 
> I was all ready to order the 690, but went 4GB 680 instead. I figure with the new 7xx due q2 next year I could put that 500$ towards that if they are good. Also all the talk about vram is just confusing at times, people saying they have no problems and others saying they go well over the 2gb vram... that site is pretty nice little benchmark.


Not a bad choice, though i'd imagine the extra horsepower of essentially another 680 with the 690 would be more beneficial to you than the additional 2GB of VRAM...but I can see you deciding to hold out for the 700 series. Enjoy that 4GB 680!


----------



## maximus56

Quote:


> Originally Posted by *Victor_Mizer*
> 
> I was all ready to order the 690, but went 4GB 680 instead. I figure with the new 7xx due q2 next year I could put that 500$ towards that if they are good. Also all the talk about vram is just confusing at times, people saying they have no problems and others saying they go well over the 2gb vram... that site is pretty nice little benchmark.


I hope that the 7xx comes out next year, but I read somewhere that Nvidia is no hurry to introduce another card, and will most likely run down its existing Kepler lineup until the end of next year. Hoping this turns out to be untrue..either way, I am sure you will be happy with your choice, as all the Kepler series cards are good.


----------



## Rei86

Quote:


> Originally Posted by *Victor_Mizer*
> 
> I was all ready to order the 690, but went 4GB 680 instead. I figure with the new 7xx due q2 next year I could put that 500$ towards that if they are good. Also all the talk about vram is just confusing at times, people saying they have no problems and others saying they go well over the 2gb vram... that site is pretty nice little benchmark.


If you're heavy into "HD Textures" and mods ala Skyrim, with a high enough resolution you will run into a VRAM issue.
Quote:


> Originally Posted by *maximus56*
> 
> I hope that the 7xx comes out next year, but I read somewhere that Nvidia is no hurry to introduce another card, and will most likely run down its existing Kepler lineup until the end of next year. Hoping this turns out to be untrue..either way, I am sure you will be happy with your choice, as all the Kepler series cards are good.


The 700 series will be Keplers. The GTX780 will most likely be the GK114 with probably around 2XXX cuda cores.


----------



## maximus56

Quote:


> Originally Posted by *Rei86*
> 
> If you're heavy into "HD Textures" and mods ala Skyrim, with a high enough resolution you will run into a VRAM issue.
> The 700 series will be Keplers. The GTX780 will most likely be the GK114 with probably around 2XXX cuda cores.


Most likely run into more "gpu grunt" issue before running into a vram issue. And, Skyrim is well optimized for Nvidia, so unless multimonitor (to your point) set up, play away with mods; just make sure to read the load order description and any conflicts. Yes, GK114 Kepler, and that's why I said *existing* line up of Kepler cards. Anyways, I would like for Nvidia to introduce this card earlier rather than later, would be an excuse for another build..lol


----------



## Rei86

Quote:


> Originally Posted by *maximus56*
> 
> Most likely run into more "gpu grunt" issue before running into a vram issue. And, Skyrim is well optimized for Nvidia, so unless multimonitor (to your point) set up, play away with mods; just make sure to read the load order description and any conflicts. Yes, GK114 Kepler, and that's why I said *existing* line up of Kepler cards. Anyways, I would like for Nvidia to introduce this card earlier rather than later, would be an excuse for another build..lol


'

Won't happen.

We'll see the HD8000 before we get a GTX700


----------



## maximus56

You could be right, as I don't know anything about the HD8000. All I am sharing is what I read in some rag about the "technology predictions for 2013" for the 700s. So, please take what I said just for speculative value..


----------



## Jessekin32

Quote:


> Originally Posted by *Rei86*
> 
> hmm 1st time hearing that.
> The EVGA Classifieds are their Cherry Picked "best of the best" cards.
> Either ways already purchased two off of amazon.


Well, I should clarify, I did a terrible job at explaining it. The classified cards have much higher potential to be better overclockers than the ftw cards. For obvious reasons like cooling, better power phases, more control, etc.

But, the ftw cards are tiered, and the classifieds are not. And by tiered, I mean the ftw cards come in different flavors of over clocks because of different levels of stability. You could get a ftw+ card that goes FAR beyond the stock clocks of a classified because of it being a stable card, while the standard ftw, and ftw LE cards are less stable.

The classifieds aren't cherry picked. All the classifieds are made and sold as is for their stock clock speeds, there aren't any tiers to the classifieds. You could get a dud, one that barely reaches 1200MHz, or one that reaches 1500MHz. Their range of variations is greater than that of a ftw+

CPU's are made and sold in this same fashion. Some are "born" stronger than others, so they label them differently and sell them for more.









In short, your odds of getting classified card that won't OC as well as a ftw+ is low, but I didn't wanna risk it. But that wasn't the reason I didn't get classifieds. I didn't want hot air from my gpu's circulating in my case because of the split exhaust design on the classifieds. It's the reason I like blower style cards. Nearly all the air is immediately kicked outside the case. And the classifieds ruin that for me.









*Also,* this should be the last time I mention 680s in a 690 forum.


----------



## pilla99

Just installed a h100, this thing is like running ice on the CPU holy crap. Also keeps the 690 cooler because the overall case temp is lowered.
Feelsgoodman, idling at like 35c.

Edit; this CPU has always run way too hot like something is wrong with it, before at this same OC idling at like 50, gaming took it to ~75.
Prime 95 now can't take it over 60, which for me is amazing.

Edit 2: and now hovering at around 40c with a 4.5GHz oc


----------



## tonyjones

H100 is amazing


----------



## V3teran

I dont think a single 780 when it arrives will beat a 690 anyhow.


----------



## fatlardo

OK, need some of your opinions guys. I currently have gigabyte 660 Ti OC version in SLI. I love them and have no problems at all. It does everything I want and need. But I was offered a brand new sealed gtx 690 for half price. Any reason why I should jump on this? How will warranty work as he does not have receipt. How much of a difference will I see and feel? Waste of money for me? Or buy and resale? Good lord, what should I do? I do want to mention, not having the best does make me always want me to upgrade from my 660 Tis. I'm constantly looking for a cheap upgrade.


----------



## egotrippin

Quote:


> Originally Posted by *fatlardo*
> 
> OK, need some of your opinions guys. I currently have gigabyte 660 Ti OC version in SLI. I love them and have no problems at all. It does everything I want and need. But I was offered a brand new sealed gtx 690 for half price. Any reason why I should jump on this? How will warranty work as he does not have receipt. How much of a difference will I see and feel? Waste of money for me? Or buy and resale? Good lord, what should I do? I do want to mention, not having the best does make me always want me to upgrade from my 660 Tis. I'm constantly looking for a cheap upgrade.


If it works, buy it. 690 is more powerful than two 660s and only occupies one PCIE slot, uses less electricity, creates less heat, uses less CPU or PCIE bandwidth. I have a GTX 690 and I can play every game at max settings at 2560x1440 and maintain 60 fps. If that's overkill for your needs, then resell it. I'm tempted to buy it if you don't (and I already have one).


----------



## fatlardo

Quote:


> Originally Posted by *egotrippin*
> 
> If it works, buy it. 690 is more powerful than two 660s and only occupies one PCIE slot, uses less electricity, creates less heat, uses less CPU or PCIE bandwidth. I have a GTX 690 and I can play every game at max settings at 2560x1440 and maintain 60 fps. If that's overkill for your needs, then resell it. I'm tempted to buy it if you don't (and I already have one).


If I do get it, I will have the 660 Tis sitting around. Do you know how warranty will work?


----------



## Kyouki

Quote:


> Originally Posted by *fatlardo*
> 
> OK, need some of your opinions guys. I currently have gigabyte 660 Ti OC version in SLI. I love them and have no problems at all. It does everything I want and need. But I was offered a brand new sealed gtx 690 for half price. Any reason why I should jump on this? How will warranty work as he does not have receipt. How much of a difference will I see and feel? Waste of money for me? Or buy and resale? Good lord, what should I do? I do want to mention, not having the best does make me always want me to upgrade from my 660 Tis. I'm constantly looking for a cheap upgrade.


Well first you would be getting better performance then the 2 660ti in SLI because the 690 is 2 680 in sli! and second HELL ya if your getting at half price there would be no reason not too. you could always resell at that price and make money back no problem. you could even sell you 660's and maybe get close to getting gtx690 at little to no cost if cards are played right. If not I would be interested in this deal haha even though people say not to go quad on gtx690 at half price I would do it for ***** and giggles.


----------



## fatlardo

Quote:


> Originally Posted by *Kyouki*
> 
> Well first you would be getting better performance then the 2 660ti in SLI because the 690 is 2 680 in sli! and second HELL ya if your getting at half price there would be no reason not too. you could always resell at that price and make money back no problem. you could even sell you 660's and maybe get close to getting gtx690 at little to no cost if cards are played right. If not I would be interested in this deal haha even though people say not to go quad on gtx690 at half price I would do it for ***** and giggles.


How does warranty work? He does not have receipt.


----------



## GTX 690 SLI

Hi guys i am new here and have two of them and now going for EPIC water ambient cooling:thumb:
Can i join the club!


----------



## Kyouki

Quote:


> Originally Posted by *fatlardo*
> 
> How does warranty work? He does not have receipt.


good question, most of them are bought online, in that case there would be a email or digital receipt but if there is no receipt at all I would assume EVGA or who ever made it should have it track by serial number Id just call them and see how they would be able to help you if you provided that information.


----------



## jcde7ago

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> Hi guys i am new here and have two of them and now going for EPIC water ambient cooling:thumb:
> Can i join the club!


Added, welcome to the club! Please do post a picture of your cards once you do get them!









Also, Arizonian will have to show me how to add his giant "Updated" pic for new members...


----------



## Arm3nian

Quote:


> Originally Posted by *egotrippin*
> 
> If it works, buy it. 690 is more powerful than two 660s and only occupies one PCIE slot, uses less electricity, creates less heat, uses less CPU or PCIE bandwidth. I have a GTX 690 and I can play every game at max settings at 2560x1440 and maintain 60 fps. If that's overkill for your needs, then resell it. I'm tempted to buy it if you don't (and I already have one).


Except far cry 3, I refuse to play that game I cant get 60fps with 8xmsaa, I know that level of AA is tough esepecially on 1440p but I think a driver update should do it.


----------



## teichu

Quote:


> Originally Posted by *Arm3nian*
> 
> Except far cry 3, I refuse to play that game I cant get 60fps with 8xmsaa, I know that level of AA is tough esepecially on 1440p but I think a driver update should do it.


Lol you could never play 8xmsaa with single gtx690 , i just turn it off runs smoother


----------



## Arm3nian

Quote:


> Originally Posted by *teichu*
> 
> Lol you could never play 8xmsaa with single gtx690 , i just turn it off runs smoother


I play skyrim with mods at 8x. And get 150fps in bf3 with 4x so 8x ez. I had a second 690 and really the only point of it was to run witcher 2 with uber sampling, and metro which is terribly optimized, like farcry 3 currently.


----------



## xHoLy

hey

the latest games especially far cry 3 really heats up my gtx690, both gpus around 90°C and almost 100% gpu usage on both. gpu1 is usually 2-3°C hotter for some reason, the max temperature i have seen so far was 94°C on gpu1

i am curious to know if its bad, and what i could posibbly do to lower it a little?


----------



## teichu

Quote:


> Originally Posted by *Arm3nian*
> 
> I play skyrim with mods at 8x. And get 150fps in bf3 with 4x so 8x ez. I had a second 690 and really the only point of it was to run witcher 2 with uber sampling, and metro which is terribly optimized, like farcry 3 currently.


Well i can play BF3 flawlessly without any issues , the only game i had little issue to run is far cry 3 , metro i havent test yet although i had this game on my steam list but as far as i know metro might need physx to run better performance i used to have 580 sli running metro , i remember i got pretty smooth play


----------



## Rei86

Quote:


> Originally Posted by *xHoLy*
> 
> hey
> the latest games especially far cry 3 really heats up my gtx690, both gpus around 90°C and almost 100% gpu usage on both. gpu1 is usually 2-3°C hotter for some reason, the max temperature i have seen so far was 94°C on gpu1
> i am curious to know if its bad, and what i could posibbly do to lower it a little?


GPU1 will always be hotter than GPU2. That's just the nature of having two GPUs on one PCB.

Clean out GTX690 and clean out whatever else in your computer of dust. Than take apart your GTX690 and clean the GPU/Vapor Chamber and reTIM it. Should drop the temps. I've never had the my 690 reach over 69 on GPU1 with a fan profile and reTIM with IC Diamond. And my 2nd GTX690 with AC Turbo never hit over 52deg on GPU1.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Rei86*
> 
> GPU1 will always be hotter than GPU2. That's just the nature of having two GPUs on one PCB.
> Clean out GTX690 and clean out whatever else in your computer of dust. Than take apart your GTX690 and clean the GPU/Vapor Chamber and reTIM it. Should drop the temps. I've never had the my 690 reach over 69 on GPU1 with a fan profile and reTIM with IC Diamond. And my 2nd GTX690 with AC Turbo never hit over 52deg on GPU1.


One gpu hotter than the other was the case with say the ati 5970 card as the fan is on one side of the cooler. But the fan is in the middle of the gtx 690 cooler and both gpus have the same heatsink on both sides of the cooler.

One core hotter than other is just a case of more/less TIM, uneven surface of the heatsink, or the gpu die is just more uneven than the other.

Both of my gpus idle and load the same on my waterblock also.


----------



## Rei86

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> One gpu hotter than the other was the case with say the ati 5970 card as the fan is on one side of the cooler. But the fan is in the middle of the gtx 690 cooler and both gpus have the same heatsink on both sides of the cooler.
> One core hotter than other is just a case of more/less TIM, uneven surface of the heatsink, or the gpu die is just more uneven than the other.
> Both of my gpus idle and load the same on my waterblock also.


That's waterblock, we're probably talking about Air.

Both of 690s I've owned has always had GPU1 higher temp in idle and load.

Same on other forums for 690 owners. Load is usually always heavier on GPU1 anyways since that's usually the connection you're using when it comes to video output also.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Rei86*
> 
> That's waterblock, we're probably talking about Air.
> Both of 690s I've owned has always had GPU1 higher temp in idle and load.
> Same on other forums for 690 owners. Load is usually always heavier on GPU1 anyways since that's usually the connection you're using when it comes to video output also.


I'm talking about air, then I said it's the same with my waterblock too.


----------



## Rei86

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I'm talking about air, then I said it's the same with my waterblock too.


Don't know what to say as all two of my 690s always had GPU1 hotter than GPU2.

Same for other forum members that I've talked to with 690s.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Rei86*
> 
> Don't know what to say as all two of my 690s always had GPU1 hotter than GPU2.
> Same for other forum members that I've talked to with 690s.


If this was the case, then my waterblock wouldn't be able to have both gpus at the same temp. I'm talking about the gpu1 should be hotter because the dvi is plugged into it as you say.


----------



## Rei86

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> If this was the case, then my waterblock wouldn't be able to have both gpus at the same temp. I'm talking about the gpu1 should be hotter because the dvi is plugged into it as you say.


But you're under water now. These numbers I'm talking about has been on air. I've monitored my temps, Core speed, Mem speed etc when I benched with them and GPU1 has always been hotter on air. Same on the 2nd 690 i received. Another member on EVGA forums that I've talked to states the same for his and others as well. I haven't run into many WCed 690 owners so that's an unknown to me.


----------



## MrTOOSHORT

But don't you think that the dvi giving the load to gpu1 would give higher temps as you said, even with a waterblock?

I took the gtx690 cooler apart and everything is even on both gpus and the fan is in the middle pushing air both ways.


----------



## Rei86

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> But don't you think that the dvi giving the load to gpu1 would give higher temps as you said, even with a waterblock?
> I took the gtx690 cooler apart and everything is even on both gpus and the fan is in the middle pushing air both ways.


Would yes but you're cooling is far more efficient than the stock air cooler that the 690 shipped with.


----------



## Stray_Bullet

Hey Rei, it's Panzer from EVGA. You RMA your card yet? Any problems with the procedure?


----------



## Rei86

Quote:


> Originally Posted by *Stray_Bullet*
> 
> Hey Rei, it's Panzer from EVGA. You RMA your card yet? Any problems with the procedure?


EVGA's RMA/Warehouse is closed till the 3rd of January so I'm shipping my card tomorrow.

Never had any issue with EVGA's RMA service or their customer service so I'm not worried.


----------



## fatlardo

Quote:


> Originally Posted by *Rei86*
> 
> EVGA's RMA/Warehouse is closed till the 3rd of July so I'm shipping my card tomorrow.
> Never had any issue with EVGA's RMA service or their customer service so I'm not worried.


You will die then July!


----------



## egotrippin

Quote:


> Originally Posted by *Arm3nian*
> 
> Except far cry 3, I refuse to play that game I cant get 60fps with 8xmsaa, I know that level of AA is tough esepecially on 1440p but I think a driver update should do it.


I'm considering 2 680 4GB cards for that reason, especially with some upcoming games like Crysis 3 which I hope will make full use of my equipment
Quote:


> Originally Posted by *xHoLy*
> 
> hey
> the latest games especially far cry 3 really heats up my gtx690, both gpus around 90°C and almost 100% gpu usage on both. gpu1 is usually 2-3°C hotter for some reason, the max temperature i have seen so far was 94°C on gpu1
> i am curious to know if its bad, and what i could posibbly do to lower it a little?


That's a bit on the warm side but within spec. If the fans aren't at 100% you can manually control them using EVGA Precision X or one of the other similar programs. Or, you could water cool your 690 which I do and my temps usually stay in the low 40°C range under 100% load running [email protected]


----------



## Rei86

Quote:


> Originally Posted by *fatlardo*
> 
> You will die then July!


I was like "WTH is this guy talking about?"

But oops I mean to say January 3rd


----------



## xHoLy

Quote:


> Originally Posted by *egotrippin*
> 
> That's a bit on the warm side but within spec. If the fans aren't at 100% you can manually control them using EVGA Precision X or one of the other similar programs. Or, you could water cool your 690 which I do and my temps usually stay in the low 40°C range under 100% load running [email protected]


i tried to see whats the max temperature my gpus rise to so i played far cry 3 for hours on 1920x1080, ultra everything even 8x msaa and it just rapes my gtx690, gpu1 was constantly 96°C while gpu2 at 94°C
even managed to get it to 97°C/95°C for shorter periods

i checked nvidia.com and it states that the max is 98°C for this card so why doesnt it shut down or something to prevent damage? 97°C is pretty close lol

are there any risks to run the fans at 100%?


----------



## NotReadyYet

Quote:


> Originally Posted by *Rei86*
> 
> EVGA's RMA/Warehouse is *closed till the 3rd of July* so I'm shipping my card tomorrow.
> Never had any issue with EVGA's RMA service or their customer service so I'm not worried.


They are going to be closed for over 7 months?!


----------



## Kyouki

LOL no he ment to say 01-03-2013

I HOPE!!!


----------



## Rei86

Quote:


> Originally Posted by *NotReadyYet*
> 
> They are going to be closed for over 7 months?!


Quote:


> Originally Posted by *Kyouki*
> 
> LOL no he ment to say 01-03-2013
> I HOPE!!!


Quote:


> Originally Posted by *NotReadyYet*
> 
> They are going to be closed for over 7 months?!


Quote:


> Originally Posted by *Kyouki*
> 
> LOL no he ment to say 01-03-2013
> I HOPE!!!


Yup


----------



## Alex132

690 and SSDs just arrived


----------



## Arizonian

Quote:


> Originally Posted by *Alex132*
> 
> 690 and SSDs just arrived


At the same time? Might want to prepare yourself when you spark them up together because your head might explode from the performance your about to witness. Great upgrade......congrats!


----------



## Alex132

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> 690 and SSDs just arrived
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At the same time? Might want to prepare yourself when you spark them up together because your head might explode from the performance your about to witness. Great upgrade......congrats!
Click to expand...

Sadly 20 days away from my rig D:
Gonna have to wait, 20, long, days


----------



## BigJoeGrizzly

Curious to know if a Seasonic X-650 could handle this card along with a 3570K OC'ed to ~4.4 GHz?


----------



## y2kcamaross

Quote:


> Originally Posted by *BigJoeGrizzly*
> 
> Curious to know if a Seasonic X-650 could handle this card along with a 3570K OC'ed to ~4.4 GHz?


my old ocz zt750w handled my 2 680s and my [email protected], along with 6 hard drives and numerous fans just fine, most I ever saw being pulled from the wall with my killawatt was 647, @roughly 82 percent efficient that's barely 530watts from the power supply, your 690 and 3570k both eat less power as well

DR;TL - you'll be more than fine


----------



## ceteris

Quote:


> Originally Posted by *BigJoeGrizzly*
> 
> Curious to know if a Seasonic X-650 could handle this card along with a 3570K OC'ed to ~4.4 GHz?


It can, but I would turn off the hybrid feature on your SeaSonic.


----------



## BigJoeGrizzly

Quote:


> Originally Posted by *y2kcamaross*
> 
> my old ocz zt750w handled my 2 680s and my [email protected], along with 6 hard drives and numerous fans just fine, most I ever saw being pulled from the wall with my killawatt was 647, @roughly 82 percent efficient that's barely 530watts from the power supply, your 690 and 3570k both eat less power as well
> DR;TL - you'll be more than fine


You know what you just did? You just pumped me up!


----------



## pilla99

Quote:


> Originally Posted by *BigJoeGrizzly*
> 
> Curious to know if a Seasonic X-650 could handle this card along with a 3570K OC'ed to ~4.4 GHz?


You can probably run two 690's on that.. http://www.overclock.net/t/1290091/gtx-690-true-power-measurement


----------



## jcde7ago

Quote:


> Originally Posted by *pilla99*
> 
> You can probably run two 690's on that.. http://www.overclock.net/t/1290091/gtx-690-true-power-measurement


While you probably can, it is neither worth the strain on that PSU, nor should one ever risk $2K worth of GPUs on a PSU with very little headroom.

1 690 should be fine, even with an OC on his CPU, but there is no way in heck I would ever risk running 2 on anything less than an 850w PSU, and even that's pushing it. Even at 80% or 90% efficiency, no one should be taxing their PSUs to the limit. If you're gonna drop serious cash on GPUs, spend a little bit extra for a PSU with ample headroom.


----------



## tonyjones

man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


----------



## BigJoeGrizzly

How's the microstuttering on this guys? I'll be using a single monitor, and the SLI gtx 580's I had weren't TOO bad with microstutter, it didn't bother me that much.


----------



## pilla99

Quote:


> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


Lol, broke as #### junior in college that works all summer and Christmas break (and part time the rest of the year) pouring every dime he has into this or my 25 year old car.


----------



## Rei86

Quote:


> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


I have a job and these things are not as expensive as people deem. However I wish I've gotten more for my money.
Quote:


> Originally Posted by *BigJoeGrizzly*
> 
> Curious to know if a Seasonic X-650 could handle this card along with a 3570K OC'ed to ~4.4 GHz?


X650 supplies +12V54A and the 690 needs 38


----------



## Qu1ckset

Quote:


> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


working and saving and slowly upgrading one part at a time, i didnt build my sig rig in one shot, took about a year and abit im always upgrading hahaha , sold my 2500k and z68 rog gene-z mobo and now upgrading to the 3930k and x79 rog gene mobo, should get the mobo tmr


----------



## iARDAs

Quote:


> Originally Posted by *Qu1ckset*
> 
> working and saving and slowly upgrading one part at a time, i didnt build my sig rig in one shot, took about a year and abit im always upgrading hahaha , sold my 2500k and z68 rog gene-z mobo and now upgrading to the 3930k and x79 rog gene mobo, should get the mobo tmr


I bought my sig rig in few months and I swear I will neevr follow this route.

1-2 components per year from now on

Though in 1 year I owned

MSI Twin Frozr III 570

Zotac 590

Gigabyte 670 Windforce

Zotac 670 4GB

and now Powercolor 7970 Vortex II.

Lol. talking about waste of money.

MUST RESIST THE URGE OF UPGRADING.

OH also cases in 1 year

Haf 922 Red

Aerocool Xpredator Evil Black

Corsair 800D

Corsair 500R


----------



## Alex132

Well, any chance to get me on the OP list?


----------



## Buzzkill

Quote:


> Originally Posted by *BigJoeGrizzly*
> 
> Curious to know if a Seasonic X-650 could handle this card along with a 3570K OC'ed to ~4.4 GHz?




Two 690 & 3770K 4.6 Idle power draw. I upgraded my HX1000 to AX1200i and I have not seen my system over 900watts if the Corsair Link Software is correct.


----------



## ceteris

Quote:


> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


You just don't think about it. Before and after you buy the hardware lol. God knows what I could've gotten instead.


----------



## PinzaC55

Quote:


> Originally Posted by *Qu1ckset*
> 
> working and saving and slowly upgrading one part at a time, i didnt build my sig rig in one shot, took about a year and abit im always upgrading hahaha , sold my 2500k and z68 rog gene-z mobo and now upgrading to the 3930k and x79 rog gene mobo, should get the mobo tmr


Always the best way. I bought an i7 2600K a year and a half ago for £200 then sold it for £155 a few weeks ago. So it cost me about 70 pence a week!


----------



## mpatel93

Hey guys after having the 7970 sapphire OC, I dont want anymore as i need minimum 60fps on 2560x1440 all on ultra and 16x and 8x etc, which card to get and best for OC, Asus? EVGA? MSI?

Thanks


----------



## MaxxOmega

Quote:


> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


I make pretty good bucks. But in June I turned 55 and some of my investments matured. I cashed some in and spent $14K on a whack of new hardware. Built 3 new Gaming Rigs each with a 30 Inch Dell Monitor at 2560 x 1600's. EVE Online looks incredible...









Also got 2 ASUS ROG Gaming Laptops, one for the wife who luckily for me is a big time gamer too...









I'm looking at putting 3 30 Inch on one Rig but probably need more than the 2 x 580's I'm running for that. I could have bought the 690's at the time but they were hard to find then and when you buy 6 x 580's all at once it was like $3K for video cards alone.


----------



## tonyjones

I'm going to sell my 5870 and 5970 radeon and save up to get this GTX 690!


----------



## jcde7ago

Quote:


> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?


I don't make six-figures, but am fortunate enough to make enough money to live rather comfortably....studied hard in college, and landed a great job pretty much right when I got out.
Quote:


> Originally Posted by *Alex132*
> 
> Well, any chance to get me on the OP list?


Did you already post a pic as proof in the thread that I or Arizonian missed earlier? If not, just post it now and i'll get you added asap.


----------



## Alex132

Quote:


> Originally Posted by *jcde7ago*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tonyjones*
> 
> man how do you guys afford this card, do you guys all make 6 figures a year or some ****?
> 
> 
> 
> I don't make six-figures, but am fortunate enough to make enough money to live rather comfortably....studied hard in college, and landed a great job pretty much right when I got out.
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Well, any chance to get me on the OP list?
> 
> Click to expand...
> 
> Did you already post a pic as proof in the thread that I or Arizonian missed earlier? If not, just post it now and i'll get you added asap.
Click to expand...

Quote:


> Originally Posted by *Alex132*
> 
> 690 and SSDs just arrived


----------



## jcde7ago

Quote:


> Originally Posted by *Alex132*


Awesome...added!


----------



## Alex132

Quote:


> Originally Posted by *jcde7ago*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Awesome...added!
Click to expand...

Thanks ^^!


----------



## AllGamer

Just dropping by to say Hi!









pics coming in a bit


----------



## jcde7ago

Quote:


> Originally Posted by *mpatel93*
> 
> Hey guys after having the 7970 sapphire OC, I dont want anymore as i need minimum 60fps on 2560x1440 all on ultra and 16x and 8x etc, which card to get and best for OC, Asus? EVGA? MSI?
> Thanks


They should all be Nvidia reference cards, so any brand will work.


----------



## jcde7ago

Quote:


> Originally Posted by *AllGamer*
> 
> Just dropping by to say Hi!
> 
> 
> 
> 
> 
> 
> 
> 
> pics coming in a bit *snipped pics*


I just added you...and great setup! But dat cable management...lol.


----------



## egotrippin

Quote:


> Originally Posted by *xHoLy*
> 
> i tried to see whats the max temperature my gpus rise to so i played far cry 3 for hours on 1920x1080, ultra everything even 8x msaa and it just rapes my gtx690, gpu1 was constantly 96°C while gpu2 at 94°C
> even managed to get it to 97°C/95°C for shorter periods
> i checked nvidia.com and it states that the max is 98°C for this card so why doesnt it shut down or something to prevent damage? 97°C is pretty close lol
> are there any risks to run the fans at 100%?


If you have a warranty there's no risk at all. They are built to be played so game on! If you're worried, you can always water cool your card and you'd probably be able to overclock it +125Mhz or more. I'm guessing that if the temps are at that level there's probably some throttling going on which means there's performance still left on the table.


----------



## PinzaC55

Just wondering...I was thinking about getting the EK Backplate for my GTX 690 but I found this http://www.watercoolinguk.co.uk/p/Watercool-HEATKILLER-®-GPU-GTX-690-Backplate_33429.html after watching a video on watercooling a 690 on Youtube and I dig the chrome look, plus it is reasonably priced. Trouble is I am a little pushed for space between the 690 and the northbridge so I was wondering if anyone here has one installed and how far above the 690 it rises? It would only be for aesthetic purposes.


----------



## Kaapstad

Quote:


> Originally Posted by *xHoLy*
> 
> i tried to see whats the max temperature my gpus rise to so i played far cry 3 for hours on 1920x1080, ultra everything even 8x msaa and it just rapes my gtx690, gpu1 was constantly 96°C while gpu2 at 94°C
> even managed to get it to 97°C/95°C for shorter periods
> i checked nvidia.com and it states that the max is 98°C for this card so why doesnt it shut down or something to prevent damage? 97°C is pretty close lol
> are there any risks to run the fans at 100%?


I think you should look at your system in total. I have never seen temps that high, the highest I ever managed was 91c on one gpu o/ced, in my setup I have 2 x 690s and a power eating 3960x that sometimes gets clocked to 5ghz.

Have a look at the airflow, cable management and the possibility of adding more fans.

Also as someone above said look at the fan profile on your 690, also check to see that the cooler is not clogged up with dust.


----------



## PinzaC55

Quote:


> Originally Posted by *Kaapstad*
> 
> I think you should look at your system in total. I have never seen temps that high, the highest I ever managed was 91c on one gpu o/ced, in my setup I have 2 x 690s and a power eating 3960x that sometimes gets clocked to 5ghz.
> Have a look at the airflow, cable management and the possibility of adding more fans.
> Also as someone above said look at the fan profile on your 690, also check to see that the cooler is not clogged up with dust.


Far Cry 3 on max settings only gets my GTX 690 up to 63 degrees with stock cooling.


----------



## ceteris

Quote:


> Originally Posted by *PinzaC55*
> 
> Just wondering...I was thinking about getting the EK Backplate for my GTX 690 but I found this http://www.watercoolinguk.co.uk/p/Watercool-HEATKILLER-®-GPU-GTX-690-Backplate_33429.html after watching a video on watercooling a 690 on Youtube and I dig the chrome look, plus it is reasonably priced. Trouble is I am a little pushed for space between the 690 and the northbridge so I was wondering if anyone here has one installed and how far above the 690 it rises? It would only be for aesthetic purposes.


I didn't get one because I saw a build log where a couple of the screws interfered with the bottom latch for the ram on RIVE board when using the first PCI-E x16 slot. I was too lazy to figure out how and where to get replacement screws that would take it's place though so I stuck with the EVGA one since I already had extra screws that can thread into the heatkiller 690 block.

I suppose it would work for you if you have a different board, or if you use a different PCI-E slot.


----------



## RandomHer0

Heard a couple of people say that the EK waterblock isn't great at all. Can anyone confirm/deny this? Perhaps also give me a reason why if it is true? Thanks


----------



## UNOE

Quote:


> Originally Posted by *RandomHer0*
> 
> Heard a couple of people say that the EK waterblock isn't great at all. Can anyone confirm/deny this? Perhaps also give me a reason why if it is true? Thanks


I would also like to know that the best water blocks are for 690


----------



## Alex132

I heard the XSPC one was a good waterblock


----------



## Buzzkill

I have Heatkiller's and it's a very nice block
HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Ni-Bl


Aquacomputer AquagraFX GTX 690 G1/4


----------



## ceteris

Just pick any block based on what looks appeal to your taste. Given enough headroom, your block shouldn't see any temps near 40 degrees and above on load.

Your only problem is silicon lottery. I keep my GPU settings +200 24/7, but my memory won't stomach much past stock.


----------



## UNOE

http://www.frozencpu.com/products/16083/ex-blc-1115/Koolance_VID-NX690_GeForce_VGA_Liquid_Cooling_Block_No_Fittings.html#blank


----------



## Buzzkill

Koolance VID-NX690 GeForce VGA Liquid Cooling Block - 2.25 pounds (*680g*)

Aquacomputer AquagraFX GTX 690 G1/4 - Over (*1000g+*) could not find exact weight

HEATKILLER® GPU-X³ GTX 690 "Hole Edition" Reference Design Full Coverage Water Block - (*1200g*)

XSPC Razor nVidia GTX 690 Full Coverage VGA Block - 1.35kg (*1350g*)


----------



## egotrippin

I've been happy with my Koolance waterblock. I love the look of the Heatkiller but the spacing of the ports on the Koolance block looks perfect in my case.


----------



## ceteris

Quote:


> Originally Posted by *egotrippin*
> 
> I've been happy with my Koolance waterblock. I love the look of the Heatkiller but the spacing of the ports on the Koolance block looks perfect in my case.


Yeah I couldn't see you using anything else there. That's like perfect


----------



## RandomHer0

Quote:


> Originally Posted by *ceteris*
> 
> Just pick any block based on what looks appeal to your taste. Given enough headroom, your block shouldn't see any temps near 40 degrees and above on load.
> Your only problem is silicon lottery. I keep my GPU settings +200 24/7, but my memory won't stomach much past stock.


I forgot to specify that it was the nickel EK block that a couple of people said "avoid at all costs" about. They didn't think it justified to explain why however. I prefer the aesthetic of the EK block above the others, with the XSPC Razor close behind (only issue for me is the lack of backplate).


----------



## egotrippin

Quote:


> Originally Posted by *RandomHer0*
> 
> I forgot to specify that it was the nickel EK block that a couple of people said "avoid at all costs" about. They didn't think it justified to explain why however. I prefer the aesthetic of the EK block above the others, with the XSPC Razor close behind (only issue for me is the lack of backplate).


I think EK had some issues with galvanic corrosion on their nickle plated blocks but that was a year ago before the GTX 690 so I imagine they've fixed it.


----------



## ceteris

Quote:


> Originally Posted by *RandomHer0*
> 
> I forgot to specify that it was the nickel EK block that a couple of people said "avoid at all costs" about. They didn't think it justified to explain why however. I prefer the aesthetic of the EK block above the others, with the XSPC Razor close behind (only issue for me is the lack of backplate).


Yeah that was the flaking nickel fiasco back then. I'm sort of neutral about them myself, but I don't like the "CSQ" design they changed to. If you like circles, then go for it I guess


----------



## PinzaC55

Quote:


> Originally Posted by *ceteris*
> 
> I didn't get one because I saw a build log where a couple of the screws interfered with the bottom latch for the ram on RIVE board when using the first PCI-E x16 slot. I was too lazy to figure out how and where to get replacement screws that would take it's place though so I stuck with the EVGA one since I already had extra screws that can thread into the heatkiller 690 block.
> I suppose it would work for you if you have a different board, or if you use a different PCI-E slot.


Thanks for that. On my mobo (MSI Big Bang Xpower II) I seem to be OK for the RAM but the Northbridge heatsink only has about 7 mm clearance to the GTX 690 which in slot 1. I hadn't considered the screws, which in DazModes's video do look rather large. I guess I will forget the Heatkiller.


----------



## RandomHer0

Quote:


> Originally Posted by *egotrippin*
> 
> I think EK had some issues with galvanic corrosion on their nickle plated blocks but that was a year ago before the GTX 690 so I imagine they've fixed it.


Watched a couple of videos and it seems like the looked into and addressed that issue

Quote:


> Originally Posted by *ceteris*
> 
> Yeah that was the flaking nickel fiasco back then. I'm sort of neutral about them myself, but I don't like the "CSQ" design they changed to. If you like circles, then go for it I guess


Well I'm going to have the supremacy CPU block as well (thinking plexi atm) so I believe they will go nicely together


----------



## rcfc89

Just recieved mine.....


----------



## jcde7ago

Quote:


> Originally Posted by *rcfc89*
> 
> Just recieved mine.....


Grats....added!


----------



## PinzaC55

Quote:


> Originally Posted by *rcfc89*
> 
> Just recieved mine.....


Nice way to see out 2012! Welcome to the Club!


----------



## Arizonian

Quote:


> Originally Posted by *PinzaC55*
> 
> Nice way to see out 2012! Welcome to the Club!


rcfc89 makes 101st member with 110th card to the club! Congrats on end year purchase.









:whee: Happy New Year! :whee:


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> rcfc89 makes 101st member with 110th card to the club! Congrats on end year purchase.
> 
> 
> 
> 
> 
> 
> 
> 
> :whee: Happy New Year! :whee:


I'm sure Nvidia feels great knowing that there's $110,000 worth of GeForce cards in this thread alone....


----------



## Alex132

Quote:


> Originally Posted by *jcde7ago*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Arizonian*
> 
> rcfc89 makes 101st member with 110th card to the club! Congrats on end year purchase.
> 
> 
> 
> 
> 
> 
> 
> 
> :whee: Happy New Year! :whee:
> 
> 
> 
> I'm sure Nvidia feels great knowing that there's $110,000 worth of GeForce cards in this thread alone....
Click to expand...

Holy crap, valuable club.
Wow.


----------



## PinzaC55

It would be interesting to know how many 690's Nvidia has actually sold?


----------



## juanP

it will also be interesting to know how many still have their 690's from the main list on 1st page


----------



## Rei86

Quote:


> Originally Posted by *juanP*
> 
> it will also be interesting to know how many still have their 690's from the main list on 1st page


Not on the list but I've had a 690 since August of this year, however I'll be unloading my 690 asap.


----------



## UNOE

Quote:


> Originally Posted by *Rei86*
> 
> Not on the list but I've had a 690 since August of this year, however I'll be unloading my 690 asap.


Why might I ask ? I'm thinking of trading for a 690 ? Had do you guys like the GPU, how are the drivers ?


----------



## PhantomTaco

Hey guys got a quick question for ya. I'm planning to switch back to stock air cooling (my liquid cooled setup has proven to be a hassle in transportation and i move around a lot), and was wondering if there was a guide or general instructions on remounting the stock fan shroud for the 690? Do I need to buy thermal pads for it or anything? Thanks


----------



## MrTOOSHORT

Quote:


> Originally Posted by *PhantomTaco*
> 
> Hey guys got a quick question for ya. I'm planning to switch back to stock air cooling (my liquid cooled setup has proven to be a hassle in transportation and i move around a lot), and was wondering if there was a guide or general instructions on remounting the stock fan shroud for the 690? Do I need to buy thermal pads for it or anything? Thanks


Use your waterblock pads or the old aircooler pads. And TIM for the gpus and PLX chip.


----------



## PhantomTaco

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Use your waterblock pads or the old aircooler pads. And TIM for the gpus and PLX chip.


Thx for the quick reply. But let's say that I wanted to get new thermal pads. What thickness would they need to be? I know for example on the EK block that they use 0.5mm and 1mm pads. Don't think that's the same however for the stock shroud though.


----------



## MrTOOSHORT

Memory chips need 0.5mm pads.

That's all you need.

Edit. Not sure about the vrms though. I did just have my waterblock off recently and can't recall.


----------



## Rei86

Quote:


> Originally Posted by *UNOE*
> 
> Why might I ask ? I'm thinking of trading for a 690 ? Had do you guys like the GPU, how are the drivers ?


Picked up two Classifieds.

Nothing is wrong with it but the VRAM. For such a premium product wish nvidia would've at least gone with 3GB per GPU.
Driver wise seems to be different for each individual. For me it feels like with every driver update they're slowly killing the 600 series. Not as optimized as the previous version and on a steady decline in performance, but that might just be in my head.


----------



## PhantomTaco

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Memory chips need 0.5mm pads.
> That's all you need.
> Edit. Not sure about the vrms though. I did just have my waterblock off recently and can't recall.


Thank you sir, REP. One last question for you, with replacing the thermal paste, any reccomendations on what is best to use?


----------



## MrTOOSHORT

I just looked at my gtx 690 shroud I had in it's box on my shelf. The pads look 1mm and the vrms had thick paste, no pads. I'd imagine you'd use 0.5mm pads for those though.

For paste, I use MX4


----------



## Rei86

Quote:


> Originally Posted by *PhantomTaco*
> 
> Thank you sir, REP. One last question for you, with replacing the thermal paste, any reccomendations on what is best to use?


MX-4
IC Diamond
PK-2
PK-3
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I just looked at my gtx 690 shroud I had in it's box on my shelf. The pads look 1mm and the vrms had thick paste, no pads. I'd imagine you'd use 0.5mm pads for those though.
> For paste, I use MX4


Yup. The VRAM used .5mm pads and the VRMs had thick compound paste but you can replace it with .5mm pads if you like.


----------



## MrTOOSHORT

Here's the pic of the shroud with the thermal pads and TIM smudge marks.



Rei86,

My EK waterblock uses 0.5mm pads for the memory chips, but the gtx690 shroud uses 1mm.


----------



## Arizonian

Quote:


> Originally Posted by *Rei86*
> 
> Picked up two Classifieds.
> Nothing is wrong with it but the VRAM. For such a premium product wish nvidia would've at least gone with 3GB per GPU.
> Driver wise seems to be different for each individual. For me it feels like with every driver update they're slowly killing the 600 series. Not as optimized as the previous version and on a steady decline in performance, but that might just be in my head.


I'd have to agree with being VRAM short if your sporting ability to carry three monitors. I'd have paid more for a 6GB GTX 690 if I was doing multiples and would have made more as well as made more sense for Nvidia. I chalk it up to this year being Nvidia's first year being able to support three monitors with one GeForce, as a learning curve.

I myself can't complain as for my set up I'm killing FPS with single 120 Hz monitor with Nvidia Lightboost activated in 2D gaming which is amazing.









If one thing I hope everyone takes away from the 690 is enthusiast love a well built card. This is the first card build at this caliber to date IMO. I hope next year we see more shrouds and pcb boards like this on other cards.

Having said all that, I'm handing the 690 down to my kids also with 120 Hz monitor at the release of a new single card that can push the same FPS my 690 can even if it's a side grade. I just like playing with new GPU's.









Whoops Edited to respond to performance. It's not in your head, it's just the drivers. I haven't seen performance degrade. I've seen AMD's performance surpass. With latest 310 drivers I feel we're back on par.


----------



## Rei86

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Here's the pic of the shroud with the thermal pads and TIM smudge marks.
> 
> Rei86,
> My EK waterblock uses 0.5mm pads for the memory chips, but the gtx690 shroud uses 1mm.


You sure its not .5mm for the VRAM? Remember asking on EVGA and was told I could use .5mm all around.
Quote:


> Originally Posted by *Arizonian*
> 
> I'd have to agree with being VRAM short if your sporting ability to carry three monitors. I'd have paid more for a 6GB GTX 690 if I was doing multiples and would have made more as well as made more sense for Nvidia. I chalk it up to this year being Nvidia's first year being able to support three monitors with one GeForce, as a learning curve.
> I myself can't complain as for my set up I'm killing FPS with single 120 Hz monitor with Nvidia Lightboost activated in 2D gaming which is amazing.
> 
> 
> 
> 
> 
> 
> 
> 
> If one thing I hope everyone takes away from the 690 is enthusiast love a well built card. This is the first card build at this caliber to date IMO. I hope next year we see more shrouds and pcb boards like this on other cards.
> Having said all that, I'm handing the 690 down to my kids also with 120 Hz monitor at the release of a new single card that can push the same FPS my 690 can even if it's a side grade. I just like playing with new GPU's.
> 
> 
> 
> 
> 
> 
> 
> 
> Whoops Edited to respond to performance. It's not in your head, it's just the drivers. I haven't seen performance degrade. I've seen AMD's performance surpass. With latest 310 drivers I feel we're back on par.


Yes astechticly I have never seen a piece of computer hardware I wanted more than the GTX690. Sure the 6990 looked beastly, but the 690 looks like a brute and yet elegant. But I have to say nvidia has made compromise with the Keplers and I hope this isn't the trend to follow.
Ugh the 310s IMO are awful. My cards would randomly boost out of no where with them.

As for me the Kepler refresh is all that the 700 series will be so I won't be shelling out money again till we get the Maxwells.


----------



## MrTOOSHORT

Yep, I'm sure it's 1mm pads for the memory chips. For clarification, that's my pic I put up of the shroud. So I see the thickness first hand.


----------



## ceteris

Out of curiosity, who is considering upgrading to the next dual-gpu GeForce card? This was my first and have been relatively happy with it. However, it seems that the sweet spot in multiple GPU configurations seems to be Tri-SLI and it's not like I can slap at GTX 680 alongside this card to achieve it. Quad-SLI doesn't seem to get much support and improvement for awhile, until then you are just waging your E-Peen around on 3DMark benchmarks.

Not to mention that it took some preferred waterblocks ages to get to the US stores...


----------



## Rei86

Sweet spot IMO looks like dual and tri SLI.

Quad SLI is if you love the benchmark game and I don't know anyone that actually likes to play the benchmark game for any longer than they actually have too.

As for me, I've already said it. The Kepler refresh won't be worth my dollars.


----------



## maximus56

By "just wagging your E-Peen around", I guess you meant this:
http://www.overclock.net/g/i/867156/3dm11452012936-jpg/;
http://www.overclock.net/g/i/867854/unigine-20120509-1723-gtx680slistock-jpg/

Lol. Happy New Year!


----------



## ceteris

Quote:


> Originally Posted by *maximus56*
> 
> By "just wagging your E-Peen around", I guess you meant this:
> http://www.overclock.net/g/i/867156/3dm11452012936-jpg/;
> http://www.overclock.net/g/i/867854/unigine-20120509-1723-gtx680slistock-jpg/
> Lol. Happy New Year!


Touche~ Those are just stock runs on air though. Might have been impressive at launch but others can easily beat those scores. Looking at your sig, you probably have more than a couple inches over mine


----------



## noob.deagle

Quote:


> Originally Posted by *Arizonian*
> 
> rcfc89 makes 101st member with 110th card to the club! Congrats on end year purchase.
> 
> 
> 
> 
> 
> 
> 
> 
> :whee: Happy New Year! :whee:


im sure there is more than that here in the thread mine got missed im still not on the list









can i into list ?

ASUS GTX690 paid way too much $1300 (stupid Australian prices and my crappy city of Adelaide had no 680s and i wanted something on that day







)


----------



## maximus56

Quote:


> Originally Posted by *ceteris*
> 
> Touche~ Those are just stock runs on air though. Might have been impressive at launch but others can easily beat those scores. Looking at your sig, you probably have more than a couple inches over mine


LOl..I am glad that its all in good spirits!








Hoping to find some time to start enjoying some of the games that I got recently (such FC3, etc). But, alas, I have my priorities in the wrong order--work first and play later


----------



## Arizonian

Quote:


> Originally Posted by *noob.deagle*
> 
> im sure there is more than that here in the thread mine got missed im still not on the list
> 
> 
> 
> 
> 
> 
> 
> 
> can i into list ?
> ASUS GTX690 paid way too much $1300 (stupid Australian prices and my crappy city of Adelaide had no 680s and i wanted something on that day
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> Spoiler: Warning: Spoiler!


Wow.....so your the one.







I have to apologize. When I was updating the list for jcde7ago during his absence I knew I thought I had missed someone.









A while back I had posted with a query to club members if I had missed someone it was bugging me. I remember it was a busy week and I saw a new member to add post. By the time I got back to the club thread about a week later your post was buried.

Now that jcde7ago is back he'll be adding you to the list.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Wow.....so your the one.
> 
> 
> 
> 
> 
> 
> 
> I have to apologize. When I was updating the list for jcde7ago during his absence I knew I thought I had missed someone.
> 
> 
> 
> 
> 
> 
> 
> 
> A while back I had posted with a query to club members if I had missed someone it was bugging me. I remember it was a busy week and I saw a new member to add post. By the time I got back to the club thread about a week later your post was buried.
> Now that jcde7ago is back he'll be adding you to the list.


He's been added!


----------



## Seanimus

Hello can I get on list...
still haven't finished build..(build log in ocn)..but here's a shot :
http://www.flickr.com/photos/[email protected]/8343474439/


----------



## jcde7ago

Quote:


> Originally Posted by *Seanimus*
> 
> Hello can I get on list...
> still haven't finished build..(build log in ocn)..but here's a shot :
> http://www.flickr.com/photos/[email protected]/8343474439/


Congrats, very nice WIP you've got going on there....and of course, added!


----------



## Alex132

The 690 has 8+2 pwm phases, per GPU. Right?


----------



## PinzaC55

I hope I don't get my knuckles rapped for posting this here but somebody in the UK has put this PC for sale on Ebay with INSANE specs http://www.ebay.co.uk/itm/Watercooled-Quad-690GTX-SLI-3960x-CPU-Gaming-Machine-Dell-UltraSharp-U2713H-/160949466938?pt=UK_Computing_DesktopPCs&hash=item257955f73a
I'm quite proud of my own rig but honestly.....


----------



## Tech09

Hi









Her is a picture of my setup:


I'm waiting for my water cooling blocks for the cards. Wen I get them I'll hide the cables more and make it look better









Hope I can get on the list









Regards


----------



## pilla99

Can I get on this list? I don't think I ever actually was added.


----------



## jcde7ago

Quote:


> Originally Posted by *Tech09*
> 
> Hi
> 
> 
> 
> 
> 
> 
> 
> 
> Her is a picture of my setup:
> 
> I'm waiting for my water cooling blocks for the cards. Wen I get them I'll hide the cables more and make it look better
> 
> 
> 
> 
> 
> 
> 
> 
> Hope I can get on the list
> 
> 
> 
> 
> 
> 
> 
> 
> Regards


Quote:


> Originally Posted by *pilla99*
> 
> Can I get on this list? I don't think I ever actually was added.


Added both of you, but no specifics on brand of card (even though they are all reference) so let me know what the actual brands are and i'll fix them (unless I got them right)! You both have nice setups!


----------



## pilla99

Quote:


> Originally Posted by *jcde7ago*
> 
> Added both of you, but no specifics on brand of card (even though they are all reference) so let me know what the actual brands are and i'll fix them (unless I got them right)! You both have nice setups!


Ah sorry it's an Asus


----------



## jcde7ago

Quote:


> Originally Posted by *pilla99*
> 
> Ah sorry it's an Asus


Fixed!


----------



## Tech09

Quote:


> Originally Posted by *jcde7ago*
> 
> Added both of you, but no specifics on brand of card (even though they are all reference) so let me know what the actual brands are and i'll fix them (unless I got them right)! You both have nice setups!


Mine are 2 Gigabyte GeForce GTX 690 4GB PhysX CUDA


----------



## jcde7ago

Quote:


> Originally Posted by *Tech09*
> 
> Mine are 2 Gigabyte GeForce GTX 690 4GB PhysX CUDA


Fixed!


----------



## Scorpion49

Hey guys, I just bought an EVGA 690 today. Super excited to get it in, choosing a water block right now. I think I'm going to go with the heatkiller Ni-Bl and backplate.


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Hey guys, I just bought an EVGA 690 today. Super excited to get it in, choosing a water block right now. I think I'm going to go with the heatkiller Ni-Bl and backplate.


Hey congrats Scorpion49. Please post pics when you get that settled in.


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> Hey congrats Scorpion49. Please post pics when you get that settled in.


Will do. I think I just scored a Dell U3007 for $300 locally as well, gonna go pick it up tomorrow. Need to flex that 690 muscle!


----------



## PhantomTaco

Hey guys, got another question for you. Does anyone know the screw type used on the shroud for the 690?


----------



## Buzzkill

Quote:


> Originally Posted by *PhantomTaco*
> 
> Hey guys, got another question for you. Does anyone know the screw type used on the shroud for the 690?


T5 Torx


----------



## PhantomTaco

Quote:


> Originally Posted by *Buzzkill*
> 
> T5 Torx


Did some reading around online, where can you get the length of the screw and the threading? Or is it a standard size (I can't seem to find any online)? Thanks for the help.

EDIT: Also I'm seeing on another thread T6 Torx, though it is for a backplate:
http://www.overclock.net/t/1291618/evga-690-backplate-stuck-screws


----------



## JoshMck

Thought I would add that I am a gtx 690 owner







I have dual cards waiting to be watercooled.

http://img259.imageshack.us/img259/2083/img20130102183548.jpg Here is one card with the watercooling block on.

Edit: Cards are asus gtx 690's


----------



## Buzzkill

Quote:


> Originally Posted by *PhantomTaco*
> 
> Did some reading around online, where can you get the length of the screw and the threading? Or is it a standard size (I can't seem to find any online)? Thanks for the help.
> EDIT: Also I'm seeing on another thread T6 Torx, though it is for a backplate:
> http://www.overclock.net/t/1291618/evga-690-backplate-stuck-screws


T5 or T6 not both
I think they are 2.5m(metric) (it's been month's). I Put the stock EVGA Backplate with the Heatkiller block. I went to Ace hardware and got Button Head allen and Phillips counter sink to use under backplate where the holes didn't match and Heatkiller Allen's would not fit. The length was 8mm with block and Heatkiller backplate used 10mm with plastic washers. I used plastic washers but thin ones where backplate needed. I looked up what hardware came with block and backplates from instruction manuals.

Watercool GPU Install.pdf 549k .pdf file


----------



## jcde7ago

Quote:


> Originally Posted by *JoshMck*
> 
> Thought I would add that I am a gtx 690 owner
> 
> 
> 
> 
> 
> 
> 
> I have dual cards waiting to be watercooled.
> http://img259.imageshack.us/img259/2083/img20130102183548.jpg Here is one card with the watercooling block on.
> Edit: Cards are asus gtx 690's


Added!


----------



## max883

added som mx-4 to my GTX 690 on the mem and core







a litle better







before 51c. now 45.c







but its sceary to mess with a 1200 dolar card!!!

My i7 2700K 5.GHz at just 1.35v max temp: 62c.







so i have a nice gaming rig







ram is now Corsair platinum 2200.mhz


----------



## Rei86

Quote:


> Originally Posted by *max883*
> 
> added som mx-4 to my GTX 690 on the mem and core
> 
> 
> 
> 
> 
> 
> 
> a litle better
> 
> 
> 
> 
> 
> 
> 
> before 51c. now 45.c
> 
> 
> 
> 
> 
> 
> 
> but its sceary to mess with a 1200 dolar card!!!
> My i7 2700K 5.GHz at just 1.35v max temp: 62c.
> 
> 
> 
> 
> 
> 
> 
> so i have a nice gaming rig
> 
> 
> 
> 
> 
> 
> 
> ram is now Corsair platinum 2200.mhz


Very jealous of the clean setup man


----------



## JoshMck

Quote:


> Originally Posted by *jcde7ago*
> 
> Added!


Thanks JC.









The computer looks awesome. Black theme is sick


----------



## Scorpion49

So, its here! Also, my $300







Dell U3007 local steal.

Sorry for the crappy cellphone pics:


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> So, its here! Also, my $300
> 
> 
> 
> 
> 
> 
> 
> Dell U3007 local steal.
> Sorry for the crappy cellphone pics:


Nice buy on the Dell U3007, the color looks brilliant. Now to flex the 690's muscles and have some game time on it.


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> Nice buy on the Dell U3007, the color looks brilliant. Now to flex the 690's muscles and have some game time on it.


Yeah, I'm going to hold off as I'm using and adapter for the second 8-pin (my 550w PSU only has one 6 and one 8), gotta get a new one ASAP. I didn't expect the card to arrive so soon. I did a quick 3Dmark run and its pulling ~450w at the wall. Honestly I'm more impressed with the monitor right now. Absolutely perfect. I hated the U2711 I had because the sparkle-coating the put on it. I also dislike 16:9 screens. This thing is so nice I can actually sit back in my seat when I'm using it and not strain to see what I'm doing.

http://www.3dmark.com/3dm11/5494293


----------



## AllGamer

Quote:


> Originally Posted by *pilla99*
> 
> Can I get on this list? I don't think I ever actually was added.


I can't find any of those chinese off brand LED RAM sticks any where around town in Toronto

even online, they seem to be only available from the UK only


----------



## MrTOOSHORT

$300 for that screen? Wow I'm jelly. And congrats on your shiny new gtx 690!

Sounds like you have a nice collection of gpus.


----------



## Scorpion49

Quote:


> Originally Posted by *AllGamer*
> 
> I can't find any of those chinese off brand LED RAM sticks any where around town in Toronto
> even online, they seem to be only available from the UK only


Those are Avexir core series, they are easily available on US newegg site. I can't tell you about canada newegg though.


----------



## FtW 420

Newegg Canada used to carry Avexir memory, but looking now absolutely nothing Avexir. Not sure what happened there...


----------



## Alex132

Quote:


> Originally Posted by *FtW 420*
> 
> Newegg Canada used to carry Avexir memory, but looking now absolutely nothing Avexir. Not sure what happened there...


newegg.ca is eugh, not that good.

Rather just order from a US site and pay the slight tax (USPS, never UPS) or from a site like ncix or tigerdirect


----------



## PinzaC55

Quote:


> So, its here! Also, my $300 Dell U3007 local steal.


Ah the moment when you get the GTX 690 out of its box and revel in its sheer awesomeness eh?


----------



## max883

LOW score scorpion49! Here is mine: http://3dmark.com/3dm11/3720840

wil try to overcklock my gtx690 and se if i can get 18.000P


----------



## Scorpion49

Quote:


> Originally Posted by *max883*
> 
> LOW score scorpion49! Here is mine: http://3dmark.com/3dm11/3720840
> wil try to overcklock my gtx690 and se if i can get 18.000P


Seriously? Thats a stock score with a non-HT CPU at 4.4.


----------



## Rei86

Quote:


> Originally Posted by *max883*
> 
> LOW score scorpion49! Here is mine: http://3dmark.com/3dm11/3720840
> wil try to overcklock my gtx690 and se if i can get 18.000P


Scorpion's score looks a little bit below avg but yours on the other hand looks abnormal for a stock GTX690 with a 2600k


----------



## jcde7ago

Quote:


> Originally Posted by *Scorpion49*
> 
> So, its here! Also, my $300
> 
> 
> 
> 
> 
> 
> 
> Dell U3007 local steal.
> Sorry for the crappy cellphone pics:


Awesome...grats on the 690 and the great steal with that monitor! And added...although i am going to assume it's an EVGA card for now, since you didn't specify or have a sig.


----------



## DarwinAce

I present to you... a cheap's man GTX 690 system...

finally running decently, at first, while I figured out an airflow pattern for my $15 usd case, first attempts got the GTX burning at 95 degrees in a matter of just a couple of minutes..

regular airflow, front to back, just didn't do the trick, tried front intake, back and side exhaust with no luck, the video board kept going up to 86 and steadily rising until it started throttling down to keep it near 90... you might have noticed, the case isn't really built with cooling in mind, sad, I know...

finally, a front, back and slot exhaust, plus a side intake did the trick, now it runs steadily at 67 and it goes down to 62 in a cool night.

I'm also lucky to have one of those boards that differ 1 or 2 degrees tops from each gpu, and it has low noise when running... I had to rma 2 before this one, the first one had a horrible high pitch sound that could be heard thru the fans the loud music... it was like killing bugs with a magnifying glass... horrible, and the second one had an unusual almost 10degrees difference between gpu 1 and gpu 2, it also gave me checkered screens from time to time... but this one seems perfect.

I'm not into lights and stuff, the computer sits behind my tv and having it glowing only distracts you when watching movies or playing videogames, so i try to keep it subtle... that's why there's some electrical tape over that green glowing GTX logo, also on some of the fans that have leds and stuff...

Specs:

Some Random H61 board that i'm too ashamed to remember the model...
Another random i5 i found somewhere, at 3.2ghz, i think
8gb of Random Kingston Value Ram
3x OCZ 30Gb Vertex SSD drives (1st gens) (sadly, the crappy h61 board does not have raid







)
1x 3Tb WD Caviar Green
Corsair AX650 PSU
Zotac Nvidia GTX 690

This is what happens when you spend all the money on the video card and the TV and you don't leave some for the rest of the system... you have to get creative, and steal some equipment from the office







... for the case i had to hack off the hard drive slots to accommodate the video card, it was too big XD, and it had some kind of high turbulence fan holes for the fan exhausts and intakes, it sounded like a jumbo jet taking off... a hair dryer was too quiet to compare... so i had to make clear holes to alleviate some of the noise, to my surprise, now it sounds like a normal computer.

i would love to get a motherboard and cpu as my next step... but the case won't let me, so i think the next upgrade will be the case... or maybe a chair so i can actually play in front of the tv :\... lying on the floor isn't too comfortable at the moment, and the tv stand is too high for the floor... damn, now i'll have to get the chair first.

congratulations on reading my book, didn't noticed how much i wrote... anyways, i'm glad to have gotten my new video card, i hope i can learn stuff from all you guys around.


----------



## rcfc89

^Congrats.........


----------



## Scorpion49

Quote:


> Originally Posted by *DarwinAce*
> 
> congratulations on reading my book, didn't noticed how much i wrote... anyways, i'm glad to have gotten my new video card, i hope i can learn stuff from all you guys around.


Wow, thats quite a build. Where are you located at?


----------



## AllGamer

quite the contrary...

if you can put a GTX 690 in any gaming rig, it automatically gets bumped to rich status









not every common folk can afford a GTX 690









and the rest of the system is very decent for any game now in days, specially take note, games no in days are more GPU dependant than CPU, as long as your system is fast enough to not bottle up the GPU then you can be running a milk can if you must, and it'll still be a decent system.
Quote:


> Originally Posted by *DarwinAce*
> 
> 
> I present to you... a cheap's man GTX 690 system...
> 
> Some Random H61 board that i'm too ashamed to remember the model...
> Another random i5 i found somewhere, at 3.2ghz, i think
> 8gb of Random Kingston Value Ram
> 3x OCZ 30Gb Vertex SSD drives (1st gens) (sadly, the crappy h61 board does not have raid
> 
> 
> 
> 
> 
> 
> 
> )
> 1x 3Tb WD Caviar Green
> Corsair AX650 PSU
> Zotac Nvidia GTX 690


----------



## Nyt Ryda

Anyone here getting a problem called the Red Screen of Death on their GTX690s ?
I'm thinking of getting an ASUS GTX690 and want to check if this problems has been solved in new drivers


----------



## Rei86

Quote:


> Originally Posted by *Nyt Ryda*
> 
> Anyone here getting a problem called the Red Screen of Death on their GTX690s ?
> I'm thinking of getting an ASUS GTX690 and want to check if this problems has been solved in new drivers


Nah we have a cold boot issue now


----------



## JoshMck

I got a red screen today. But hasn't come back. Was doing some extreme changes.

BTW, with GPU tweak, what will changing the min voltage do for the 690? Aren't they voltage locked?


----------



## pilla99

Quote:


> Originally Posted by *DarwinAce*
> 
> snip


This inspired me to try and construct the least expensive 690 touting beast possible.
"Budget" 690 components:

690 itself $1000 give or take (In the US) : http://www.newegg.com/Product/Product.aspx?Item=N82E16814130781
A Tower, don't need anything bigger than a mid $18: http://www.newegg.com/Product/Product.aspx?Item=N82E16811353006
HD, don't need an SSD to rip through games $20: http://www.newegg.com/Product/Product.aspx?Item=N82E16822144545
500w PSU $30: http://www.newegg.com/Product/Product.aspx?Item=N82E16817159124

Before anyone jumps on my that 500w 80 plus certified isn't enough for a 690: http://www.overclock.net/t/1290091/gtx-690-true-power-measurement

CPU that won't bottleneck(2500 i5) $219 : http://www.newegg.com/Product/Product.aspx?Item=N82E16819115072
Motherboard 1x16 PCI 2.0 lane$54: http://www.newegg.com/Product/Product.aspx?Item=N82E16813186225
Ram 4GB 1333 $20: http://www.newegg.com/Product/Product.aspx?Item=N82E16820231341

Add in a cheap 1080p monitor Daily Deal 24" Asus $129: http://www.newegg.com/Product/Product.aspx?Item=N82E16824009440&cm_sp=DailyDeal-_-24-009-440-_-Homepage

And a mouse keyboard combo $15: http://www.newegg.com/Product/Product.aspx?Item=N82E16823201050

You can build a gaming MONSTER for only: $1505


----------



## rcfc89

My case pic.


----------



## AllGamer

did you notice the HDD is not SATA ?!?!

unless your motherboard still accepts ATA 133, you wont be able to use it

also that will become a bottle neck for the GTX 690, bad move

Quote:


> Originally Posted by *pilla99*
> 
> This inspired me to try and construct the least expensive 690 touting beast possible.
> "Budget" 690 components:
> 
> 690 itself $1000 give or take (In the US) : http://www.newegg.com/Product/Product.aspx?Item=N82E16814130781
> A Tower, don't need anything bigger than a mid $18: http://www.newegg.com/Product/Product.aspx?Item=N82E16811353006
> HD, don't need an SSD to rip through games $20: http://www.newegg.com/Product/Product.aspx?Item=N82E16822144545
> 500w PSU $30: http://www.newegg.com/Product/Product.aspx?Item=N82E16817159124
> 
> Before anyone jumps on my that 500w 80 plus certified isn't enough for a 690: http://www.overclock.net/t/1290091/gtx-690-true-power-measurement
> 
> CPU that won't bottleneck(2500 i5) $219 : http://www.newegg.com/Product/Product.aspx?Item=N82E16819115072
> Motherboard 1x16 PCI 2.0 lane$54: http://www.newegg.com/Product/Product.aspx?Item=N82E16813186225
> Ram 4GB 1333 $20: http://www.newegg.com/Product/Product.aspx?Item=N82E16820231341
> 
> Add in a cheap 1080p monitor Daily Deal 24" Asus $129: http://www.newegg.com/Product/Product.aspx?Item=N82E16824009440&cm_sp=DailyDeal-_-24-009-440-_-Homepage
> 
> And a mouse keyboard combo $15: http://www.newegg.com/Product/Product.aspx?Item=N82E16823201050
> 
> You can build a gaming MONSTER for only: $1505


----------



## Qu1ckset

Quote:


> Originally Posted by *rcfc89*
> 
> My case pic.


What is that on you 690!?


----------



## MasterVampire

Guys Im going to buy a GTX 690!

Can someone tell me which is the best brand to buy?

Asus?
Gigabyte?
or Evga?

And also the best waterblock

XSPC Razor GTX 690 Full Coverage Waterblock?
or
Heatkiller GPU-X³ GTX 690 Hole Edition Water Block Ni-Bl ?

Thanks


----------



## Rei86

Quote:


> Originally Posted by *MasterVampire*
> 
> Guys Im going to buy a GTX 690!
> 
> Can someone tell me which is the best brand to buy?
> 
> Asus?
> Gigabyte?
> or Evga?
> 
> And also the best waterblock
> 
> XSPC Razor GTX 690 Full Coverage Waterblock?
> or
> Heatkiller GPU-X³ GTX 690 Hole Edition Water Block Ni-Bl ?
> 
> Thanks


You do know that all 690s are reference design's so it doesn't matter which one you get.

Also if you live in the USA the only official board partners that can sell a 690 is Asus and EVGA. And in that case I would get EVGA for the better, faster, customer service. Not only that Asus is still listed at 1049, EVGA is no longer selling the signature edition and only selling the normal 690 for 999.


----------



## MasterVampire

Ah ok then.

What about the waterblock?

XSPC Razor GTX 690 Full Coverage Waterblock?
or
Heatkiller GPU-X³ GTX 690 Hole Edition Water Block Ni-Bl ?


----------



## Rei86

Quote:


> Originally Posted by *MasterVampire*
> 
> Ah ok then.
> 
> What about the waterblock?
> 
> XSPC Razor GTX 690 Full Coverage Waterblock?
> or
> Heatkiller GPU-X³ GTX 690 Hole Edition Water Block Ni-Bl ?


I can't answer that since most of the major review sites don't review WB, so sorry









However form what owners report with their temperatures and such from various brands and cards they're all like 10~5deg apart. So I guess it comes down to which one appeals to you aesthetically.

IMO the black Heatkiller Hole Edition is nice looking but it isn't really a full coverage block and that kind of puts me off, vs the full coverage designs. But once again that's just an opinion of mine of the looks of something.


----------



## JoshMck

I bought heat killer and currently my 690's on full load have not broken 36 degrees.

Minimum of 20







hope that helps!


----------



## MasterVampire

Actually maybe I shouldnt buy a 690?

I mainly want it because my 590 is hitting its VRAM limit in Skyrim.
I play Skyrim at 1920x1080 with lots and lots of HD texture mods + ENB and it stutters a bit outdoors when loading and looking around.

The 690 only has 2GB per GPU where the 590 only has 1.5 per GPU so its not that much of an improvement.

Would it just be better for me to get a 4GB 680 ?


----------



## PinzaC55

Quote:


> Originally Posted by *MasterVampire*
> 
> Actually maybe I shouldnt buy a 690?
> 
> I mainly want it because my 590 is hitting its VRAM limit in Skyrim.
> I play Skyrim at 1920x1080 with lots and lots of HD texture mods + ENB and it stutters a bit outdoors when loading and looking around.
> 
> The 690 only has 2GB per GPU where the 590 only has 1.5 per GPU so its not that much of an improvement.
> 
> Would it just be better for me to get a 4GB 680 ?







Note the title "Maxed Out".


----------



## ceteris

Was able to OC 200/500 running 24/7 on a Heatkiller block. DeltaT is 10-11C from idle temps.


----------



## rcfc89

Quote:


> Originally Posted by *Qu1ckset*
> 
> What is that on you 690!?


I basically buy all my hardware and hire a company to build and overclock my computers. It might seem stupid to some but the amount of extensive testing they do added with their quality of work makes it worth every penny to me. The gtx690 is quite heavy so they basically put a aftermarket bracket on it to give it extra support.


----------



## zer0sum

Alright...I think I might be about to join this club as I managed to grab an evga 690 off ebay for $700 shipped









Now to wait and see if it arrives as described...


----------



## pilla99

Quote:


> Originally Posted by *rcfc89*
> 
> I basically buy all my hardware and hire a company to build and overclock my computers. It might seem stupid to some but the amount of extensive testing they do added with their quality of work makes it worth every penny to me. The gtx690 is quite heavy so they basically put a aftermarket bracket on it to give it extra support.


How much do you pay them to do this? Just curious.
Quote:


> Originally Posted by *zer0sum*
> 
> Alright...I think I might be about to join this club as I managed to grab an evga 690 off ebay for $700 shipped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now to wait and see if it arrives as described...


For $700 a 690? Was it described as being cut in half because that is far and away the best price I have ever heard of for one. I wouldn't go a dollar below 900 if I was selling mine. I just looked at Ebay earlier and they are stil grabbing 1k+


----------



## rcfc89

Quote:


> Originally Posted by *pilla99*
> 
> How much do you pay them to do this? Just curious.
> For $700 a 690? Was it described as being cut in half because that is far and away the best price I have ever heard of for one. I wouldn't go a dollar below 900 if I was selling mine. I just looked at Ebay earlier and they are stil grabbing 1k+


pm'd


----------



## Qu1ckset

I love my evga hydrocopper waterblock, its a true full cover block, my gpu's don't go over 35degress stressing them and the evga logo lights up! Lol


----------



## PinzaC55

Quote:


> Originally Posted by *pilla99*
> 
> How much do you pay them to do this? Just curious.
> For $700 a 690? Was it described as being cut in half because that is far and away the best price I have ever heard of for one. I wouldn't go a dollar below 900 if I was selling mine. I just looked at Ebay earlier and they are stil grabbing 1k+


There was a 690 for sale in Florida on ebay a few weeks ago about the same time I bought mine (uk) and it went for about £430 which I think is slightly less than $700. People are often reluctant to bid on high end PC's and components on Ebay so if you have the balls you can get a bargain.


----------



## Nyt Ryda

What drivers are you 690 guys running ?


----------



## MrTOOSHORT

310.70.

With 310.90, the driver doesn't load when the computer is turned on or restarted sometimes.


----------



## Scorpion49

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> 310.70.
> 
> With 310.90, the driver doesn't load when the computer is turned on or restarted sometimes.


I'm running 310.90 with no issues.


----------



## max883

same here mrTooshort.

With 310.90, the driver doesn't load when the computer is turned on or restarted sometimes.


----------



## Rei86

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> 310.70.
> 
> With 310.90, the driver doesn't load when the computer is turned on or restarted sometimes.


310.70 has a security hole in it...

310.90 has owners with X58, Z77, Z75, P67, X79 mobos with SLI setups reporting a cold boot issue where the drivers are not loaded. However if you restart they will reload.
So even if I was facing the cold boot issue I would stick to 310.90 man, or an early version of the drivers, NOT 310.70.


----------



## AllGamer

latest 310.90 doubled or tripled my FPS rate in some games

games that used to only get roughly 30 FPS, now they push 120+ FPS (eye candy set to HIGH)

that's MechWarrior Online, and many other CryEngine games


----------



## tsm106

Quote:


> Originally Posted by *Rei86*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrTOOSHORT*
> 
> 310.70.
> 
> With 310.90, the driver doesn't load when the computer is turned on or restarted sometimes.
> 
> 
> 
> *
> 310.70 has a security hole in it...*
> 
> 310.90 has owners with X58, Z77, Z75, P67, X79 mobos with SLI setups reporting a cold boot issue where the drivers are not loaded. However if you restart they will reload.
> So even if I was facing the cold boot issue I would stick to 310.90 man, or an early version of the drivers, NOT 310.70.
Click to expand...

Is it just .70? Dammit, have to check both my kepler rigs. Driver 310xx have been annoying.


----------



## Kaapstad

Im using 310.90 for games

301.42 for benching 2 cards

306.23 for benching a single card


----------



## Rei86

Quote:


> Originally Posted by *tsm106*
> 
> Is it just .70? Dammit, have to check both my kepler rigs. Driver 310xx have been annoying.


For benchmark purposes each drivers updates have sucked on them (Especially talking about 3DMarks11 here). As for real games its been like +/- FPS on that front.

But yes its only 310.70, roll it to 310.90 or something else but 310.7


----------



## Stay Puft

.90s have killed my core overclocking. I can get +85 core stable but thats about it. Memory overclocking is still good at +400


----------



## KaRLiToS

Quote:


> Originally Posted by *rcfc89*
> 
> I basically buy all my hardware and hire a company to build and overclock my computers. It might seem stupid to some but the amount of extensive testing they do added with their quality of work makes it worth every penny to me. The gtx690 is quite heavy so they basically put a aftermarket bracket on it to give it extra support.


I'm pretty sure me and some other members here are FAR better than those companies.

"If you want something done right, you have to do it yourself! "


----------



## pilla99

Quote:


> Originally Posted by *KaRLiToS*
> 
> I'm pretty sure me and some other members here are FAR better than those companies.
> 
> "If you want something done right, you have to do it yourself! "


You know I'm going to take that quote under advisement, I was going to take my car in to get the engine replaced, but I think I'll do because it needs to be done right. And that triple bypass next month? No worries I'll handle it too.


----------



## Rei86

Quote:


> Originally Posted by *KaRLiToS*
> 
> I'm pretty sure me and some other members here are FAR better than those companies.
> 
> "If you want something done right, you have to do it yourself! "


hmmm

But that person would probably charge about as much as those companies and won't have a warranty build backup after their done.


----------



## KaRLiToS

Quote:


> Originally Posted by *pilla99*
> 
> You know I'm going to take that quote under advisement, I was going to take my car in to get the engine replaced, but I think I'll do because it needs to be done right. And that triple bypass next month? No worries I'll handle it too.


Did I speak about car? You can OC everything without any tool, I don't get your point.

Is OCN a forum about building cars? Wrong forum maybe.









But pilla99, would you rather overclock your system yourself, or leave it to a company that will pump the vcore and charge a price. Where is the fun if you don't OC your *own* rig.


----------



## rcfc89

Quote:


> Originally Posted by *KaRLiToS*
> 
> Did I speak about car? You can OC everything without any tool, I don't get your point.
> 
> Is OCN a forum about building cars? Wrong forum maybe.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But pilla99, would you rather overclock your system yourself, or leave it to a company that will pump the vcore and charge a price. Where is the fun if you don't OC your *own* rig.


If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.


----------



## KaRLiToS

Quote:


> Originally Posted by *rcfc89*
> 
> If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.










But you can be a professionnal.


----------



## maximus56

Quote:


> Originally Posted by *KaRLiToS*
> 
> 
> 
> 
> 
> 
> 
> 
> But you can be a professionnal.


Discussion to Outsource Overclocking on the Overclock.net Forum?..LOl..and here I thought this was more of a DIY forum for the overclocking community..lol Now we can look forward to more OEMs of pre-built systems, as the featured sponsors!


----------



## Rei86

Quote:


> Originally Posted by *rcfc89*
> 
> If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.


That's 3k in hardware?

Either ways building a PC is pretty easy plug and play. The only issue many run into is cable management.


----------



## tsm106

Quote:


> Originally Posted by *Rei86*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rcfc89*
> 
> If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.
> 
> 
> 
> That's 3k in hardware?
> 
> Either ways building a PC is pretty easy plug and play. The only issue many run into is cable management.
Click to expand...

My loop costs more than 3k and I do it myself. I don't think I could afford to pay myself to do it as well as I did it myself! Rofl.


----------



## ceteris

OC'ing your GPU is relatively easier these days then when I first started out 4 years back. Same procedure but just make sure you have the cooling or crank up the fan speed to 100% while reading what others got on their cards and tweak til you hit your spot.


----------



## JoshMck

Quote:


> Originally Posted by *rcfc89*
> 
> If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.


My system cost a whole lot more than 3k







And I spend every chance I get overclocking every last big of it.

Your choice but you are on the wrong forum to justify outsourcing overclocking.

Edit: My GPU's alone cost nearly 3k haha. with waterblocks, well above.


----------



## rcfc89

Quote:


> Originally Posted by *JoshMck*
> 
> My system cost a whole lot more than 3k
> 
> 
> 
> 
> 
> 
> 
> And I spend every chance I get overclocking every last big of it.
> 
> Your choice but you are on the wrong forum to justify outsourcing overclocking.
> 
> Edit: My GPU's alone cost nearly 3k haha. with waterblocks, well above.


Good for you. I have the funds to have it done so I do so. My gaming rig is just one of my many toys. And the cheapest of my hobbies by far. Was never trying to justify anything.


----------



## FtW 420

Quote:


> Originally Posted by *rcfc89*
> 
> If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.


I'm just the opposite, not so bad letting some random person (professional? like geek squad?







) play with an $800 rig, with expensive HW I'm doing it myself...

Just start slow & read up, it isn't very dangerous.


----------



## rcfc89

Quote:


> Originally Posted by *FtW 420*
> 
> I'm just the opposite, not so bad letting some random person (professional? like geek squad?
> 
> 
> 
> 
> 
> 
> 
> ) play with an $800 rig, with expensive HW I'm doing it myself...
> 
> Just start slow & read up, it isn't very dangerous.


I would not hand my computer over to a bunch of kids "geek squad." Puget Systems is who built my PC. Very reputable company.


----------



## armando666

Quote:


> Originally Posted by *rcfc89*
> 
> Good for you. I have the funds to have it done so I do so. My gaming rig is just one of my many toys. And the cheapest of my hobbies by far. Was never trying to justify anything.


Your postings are a bit contradictory. On one hand your seem to be implying that a system over 3k is out of your comfort zone, so you would rather have a professional assemble it, on the other hand, you are saying that money is no object, as you have the funds to get someone to build you the system, and it is the cheapest of your hobbies.

Putting money aside, I would just say that you are missing out on The Pleasure Of Finding Things Out , as Feynman infamously said


----------



## JoshMck

Quote:


> Originally Posted by *rcfc89*
> 
> Good for you. I have the funds to have it done so I do so. My gaming rig is just one of my many toys. And the cheapest of my hobbies by far. Was never trying to justify anything.


I will give you a tip, this is my cheapest hobby....

But I would not trust someone to do this, nor would I lose the fun of doing it myself, I may fry something, but I will just replace it









I also have the funds to have it done but would never get it done. This forum is called OCN not buy a computer and get someone else to overclock it.









Quote:


> Originally Posted by *armando666*
> 
> Your postings are a bit contradictory. On one hand your seem to be implying that a system over 3k is out of your comfort zone, so you would rather have a professional assemble it, on the other hand, you are saying that money is no object, as you have the funds to get someone to build you the system, and it is the cheapest of your hobbies.
> 
> Putting money aside, I would just say that you are missing out on The Pleasure Of Finding Things Out , as Feynman infamously said


I definitely agree, although sometimes watching my computer crash can be scary. Hoping I didn't burn out my CPU! haha

Anyway, stop acting so serious! We are entitled to think you should do it yourself as you are entitled to think you should get someone else to, if you can't handle an overclocking forums criticism, don't post about it


----------



## JoshMck

Also, I looked at puget systems, they look professional, but they also hold back on all the parts I would want to use XD Only kingston ram? Also, 4.3ghz for a liquid cooling OC is quite low. So I would not be assuming that is them doing the fastest stable clock for a 3960 x extreme processor.


----------



## Rei86

Lets not get rude and get off the topic of rcfc89's system (its a very nice looking system tho, very clean and quality of work was put into it).


----------



## JoshMck

Wasn't trying to degrade his system, was trying to make a point that they may do a job and function, but personally overclocking to get the most you can out of a PC can get the systems they have set up much further than they take it stably and safely.

Once again, each person is entitled to their opinion, but I love getting into my BIOS


----------



## grunion

What's this all about?
From Nov 2012 and it's just now trying to update it?
Any issues with this WDDM?


----------



## ceteris

Quote:


> Originally Posted by *rcfc89*
> 
> I would not hand my computer over to a bunch of kids "geek squad." Puget Systems is who built my PC. Very reputable company.


Quote:


> Originally Posted by *armando666*
> 
> Your postings are a bit contradictory. On one hand your seem to be implying that a system over 3k is out of your comfort zone, so you would rather have a professional assemble it, on the other hand, you are saying that money is no object, as you have the funds to get someone to build you the system, and it is the cheapest of your hobbies.
> 
> Putting money aside, I would just say that you are missing out on The Pleasure Of Finding Things Out , as Feynman infamously said


Quote:


> Originally Posted by *JoshMck*
> 
> I will give you a tip, this is my cheapest hobby....
> 
> But I would not trust someone to do this, nor would I lose the fun of doing it myself, I may fry something, but I will just replace it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also have the funds to have it done but would never get it done. This forum is called OCN not buy a computer and get someone else to overclock it.
> 
> 
> 
> 
> 
> 
> 
> 
> I definitely agree, although sometimes watching my computer crash can be scary. Hoping I didn't burn out my CPU! haha
> 
> Anyway, stop acting so serious! We are entitled to think you should do it yourself as you are entitled to think you should get someone else to, if you can't handle an overclocking forums criticism, don't post about it


To each his/her own, guys. I consider myself a PC enthusiast and a car enthusiast. I wouldn't mind digging into my hardware, but when it comes dual cat exhausts, car stereos/speakers or even oil changes I'd rather have professionals handle it. The official excuse I pass around is that I live in apt and don't have my own garage to experiment. But to tell you the truth, I would feel more comfortable if a professional who has more experience deals with it.


----------



## MasterClocker

My Christmas Gift ''Gaming Machine''

Case CoolerMaster Strom Trooper
Processor Intel I7-3960x Extreme CPU
Asus Rampage IV Extreme Mobo
32 GB Ram G.Skills Trident X Quad Channel 2400mhz (An Other 32 GB on his way)
*2x Asus GTX-690 Dual GPU in Quad SLI Graphic Card*
Thermaltake Water 2.0 Extreme Water Cooling system
2x SSD Kingston Hyper X 240 GB Sata 6.0 GB/s in Raid-0 ( Windows 8 Pro x64

Gaming Machine.jpg 97k .jpg file
)
2x OCZ Vertex 4 128 GB Sata 6.0 GB/s (Game 'n' Data)
LG Blu-Ray Burner 14x rewriter
LG DVD burner 24x
OCZ ZX-1250w Gold Power Supply
Logitech Z-5300e Speaker 560 watt Home Theater
Toshiba 32'' LED 1080p HDMI
Mouse - Keyboard - Pad Razor


----------



## Arizonian

Quote:


> Originally Posted by *MasterClocker*
> 
> My Christmas Gift ''Gaming Machine''
> 
> Case CoolerMaster Strom Trooper
> Processor Intel I7-3960x Extreme CPU
> Asus Rampage IV Extreme Mobo
> 32 GB Ram G.Skills Trident X Quad Channel 2400mhz (An Other 32 GB on his way)
> *2x Asus GTX-690 Dual GPU in Quad SLI Graphic Card*
> Thermaltake Water 2.0 Extreme Water Cooling system
> 2x SSD Kingston Hyper X 240 GB Sata 6.0 GB/s in Raid-0 ( Windows 8 Pro x64
> 
> Gaming Machine.jpg 97k .jpg file
> )
> 2x OCZ Vertex 4 128 GB Sata 6.0 GB/s (Game 'n' Data)
> LG Blu-Ray Burner 14x rewriter
> LG DVD burner 24x
> OCZ ZX-1250w Gold Power Supply
> Logitech Z-5300e Speaker 560 watt Home Theater
> Toshiba 32'' LED 1080p HDMI
> Mouse - Keyboard - Pad Razor


Very nice. Congrats.







Welcome with your first post to OCN as well.









"How to put your Rig in your Sig"


----------



## JoshMck

Great computer to start with







congratulations! Quad GPU is fun


----------



## armando666

Quote:


> Originally Posted by *ceteris*
> 
> To each his/her own, guys. I consider myself a PC enthusiast and a car enthusiast. I wouldn't mind digging into my hardware, but when it comes dual cat exhausts, car stereos/speakers or even oil changes I'd rather have professionals handle it. The official excuse I pass around is that I live in apt and don't have my own garage to experiment. But to tell you the truth, I would feel more comfortable if a professional who has more experience deals with it.


You right that it is about the comfort level, and this is why people come to OCN to learn and get comfortable. And, if they are not comfortable, they go to guys like Cyberpower, ibuypower etc. I agree to each their own..but I guess OCN"rs are proud system builders, and they take exception when someone refers to these prebuilders or assemblers as professionals...lol...guess a professional is just a gauge relative someone's own skill set, or lack there of


----------



## zer0sum

Quote:


> Originally Posted by *MasterClocker*
> 
> My Christmas Gift ''Gaming Machine''
> 
> Case CoolerMaster Strom Trooper
> Processor Intel I7-3960x Extreme CPU
> Asus Rampage IV Extreme Mobo
> 32 GB Ram G.Skills Trident X Quad Channel 2400mhz (An Other 32 GB on his way)
> *2x Asus GTX-690 Dual GPU in Quad SLI Graphic Card*
> Thermaltake Water 2.0 Extreme Water Cooling system
> 2x SSD Kingston Hyper X 240 GB Sata 6.0 GB/s in Raid-0 ( Windows 8 Pro x64
> 
> Gaming Machine.jpg 97k .jpg file
> )
> 2x OCZ Vertex 4 128 GB Sata 6.0 GB/s (Game 'n' Data)
> LG Blu-Ray Burner 14x rewriter
> LG DVD burner 24x
> OCZ ZX-1250w Gold Power Supply
> Logitech Z-5300e Speaker 560 watt Home Theater
> Toshiba 32'' LED 1080p HDMI
> Mouse - Keyboard - Pad Razor


Are you sticking with that tv/monitor?
Seems like you should be rocking a nice 27" or 30" or even going surround with 3 monitors


----------



## jcde7ago

Guys, let's try to stay on topic before I sick Arizonian on you guys.









I want to make sure I don't miss anyone that might need to get added to the list this week - i'm moving to a new place, so i'll only be hopping on OCN here and there throughout the week as I get settled in and may not have internet outside of my iPhone. That said, if anyone wants to get added this week without me missing it, PM'ing me might be the way to go.


----------



## MasterClocker

Hi zer0sum, Im looking about 3x 24 or 27 inch Monitor to make sure my rig be at is max potential ! Im thinking about OCZ Revodrie 3 x2 480GB but I have to choice between revodrive and Monitor !

Thanks for your comments !


----------



## Kaapstad

Go for the monitor

I use a Revodrive and they are nice (if a bit slow to boot up) but a top quality monitor will improve your user experience a lot more.


----------



## shremi

I am interested in getting this card....

Does anyone here knows if the evga backplate is cmpatible with other waterblocks other than the evga ??


----------



## TeamBlue

Hi all,
I'm getting pretty excited to move into my new mini itx build in a prodigy, going to be putting an evga 690 under an EK waterblock. Any quirks/things I should immediately do? This will be going on a fresh install, so hopefully no driver issues...


----------



## Sujeto 1

Some months ago, i asked here if it worth to buying 1 GTX 690 instead two GTX 680, some good felows from this club told me i better go with GTX 680 SLI instead, not just because is cheaper (60 bucks is not a big deal anyway) but they told me GTX 690 get Hot and louder than SLI GTX 680 and total FPS will be bigger in the SLI beside drivers are better in sli than dual cards (they told me something about SKYRIM perfomance is not what he expected to be), pros is that you can save space and of course GTX 690 is gorgeous. Today when i finally have some time, im ready to pull the trigger for TWO GTX 680 or 1 GTX 690 not upgrading anymore in many years, so i must to recheck this information with you GTX 690 owners. thank you in advance.


----------



## Alex132

2x 4GB 680s IMO


----------



## Sujeto 1

Quote:


> Originally Posted by *Alex132*
> 
> 2x 4GB 680s IMO


any good reason for this?


----------



## ceteris

Quote:


> Originally Posted by *shremi*
> 
> I am interested in getting this card....
> 
> Does anyone here knows if the evga backplate is cmpatible with other waterblocks other than the evga ??


The only thing that's possibly incompatible are the screws. You won't get the right length to thread them in all the way for a secure fit. Koolance was the only exception in my experience, except that you might be left with some of the screw holes unpopulated. Someone correct me if I'm wrong, but I'm just basing it on the 480 and 580 block I worked with before.

I luckily had some extra small screws the right size to work with the heatkiller block. They came off something ages ago and I couldn't tell you where to get them though


----------



## rcfc89

Quote:


> Originally Posted by *Sujeto 1*
> 
> Some months ago, i asked here if it worth to buying 1 GTX 690 instead two GTX 680, some good felows from this club told me i better go with GTX 680 SLI instead, not just because is cheaper (60 bucks is not a big deal anyway) but they told me GTX 690 get Hot and louder than SLI GTX 680 and total FPS will be bigger in the SLI beside drivers are better in sli than dual cards (they told me something about SKYRIM perfomance is not what he expected to be), pros is that you can save space and of course GTX 690 is gorgeous. Today when i finally have some time, im ready to pull the trigger for TWO GTX 680 or 1 GTX 690 not upgrading anymore in many years, so i must to recheck this information with you GTX 690 owners. thank you in advance.


They told you (2) 680's combined would run cooler and would produce less noise then a 690. What a crock of shi*. I prefer the 690 for the very reason that it produces much less heat and noise then a pair of 680's. As well as much more power efficient. Of course if you plan to overclock the hell out of the 690 its going to be less efficient and produce more heat. That's why I prefer to not push gpu's. For the little gain you get performance wise its not worth it imo. I buy new gpu's every year so my opinion might not be suitable for your needs but here it is anayways. Wait a few months and pick up a pair of 8970's or 780's. They will outperform anything in the current generation and most likely have more vram as well as run much cooler and take less power to operate. With my current rig I'm able to max out most all games and my stock 690 runs nice and cool with the fans rarely breaking 50% so it quiet as well. But will that be the case 6 months from now as games continue to push for more gpu power and vram? Probably not. Imo you can set things up firmly to future proof your system for a few years with a strong cpu/ram/mb. But when it comes to gpu's your going to have to upgrade quite often to keep up with the demand of games if you wan't to run them at there max potential.


----------



## Arizonian

Quote:


> Originally Posted by *Sujeto 1*
> 
> Some months ago, i asked here if it worth to buying 1 GTX 690 instead two GTX 680, some good felows from this club told me i better go with GTX 680 SLI instead, not just because is cheaper (60 bucks is not a big deal anyway) but they told me GTX 690 get Hot and louder than SLI GTX 680 and total FPS will be bigger in the SLI beside drivers are better in sli than dual cards (they told me something about SKYRIM perfomance is not what he expected to be), pros is that you can save space and of course GTX 690 is gorgeous. Today when i finally have some time, im ready to pull the trigger for TWO GTX 680 or 1 GTX 690 not upgrading anymore in many years, so i must to recheck this information with you GTX 690 owners. thank you in advance.


Well Sujeto 1 your going to have to weigh your preferences with some CONS and PROS and make the decision that's best for you as club members give their opinions.

I went with a dual GPU because I never had one before and only way to learn sometimes is hands on. What I do like about the dual is I've got the extra PCIe slot that I can either put a deticated sound card or TV turner. I'm looking into turning this into a TV too so I can watch TV in one corner of screen while I work, play, etc. If I do, I'm stuck and won't be changing my GTX 690 until I can replace it with another single card that is as powerful or I won't be upgrading until one is as capable.

Now if your not thinking of doing anything else with that other PCIe slot or have room in SLI to add a tuner / sound card then the above reasoning is moot for you.

One advantage of SLI is if one dies having the other run things while the second is being RMA'd. This too is a moot point if your mother board has an Intel CPU with integrated graphics as that will suffice for a system (without gaming) until the replacement arrives as well.

The key words though that you said were your not upgrading for many years. Then my advice for you and future hind sight is go with two GTX 680's and as mentioned above with 4GB VRAM. You'll have the performance that won't be limited by possible VRAM bottlenecks for a longer time than you will with only 2GB the GTX 690 actually offers.

Though you did say you won't upgrade for many years, nice things about high end cards is they do well better resale and making up cost difference makes it less expensive to upgrade again later. It's also easier to sell two single cards when split apart then on the single GPU which will resale for more and probably take a little longer to sell if you do.

There's my


----------



## zer0sum

Quote:


> Originally Posted by *pilla99*
> 
> How much do you pay them to do this? Just curious.
> For $700 a 690? Was it described as being cut in half because that is far and away the best price I have ever heard of for one. I wouldn't go a dollar below 900 if I was selling mine. I just looked at Ebay earlier and they are stil grabbing 1k+


Think I got lucky because the title of the Ebay auction was misspelled and I ended up being the only bidder.
But you are right $700 shipped is crazy. Cheaper than 2 x 670's









Card just turned up today and it looks good. In its original box with cables etc.
Now I just have to wait for a new PSU so I can make sure its all ok. Then I can order the XSPC block and get it under water


----------



## VulgarDisplay88

Quote:


> Originally Posted by *zer0sum*
> 
> Think I got lucky because the title of the Ebay auction was misspelled and I ended up being the only bidder.
> But you are right $700 shipped is crazy. Cheaper than 2 x 670's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card just turned up today and it looks good. In its original box with cables etc.
> Now I just have to wait for a new PSU so I can make sure its all ok. Then I can order the XSPC block and get it under water


I won an auction for a GTX 690 on the 20th December that was misspelled (got it for £355) but it still hasn't turned up yet. Seller told me that it is in my country but it's unable to be tracked so I've opened a case to see if they can help me out. I'm thinking that he is angry that he got such a low price for it and has just said that it's been sent when it hasn't.


----------



## PinzaC55

Quote:


> Originally Posted by *zer0sum*
> 
> Think I got lucky because the title of the Ebay auction was misspelled and I ended up being the only bidder.
> But you are right $700 shipped is crazy. Cheaper than 2 x 670's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card just turned up today and it looks good. In its original box with cables etc.
> Now I just have to wait for a new PSU so I can make sure its all ok. Then I can order the XSPC block and get it under water


"Card just turned up today and it looks good. In its original box with cables etc."

Are you sure its a real GTX 690?

When mine turned up I was like this


----------



## armando666

Quote:


> Originally Posted by *Sujeto 1*
> 
> any good reason for this?


Although not listed as one of your preferred options, and with all due to respect to all the members here, have you considered a couple of these Sapphire Radeon Vapor-X HD 7970 GHZ OC 6GB? especially, if VRAM is a major consideration for you.
Personally, I think that you will happy with either of your choices.


----------



## Sujeto 1

Quote:


> Originally Posted by *armando666*
> 
> Although not listed as one of your preferred options, and with all due to respect to all the members here, have you considered a couple of these Sapphire Radeon Vapor-X HD 7970 GHZ OC 6GB? especially, if VRAM is a major consideration for you.
> Personally, I think that you will happy with either of your choices.


I have considered this, yes, prices are very tempting rigth now, but losing Physix i think im gonna miss some feautres in games and dont even like thinking to add an aditional Nvidia card aditional for this matter thats uggly. al though i know 7970 GHZ kick as to GTX 680 all the way.


----------



## maximus56

Quote:


> Originally Posted by *ceteris*
> 
> The only thing that's possibly incompatible are the screws. You won't get the right length to thread them in all the way for a secure fit. Koolance was the only exception in my experience, except that you might be left with some of the screw holes unpopulated. Someone correct me if I'm wrong, but I'm just basing it on the 480 and 580 block I worked with before.
> 
> I luckily had some extra small screws the right size to work with the heatkiller block. They came off something ages ago and I couldn't tell you where to get them though


That is correct. Koolance would work, and you will get a pretty good fit even with some screws holes that are not utilized or needed.


----------



## Rei86

And my GTX690 is back with me Odd as I've posted more than a few pictures in this thread and I'm still not a member of the 690 owners club


Quote:


> Originally Posted by *Sujeto 1*
> 
> I have considered this, yes, prices are very tempting rigth now, but losing Physix i think im gonna miss some feautres in games and dont even like thinking to add an aditional Nvidia card aditional for this matter thats uggly. al though i know 7970 GHZ kick as to GTX 680 all the way.


Depends on the game and the 7970 non Ghz is the real winner when it comes to price per performance ratio

Major Pros of the issue of 690 vs anything else is

- Ascetically its the best looking card out there
- It only uses up one PCIe slot
- It only uses two rear IO slots
- It only needs two 8pin PCI power
- Its pretty quiet at idle

Major con
- You'll be limited to 2GB of VRAM (however this all depends on the screen resolution you're playing at, also realize that if the performance is effected after the game has occupied all the VRAM space than that's an issue. IF not don't worry about it)
- The dual GPU on a single PCB has its limitations and if you're into OC you'll run out of head room faster than a normal two card setup
- Price

After all that and if you don't want a 690 a good alternative is 7970 NON-Ghz model and CF, the 7950 is a great value proposition for always being below 300 dollars and probably the best bang for the buck on the market ATM for gamers, the GTX680 any flavor is great, and the GTX670 is another great card too.
If you're looking at a 690 as a purchase you're not really hurting for money but the 660Ti is another great choice. And of course I'm talking about all of these cards in the fashion that you can purchase more than one of them that is to SLI/CF.


----------



## TeamBlue

The 690 is inarguably the best 2-slot solution period. For my upcoming bitfenix prodigy pushing a 1440p monitor, this is what sold me.


----------



## PinzaC55

OK, I have a couple of questions relating to issues on this page. One or both questions may be dumb but I think they are worth asking.
1) The only real criticism anyone seems to have of the GTX 690 is that it "only" has 2GB of VRAM per GPU.
So, imagine you are Nvidia; you have a high tech factory, expert designers, skilled workers. Nvidia must have anticipated this criticism so why didn't they put more VRAM in there? Is it too expensive or is there a technical reason, ie not enough space on the card, higher power requirement etc?
2) Re Sujeto 1's post "I have considered this, yes, prices are very tempting rigth now, but losing Physix i think im gonna miss some feautres in games and dont even like thinking to add an aditional Nvidia card aditional for this matter thats uggly. al though i know 7970 GHZ kick as to GTX 680 all the way."
I was going to ask why does nobody put in a second card to use for Physx? Does it simply not add enough performance to make it worthwhile or is there another reason?


----------



## maximus56

Quote:


> Originally Posted by *Sujeto 1*
> 
> I have considered this, yes, prices are very tempting rigth now, but losing Physix i think im gonna miss some feautres in games and dont even like thinking to add an aditional Nvidia card aditional for this matter thats uggly. al though i know 7970 GHZ kick as to GTX 680 all the way.


I can't speak for everyone else, but I have been fairly happy with the 690, and have no regrets, now with the benefit of hindsight. I have never run in to a VRAM issue. I run all games with max settings on 5760 x1080 resolution (in the interest of full disclosure, I am running a quad configuration,). The only issues I have experienced related to any fps problems have been associated with the drivers or with games that are not well optimized. I will not trade out my 690 for any other card/cards at this point in time. If there is a new GPU introduced by Nvidia or AMD that absolutely eclipses the performance of a single 690 , I will change out my build. If you check out the forums for other card owners, everyone seems to have their own set of issues, and I am not entirely certain how many people directly attribute their problems to VRAM. I have seen some people who switch their cards due to the lack of VRAM , and still don't like what they get, and ultimately settle on what gives them the best user experience. The best advise that I can give you is to thoroughly read the 690, 680 and 7950/7970/7990 owners' threads, and come to your own conclusion, after reading the respective user experience of various card owners, and based on your own set of needs /requirements. If you ask anyone for an opinion, they will have naturally have a biased opinion colored by their own preferences, and that includes yours truly.


----------



## Scorpion49

Quote:


> Originally Posted by *PinzaC55*
> 
> OK, I have a couple of questions relating to issues on this page. One or both questions may be dumb but I think they are worth asking.
> 1) The only real criticism anyone seems to have of the GTX 690 is that it "only" has 2GB of VRAM per GPU.
> So, imagine you are Nvidia; you have a high tech factory, expert designers, skilled workers. Nvidia must have anticipated this criticism so why didn't they put more VRAM in there? Is it too expensive or is there a technical reason, ie not enough space on the card, higher power requirement etc?
> 
> Because it doesn't really need it. The people who complain about this are split into two groups: 99% whiners who play at 1080p and believe that because MSI afterburner says the game wants to buffer X amount of RAM, that they actually hit a Vram limit. 1% who game at ridiculously high res and actually need it (I sure didn't at 6040x1200).
> 
> 2) Re Sujeto 1's post "I have considered this, yes, prices are very tempting rigth now, but losing Physix i think im gonna miss some feautres in games and dont even like thinking to add an aditional Nvidia card aditional for this matter thats uggly. al though i know 7970 GHZ kick as to GTX 680 all the way."
> I was going to ask why does nobody put in a second card to use for Physx? Does it simply not add enough performance to make it worthwhile or is there another reason?
> 
> Because its a pain in the butt. I've run hybrid PhysX, its hard to setup, it breaks easily, and Nvidias newest PhysX driver kills it completely if there is AMD software present in the system because they don't like us doing it. It took me about a week to get PhysX working with Borderlands 2 with a 7970 and a GTS 450.


Nice rig by the way.


----------



## armando666

+1 rep Scorpion 49 for calling it the way it is


----------



## Rei86

Quote:


> Originally Posted by *PinzaC55*
> 
> OK, I have a couple of questions relating to issues on this page. One or both questions may be dumb but I think they are worth asking.
> 1) The only real criticism anyone seems to have of the GTX 690 is that it "only" has 2GB of VRAM per GPU.
> So, imagine you are Nvidia; you have a high tech factory, expert designers, skilled workers. Nvidia must have anticipated this criticism so why didn't they put more VRAM in there? Is it too expensive or is there a technical reason, ie not enough space on the card, higher power requirement etc?


Because they could have went with 3GB VRAM. Would it have cost more? We don't know. But it was probably because Nvidia was just following their own foot steps in design. GTX480 1.5GB > GTX580 1.5GB > GTX680 2GB.
Quote:


> Originally Posted by *PinzaC55*
> 
> 2) Re Sujeto 1's post "I have considered this, yes, prices are very tempting rigth now, but losing Physix i think im gonna miss some feautres in games and dont even like thinking to add an aditional Nvidia card aditional for this matter thats uggly. al though i know 7970 GHZ kick as to GTX 680 all the way."
> I was going to ask why does nobody put in a second card to use for Physx? Does it simply not add enough performance to make it worthwhile or is there another reason?


There really isn't enough games on the market to justify getting nvidia only card to PhysX with. I've never done a AMD setup with a Nvidia extra card to do physx. And I'm sure you'll run into more issue than its worth doing a hybrid setup like that.


----------



## ceteris

I used to care about PhysX too, but now I don't really care. Most games I like to play either doesn't have it (or doesn't takes full advantage of it for that matter.)

I think there was some dude a couple months back who posted on this thread showing Quad SLI with two 690's and a 680 for PhysX. When you think about it, you are spending almost $500 to see SPARKLES pop on your screen. I'm not one to skimp out on my rig, but that just seems silly to me


----------



## JoshMck

I love my 690 and running quad GPU I haven't hit my vram limit on 5760x1080.

In saying that, if I didn't want my case and motherboard I would have gone quad 4gb 680's


----------



## JoshMck

Only problem I've come across so far came from a driver update to 310.90, did anyone have an issue where frames went to absolute ****? 30 fps? Games were unplayable. Currently rolling back.

Any ideas?


----------



## zer0sum

Quote:


> Originally Posted by *Rei86*
> 
> Because they could have went with 3GB VRAM. Would it have cost more? We don't know. But it was probably because Nvidia was just following their own foot steps in design. GTX480 1.5GB > GTX580 1.5GB > GTX680 2GB.


Ummm, I think they may have slightly more of a handle on things than that


----------



## rcfc89

Quote:


> Originally Posted by *JoshMck*
> 
> Only problem I've come across so far came from a driver update to 310.90, did anyone have an issue where frames went to absolute ****? 30 fps? Games were unplayable. Currently rolling back.
> 
> Any ideas?


The .90 driver had lots of issues with the 690. They just released a revised version yesterday morning fixing the 690 problems. I was running .70 up until yesterday because of it. Downloaded .90 last night and everything is running great now for me.


----------



## JoshMck

Quote:


> Originally Posted by *rcfc89*
> 
> The .90 driver had lots of issues with the 690. They just released a revised version yesterday morning fixing the 690 problems. I was running .70 up until yesterday because of it. Downloaded .90 last night and everything is running great now for me.


Maybe I had an older link, because I downloaded today from an old download that was cancelled.

So I will check again tomorrow









but from 200 fps to 30 fps I was a little worried! haha


----------



## Rei86

Quote:


> Originally Posted by *zer0sum*
> 
> Ummm, I think they may have slightly more of a handle on things than that


Handle on what?

The GTX590 was 1.5GB also.

Its just the way they do things. They increased 1GB total since the GTX280.

EDIT: Besides the cold boot issues that the 690s also suffered with the 310.90, they had performance issues too. Good luck getting the right 310.90 tho as they secretly patched it and you have a chance of getting the old 310.9 vs the new 310.9...


----------



## rcfc89

Quote:


> Originally Posted by *Rei86*
> 
> Handle on what?
> 
> The GTX590 was 1.5GB also.
> 
> Its just the way they do things. They increased 1GB total since the GTX280.
> 
> EDIT: Besides the cold boot issues that the 690s also suffered with the 310.90, they had performance issues too. Good luck getting the right 310.90 tho as they secretly patched it and you have a chance of getting the old 310.9 vs the new 310.9...


This one worked for me as of yesterday. The one I tried last week would not boot properly at all.
http://www.nvidia.com/object/win8-win7-winvista-64bit-310.90-whql-driver.html


----------



## jcde7ago

@Rei86: I'm adding you now!


----------



## armando666

Quote:


> Originally Posted by *JoshMck*
> 
> Maybe I had an older link, because I downloaded today from an old download that was cancelled.
> 
> So I will check again tomorrow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but from 200 fps to 30 fps I was a little worried! haha


Try the link from this post over on evga forums

http://us.download.nvidia...international-whql.exe


----------



## Rei86

Quote:


> Originally Posted by *jcde7ago*
> 
> @Rei86: I'm adding you now!


/hug

If you wait till probably the end of this week you can get the patched 310.9 that has most issues fixed.


----------



## PinzaC55

Quote:


> Originally Posted by *VulgarDisplay88*
> 
> I won an auction for a GTX 690 on the 20th December that was misspelled (got it for £355) but it still hasn't turned up yet. Seller told me that it is in my country but it's unable to be tracked so I've opened a case to see if they can help me out. I'm thinking that he is angry that he got such a low price for it and has just said that it's been sent when it hasn't.


Sorry dude but you will never see that card. The only good side is that you will get your money back from Paypal or Ebay.


----------



## zer0sum

Just curios if anyone tried plugging one of these in with an 8 and then just a 6 pin PCI-e power cable?

Getting bored waiting for my new PSU to arrive


----------



## Rei86

Quote:


> Originally Posted by *zer0sum*
> 
> Just curios if anyone tried plugging one of these in with an 8 and then just a 6 pin PCI-e power cable?
> 
> Getting bored waiting for my new PSU to arrive


All 690s come with a 8pin too two 4pin molex converter.. I do not and everyone here probably agrees against to what your thinking about to do.


----------



## Scorpion49

Quote:


> Originally Posted by *zer0sum*
> 
> Just curios if anyone tried plugging one of these in with an 8 and then just a 6 pin PCI-e power cable?
> 
> Getting bored waiting for my new PSU to arrive


It doesn't work. Some cards with 8-pin will power with 6-pin but this one will not. I'm using one 8[-pin and one 6-pin and one molex to 6 pin to power mine right now. If you don't have them all plugged it will give you a warning at POST and not continue on.


----------



## qiplayer

Nvidia puts 2gb of vram because users will change the card sooner. Why spend more and get less income potential for the next future?
It's in my opinion only marketing.
About vram... Nowadays we are used to HD displays. Apple is the first with micro-retina on computer displays. Other will follow. I believe in 2 years we will have on the market resolutions that are double as the actual ones. Most of all for the 25-28" displays.

And still there will be no monitor without bezel for who wants surround or eyefinity
















I think it's absurd that a gamer can't have on the market the power needed by gpu's for surround.
In this moment often tri or quad sli means alot of work on it and all kind of issues.

As said in another thread, let's say apples starts to develop a console. It will blow away all the ones that are on the market and in planning. Consoles for now are very important because these make a reference. Users don't have to worry and spend extra time and cash to make them work.
But in my opinion a console coming out in the next year should be able to handle 3D AT LEAST in HD. This would mean a higher standard where to start from for all the pc cpu's gpu's and gaming developement. Prices will drop and consumers will have more for less.


----------



## maximus56

Quote:


> Originally Posted by *JoshMck*
> 
> Only problem I've come across so far came from a driver update to 310.90, did anyone have an issue where frames went to absolute ****? 30 fps? Games were unplayable. Currently rolling back.
> 
> Any ideas?


I had issues with game fps when booting in to Windows 8, but did not have any problems on Windows 7. I have not tried the new patch that apparently came out, as I have been fairly busy with my work travels.
If someone has successfully tested the new patch, please do share your experience for the benefit of people like me who have not had an opportunity to do so. Thanks.


----------



## Rei86

Quote:


> Originally Posted by *qiplayer*
> 
> Nvidia puts 2gb of vram
> _[snip/I]
> quote]
> 
> That's because most of the people probably thinks surround setup is stupid vs having a single large monitor, IE the group of people that owns 50 + inch screens at 1920x1080 thinking its the same thing.
> 
> Also the screen cartel pretty much controls and manipulates the market, main reason why the screen market is so stagnet._


----------



## Alex132

Just got my 690 back from Canada and installed in my rig.

Holy crap. This thing is stupidly fast. Coming from a 5870, I can actually play DX11 games now


----------



## PinzaC55

So in other words possibly Nvidia have given the GTX 690 "built in redundancy" so that it doesn't overshadow later single GPU card


----------



## Alex132

EVGA Geforce GTX logo brightness adjustment

Seems to only work on EVGA BIOS 690's.

Awesome tool though!


----------



## Scorpion49

Anyone using the Accelero cooler with this card? Trying to see if there are any sagging issues and if the backplate is compatible.


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone using the Accelero cooler with this card? Trying to see if there are any sagging issues and if the backplate is compatible.


http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2700_100#post_18538363

http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2800_100#post_18738869


----------



## Scorpion49

Quote:


> Originally Posted by *Rei86*
> 
> http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2700_100#post_18538363
> 
> http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/2800_100#post_18738869


Oh nice! Were you able to keep it under 70*C? How about overclock?


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Oh nice! Were you able to keep it under 70*C? How about overclock?


Installation - Was easy, the paper that they give you kind of sucks however. You get everything you need to install it right away.

Sagging - yes you'll get sagging. The cooler is about as heavy as the whole GTX690 stock. However like in my post they give you foam sticks to support it. If its too short you can do the ghetto way and get some paper towel cardboard tubes.

Temps - Under full load (STOCK EVERYTHING BTW) the cards hit 75 on GPU 1 and 71 on GPU 2. With the Turbo max on GPU1 was around 50~60, and GPU2 was always right behind. HOWEVER the silence was golden.

Noise - Which brings us to this one. Every review of the 690 IMHO was a lie. At idle yes its silent but under load the fans are loud. This setup was silent at idle and load.


----------



## Qu1ckset

Quote:


> Originally Posted by *Scorpion49*
> 
> Oh nice! Were you able to keep it under 70*C? How about overclock?


for that kind money $169.99.. id rather watercool it, my gtx690 doesn't go over 35degrees, when with the stock cooler i was hitting 80 degrees


----------



## Rei86

Quote:


> Originally Posted by *Qu1ckset*
> 
> for that kind money $169.99.. id rather watercool it, my gtx690 doesn't go over 35degrees, when with the stock cooler i was hitting 80 degrees


I'm WC My CPU and GPU however

169.99 for Turbo 690

WC setup round numbers

1.99 For the Water
6.99 For kill coil
59.99 for Radiator
15.99 for Tubing
79.99 for Pump/Res combo
129.99 for GPU Block (Cheapest)
294.94 Total before shipping


----------



## Scorpion49

Quote:


> Originally Posted by *Rei86*
> 
> I'm WC My CPU and GPU however
> 
> 169.99 for Turbo 690
> 
> WC setup round numbers
> 
> 1.99 For the Water
> 6.99 For kill coil
> 59.99 for Radiator
> 15.99 for Tubing
> 79.99 for Pump/Res combo
> 129.99 for GPU Block (Cheapest)
> 294.94 Total before shipping


Its $99 now, which is why I was considering it. I might give it a shot, I want to try and fit the EVGA backplate as well.


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Its $99 now, which is why I was considering it. I might give it a shot, I want to try and fit the EVGA backplate as well.


damn that's a huge dip in price, good buy for that price


----------



## Scorpion49

Hmm, so now I just re-installed the 310.90 drivers and my card is no longer downclocking at idle. Weird.


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Hmm, so now I just re-installed the 310.90 drivers and my card is no longer downclocking at idle. Weird.


310.90 with the GTX690 should've had its "secret" patch

http://www.evga.com/forums/fb.ashx?m=1837622

might wanna post in this thread
http://www.evga.com/forums/tm.aspx?&m=1830605&mpage=4

Or go to the nvidia forums and check in on this thread
https://forums.geforce.com/default/topic/527753/geforce-drivers/nvidia-310-90-geforce-sli-multi-gpu-cold-boot-gpu-issue/


----------



## Scorpion49

Quote:


> Originally Posted by *Rei86*
> 
> 310.90 with the GTX690 should've had its "secret" patch
> 
> http://www.evga.com/forums/fb.ashx?m=1837622
> 
> might wanna post in this thread
> http://www.evga.com/forums/tm.aspx?&m=1830605&mpage=4
> 
> Or go to the nvidia forums and check in on this thread
> https://forums.geforce.com/default/topic/527753/geforce-drivers/nvidia-310-90-geforce-sli-multi-gpu-cold-boot-gpu-issue/


I figured it out, had to delete and do a clean install as I just moved from Z77 to X79 as well, something got out of whack.


----------



## zer0sum

Seems I got lucky and my $700 ebay GTX690 is legit:




Now I just need to buy a water block to make it run cool and quiet, and get a bit of an over clock on this thing


----------



## Methos07

The cold boot issue is driving me nuts, lol. Nothing like seeing a default vga resolution on a dell u3011...


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Methos07*
> 
> The cold boot issue is driving me nuts, lol. Nothing like seeing a default vga resolution on a dell u3011...


oh roll back buddy, not worth the hassle.


----------



## Scorpion49

New personal best score with the new rig, 3930k at 4.5ghz and GTX 690 at +100/+300

P17090


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> New personal best score with the new rig, 3930k at 4.5ghz and GTX 690 at +100/+300
> 
> P17090


Damn that 3930K is really helping you out.

Whats your Core Mhz?


----------



## Scorpion49

Quote:


> Originally Posted by *Rei86*
> 
> Damn that 3930K is really helping you out.


Yeah, 3k just from the Physics score







Gotta love MOAR THREADZ. Card tops out at 1163mhz but usually 1150.


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, 3k just from the Physics score
> 
> 
> 
> 
> 
> 
> 
> Gotta love MOAR THREADZ. Card tops out at 1163mhz but usually 1150.


Damn...that 3930k is _really_ helping you out

I was gonna go with the X79 but the price turned me off...oh well. Next round its either the 1090 (or whatever AMD calls it) or I'm gonna pony up for the Intel X89 (I guess that's what the name might be).


----------



## bobbavet

*ADD ME*

For your viewing pleasure.


----------



## Arizonian

Quote:


> Originally Posted by *bobbavet*
> 
> For your viewing pleasure.


One hell of a mod!







Great job creating something unique.


----------



## mrod

I just ordered the GTX 690







it will be replacing two 6970's in xfire . Is it safe to say that the 690 will smoke the two 6970's ?


----------



## Scorpion49

Quote:


> Originally Posted by *mrod*
> 
> I just ordered the GTX 690
> 
> 
> 
> 
> 
> 
> 
> it will be replacing two 6970's in xfire . Is it safe to say that the 690 will smoke the two 6970's ?


Yes. You would need tri-fire 6970's at least to compare them.


----------



## pilla99

Quote:


> Originally Posted by *mrod*
> 
> I just ordered the GTX 690
> 
> 
> 
> 
> 
> 
> 
> it will be replacing two 6970's in xfire . Is it safe to say that the 690 will smoke the two 6970's ?


A lot of this depends on the game and what it is optimized for.
IE Crysis you will see huge gains but Metro 2033 you might actually do worse or the same.


----------



## Scorpion49

Which GPU on the 690 is considered #1 and 2? My #1 GPU is reaching 90*C+ in games while #2 is only around 70*C.


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Which GPU on the 690 is considered #1 and 2? My #1 GPU is reaching 90*C+ in games while #2 is only around 70*C.


90c on the stock cooler man? That's a bit dangerous... Case airflow fine? Did you take your card apart? If so did you put it back on fine?

GPU1 is the left, GPU2 on the right


----------



## Scorpion49

Quote:


> Originally Posted by *Rei86*
> 
> 90c on the stock cooler man? That's a bit dangerous... Case airflow fine? Did you take your card apart? If so did you put it back on fine?
> 
> GPU1 is the left, GPU2 on the right


Yeah I figured out the problem, my front fans are blowing the hot CPU air straight into the card (off of my Kraken X60 front mounted). I've made a patented Cardboard Deflector and screwed it to the back of the card, we'll see what happens now. My accelero is on order so hopefully it will be good when I change it out.


----------



## JoshMck

On stock air I never got above 85 degrees.

BTW you can get that 3930k above 3.8 GHz as a stable gaming clock. Mine sits at 3.95ghz stably 1.49 v. No restriction on my 690s.









Congrats on the 690 mrod!


----------



## grunion

The hottest mine has been over the last 3 days, extended BF3 and FC3 sessions.


----------



## JoshMck

Quote:


> Originally Posted by *grunion*
> 
> The hottest mine has been over the last 3 days, extended BF3 and FC3 sessions.


That's some really good temps. Is that on the stock cooler?

Water cooling so now I never get above 35 degrees


----------



## Alex132

I only get about max 78'c on GPU1, and 76'c on GPU2.

This is with a ~35'c ambient temperature


----------



## Scorpion49

Quote:


> Originally Posted by *grunion*
> 
> The hottest mine has been over the last 3 days, extended BF3 and FC3 sessions.


I just removed the cooler and re-applied some Shin-etsu G751 and new thermal pads on the VRM's and the stock paste-ish stuff was bone dry and crumbling.

Quote:


> Originally Posted by *JoshMck*
> 
> On stock air I never got above 85 degrees. I was seeing a max of 91*C on GPU1 and 85*C on GPU 2 before, even with a VERY aggressive and loud fan profile.
> 
> BTW you can get that 3930k above 3.8 GHz as a stable gaming clock. Mine sits at 3.95ghz stably 1.49 v. No restriction on my 690s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats on the 690 mrod!


If that was directed at me, I run my 3930k at 4.5ghz 1.225v. My chip is not a good clocker (won't even post above 4.7ghz) but it will run 4.5 and lower at very low voltage.


----------



## Alex132

Quote:


> Originally Posted by *Scorpion49*
> 
> I just removed the cooler and re-applied some Shin-etsu G751 and new thermal pads on the VRM's and the stock paste-ish stuff was bone dry and crumbling.


Been wanting to do this for awhile, but is the 690 harder than any other normal GPU to disassemble?


----------



## Scorpion49

Quote:


> Originally Posted by *Alex132*
> 
> Been wanting to do this for awhile, but is the 690 harder than any other normal GPU to disassemble?


Yes and no. I took the whole thing apart, the shroud is like a jigsaw puzzle. But if you just take all the back screws out and flip it over it will come off as one unit. My temps improved dramatically, the stock paste was very dry. My #1 is now maxing at 77*C and #2 at 70*C. Dropped 15*C on both.


----------



## V3teran

Have you people used the new KGB mod release by CrazyNuts?
Version 6.1 was released a few days ago and now has adjustable boost clocks.
https://www.dropbox.com/s/d6dimgnx8cxvlca/kgb_0.6.1.zip

Here is a chart that you can use for your boost clocks that are configured in the .cfg file.

These have been coded in by CrazyNuts....choose one.
Im thinking around the 1254 maybe beings as some reviewers had cards that went around this mark when the cards was released.

771.0
784.0
797.0
810.5
823.5
836.5
849.5
862.5
875.5
888.5
901.5
915.0
928.0
941.0
954.0
967.0
980.0
993.0
1006.0
1019.5
1032.5
1045.5
1058.5
1071.5
1084.5
1097.5
1110.5
1124.0
1137.0
1150.0
1163.0
1176.0
1189.0
1202.0
1215.0
1228.5
1241.5
1254.5
1267.5
1280.5
1293.5
1306.5
1319.5
1333.0
1346.0
1359.0
1372.0
1385.0
1398.0
1411.0
1424.0


----------



## Alex132

Just took this picture quickly, thought it turned out pretty well









Wish I remembered to turn the GeForce GTX logo on though!


----------



## noob.deagle

is it worth raising the power limit on the GTX690 even if your not overclocking ?, i notice on mine both cores seem to have different max boosts which is kinda weird.


----------



## Alex132

Quote:


> Originally Posted by *noob.deagle*
> 
> is it worth raising the power limit on the GTX690 even if your not overclocking ?, i notice on mine both cores seem to have different max boosts which is kinda weird.


Yes, raising it to something like 135% will mean that the max boost clock will either be higher, or more consistently higher.
Eg; If your boost clock is peaking at 1058Mhz on GPU1 and 1071Mhz on GPU2, but only for a very brief second, the power limit % may mean that the GPUs will be able to stay at those boost speeds constantly - rather than quickly touching them.

Easiest way to 'overclock' the GPUs.


----------



## Rei86

Quote:


> Originally Posted by *Scorpion49*
> 
> Yes and no. I took the whole thing apart, the shroud is like a jigsaw puzzle. But if you just take all the back screws out and flip it over it will come off as one unit. My temps improved dramatically, the stock paste was very dry. My #1 is now maxing at 77*C and #2 at 70*C. Dropped 15*C on both.


I knew the prodigy's have **** airflow, since my AMD A8 in mine reached 40+ on the stock box cooler.

TBH man 77 and 70 is really how. Kepler's throttle at 70 > and beyound so ....

The 690 in my case after I only replaced the temps never reached over 69deg on GPU1 so I don't know what more you can do for yours...


----------



## Alex132

Quote:


> Originally Posted by *grunion*
> 
> The hottest mine has been over the last 3 days, extended BF3 and FC3 sessions.


Wait, how are you at 140% GPU Power?


----------



## FiShBuRn

Modded bios i guess...

Heres mine, also with modded bios, but with 135% or 150% is the same, my max overclock without crash is 1215.



This is after playing some Battlefield 64 Player map.


----------



## RandomHer0

Add me please









EVGA


----------



## Alex132

Quote:


> Originally Posted by *FiShBuRn*
> 
> Modded bios i guess...
> 
> Heres mine, also with modded bios, but with 135% or 150% is the same, my max overclock without crash is 1215.
> 
> 
> 
> This is after playing some Battlefield 64 Player map.


Only 59'c on both GPUs? What is your ambient?
Also +560mhz on memory?! What is the normal 690 memory overclock?!


----------



## FiShBuRn

Dunno that for sure but maybe 15/20c.

I think +500 memory OC is normal, you dont need a modded bios for that...


----------



## Alex132

Quote:


> Originally Posted by *FiShBuRn*
> 
> Dunno that for sure but maybe 15/20c.
> 
> I think +500 memory OC is normal, you dont need a modded bios for that...


You noticed any improvement with the memory overclock?


----------



## grunion

Quote:


> Originally Posted by *JoshMck*
> 
> That's some really good temps. Is that on the stock cooler?
> 
> Water cooling so now I never get above 35 degrees


Air, cracked window, north breeze









Quote:


> Originally Posted by *Alex132*
> 
> I only get about max 78'c on GPU1, and 76'c on GPU2.
> 
> This is with a ~35'c ambient temperature


Find that very hard to believe with a 35°c ambient.

Quote:


> Originally Posted by *Alex132*
> 
> Wait, how are you at 140% GPU Power?


Correct, modded bios.

Quote:


> Originally Posted by *FiShBuRn*
> 
> Modded bios i guess...
> 
> Heres mine, also with modded bios, but with 135% or 150% is the same, my max overclock without crash is 1215.
> 
> 
> 
> This is after playing some Battlefield 64 Player map.


Same here, modded or not 1215 is my max stable boost.

My usage during BF3 is closer to 99% always.


----------



## FiShBuRn

Quote:


> Originally Posted by *Alex132*
> 
> You noticed any improvement with the memory overclock?


Only in 3dmark, this is my latest score http://www.3dmark.com/3dm11/5639035

grunion what resolution do you use?


----------



## grunion

1200p


----------



## FiShBuRn

hmmm...and you got 99% usage everytime? in every map?


----------



## grunion

Quote:


> Originally Posted by *FiShBuRn*
> 
> hmmm...and you got 99% usage everytime? in every map?


Nope I was wrong,,

95-99 in Caspian

75-85 in Armored Kill

Definitely not every map.

Here is my best 3d11 score to date, 21348 gpu score.


----------



## MrTOOSHORT

Nice score grunion!

You just need 6 core SB-E to kick my butt in overall.


----------



## grunion

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Nice score grunion!
> 
> You just need 6 core SB-E to kick my butt in overall.


I might take it work and pop it in my RIVE, see what happens with that.


----------



## Scorpion49

Quote:


> Originally Posted by *grunion*
> 
> I might take it work and pop it in my RIVE, see what happens with that.


You'll gain about 3k overall, thats what I got switching from 3570k to 3930k.

In other news, I have to undervolt and underclock my card to keep it under 90*C again, I thought I had fixed it but remember I turned FC3 settings to medium so the fan wouldn't be at 75%+ all the time. This card is really awful as far as heat, my 590 was nowhere near this bad even OC'd as far as it would go.


----------



## Rei86

Quote:


> Originally Posted by *grunion*
> 
> I might take it work and pop it in my RIVE, see what happens with that.


With a 3770k and stock nothing touched all of the 690s I've owned have hit around 14XXX~15XXX. between the 3930k and 3770k the CPU score is overall anywhere from 1~2k lower and combined score is what really hurts the overall score in 3DMarks11 with the 3770k vs 3930k.

However in some gaming test I've asked for between another 690 owner and myself in the 3930 vs 3770; they only have a sliver of difference.


----------



## Kaapstad

Quote:


> Originally Posted by *grunion*
> 
> I might take it work and pop it in my RIVE, see what happens with that.


I have noticed using the same GTX 690 in my RIVE/3960x and then in my RIIIE/975, the graphics scores are about 300 points higher in the RIIIE/975 than they are in the RIVE/3960x.

When it comes to total score there is no contest, the RIVE/3960x wins easy.


----------



## grunion

Quote:


> Originally Posted by *Kaapstad*
> 
> I have noticed using the same GTX 690 in my RIVE/3960x and then in my RIIIE/975, the graphics scores are about 300 points higher in the RIIIE/975 than they are in the RIVE/3960x.
> 
> When it comes to total score there is no contest, the RIVE/3960x wins easy.


I noticed the same thing with a RIVE/7970 combo









Example


----------



## Arizonian

Quote:


> Originally Posted by *grunion*
> 
> Nope I was wrong,,
> 
> 95-99 in Caspian
> 
> 75-85 in Armored Kill
> 
> Definitely not every map.
> 
> Here is my best 3DMark11 score to date, 21348 gpu score.


Wait did I miss something? You got a GTX 690? Congrats.









Your score was very impressive must be some northern wind.


----------



## Scorpion49

I finally broke 20k GPU score, +125/500.

P17308 http://www.3dmark.com/3dm11/5643217


----------



## Cheesemaster

here is my new score with .90 drivers....


----------



## Alex132

Any clues as to why I am getting such a low score?
Especially the combined score!


----------



## Alex132

Set PhysX to CPU only in the nvidia panel and my (combined) score increase massively;









Wat


----------



## Rei86

3DMarks11 PhysX isn't the same Ageia PhysX used in video games, and benefits more from CPU calculations vs GPU.

EDIT: Nvidia rep video games


----------



## Alex132

Quote:


> Originally Posted by *Rei86*
> 
> 3DMarks11 PhysX isn't the same Ageia PhysX used in video games, and benefits more from CPU calculations vs GPU.


Yeah it uses the havok engine or something along those lines. Just heard that 3D Mark 11 can have troubles properly recognizing SLI, wondering if that fixed it.


----------



## Kaapstad

Quote:


> Originally Posted by *Cheesemaster*
> 
> here is my new score with .90 drivers....


Best drivers I found for this and quad sli are 301.42


----------



## Tech09

Quote:


> Originally Posted by *rcfc89*
> 
> If I was working with 800 dollars worth of hardware I might give it a shot. But with over 3k worth of hardware I'll put it into the hands of professionals that do it for a living.


I still dont get why you dont just do it your self... My rig is over 8K and i play around with it my self







I'm just starting to master the CPU overclocking before i go over to my GPU's:thumbsups


----------



## pilla99

Quote:


> Originally Posted by *Tech09*
> 
> I still dont get why you dont just do it your self... My rig is over 8K and i play around with it my self
> 
> 
> 
> 
> 
> 
> 
> I'm just starting to master the CPU overclocking before i go over to my GPU's:thumbsups


Looking at your sig I have no idea how your rig would cost 8k.
2 690's
a 1k CPU
... erm

Unless that's gold plated I am trying to figure out where 3k extra would come from.


----------



## FiShBuRn

Quote:


> Originally Posted by *Cheesemaster*
> 
> here is my new score with .90 drivers....


With only one 690? Thats impressive...

Edit: nvm i saw your pic, its quad sli


----------



## Rei86

Quote:


> Originally Posted by *pilla99*
> 
> Looking at your sig I have no idea how your rig would cost 8k.
> 2 690's
> a 1k CPU
> ... erm
> 
> Unless that's gold plated I am trying to figure out where 3k extra would come from.


Does it matter? I'm sure the owners probably rotated out a lot of items and is keeping tabs on thos items as well. Not only that his system is WC and WC adds up super fast.


----------



## ceteris

My scores went up slightly with the new drivers

3930K @ 4.8GHz, GTX 690 @ +200/500

P18760 on 310.70: http://www.3dmark.com/3dm11/5442770

P18860 on 310.90: http://www.3dmark.com/3dm11/5653540
Quote:


> Originally Posted by *pilla99*
> 
> Looking at your sig I have no idea how your rig would cost 8k.
> 2 690's
> a 1k CPU
> ... erm
> 
> Unless that's gold plated I am trying to figure out where 3k extra would come from.


He's from Europe (Norway) so I assume he's paying a bit higher for his parts.


----------



## grunion

Quote:


> Originally Posted by *Arizonian*
> 
> Wait did I miss something? You got a GTX 690? Congrats.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your score was very impressive must be some northern wind.


Probably not for very much longer, think I'm going to trade for a couple of Tahitis

Quote:


> Originally Posted by *ceteris*
> 
> My scores went up slightly with the new drivers
> 
> 3930K @ 4.8GHz, GTX 690 @ +200/500
> 
> P18760 on 310.70: http://www.3dmark.com/3dm11/5442770
> 
> P18860 on 310.90: http://www.3dmark.com/3dm11/5653540
> He's from Europe (Norway) so I assume he's paying a bit higher for his parts.


Nice..

What does +200 give you for max boost.

Side note for all owners, do you guys actually see core throttling >70°c?


----------



## ceteris

Quote:


> Originally Posted by *grunion*
> 
> What does +200 give you for max boost.
> 
> Side note for all owners, do you guys actually see core throttling >70°c?


1228 on GPU1 @ 9.5C deltaT and 1241 on GPU2 @ 12C deltaT

When I was on air, throttling was pretty bad when I tried OC'ing it. I couldn't get anywhere near half the OC I have now w/o blasting up the AC to help with the ambient. Then I get my wife complaining about how cold it is.

At first I thought I just got stuck with one of those weak binned GPU's but putting her under water made a huge difference. Currently at 27.5C ambient, temps barely tickle under 40C at full load and performs very happily.


----------



## grunion

Quote:


> Originally Posted by *ceteris*
> 
> 1228 on GPU1 @ 9.5C deltaT and 1241 on GPU2 @ 12C deltaT
> 
> When I was on air, throttling was pretty bad when I tried OC'ing it. I couldn't get anywhere near half the OC I have now w/o blasting up the AC to help with the ambient. Then I get my wife complaining about how cold it is.
> 
> At first I thought I just got stuck with one of those weak binned GPU's but putting her under water made a huge difference. Currently at 27.5C ambient, temps barely tickle under 40C at full load and performs very happily.


Awesome...

BTW mine never throttles









Side note, check this out.


----------



## Tech09

Quote:


> Originally Posted by *pilla99*
> 
> Looking at your sig I have no idea how your rig would cost 8k.
> 2 690's
> a 1k CPU
> ... erm
> 
> Unless that's gold plated I am trying to figure out where 3k extra would come from.


Simple dude







here in Norway things are a little more expensive then in US or in other ****ies. The GTX690 for exampel cost 999$ in US here it costs over 1250 $...


----------



## Tech09

Quote:


> Originally Posted by *Rei86*
> 
> Does it matter? I'm sure the owners probably rotated out a lot of items and is keeping tabs on thos items as well. Not only that his system is WC and WC adds up super fast.


Quote:


> Originally Posted by *ceteris*
> 
> My scores went up slightly with the new drivers
> 
> 3930K @ 4.8GHz, GTX 690 @ +200/500
> 
> P18760 on 310.70: http://www.3dmark.com/3dm11/5442770
> 
> P18860 on 310.90: http://www.3dmark.com/3dm11/5653540
> He's from Europe (Norway) so I assume he's paying a bit higher for his parts.


Thats right.. PC parts are more expesive here


----------



## CryptiK

Same here, EVA GTX690 = $1399

and the AUD is worth over 1 USD lol, we get ripped off so badly.


----------



## ceteris

Quote:


> Originally Posted by *CryptiK*
> 
> Same here, EVA GTX690 = $1399
> 
> and the AUD is worth over 1 USD lol, we get ripped off so badly.


NP! You Aussies are loaded with money! Quite a few Austrailian companies here in LA. Westfield being the most recognizable owning most of the real estate in the county I live in lol


----------



## PinzaC55

If anybody is interested there are 2 GTX 690's for auction on Ebay UK, though one is a seller with Zero feedback.








One ends in 7 hours, the other 5 days.


----------



## VulgarDisplay88

I bought a Xonar DGX from the seller with zero feedback (prazkan8516) It turned up today but I haven't tested it so I haven't left feedback yet. Good ebayer though, sent it quickly and communication was good.

EDIT:

Oh and I opened a case with Ebay about that GTX 690 that I bought from that Danish guy and they're giving him until Thursday to respond.

Really annoyed that I won't be getting a GTX 690


----------



## Alex132

I seem to only be able to get +120Mhz on my GPU








Odd how I can get like +135Mhz 'stable' in 3DMark11 but in Far Cry 3 that crashes almost instantly, so I have just stuck with +120.

I am guessing it is part to do with the temps, as I get ~80-82'c load at that. (Mid 30'c ambient = eugh)
Was wondering if I should replace the stock TIM?


----------



## PinzaC55

Quote:


> Originally Posted by *VulgarDisplay88*
> 
> I bought a Xonar DGX from the seller with zero feedback (prazkan8516) It turned up today but I haven't tested it so I haven't left feedback yet. Good ebayer though, sent it quickly and communication was good.
> 
> EDIT:
> 
> Oh and I opened a case with Ebay about that GTX 690 that I bought from that Danish guy and they're giving him until Thursday to respond.
> 
> Really annoyed that I won't be getting a GTX 690


Oh I couldn't remember who posted about the GTX 690 which didn't turn up, which was what I was thinking about when I noted zero feedback. If he has posted the item I guess he must be good to go. I always buy expensive items through an ebay shop if possible because I got scammed twice in my attempts to buy a Samsung Galaxy Note 2 last year and if it hadn't been for Buyer Protection I would have been ruined.


----------



## thedaman

Quote:


> Originally Posted by *VulgarDisplay88*
> 
> I bought a Xonar DGX from the seller with zero feedback (prazkan8516) It turned up today but I haven't tested it so I haven't left feedback yet. Good ebayer though, sent it quickly and communication was good.
> 
> EDIT:
> 
> Oh and I opened a case with Ebay about that GTX 690 that I bought from that Danish guy and they're giving him until Thursday to respond.
> 
> Really annoyed that I won't be getting a GTX 690


Hey,
Thank's for the PM, I can only send 2 a day so yeah...
So I guess he's a trusted seller haha! Thanks for assuring me.


----------



## VulgarDisplay88

Quote:


> Originally Posted by *thedaman*
> 
> Hey,
> Thank's for the PM, I can only send 2 a day so yeah...
> So I guess he's a trusted seller haha! Thanks for assuring me.


Yeah prazkan8516 was a good seller unlike the Danish guy who ripped me off.


----------



## AllGamer

i never buy expensive stuff over ebay because of that

i don't mind buying cheap cell phone or car accessories

the good thing about ebay & paypal is that you'll 90% of the time get your money back, after much trouble, unfortunately


----------



## PinzaC55

Quote:


> Originally Posted by *AllGamer*
> 
> i never buy expensive stuff over ebay because of that
> 
> i don't mind buying cheap cell phone or car accessories
> 
> the good thing about ebay & paypal is that you'll 90% of the time get your money back, after much trouble, unfortunately


I have always got my money back without a quibble. Generally there are a few golden rules to follow, one of the most important being not to assume that because somebody has a few hundred feedback that they are kosher - you need to check that the feedback is recent, ie they didn't stop selling about 8 months ago and have suddenly started selling radically different things - this is usually a sign of a "hijacked" account. Plus if something seems too cheap to be true , it probably isn't true.


----------



## Cheesemaster

With the latest nvidia drivers I overclocked with PX and got 140mhz core 550mhz mem stable.. runs all games and I am running quad sli.. best overclocking drivers since cd that came with the cards... Which drivers are you using?


----------



## JoshMck

Quote:


> Originally Posted by *Tech09*
> 
> Simple dude
> 
> 
> 
> 
> 
> 
> 
> here in Norway things are a little more expensive then in US or in other ****ies. The GTX690 for exampel cost 999$ in US here it costs over 1250 $...


yeah, in Australia prices are stupid, about 1400 for my 690's, so two of those + water blocks is a bit ridiculous. They were so worth it though saying I wanted a quad gpu configuration in an mATX case haha


----------



## over-x16

Hello,

here is my EVGA gtx 690 with my current build on RV02


----------



## AllGamer

is that with, or without OC ?
Quote:


> Originally Posted by *Cheesemaster*
> 
> here is my new score with .90 drivers....


----------



## Cheesemaster

That is with over clock.. 140mhz core, 550mhz mem over clock.. Im using SB-E @ 5.0ghz overclock with corsair 2400mhz memory cas 9 command rate 1


----------



## DamnVicious

Hi guys! I want to join the club! Here is my GTX 690 watercooled with a Heatkiller Water block!

My first computer build and loop which I built together with my girlfriend


----------



## PinzaC55

DamnVicious, killer looking rig there


----------



## Chili195

Hi folks. Just wondering if someone could do me a favour. Since ASUS do not have any plans to update their GTX 690 BIOS for UEFI/GOP support I was wondering if it would be possible to update my card with the BIOS for the reference EVGA GTX 690 which they are handing out VBIOSes for upon request.

If anyone has these BIOSes on their card, please could they copy it and send it/upload it. I would be eternally grateful! Thanks.


----------



## VulgarDisplay88

Looks like I won't be joining the club.

Received an email from Ebay saying "We didn't receive tracking information from the seller. We're sorry you have a problem with your purchase, and we're issuing you with a refund in this case"

I'm glad that I have been refunded but I'm gutted that I haven't got a GTX 690.

I'll most likely buy another Asus GTX 670 for SLI or save up for a GTX 690.


----------



## PinzaC55

Sorry to hear that but at least you will get your money back. There's still another one on ebay currently at £560 from a guy with 49 feedback, all positive.


----------



## Rei86

I was gonna ask how much you paid for the 690 on EBay but you're located in the UK.

Gonna put up my EVGA GTX690 Sig up for sale soon, but can't on this website, have it on the EVGA forums and gonna go up on US Ebay.


----------



## zer0sum

Quote:


> Originally Posted by *Rei86*
> 
> I was gonna ask how much you paid for the 690 on EBay but you're located in the UK.
> 
> Gonna put up my EVGA GTX690 Sig up for sale soon, but can't on this website, have it on the EVGA forums and gonna go up on US Ebay.


I paid $700 inc shippping and insurance but I definitely got lucky as it was misspelled and I was the only bidder.

They go for more like $800-900+ typically, but you can check the most recent completed listings really easily


----------



## Rei86

Quote:


> Originally Posted by *zer0sum*
> 
> I paid $700 inc shippping and insurance but I definitely got lucky as it was misspelled and I was the only bidder.
> 
> They go for more like $800-900+ typically, but you can check the most recent completed listings really easily


Yeah I'm looking 850 shipped for mine with the EVGA Backplate + AC Turbo 690


----------



## shremi

Is there a place that has the backplate in stock ????


----------



## TeamBlue

I will be editing this post with pics... I will be joining this club in a couple of hours! Needless to say I'm pretty excited.


----------



## qiplayer

Quote:


> Originally Posted by *VulgarDisplay88*
> 
> Looks like I won't be joining the club.
> 
> Received an email from Ebay saying "We didn't receive tracking information from the seller. We're sorry you have a problem with your purchase, and we're issuing you with a refund in this case"
> 
> I'm glad that I have been refunded but I'm gutted that I haven't got a GTX 690.
> 
> I'll most likely buy another Asus GTX 670 for SLI or save up for a GTX 690.


Wait a few mounth for gtx 780 it will ve like 2 670 but with 6gb of vram


----------



## jhager8783

I recently purchased this rig as bare bones. After getting everything ordered and installed, I replaced my 670 GTX FTW SLI setup with a 690 GTX and kept one of the 670 GTX FTW as a dedicated PhysX card. I'd like to join the ranks of enthusiast by joining this 690 GTX owners club.


----------



## PinzaC55

Quote:


> Originally Posted by *shremi*
> 
> Is there a place that has the backplate in stock ????


Heatkiller http://www.frozencpu.com/products/17554/ex-blc-1364/HEATKILLER_Geforce_GTX_690_Reference_Design_GPU_Backplate_16005.html

EK http://www.overclockers.co.uk/showproduct.php?prodid=WC-221-EK


----------



## shremi

Quote:


> Originally Posted by *PinzaC55*
> 
> Heatkiller http://www.frozencpu.com/products/17554/ex-blc-1364/HEATKILLER_Geforce_GTX_690_Reference_Design_GPU_Backplate_16005.html
> 
> EK http://www.overclockers.co.uk/showproduct.php?prodid=WC-221-EK


I meant the evga so i could use it with the koolance block

I am trying to overclock this card but it seems that it is impossible even at +100 i get crashes .... is there something i am missing ?????


----------



## jcde7ago

Member list has been updated with our newest additions below!

- over-x16
- DamnVicious
- jhager8783
- Sweeper101

Welcome to the club guys!


----------



## ozrek

A.) 

B.) EVGA

i7 3930K
DX79SR
AX1200i

90 mm SLI bridge enroute


----------



## jcde7ago

Quote:


> Originally Posted by *ozrek*
> 
> A.)
> 
> B.) EVGA
> 
> i7 3930K
> DX79SR
> AX1200i
> 
> 90 mm SLI bridge enroute


Awesome, added!

Also, *RandomHer0* was added!


----------



## DamnVicious

Quote:


> Originally Posted by *jcde7ago*
> 
> Member list has been updated with our newest additions below!
> 
> - over-x16
> - DamnVicious
> - jhager8783
> - Sweeper101
> 
> Welcome to the club guys!


Thanks for adding me up! Just wanted to note that my GTX 690 is from Inno3D. Wasn't able to mention it in my post. Thanks!


----------



## Alex132

I am kinda getting annoyed how loud this thing can get.

Don't take me wrong, probably one of the quietest stock cooling designs I have heard, let alone the quietest dual-card stock cooling. It's just that most of my rig is dead silent. And I mean I have set up the fans to not even turn on unless it gets too hot, for the most part my fans on my RX360 rad don't spin unless I render/run IBT/play a game for awhile.

I have heard reviewers state before that 75% fan speed is totally 'quiet' and able to live with, but yeah not for me









Basically it boils down to me wanting to watercool this card. Just a few questions, will the X20 750lph XSPC pump be enough for the CPU + GPU, will my RX360 rad be enough, and what kind of benefits (apart from noise) will I see? Extra overclocking?


----------



## PhantomTaco

Quote:


> Originally Posted by *Alex132*
> 
> I am kinda getting annoyed how loud this thing can get.
> 
> Don't take me wrong, probably one of the quietest stock cooling designs I have heard, let alone the quietest dual-card stock cooling. It's just that most of my rig is dead silent. And I mean I have set up the fans to not even turn on unless it gets too hot, for the most part my fans on my RX360 rad don't spin unless I render/run IBT/play a game for awhile.
> 
> I have heard reviewers state before that 75% fan speed is totally 'quiet' and able to live with, but yeah not for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Basically it boils down to me wanting to watercool this card. Just a few questions, will the X20 750lph XSPC pump be enough for the CPU + GPU, will my RX360 rad be enough, and what kind of benefits (apart from noise) will I see? Extra overclocking?


I'm not quite sure that an RX360 (slim profile) rad will be enough for your CPU and the 690 together. The 690 is around 300W, and add another 100W for the 2500k and then a good bit more for the OC you have and you've looking at well over 400W. You might want to add another 120mm rad or something to keep your temps reasonable. Also, in the case of the 690 from my experience, watercooling may improve your OC a bit, but you're more limited by the voltage regulation than you are by thermals.


----------



## Methos07

Going to mention to the 690 enthusiasts here that I am now selling my card and can be removed from the club. Decided to part out my rig and follow other pursuits, thanks for this great thread with loads of information.

If anyone happens to be interested in having another 690, check my for sale thread. Take care.


----------



## Alex132

Quote:


> Originally Posted by *PhantomTaco*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I am kinda getting annoyed how loud this thing can get.
> 
> Don't take me wrong, probably one of the quietest stock cooling designs I have heard, let alone the quietest dual-card stock cooling. It's just that most of my rig is dead silent. And I mean I have set up the fans to not even turn on unless it gets too hot, for the most part my fans on my RX360 rad don't spin unless I render/run IBT/play a game for awhile.
> 
> I have heard reviewers state before that 75% fan speed is totally 'quiet' and able to live with, but yeah not for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Basically it boils down to me wanting to watercool this card. Just a few questions, will the X20 750lph XSPC pump be enough for the CPU + GPU, will my RX360 rad be enough, and what kind of benefits (apart from noise) will I see? Extra overclocking?
> 
> 
> 
> I'm not quite sure that an RX360 (slim profile) rad will be enough for your CPU and the 690 together. The 690 is around 300W, and add another 100W for the 2500k and then a good bit more for the OC you have and you've looking at well over 400W. You might want to add another 120mm rad or something to keep your temps reasonable. Also, in the case of the 690 from my experience, watercooling may improve your OC a bit, but you're more limited by the voltage regulation than you are by thermals.
Click to expand...

The RX360 is the thick radiator:










http://www.xs-pc.com/products/radiators/rx-series/rx360-triple-fan-radiator/


----------



## ozrek

Quote:


> Originally Posted by *jcde7ago*
> 
> Awesome, added!
> 
> Also, *RandomHer0* was added!


Sweet, thanks - excited to be here!


----------



## Cheesemaster

My new heaven score


----------



## PinzaC55

God almighty I haven't looked at your specs but that must be 2 X GTX 690's?


----------



## Cheesemaster

Quote:


> Originally Posted by *PinzaC55*
> 
> God almighty I haven't looked at your specs but that must be 2 X GTX 690's?


Yes sir it is... Here is my new P score with new betas.. Note the 39000 graphics score, these drivers; COME WITH EXTRA CHEESE SAUCE!!


----------



## Arizonian

Well I'm not too excited to download the new beta drivers yet but this performance upgrade to Crysis 3 which isn't out yet for the GTX 690.
Quote:


> New in GeForce R313 Drivers
> 
> Performance Boost - Increases performance for GeForce 400/500/600 series GPUs in several PC games vs. GeForce 310.90 WHQL-certified drivers. Results will vary depending on your GPU and system configuration:
> 
> *GeForce GTX 690: Up to 65% in Crysis 3*


Curious what the performance was prior. I'm happy to see the dev team giving the proper attention to upcoming Crysis 3 which is a good sign for a game I'm highly anticipating.


----------



## JoshMck

Quote:


> Originally Posted by *Alex132*
> 
> The RX360 is the thick radiator:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xs-pc.com/products/radiators/rx-series/rx360-triple-fan-radiator/


I would say it still isn't enough.

Too properly cool an overclocked processor and GPU under heavy loads you will need to have 2x360 rads to be safe.

I run quad SLI with an overclocked 3930k watercooled and have 1x480 and 1x360 and still think that is a bit under what I should have because I would like to see much colder temps on my CPU at the overclock I run daily. (4.9ghz) Graphics temps are fine but without the extra rad processor will get hot.


----------



## JoshMck

Quote:


> Originally Posted by *Cheesemaster*
> 
> Yes sir it is... Here is my new P score with new betas.. Note the 39000 graphics score, these drivers; COME WITH EXTRA CHEESE SAUCE!!


My 3D mark won't do a full pass for some reason anymore







and when it does my cpu score is destroyed down to about 7 fps for some reason. I am blaming my dead air conditioner and 30+ degrees in my office haha.

Great score btw!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *JoshMck*
> 
> My 3D mark won't do a full pass for some reason anymore
> 
> 
> 
> 
> 
> 
> 
> and when it does my cpu score is destroyed down to about 7 fps for some reason. I am blaming my dead air conditioner and 30+ degrees in my office haha.
> 
> Great score btw!


Where does it freeze at in 3dmark11? If it's at the last cpu/graphics test, then drop your ram down to 2133Mhz if it isn't already to get 3dmarkk11 to go.


----------



## drek

Is a 500w power supply enough for one gtx 690?


----------



## pilla99

Quote:


> Originally Posted by *drek*
> 
> Is a 500w power supply enough for one gtx 690?


Yes: http://www.overclock.net/t/1290091/gtx-690-true-power-measurement


----------



## drek

Quote:


> Originally Posted by *pilla99*
> 
> Yes: http://www.overclock.net/t/1290091/gtx-690-true-power-measurement


Really eh, thats good news...just bought a rog tytan cg8480 computer and thinking im pulling the gtx 660 it came with and replacing it with a gtx 690. Althought it only has a standard asus 500w psu.

Thanks for the info!


----------



## pilla99

Quote:


> Originally Posted by *drek*
> 
> Really eh, thats good news...just bought a rog tytan cg8480 computer and thinking im pulling the gtx 660 it came with and replacing it with a gtx 690. Althought it only has a standard asus 500w psu.
> 
> Thanks for the info!


I would only recommend that if the PSU is 80w + certified.


----------



## drek

Also, will the current power plugs running my 660 will work on the 690 correct?


----------



## drek

Quote:


> Originally Posted by *pilla99*
> 
> I would only recommend that if the PSU is 80w + certified.


How do i check that?


----------



## RandomHer0

Quote:


> Originally Posted by *drek*
> 
> How do i check that?


Will be on packaging/product overview of any PSU that has it as it is a big selling point for a PSU


----------



## drek

Quote:


> Originally Posted by *drek*
> 
> Also, will the current power plugs running my 660 will work on the 690 correct?


http://www.bestbuy.ca/en-CA/product/asus-asus-republic-of-gamers-tytan-gaming-pc-intel-core-i7-3770k-128gb-ssd-3tb-hdd-16gb-ram-windows-8-cg8480-ca001s/10227758.aspx?path=eeab732bac83cae30a8f0120e93b185fen02

will be going in that computer, how do i tell what size power supply will fit in that computer if i do need to change it?


----------



## drek

Quote:


> Originally Posted by *RandomHer0*
> 
> Will be on packaging/product overview of any PSU that has it as it is a big selling point for a PSU


Its not 80+ then, so do i need to chance my psu?


----------



## RandomHer0

Chancing your PSU could very well mean chancing all your components as well. A good PSU should be a priority for any high end build. I'd recommend upgrading, not too expensive to get a decent one.


----------



## pilla99

Quote:


> Originally Posted by *drek*
> 
> Its not 80+ then, so do i need to chance my psu?


It's simple math here.

From the thread i showed you, the computer will likely draw around 400 max when gaming. That is with my system but it should be a safe bet either way.
So.

400w is need to be safe. Don't OC anything in your system including the card when you get it to help stay in this range.
Next you need to figure out what the efficiency is.

80% efficiency on a 500w PSU means you can guarantee 400w. So if you are under that you probably actually can't run it safely.

So nevermind I don't think this is actually going to work for you.


----------



## drek

Quote:


> Originally Posted by *pilla99*
> 
> It's simple math here.
> 
> From the thread i showed you, the computer will likely draw around 400 max when gaming. That is with my system but it should be a safe bet either way.
> So.
> 
> 400w is need to be safe. Don't OC anything in your system including the card when you get it to help stay in this range.
> Next you need to figure out what the efficiency is.
> 
> 80% efficiency on a 500w PSU means you can guarantee 400w. So if you are under that you probably actually can't run it safely.
> 
> So nevermind I don't think this is actually going to work for you.


Can you point me to a power supply that would fit in my pc? is a Asus CG8480 and im not sure what power supply I would have to buy to replace it.


----------



## drek

Quote:


> Originally Posted by *RandomHer0*
> 
> Chancing your PSU could very well mean chancing all your components as well. A good PSU should be a priority for any high end build. I'd recommend upgrading, not too expensive to get a decent one.


How do i know if i have to change something else?


----------



## drek

http://www.newegg.ca/Product/Product.aspx?Item=N82E16817159082

Would that psu fit in my computer?


----------



## pilla99

Quote:


> Originally Posted by *drek*
> 
> Can you point me to a power supply that would fit in my pc? is a Asus CG8480 and im not sure what power supply I would have to buy to replace it.


This is all you should need
http://www.newegg.com/Product/Product.aspx?Item=N82E16817152041

This like bare minimum though.


----------



## Cheesemaster

I got a EVGA SuperNova 1500w, check out one of their psu's.. they have a great warranty program and support system...


----------



## drek

Is my standard asus power supply most likely considered the same size as a atx?


----------



## Rei86

Quote:


> Originally Posted by *RandomHer0*
> 
> Chancing your PSU could very well mean chancing all your components as well. A good PSU should be a priority for any high end build. I'd recommend upgrading, not too expensive to get a decent one.


This.
Quote:


> Originally Posted by *drek*
> 
> How do i check that?


On the side of the PSU you'll need to look for the specs. All PSU come with specs on the sides posting out how much watts it is and amp to each line.

Remember also that Nvidia gives us a nebulus number, because we don't know how many HDDs they have, how many fans they are running, what MOBO, CPU and what OC they might have on it, etc etc when they give us a wattage number for their test bench.

If you go here
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690/specifications
Nvidia states that GPU is 300w and minimum PSU of 650w is recommend. However from reviews and even pilla99 thread and you don't need 650w PSU to run the GTX690 as his total system watts didn't pull more than 390w on full load but since Nvidia is a large corp and needs to cover their ass when consumers **** up their card they recommend 650w PSU.

So what do you do? Find a PSU that can at least supply a 38A on the 12volt line if you want to be safe and not end up a burnt out GPU, CPU, Mobo etc of your system.


----------



## RandomHer0

Quote:


> Originally Posted by *drek*
> 
> How do i know if i have to change something else?


Don't mean you need to change anything else, just that an underpowered/low quality PSU can be harmful to a system, and so you should make sure you get a good quality one. Anything from Corsair/silverstone etc will be fine (if it is of sufficient wattage)


----------



## RandomHer0

Quote:


> Originally Posted by *drek*
> 
> Can you point me to a power supply that would fit in my pc? is a Asus CG8480 and im not sure what power supply I would have to buy to replace it.


is this your pc from behind? Side panels look riveted down. Never had experience with a black box pc (pre-built by a company), but I'm not sure how easy it will be access the internals unless you are comfortable doing more than simply 'unscrewing'. If you do manage to get inside, the cable management of a pre-built might be an absolute ***** to remove old power cables, and put in new ones. I actually have no idea what it will be like on the inside. Something to consider though. Replacing the PSU of your PC will not be anywhere near as straight forward as replacing the PSU of a custom built PC. Sorry


----------



## drek

Here is my current, PSU. I took a few panels off already to prepare for a new one. Doesnt look to bad actually.

http://imgur.com/lDkazzo,NX68hcP,OqQteVg


----------



## drek

How much variance can i have from the old voltages to the new....or must they match?


----------



## drek

Going to buy this power supply.

http://www.newegg.ca/Product/Product.aspx?Item=N82E16817139005

Pefect voltage match to my old one

http://imgur.com/lDkazzo,NX68hcP,OqQteVg

Thanks for all your help. GTX 690 inbound!


----------



## shremi

Ok so i just installed my card fired up farcry 3 and after 20 - 30 min tops the temps are 94-95 on the gpu 1 and 90 on the gpu 2

Is this normal ???? Its very chilly where i live


----------



## Rei86

Quote:


> Originally Posted by *shremi*
> 
> Ok so i just installed my card fired up farcry 3 and after 20 - 30 min tops the temps are 94-95 on the gpu 1 and 90 on the gpu 2
> 
> Is this normal ???? Its very chilly where i live


Replace the TIM.

Seriously the last few new owners have been reporting high temps, and I've noticed a lot of them when people post has crusty ass, dried up TIM.


----------



## pilla99

Quote:


> Originally Posted by *shremi*
> 
> Ok so i just installed my card fired up farcry 3 and after 20 - 30 min tops the temps are 94-95 on the gpu 1 and 90 on the gpu 2
> 
> Is this normal ???? Its very chilly where i live


I'm guessing you aren't OC'd.
Either you have really bad cooling // case setup or something isn't right if you are on stock clocks.


----------



## shremi

I tried to re tim the card opened up my side window set the fan to the highest possible speed ran heaven for a 3 rounds and the temps are still high 80s low 90s .... (the rest of my system is watercooled) this card was also supposed to go underwater but i was trying rto figure out if the card worked properly or not....

What do you guys think RMA ???? this card is brand new i got it last week


----------



## PinzaC55

Quote:


> Originally Posted by *pilla99*
> 
> I'm guessing you aren't OC'd.
> Either you have really bad cooling // case setup or something isn't right if you are on stock clocks.


Definitely the card. When I run Far Cry 3 on Extreme settings mine never goes above 60 degrees in a HAF-X. I have recently replaced the side panel + fan with a HAF-X window panel (no fan) and there has been virtually no difference in temperature.


----------



## JoshMck

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Where does it freeze at in 3dmark11? If it's at the last cpu/graphics test, then drop your ram down to 2133Mhz if it isn't already to get 3dmarkk11 to go.


It is on the last test. So I will try that.


----------



## Alex132

Argh, the 690 screws are so easy to thread.

Any tips on removing a threaded screw?


----------



## DamnVicious

Quote:


> Originally Posted by *Alex132*
> 
> Argh, the 690 screws are so easy to thread.
> 
> Any tips on removing a threaded screw?


Dremel a small straight line on the screw itself (just be careful not to hit the pcb) and remove it with a standard flat screw driver. That is how I removed my threaded torx screws off my gtx 690.


----------



## DamnVicious

similar to this:


----------



## Alex132

Quote:


> Originally Posted by *DamnVicious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Argh, the 690 screws are so easy to thread.
> 
> Any tips on removing a threaded screw?
> 
> 
> 
> Dremel a small straight line on the screw itself (just be careful not to hit the pcb) and remove it with a standard flat screw driver. That is how I removed my threaded torx screws off my gtx 690.
Click to expand...

eugh yeah, gonna have to find a place to rent or a friend who has one of those


----------



## Rei86

Quote:


> Originally Posted by *shremi*
> 
> I tried to re tim the card opened up my side window set the fan to the highest possible speed ran heaven for a 3 rounds and the temps are still high 80s low 90s .... (the rest of my system is watercooled) this card was also supposed to go underwater but i was trying rto figure out if the card worked properly or not....
> 
> What do you guys think RMA ???? this card is brand new i got it last week


Time to RMA my friend.


----------



## shremi

Quote:


> Originally Posted by *Rei86*
> 
> Time to RMA my friend.


I did already thanks for the suggestions tho

And the saddest part is that since i am not from the us and only going to be in vegas for the weekend ill have to cross ship the card and paying $1000 upfront


----------



## Rei86

Quote:


> Originally Posted by *shremi*
> 
> I did already thanks for the suggestions tho
> 
> And the saddest part is that since i am not from the us and only going to be in vegas for the weekend ill have to cross ship the card and paying $1000 upfront


EVGA? At least they allow such an option vs others, and they'll refund you once they get their hands on your card.


----------



## pilla99

Just played the Crysis 3 demo with the new 313.96 drivers. On high settings at 2560x1440 I can hold 60fps almost always.
Game is absolutely terrible, another cod clone appealing to casual gamers but it looks pretty good I guess.


----------



## qiplayer

It is normal that the temps are high. If I saw it correctly your case has the psu o the top, and it spins hot air in, instead of out.
Make shure you have fan taking the air in o the bottom and out on the top. If not the card warms up the whoke case.
I wouldn't recommend such a gpu for that pc. You saved on the pc and get the most expensive card?
Maybe Im wrong.

Cards I suggest are for example 2 gtx 670 but all depends on resolution. If you have HD one gtx 680 should be enough.
If you manage to have enough air the gigabyte windforce is the best overall, if you have a bad air circulation Get a normal one that spins out.
Just my opinion.


----------



## drek

Quote:


> Originally Posted by *pilla99*
> 
> Just played the Crysis 3 demo with the new 313.96 drivers. On high settings at 2560x1440 I can hold 60fps almost always.
> Game is absolutely terrible, another cod clone appealing to casual gamers but it looks pretty good I guess.


Agreed, game looks somewhat cool, graphics are nice. Game itself sucks.


----------



## TheMadHerbalist

A) 
B) EVGA

pick of the my old config before i rebuild my system.

here they are now


----------



## OmniScience

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> A)
> B) EVGA
> 
> pick of the my old config before i rebuild my system.
> 
> here they are now


Good stuff. I'll be running Razor blocks on my two 690's soon enough. Cant wait to finally drop everything in water!


----------



## AllGamer

i like the blue and white theme









now it's making me re-evaluate my rig, and spend some $ to swap out all the blue fans, to orange/yellow fan to match the rest of the case theme









Quote:


> Originally Posted by *TheMadHerbalist*
> 
> A)
> B) EVGA
> 
> pick of the my old config before i rebuild my system.
> 
> here they are now


----------



## AllGamer

Quote:


> Originally Posted by *jcde7ago*
> 
> I just added you...and great setup! But dat cable management...lol.


finally got some spare time to Manage the cables









updated pics










no much i can do with the side panel fan wires, no place to hide them


----------



## thunderdom77

Totally agree with you man. I think all the haters are gonna hate against people with bad ass rigs. Until you actually have built one "right" and play on one they should ****!


----------



## thunderdom77

why do you keep referring to the gtx690 as only having 2gb vram when it has 4gb vram. 2gb vram per GPU???


----------



## Qu1ckset

Quote:


> Originally Posted by *thunderdom77*
> 
> why do you keep referring to the gtx690 as only having 2gb vram when it has 4gb vram. 2gb vram per GPU???


ya it only has 2GB usable vram because it the 4GB is split between the two gpu's


----------



## Rei86

Quote:


> Originally Posted by *thunderdom77*
> 
> why do you keep referring to the gtx690 as only having 2gb vram when it has 4gb vram. 2gb vram per GPU???


Because SLI isn't additive.

When you have GPU 1 with 3GB and a GPU2 with 3GB you do not get 6GB total. It stays parallel and you have only 3GB.


----------



## pilla99

Quote:


> Originally Posted by *Rei86*
> 
> Because SLI isn't additive.
> 
> When you have GPU 1 with 3GB and a GPU2 with 3GB you do not get 6GB total. It stays parallel and you have only 3GB.


Even worse is if you have one card at 4GB and one at 2GB for example, the 4GB draws down to 2 so that it can match. I made that mistake with my old comp when I had a 2GB 6870 and then CX'd it with a 1GB version.


----------



## MrHamm

Hey guys,

Does the GTX690 work in full windows mode?

I used to have xfire 5850's and it was a nightmare. Could only be used in Full Screen mode.

I usually play in windows mode so I can "mouse-over" to my left and right flanking monitors.


----------



## Sugi

Could someone please point me in the right direction of a working mini-displayport to HDMI. I bought some already and none of them work. They're both white and I dislike the white color strongly, but I couldn't find a black one or one that works. I already have 3 DVI plugged into my 690 right now. Please help me out.

Mini DisplayPort to HDMI Adapter Cable, 6 feet
http://www.amazon.com/gp/product/B003OC6LWM/ref=oh_details_o04_s00_i00

eForCity Mini Display Port to DVI Male / Female Adapter
http://www.amazon.com/gp/product/B002DHSIWK/ref=oh_details_o04_s00_i01
[These do not work.]

MrHamm,
I haven't had any issues with Windows Mode, or Fullscreen Windows Mode with my 690. I sometimes force Fullscreen Windows Mode with GameCompanion. It's a nice application to correct gamma, hotkey access ingame, and of course, Fake Fullscreen Windows mode.
http://oblivion.nexusmods.com/mods/39550
[You can use it with any game though, not just oblivion.]


----------



## MrHamm

Thanks Sugi,

You guys really think its worth the $1000.00 price tag for 2gigs of ram?


----------



## Arizonian

Quote:


> Originally Posted by *MrHamm*
> 
> Thanks Sugi,
> 
> You guys really think its worth the $1000.00 price tag for 2gigs of ram?


Depends on your set up. 2GB VRAM for my single 120 Hz monitor is enough. No different than two 680 2GB in SLI for $1000, until pricing comes down on the 680's anyway.


----------



## Sugi

Quote:


> Originally Posted by *MrHamm*
> 
> Thanks Sugi,
> 
> You guys really think its worth the $1000.00 price tag for 2gigs of ram?


It depends on the games you want to play, have I ran into an issue with Vram alone? No, not at all. Some games play better with one equipment over the others.
http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-7.html


----------



## Viewer3

I know this completely off-topic here, but does anyone else here use the HDMI audio out on the 690? I've been posting around different forums (including this one) looking for a solution as to why my 690's HDMI audio pops and clicks while the GPU is under any sort of load. Before I send it back I think seeing even one person say that they use their HDMI out with no issue would make me feel better about RMA'ing it, lol.


----------



## Qu1ckset

Quote:


> Originally Posted by *MrHamm*
> 
> Thanks Sugi,
> 
> You guys really think its worth the $1000.00 price tag for 2gigs of ram?


for my single 1440p display, it works perfect!


----------



## qiplayer

Quote:


> Originally Posted by *MrHamm*
> 
> Hey guys,
> 
> Does the GTX690 work in full windows mode?
> 
> I used to have xfire 5850's and it was a nightmare. Could only be used in Full Screen mode.
> 
> I usually play in windows mode so I can "mouse-over" to my left and right flanking monitors.


I think you just need to disable an option, by the executable file you wanna run. For example crysis.exe,
Go to properties, and disable the option that resizes the window for high dpi values.
I'ts a stupid windows feature, I did 2 reinstall before finding out this option was messing up the view.


----------



## MrHamm

It's been almost a year since the GTX 600 series has been out. Would it be wise to wait for the next series?

I just feel that playing a $1000.00 for 2gigs or Vram is kinda low.....


----------



## Rei86

Quote:


> Originally Posted by *MrHamm*
> 
> It's been almost a year since the GTX 600 series has been out. Would it be wise to wait for the next series?
> 
> I just feel that playing a $1000.00 for 2gigs or Vram is kinda low.....


Whats your screen resolution?


----------



## Sugi

Quote:


> Originally Posted by *Rei86*
> 
> Whats your screen resolution?


I would guess 3x1440p from his signature.


----------



## Rei86

Quote:


> Originally Posted by *Sugi*
> 
> I would guess 3x1440p from his signature.


ugh I keep on forgetting about the rig builder of this forum.

And he's got two 1920x1080 and one 2560x1440 screen listed.

Odd resolution if s/he does really have that in a surround setup. Either was if its just 2560x1440 the GTX690 does fine.


----------



## Sugi

Haha! Opps, I thought they were all the same resolutions. I guess I should take notes from you and myself and actually read the rig builder. XDD


----------



## Shaitan

Has anyone else had a bad experience with their 690? I am now about to receive my 4th device from EVGA. I think I have spent more time waiting on RMA's and using my 680 than I have actually using my 690 since I bought it in October. First one had gpu clocks that would constantly fluctuate all over the place, second one would not do surround, third one had a bad fan upon receiving it yesterday. I am really trying to decide if I wanna keep the 4th one when it comes in, assuming that it works properly at all.


----------



## Alex132

Quote:


> Originally Posted by *Shaitan*
> 
> Has anyone else had a bad experience with their 690? I am now about to receive my 4th device from EVGA. I think I have spent more time waiting on RMA's and using my 680 than I have actually using my 690 since I bought it in October. First one had gpu clocks that would constantly fluctuate all over the place, second one would not do surround, third one had a bad fan upon receiving it yesterday. I am really trying to decide if I wanna keep the 4th one when it comes in, assuming that it works properly at all.


1) The GPU clocks are meant to change, that's GPU boost







(Also at 70'c it downclocks by 13mhz, 80'c by 26mhz and 90'c by 39mhz, while undervolting I believe). That is a normal Kepler thing.

2) That sounds like drivers more than anything, providing you were using the correct ports etc.

3) Well, that's rare lol.

Sounds like terrible luck for you!








Luckily I have have good luck with my 690, I keep thinking of this thing as a single card. But realistically for a dual card that can overclock this well and keep this silent / cool just amazes me. Also looks damn fine.


----------



## AllGamer

Quote:


> Originally Posted by *Alex132*
> 
> 1) The GPU clocks are meant to change, that's GPU boost
> 
> 
> 
> 
> 
> 
> 
> (Also at 70'c it downclocks by 13mhz, 80'c by 26mhz and 90'c by 39mhz, while undervolting I believe). That is a normal Kepler thing.
> 
> 2) That sounds like drivers more than anything, providing you were using the correct ports etc.
> 
> 3) Well, that's rare lol.
> 
> Sounds like terrible luck for you!
> 
> 
> 
> 
> 
> 
> 
> 
> Luckily I have have good luck with my 690, I keep thinking of this thing as a single card. But realistically for a dual card that can overclock this well and keep this silent / cool just amazes me. Also looks damn fine.


Quote:


> Originally Posted by *Shaitan*
> 
> Has anyone else had a bad experience with their 690? I am now about to receive my 4th device from EVGA. I think I have spent more time waiting on RMA's and using my 680 than I have actually using my 690 since I bought it in October. First one had gpu clocks that would constantly fluctuate all over the place, second one would not do surround, third one had a bad fan upon receiving it yesterday. I am really trying to decide if I wanna keep the 4th one when it comes in, assuming that it works properly at all.


additional to what Alex already mentioned

I just wanted to chime in so you wont get discouraged by those minor set backs.

i'm also using two EVGA GTX 690

been using it since mid embedder of last year, everything working fine right off the box, purchased it retail while on sale Xmas promo prices


----------



## Sugi

Allgamer, how cheap was it during the christmas sale?


----------



## Shaitan

Quote:


> Originally Posted by *Alex132*
> 
> 1) The GPU clocks are meant to change, that's GPU boost
> 
> 
> 
> 
> 
> 
> 
> (Also at 70'c it downclocks by 13mhz, 80'c by 26mhz and 90'c by 39mhz, while undervolting I believe). That is a normal Kepler thing.
> 
> 2) That sounds like drivers more than anything, providing you were using the correct ports etc.
> 
> 3) Well, that's rare lol.
> 
> Sounds like terrible luck for you!
> 
> 
> 
> 
> 
> 
> 
> 
> Luckily I have have good luck with my 690, I keep thinking of this thing as a single card. But realistically for a dual card that can overclock this well and keep this silent / cool just amazes me. Also looks damn fine.


1. Yeah I expected it to operate like my 680, but when the clocks at sub 70 degree temps are constantly fluctuating up to 100 Mhz, it doesn't seem right. Especially when it does it multiple times per second.

2. That's what I thought at first as well, but after trying multiple driver versions while using driver fusion to clean the old ones off I began to suspect otherwise. It was always monitor 3 no matter how I had my connections however, so that does point more toward drivers.

3. This last one really threw me for a loop. The fan would run at normal speed and I could not adjust it via precision x at all. It would report that the fan speed would raise, but it would not actually spin up any faster.

I love the looks of the card, although it will go under water once I get a working one and I do like the fact that I get the performance of two cards while only taking up two slots instead of four.


----------



## AllGamer

Quote:


> Originally Posted by *Sugi*
> 
> Allgamer, how cheap was it during the christmas sale?


high 800ish to 900ish

although some guys here they said they got some on high 700ish

anything less than the MRSP of 1000ish is good


----------



## AllGamer

just posted this over at the surround view thread, but since this is concerning the GTX 690, might be a good read for you guys as well:
Quote:


> finally had time to install the accessory monitor, just picked a nice BenQ LED 27" on sale
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but it was a PITA to get it recognized on dual GTX 690, a quick google search returned a bunch of useless info, and jumped from site to site toms hardware, to EVGA forum. to nVidia, here, in OCN, to everywhere to no avail, until i came across this http://itslick.com/blogs/index.php/ITslick/news/#item_69
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so the info from the System Requirement site is correct, just the stupid drivers not doing what it's supposed to
> 
> go it working with DVI as indicated on the Geforce Site, without the need to use the mini-Display Port to DVI as suggested on toms hardware and other sites.
> 
> the instruction from nvVidia here http://www.geforce.com/optimize/guides/how-to-correctly-configure-geforce-gtx-680-surround#1 are actually wrong as well


----------



## jcde7ago

*TheMadHerbalist* has been added!

Also, *Methos07* has departed the club! Best of luck on your next GPU venture!


----------



## Sugi

AllGamer, I was able to get 6 monitors working, 4 from the 690 and 2 from the MB on a chipset of z68. However, I am currently using 690 + 430 for the 6 monitor display. I prefer this over the MB support, because I have the able to change gamma, color, etc per monitor with Nvidia control panel. Setting up the fourth monitor wasn't too bad, however I did buy a couple of cabling. For example, mini displayport to X adapters. MDP To HDMI and another cable that escapes my memory right now would not work correctly, but a MDP to DVI female did it for me. It did take me a while to troubleshoot this, but the cost was minimal at the price point of 10 dollars for all cabling plus an HDMI cable. For drivers, lately it hasn't been bad at all. However before, setting up surround vision would make most people cry, at least it did for me.

jcde7ago, Please consider my entry for this club with this photo of proof.


----------



## Mezza1989

Hi Guys,

Only bought my GTX 690 yesterday but i have a question. When running the MSI Combustor stress test, as you can see from the screenshot below i am only get 4%-5% on GPU1 and also the power on GPU2 on the right frequently goes above 100%.

Is this normal?

Thanks

Mezza1989


----------



## Alex132

Quote:


> Originally Posted by *Mezza1989*
> 
> Hi Guys,
> 
> Only bought my GTX 690 yesterday but i have a question. When running the MSI Combustor stress test, as you can see from the screenshot below i am only get 4%-5% on GPU1 and also the power on GPU2 on the right frequently goes above 100%.
> 
> Is this normal?
> 
> Thanks
> 
> Mezza1989


Are you in full-screen?

Windowed apps only run with 1 GPU.


----------



## Mezza1989

I feel stupid now, it was in windowed mode. What about the power, is it ok to be above 100% as my other GPU rarely goes above 10-15%

Thanks for your help.

Mezza1989


----------



## Sugi

Quote:


> Originally Posted by *Alex132*
> 
> Are you in full-screen?
> 
> Windowed apps only run with 1 GPU.


I have never heard of that before. Can you provide links?


----------



## Alex132

Quote:


> Originally Posted by *Mezza1989*
> 
> I feel stupid now, it was in windowed mode. What about the power, is it ok to be above 100% as my other GPU rarely goes above 10-15%
> 
> Thanks for your help.
> 
> Mezza1989


Because you're only stressing 1 GPU?

Each GPU can go from 1% to 135% TDP depending on how much work it is doing.
More work = more power to the GPU.
Quote:


> Originally Posted by *Sugi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Are you in full-screen?
> 
> Windowed apps only run with 1 GPU.
> 
> 
> 
> I have never heard of that before. Can you provide links?
Click to expand...

Links? Links to what? I thought it was a commonly known fact


----------



## Sugi

Quote:


> Originally Posted by *Alex132*
> 
> Because you're only stressing 1 GPU?
> 
> Each GPU can go from 1% to 135% TDP depending on how much work it is doing.
> More work = more power to the GPU.
> Links? Links to what? I thought it was a commonly known fact


Nope, I have never heard of anything like that before. Do you have proof?


----------



## Alex132

Quote:


> Originally Posted by *Sugi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Because you're only stressing 1 GPU?
> 
> Each GPU can go from 1% to 135% TDP depending on how much work it is doing.
> More work = more power to the GPU.
> Links? Links to what? I thought it was a commonly known fact
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nope, I have never heard of anything like that before. Do you have proof?
Click to expand...

Well crossfire only works in fullscreen.
SLI used to only work in fullscreen too, but they have changed it.

Most programs can utilize SLI with windowed mode, however some can't (Kombustor etc), even when programs do utilize SLI in windowed mode there is a very noticeable decrease in performance.

source


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Sugi*
> 
> Nope, I have never heard of anything like that before. Do you have proof?


It's true, there is your proof.


----------



## Sugi

Quote:


> Originally Posted by *Alex132*
> 
> Are you in full-screen?
> 
> *Windowed apps only run with 1 GPU*.


Quote:


> Originally Posted by *Alex132*
> 
> Well crossfire only works in fullscreen.
> SLI used to only work in fullscreen too, but they have changed it.
> 
> *Most programs can utilize SLI with windowed mode*, however some can't (Kombustor etc), even when programs do utilize SLI in windowed mode there is a very noticeable decrease in performance.
> 
> source


How does that make any sense? First you say "*they [as in application] can only run 1 gpu*" at a time, and then you say "*most programs can utilize sli with windowed mode*". Which is it? Your source is just lmgtfu, and that isn't helpful at all. If it is common knowledge, then show me. I was always under the impression that it takes more to render the desktop and the game at the same time that is why it runs slower in windows mode then fullscreen mode. You stated something different, that is why I am interested.


----------



## Alex132

Quote:


> Originally Posted by *Sugi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Are you in full-screen?
> 
> *Windowed apps only run with 1 GPU*.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Well crossfire only works in fullscreen.
> SLI used to only work in fullscreen too, but they have changed it.
> 
> *Most programs can utilize SLI with windowed mode*, however some can't (Kombustor etc), even when programs do utilize SLI in windowed mode there is a very noticeable decrease in performance.
> 
> source
> 
> Click to expand...
> 
> How does that make any sense? First you say "*they [as in application] can only run 1 gpu*" at a time, and then you say "*most programs can utilize sli with windowed mode*". Which is it? Your source is just lmgtfu, and that isn't helpful at all. If it is common knowledge, then show me. I was always under the impression that it takes more to render the desktop and the game at the same time that is why it runs slower in windows mode then fullscreen mode. You stated something different, that is why I am interested.
Click to expand...

I didn't know that Nvidia patched SLI to work with most programs into windowed mode.
I have used ATI for most of my life (this is my first desktop Nvidia GPU). Therefore I didn't care for Nvidia patch/driver changes.
I was basing my knowledge on what I knew with CrossfireX and of what I knew of SLI. I didn't know SLI had changed.

There is no one real source for things like this, it's not necessary for someone to provide a source to their knowledge on everything they say. You don't inquire about a source's source now do you? And if you say that's because they did their own work, how do you know I didn't? (Of which I did, and found that Heaven 3.5 used about 50-60% on each GPU, with very sporadic usage when in windowed mode. While Kombustor used 80-90% of 1 GPU, while 0% of the other GPU).

On top of that how hard can it be to Google the problem? The internet has tons of data on almost anything and everything, so why not use it in more way than asking on a slow-respond forum?


----------



## MrHamm

Quote:


> Originally Posted by *Sugi*
> 
> I would guess 3x1440p from his signature.


Hey guys,

Ya my main gaming monitor is 2560x1440.

The left and right monitors are in portrait mode and only used for 2D applications (web browsing, Windows Media player, temp programs etc).

I play in windows mode so I can "mouse over" if I need to look something up while playing.

I'm thinking selling this and getting the rumored Nvidia Titan......


----------



## samoth777

Hi guys. I have a friend who wants to build a dream machine using 2 690s in quad SLI configuration. He has a spare Gigabyte GA-Z77X-UD5H-WB WIFI http://www.gigabyte.com/products/product-page.aspx?pid=4439#ov lying around which he wants to use for the build. He also has the i7-3770K which he plans on using for it. My question is, will this board be fine for a 2x 690 quad SLI setup? Might get the h100i in push pull for some good overclocking on the 3770k. Please advise! Thanks guys!


----------



## qiplayer

Quote:


> Originally Posted by *Shaitan*
> 
> Has anyone else had a bad experience with their 690? I am now about to receive my 4th device from EVGA. I think I have spent more time waiting on RMA's and using my 680 than I have actually using my 690 since I bought it in October. First one had gpu clocks that would constantly fluctuate all over the place, second one would not do surround, third one had a bad fan upon receiving it yesterday. I am really trying to decide if I wanna keep the 4th one when it comes in, assuming that it works properly at all.


I wasted 2000$ and more on 2 gtx 690.
The 2 680 always worked better, then I changed mobo and cpu to have 2 pcie lanes at full speed, but nothing changed, lastly I upgraded the psu from 850 to muuch more.

With the 690 crysis 2 and bf3 where unplayable.
Evga even if I asked, closed my request while I was one week in holidays.
So I gave up with double gpu cards.

The problems I encountered where the imagine slowing down drastically even with 90fps, and a lot of stuttering, it was the same on 2 different mobos, p8p67deluxe from asus and x79 ud3 of gigabyte.


----------



## qiplayer

Quote:


> Originally Posted by *Alex132*
> 
> Well crossfire only works in fullscreen.
> SLI used to only work in fullscreen too, but they have changed it.
> 
> Most programs can utilize SLI with windowed mode, however some can't (Kombustor etc), even when programs do utilize SLI in windowed mode there is a very noticeable decrease in performance.
> 
> source


I think you guys must disable the "resize windows on high dpi values" by the properties of the executable file you'r running


----------



## PinzaC55

Quote:


> Originally Posted by *samoth777*
> 
> Hi guys. I have a friend who wants to build a dream machine using 2 690s in quad SLI configuration. He has a spare Gigabyte GA-Z77X-UD5H-WB WIFI http://www.gigabyte.com/products/product-page.aspx?pid=4439#ov lying around which he wants to use for the build. He also has the i7-3770K which he plans on using for it. My question is, will this board be fine for a 2x 690 quad SLI setup? Might get the h100i in push pull for some good overclocking on the 3770k. Please advise! Thanks guys!


I looked at a couple of reviews of that board and it is pretty good but nothing spectacular? If your friend has the cash to splash on 2 of the best graphics cards you can buy (like $2000 or £1400) I would have thought he would have went for the best mobo like an ASUS Rampage 4 Extreme or an MSI Big Bang Xpower II?


----------



## noob.deagle

does anyone elses 690 make a squealing sound when its outputting very high FPS, i can place it under high load and it doesnt do it but run a demo and get 1000+ fps and it makes a weird noise. you can test this with the new 3dmark running the low end test got 1500fps almost constantly.

also if anyone is interested this is the DX11 demo results on a 690.
Fire Strike results - http://www.3dmark.com/fs/17508


----------



## Alex132

Quote:


> Originally Posted by *noob.deagle*
> 
> does anyone elses 690 make a squealing sound when its outputting very high FPS, i can place it under high load and it doesnt do it but run a demo and get 1000+ fps and it makes a weird noise. you can test this with the new 3dmark running the low end test got 1500fps almost constantly.
> 
> also if anyone is interested this is the DX11 demo results on a 690.
> Fire Strike results - http://www.3dmark.com/fs/17508


no coil whine on my side


----------



## Rei86

Quote:


> Originally Posted by *Alex132*
> 
> no coil whine on my side


have you tried stressing it with [email protected]? Usually you'll hear no coil whine even in intensive games but once you start BIONIC, [email protected] etc etc the cards will make your ears bleed.


----------



## Alex132

Quote:


> Originally Posted by *Rei86*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> no coil whine on my side
> 
> 
> 
> have you tried stressing it with [email protected]? Usually you'll hear no coil whine even in intensive games but once you start BIONIC, [email protected] etc etc the cards will make your ears bleed.
Click to expand...

No ways I will fold with this card,

1) not enough PPD for kepler
2) too much stress

and I wouldnt hear the coil whine over THE EXTREMELY LOUD FAN anyway


----------



## Shogon

Quote:


> Originally Posted by *Rei86*
> 
> have you tried stressing it with [email protected]? Usually you'll hear no coil whine even in intensive games but *once you start BIONIC, [email protected]* etc etc the cards will make your ears bleed.


Glad I'm not the only one with this issue. That's why I never fold on my 690, as much as I want to this whine noise (even with a waterblock) its enough to you notice, and stop right away. Even after my 1st 690 died and RMA'd a new one in no time, it still whined. Doesn't feel right, no other card I had ever made this whine on [email protected]

Besides that, this card is still kickin', 135%/+130/+400, max of 34C. The best card I have experienced so far, out of SLI 280s, single 480, SLI 580s, this has been the least painful with driver related issues for me.


----------



## Alex132

Quote:


> Originally Posted by *Shogon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Rei86*
> 
> have you tried stressing it with [email protected]? Usually you'll hear no coil whine even in intensive games but *once you start BIONIC, [email protected]* etc etc the cards will make your ears bleed.
> 
> 
> 
> Glad I'm not the only one with this issue. That's why I never fold on my 690, as much as I want to this whine noise (even with a waterblock) its enough to you notice, and stop right away. Even after my 1st 690 died and RMA'd a new one in no time, it still whined. Doesn't feel right, no other card I had ever made this whine on [email protected]
> 
> Besides that, this card is still kickin', 135%/+130/+400, max of 34C. The best card I have experienced so far, out of SLI 280s, single 480, SLI 580s, this has been the least painful with driver related issues for me.
Click to expand...

Coil whine is mostly fine.

I haven't had any coil whine on my 690, with the new 3DMark.


----------



## Rei86

Quote:


> Originally Posted by *Shogon*
> 
> Glad I'm not the only one with this issue. That's why I never fold on my 690, as much as I want to this whine noise (even with a waterblock) its enough to you notice, and stop right away. Even after my 1st 690 died and RMA'd a new one in no time, it still whined. Doesn't feel right, no other card I had ever made this whine on [email protected]
> 
> Besides that, this card is still kickin', 135%/+130/+400, max of 34C. The best card I have experienced so far, out of SLI 280s, single 480, SLI 580s, this has been the least painful with driver related issues for me.


Under normal load none of my cards have a Coil Whine. GTX690, GTX680x2, GTX650Ti and GTX570. However under heavy load the only card I own that does have it is the GTX650Ti. EVGA however allows me to return it for a RMA if it bothers me so much. What brad do you have?


----------



## Sugi

Can anyone recommend good OC'ng guides for the 690 using the standard air cooling. I have only seen OC'ng guides for the 600 series, but I am looking for something geared towards the 690.


----------



## Rei86

Quote:


> Originally Posted by *Sugi*
> 
> Can anyone recommend good OC'ng guides for the 690 using the standard air cooling. I have only seen OC'ng guides for the 600 series, but I am looking for something geared towards the 690.


All 600 cards from 660 to 690 OC the same, unless you want to rewrite your bios/flash a new bios/EVBot


----------



## Shogon

Quote:


> Originally Posted by *Rei86*
> 
> Under normal load none of my cards have a Coil Whine. GTX690, GTX680x2, GTX650Ti and GTX570. However under heavy load the only card I own that does have it is the GTX650Ti. EVGA however allows me to return it for a RMA if it bothers me so much. What brad do you have?


I also have EVGA.

Playing games is not an issue, don't hear a thing. Just in 3Dmark, or [email protected], is when I notice it.
Quote:


> Originally Posted by *Sugi*
> 
> Can anyone recommend good OC'ng guides for the 690 using the standard air cooling. I have only seen OC'ng guides for the 600 series, but I am looking for something geared towards the 690.


Just as Rei said, there really aren't any guides to overclocking. You have to figure out what is a balance between temps, and overclock. I would try and go for +75 on the core and +150 on the memory as a start and work up more later on, you can also try 135% power target, though when I was on air I reached thermal max and my PC would shut down in no time.


----------



## Sugi

Thanks Shogon for the tip. I can't wait to get start on my OC'ng.


----------



## Arizonian

Quote:


> Originally Posted by *Sugi*
> 
> Can anyone recommend good OC'ng guides for the 690 using the standard air cooling. I have only seen OC'ng guides for the 600 series, but I am looking for something geared towards the 690.


Kepler over clocks the same way through the entire series. The differences are the starting base clocks and the 690 dual GPU for the most part. Same 13 MHz straps. At 70C throttles back and dynamically down clocks by 13 MHz and again down another 13 MHz at 80C. Key is finding a good OC while maintaining temps under 70C when at full load.

I like to start at the Core over clocks first. Determine highest OC stability and then work on Memory.


----------



## noob.deagle

Quote:


> Originally Posted by *Rei86*
> 
> Under normal load none of my cards have a Coil Whine. GTX690, GTX680x2, GTX650Ti and GTX570. However under heavy load the only card I own that does have it is the GTX650Ti. EVGA however allows me to return it for a RMA if it bothers me so much. What brad do you have?


Mine is an Asus the sound has got me kinda worried because as game loads get higher the noise may become more prevalent. i have not contacted Asus about it as of yet because the card does not really have any issues and i dont feel like waiting for a new one cause it will leave me with no GPU to use in the mean time and there is so much coming out in Feb and march


----------



## Alex132

Quote:


> Originally Posted by *noob.deagle*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Rei86*
> 
> Under normal load none of my cards have a Coil Whine. GTX690, GTX680x2, GTX650Ti and GTX570. However under heavy load the only card I own that does have it is the GTX650Ti. EVGA however allows me to return it for a RMA if it bothers me so much. What brad do you have?
> 
> 
> 
> Mine is an Asus the sound has got me kinda worried because as game loads get higher the noise may become more prevalent. i have not contacted Asus about it as of yet because the card does not really have any issues and i dont feel like waiting for a new one cause it will leave me with no GPU to use in the mean time and there is so much coming out in Feb and march
Click to expand...

Make doesn't matter, all are nvidia-reference spec. so the parts should be 100% identical.

Coil-whine isn't really an indication of a major-flaw. It is damn annoying though. My EVGA card has no coil-whine so far.
If it really bothers you, contact ASUS about it. If it doesn't, then don't sweat it. It doesn't mean your card will die more than a card with coil-whine.
(Might shorten the lifespan from like 25 years to like 23, iono, it's really not as bad as it sounds, literally.)
Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sugi*
> 
> Can anyone recommend good OC'ng guides for the 690 using the standard air cooling. I have only seen OC'ng guides for the 600 series, but I am looking for something geared towards the 690.
> 
> 
> 
> Kepler over clocks the same way through the entire series. The differences are the starting base clocks and the 690 dual GPU for the most part. Same 13 MHz straps. At 70C throttles back and dynamically down clocks by 13 MHz and again down another 13 MHz at 80C. Key is finding a good OC while maintaining temps under 70C when at full load.
> 
> I like to start at the Core over clocks first. Determine highest OC stability and then work on Memory.
Click to expand...

It's near impossible to stay under 70'c on air!


----------



## Arizonian

Quote:


> Originally Posted by *Alex132*
> 
> It's near impossible to stay under 70'c on air!


I know what you mean.







I've managed to keep it about 78C-79C full load and end up with max 1163 MHz Core over clock. After many months with this 24/7 OC it's safe to say I'm maxed and 1163 Mhz is the ceiling temps taken into consideration. Too lazy to go under water and I don't keep GPU's long enough to invest that money.


----------



## Alex132

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> It's near impossible to stay under 70'c on air!
> 
> 
> 
> I know what you mean.
> 
> 
> 
> 
> 
> 
> 
> I've managed to keep it about 78C-79C full load and end up with max 1163 MHz Core over clock. After many months with this 24/7 OC it's safe to say I'm maxed and 1163 Mhz is the ceiling temps taken into consideration. Too lazy to go under water and I don't keep GPU's long enough to invest that money.
Click to expand...

I did look at an Accelero-type thing for this card, but no way I could slap one of those ugly things on my GPU!


----------



## shremi

So i got my RMA card









Just tested the temps and they where much less than the one i sent in so i went strait to watercool this baby









Temps are somewhere near the 42-43 overclocked at 1189 core and 3300 after a nice 1 hour gaming session are my temps ok ???? I think i am a bit short on rads since i only have a single RX360

Now what do i need to do to get into the club ????

BTW is my overclock decent ??? anything else heaven would crash ....


----------



## Rei86

Quote:


> Originally Posted by *Alex132*
> 
> Make doesn't matter, all are nvidia-reference spec. so the parts should be 100% identical.
> 
> Coil-whine isn't really an indication of a major-flaw. It is damn annoying though. My EVGA card has no coil-whine so far.
> If it really bothers you, contact ASUS about it. If it doesn't, then don't sweat it. It doesn't mean your card will die more than a card with coil-whine.
> (Might shorten the lifespan from like 25 years to like 23, iono, it's really not as bad as it sounds, literally.)
> It's near impossible to stay under 70'c on air!


No it doesn't, I was just stating EVGA since its been discussed and they are willing to take take a coil whining card with no issue. I know some electronics companies that won't take a coil whining equipment at all since technically its not faulty.

Also after I reTIMed on my 690 with IC Diamond on my 1st 690 I never hit over 69c on GPU1. Its possible. And when it had the AC Turbo 690 on it, it never hit over 60 I believe. I wish I kept pictures and records on all this but its possible on air. But after doing all that getting a WB is the better way to do things.
Quote:


> Originally Posted by *Shogon*
> 
> Just as Rei said, there really aren't any guides to overclocking. You have to figure out what is a balance between temps, and overclock. I would try and go for +75 on the core and +150 on the memory as a start and work up more later on, you can also try 135% power target, though when I was on air I reached thermal max and my PC would shut down in no time.


I would start at max power %, and start on Core GPU OC only. I would OC at 10~20 increments and than retest again and again.

TO Sugi:
IE +135/+20/+0 -- Test -- +135/+40/+0 -- Test -- +135/+60/+0 -- test -- +135/+80/+0 -- test
etc etc

Once you hit your instability I would bring it down by around +/-5 to get the highest point

IE +135/+140/+0 instability so +135/+135/+0 so on so on so on


----------



## grunion

Anyone having problems running the new 3DMark?


----------



## jhager8783

So I just upgraded to a second EVGA 690 GTX and a 1300watt Rosewill Lightning psu and rearranged the fans in the case (HAF-X) to suck out the extra heat, but things don't seem to be as cool as I want them to be. GPU temps logged on The Witcher 2 Max settings with uber-sampling and the rest at 6066x1080 show GPU 2 and GPU4 running at 82 degrees. Benchmarks like 3DMark run about the same.

I have included pics of my airflow set up, but was wondering if anybody has a recommendation as to how to maximize heat exhaust from these bad boys. I had thought about making a kind of exhaust as shown in yellow. Oh, and the 690 with the backing plate is now on the bottom...not that it made that much difference. Thanks for the input.


----------



## jhager8783

I get 7.3 FPS on fire strike with 690 GTX Quad SLI. I've tried a single card, every driver from 301.24 up to the beta 313.96 and nothing changes. None of the drivers are FM approved or so they say. I'm just resorting to running 3DMark 11.


----------



## Rei86

Quote:


> Originally Posted by *jhager8783*
> 
> I get 7.3 FPS on fire strike with 690 GTX Quad SLI. I've tried a single card, every driver from 301.24 up to the beta 313.96 and nothing changes. None of the drivers are FM approved or so they say. I'm just resorting to running 3DMark 11.


313.96 are FM Approved. Run a Dual setup and not a Quad.


----------



## jhager8783

Quote:


> Originally Posted by *Rei86*
> 
> 313.96 are FM Approved. Run a Dual setup and not a Quad.


Quote:


> Originally Posted by *Rei86*
> 
> 313.96 are FM Approved. Run a Dual setup and not a Quad.


I guess I'll try. Any idea what kind of FPS I should expect from that setup?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *grunion*
> 
> Anyone having problems running the new 3DMark?


I'm on 313.95 betas and 3dmark is working great for me. What driver are you running?


----------



## Rei86

Quote:


> Originally Posted by *jhager8783*
> 
> I guess I'll try. Any idea what kind of FPS I should expect from that setup?


Dunno, my 690 is sitting in their box.


----------



## jhager8783

Quote:


> Originally Posted by *Rei86*
> 
> Dunno, my 690 is sitting in their box.


Not surprised, guess it'll be a while before a lot of people really start to use them. I'm just running one of my two. speaking of, with the 313.96 drive and one card, Fire Strike got about 19FPS minimum and 42 max. My 3770k OC'd to 4.6 is probably where most of the hurt is. Thanks again for the suggestion about getting it to work.


----------



## Alex132

Quote:


> Originally Posted by *jhager8783*
> 
> So I just upgraded to a second EVGA 690 GTX and a 1300watt Rosewill Lightning psu and rearranged the fans in the case (HAF-X) to suck out the extra heat, but things don't seem to be as cool as I want them to be. GPU temps logged on The Witcher 2 Max settings with uber-sampling and the rest at 6066x1080 show GPU 2 and GPU4 running at 82 degrees. Benchmarks like 3DMark run about the same.
> 
> I have included pics of my airflow set up, but was wondering if anybody has a recommendation as to how to maximize heat exhaust from these bad boys. I had thought about making a kind of exhaust as shown in yellow. Oh, and the 690 with the backing plate is now on the bottom...not that it made that much difference. Thanks for the input.


Flip your PSU around. It's sucking air in, not blowing it out.

Also change your bottom 200mm front to an exhaust.
Rear 120mm to an intake, and reverse the flow of your CPU cooler fans, then make the front top an exhaust.
Lastly change all your top fans to intakes.
Keep your side 200mm as an intake.


----------



## maximus56

Quote:


> Originally Posted by *jhager8783*
> 
> I get 7.3 FPS on fire strike with 690 GTX Quad SLI. I've tried a single card, every driver from 301.24 up to the beta 313.96 and nothing changes. None of the drivers are FM approved or so they say. I'm just resorting to running 3DMark 11.


If it makes you feel better (as misery loves company..lol) I am not having much luck either with the quads when running the Fire Strike benchmark. I am currently using the 310.96 drivers, but have tried the older drivers with no luck.This benchmark is horrible for me on Windows 8, and only half decent on Windows 7. I suspect this is due to a combination of software and driver issues. I guess this is one of the reasons why we have not seen many quad 690 scores posted yet over on the Top 30 3D Mark Fire Strike thread. It would be interesting to find out if majority of the quad 690 owners are experiencing the same issues, or is it just you and me who are the outliers


----------



## jcde7ago

Added *Sugi* to the list...welcome!


----------



## Sugi

Quote:


> Originally Posted by *jcde7ago*
> 
> Added *Sugi* to the list...welcome!


Thanks for adding me, I am glad to be here.


----------



## Rei86

Quote:


> Originally Posted by *jhager8783*
> 
> Not surprised, guess it'll be a while before a lot of people really start to use them. I'm just running one of my two. speaking of, with the 313.96 drive and one card, Fire Strike got about 19FPS minimum and 42 max. My 3770k OC'd to 4.6 is probably where most of the hurt is. Thanks again for the suggestion about getting it to work.


Tri and Quad SLI/CFX isn't working ATM in 3DMarks Fire Strike so we'll have to wait for a driver update from both Nvidia and AMD. I think AMD has aleady sent out another beta driver but not sure.

And the only reason why I'm not using my GTX690 is because I upgraded to two EVGA GTX680 Classifieds.


----------



## grunion

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I'm on 313.95 betas and 3dmark is working great for me. What driver are you running?


The latest...

Finally found the culprit, there was a post on the FM forums that suggested disabling hw monitoring, that fixed it.

With that said, scores are pitiful compared to CFX.


----------



## Alex132

Quote:


> Originally Posted by *grunion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrTOOSHORT*
> 
> I'm on 313.95 betas and 3dmark is working great for me. What driver are you running?
> 
> 
> 
> The latest...
> 
> Finally found the culprit, there was a post on the FM forums that suggested disabling hw monitoring, that fixed it.
> 
> With that said, scores are pitiful compared to CFX.
Click to expand...

CPU.

My 2500k crawls in the new 3DMark, I am guessing your 3570k does too.


----------



## qiplayer

I say only one word
TITAN!

Is coming


----------



## iARDAs

If Titan will perform like a 690 on a single core with 6GB of ram, Either it will be priced insanely much or the prices of 690s will drop which is also nice.

I heard somewhere that Titan will be $900 and i highly doubt that.

If so than a 690 should sell for like $700 which would be nice actually.


----------



## AllGamer

my hunch tells me Titan = vaporware

even if it does exist it wont be ready for retail in 2013


----------



## PCModderMike

Hey guys....I'm a little late to the party, but I had a great deal come through and I'm now a proud owner of an EVGA GTX 690....love it!


----------



## MrTOOSHORT

Oh my! What about the Lightning?


----------



## PCModderMike

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Oh my! What about the Lightning?


It went towards getting the 690...couldn't pass it up. Basically my Lightning with the EK block got me this 690, and just a little cash on top.


----------



## ceteris

Just benched on the new 3DMark and got 11885 on HP setting.

http://www.3dmark.com/fs/119160

Seems like AMD is getting tons of love on the leaderboards though.


----------



## icecpu

What's everybody best stable overclock for far Cry 3 ?
I can benchmark and game all day @ +130 for the core, but only +100 to play Far Cry 3


----------



## shremi

Quote:


> Originally Posted by *icecpu*
> 
> What's everybody best stable overclock for far Cry 3 ?
> I can benchmark and game all day @ +130 for the core, but only +100 to play Far Cry 3


Mine for FC3 is 1167 core 3330 mem @1440p everything maxed out ... Since my card is fairly new I have only played that game I can bench @ 1200 core tho


----------



## Alex132

Quote:


> Originally Posted by *icecpu*
> 
> What's everybody best stable overclock for far Cry 3 ?
> I can benchmark and game all day @ +130 for the core, but only +100 to play Far Cry 3


This too.

FC3 = most demanding thing it seems. Lol


----------



## PCModderMike

FC3 gives my 690 a workout...gets that thing *hot*









Piecing together some blocks and parts to get this thing under water....couldn't stay away from water forever









EDIT:

So one of the blocks I'm interested in is from XPSC....any owners in here?
I'm wondering if the EVGA backplate can still be used when the block is mounted?


----------



## Fullmetalaj0

I just got my tax return and Im bout to pull the trigger on this thing, but something concerns me. I also have a 1600p monitor on the way so i was wondering if this thing is going to get Vram starved playing at that rez?


----------



## PCModderMike

Quote:


> Originally Posted by *Fullmetalaj0*
> 
> I just got my tax return and Im bout to pull the trigger on this thing, but something concerns me. I also have a 1600p monitor on the way so i was wondering if this thing is going to get Vram starved playing at that rez?


I play at 1440p...and so far haven't even come close to running out of memory. 2560x1600 is not much more than 2560x1440....so I would think you're safe. Unless you play some heavily modded Skyrim...I've heard that can eat up a crazy amount of memory, but other than that most common games will be fine.


----------



## TheMadHerbalist

3dMark isn't playing nice with my cards.

Scored a 7922 with my systems 24/7 stable oc, and the combined score is killing me. 7? ***








http://www.3dmark.com/fs/138939


----------



## MrTOOSHORT

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> 3dMark isn't playing nice with my cards.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Scored a 7922 with my systems 24/7 stable oc, and the combined score is killing me. 7? ***
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/138939


3dmark doesn't work right with more than two gpus( AKA one gtx690 ) going in one system. Have to wait for a fix.


----------



## PCModderMike

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> 3dmark doesn't work right with more than two gpus( AKA one gtx690 ) going in one system. Have to wait for a fix.


Unrelated.
But, what kind of block are you using for your 690 MrTOOSHORT?


----------



## MrTOOSHORT

EK with EK backplate.

Oh and with the circles!


----------



## PCModderMike

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> EK with EK backplate.
> 
> Oh and with the circles!


Cool, nothing wrong with that.








Had 'em on my Lightning block so they don't bother me.


----------



## Elite GunnerX

EVGA....In it to win it!!!


----------



## drek

Anyone know what wattage is needed to run dual gtx 690's.

I7 3820
Asus rampage iv extreme
Corsair vengence 32gb ddr3
Corsair hx 850w


----------



## drek

Quote:


> Originally Posted by *drek*
> 
> Anyone know what wattage is needed to run dual gtx 690's.
> 
> I7 3820
> Asus rampage iv extreme
> Corsair vengence 32gb ddr3
> Corsair hx 850w


Meant corsair hx 750w


----------



## drek

Quote:


> Originally Posted by *drek*
> 
> Meant corsair hx 750w


Anyone?


----------



## iARDAs

Quote:


> Originally Posted by *drek*
> 
> Anyone?


Check out this site

http://www.coolermaster.outervision.com/

You might need a 1000W PSU for 690 Quad Sli


----------



## Alex132

I would go with anything gold rated and 900w+

Hale 900w gold, AX1200, CM 1000w, etc.


----------



## qiplayer

Quote:


> Originally Posted by *drek*
> 
> Meant corsair hx 750w


Hi!
I tested 2 690 with cpu 3930k slightly overclocked.
Psu xfx 850 watt
Running furmark on 6000x1080 I cud hear the fan spinning faster but not much else
I checked on a test review that my psu has a safety function that turns it off at 940 watts.

To reach the psu limit I had to run several furmark tests at the same time, as the cpu wasn't working more than 30% I run meanwhile a cpu test that brougth all cores at max.
So I had 4 gpu at max and the cpu too.

So the fan began spinning at maximum and after a few minutes the pc turned off.
Even if in normal stressed conditions it works I would eventually go for a higher, like 1000-1200 psu. If you wanna oc the cpu 850 is definitely not enough.

If you live in center europe, today evening I knew a guy that has to sell a 1200 and a 1500 psu.
By the way I just sold my two gtx 680, made a good sell, I now wait for the titan card


----------



## TheMadHerbalist

Any have good results from moding their bios. If I could get a little more power i could push my best card over 13k score on 3dmark.


Both gpus clocked to 1228, GPU 1 is at 186+ and gpu 2 at 196, with a mem of set at 560+. i might still be able to squeeze a little more out of them, since i did a rush job of an oc, but haveing that extra head room would be nice.


----------



## jhager8783

Quote:


> Originally Posted by *Alex132*
> 
> Flip your PSU around. It's sucking air in, not blowing it out.
> 
> Also change your bottom 200mm front to an exhaust.
> Rear 120mm to an intake, and reverse the flow of your CPU cooler fans, then make the front top an exhaust.
> Lastly change all your top fans to intakes.
> Keep your side 200mm as an intake.


I will try this setup, but in playing with other options, I have managed to get GPU temps down to 69 degrees. Thanks for the suggest.


----------



## jhager8783

Quote:


> Originally Posted by *maximus56*
> 
> If it makes you feel better (as misery loves company..lol) I am not having much luck either with the quads when running the Fire Strike benchmark. I am currently using the 310.96 drivers, but have tried the older drivers with no luck.This benchmark is horrible for me on Windows 8, and only half decent on Windows 7. I suspect this is due to a combination of software and driver issues. I guess this is one of the reasons why we have not seen many quad 690 scores posted yet over on the Top 30 3D Mark Fire Strike thread. It would be interesting to find out if majority of the quad 690 owners are experiencing the same issues, or is it just you and me who are the outliers


Yeah, I've got my cousin working on a solution. He's a coding programmer, maybe he can get things running a little smoother. Mean while, I try all new drivers and whatever tweaks I can think of. Wondering whether the upgrade to windows 8 will be worth looking into.


----------



## jhager8783

Here's the latest on my build, added a second 80plus gold non-modular PSU. A bitspower x-station, bitspower x-station ext II, and used those fancy red pci-e adapters. I used existing holes to mount the psu up top, and am making a custom engraved metal plate to go over the whole top assembly so that you won't see the psu. It'll be back lit with green Polyplex.


----------



## jhager8783

Quote:


> Originally Posted by *jhager8783*
> 
> I will try this setup, but in playing with other options, I have managed to get GPU temps down to 69 degrees. Thanks for the suggest.


Alex132, your suggestion was good. However I didn't know what you meant by turning the PSU around, if anything I'd break it open and turn the fan around to exhaust air to the outside (it does get pretty hot at load). My GPU temps in Furmark 1.82 where reduced an average of 4 degrees, yet in Prime 95 (large FFT's) my CPU heat went up an average of 3 degrees. Taking my overclock down from 4.7Ghz to 4.5 helped, but I guess there's always a trade-off. Thanks again, this was a simple way to maximize heat exhaust. Anybody who uses the HAF-X case here should consider this setup.


----------



## Alex132

Quote:


> Originally Posted by *jhager8783*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jhager8783*
> 
> I will try this setup, but in playing with other options, I have managed to get GPU temps down to 69 degrees. Thanks for the suggest.
> 
> 
> 
> Alex132, your suggestion was good. However I didn't know what you meant by turning the PSU around, if anything I'd break it open and turn the fan around to exhaust air to the outside (it does get pretty hot at load). My GPU temps in Furmark 1.82 where reduced an average of 4 degrees, yet in Prime 95 (large FFT's) my CPU heat went up an average of 3 degrees. Taking my overclock down from 4.7Ghz to 4.5 helped, but I guess there's always a trade-off. Thanks again, this was a simple way to maximize heat exhaust. Anybody who uses the HAF-X case here should consider this setup.
Click to expand...

You can physically turn your PSU over.
The fan sucks air into the PSU, not exhausts air out of it.

So if you make the PSU fan face the bottom of your case, it will not be fighting with your 690 fan for air.

EG:

PSU facing up:










PSU facing down:










You want to do the latter!


----------



## PCModderMike

Most would say the orientation of the PSU, whether the fan is up or down, is all about personal preference. The PSU is really the last thing in a build that could effect temps. TTL is a big believer in this.


----------



## qiplayer

Quote:


> Originally Posted by *PCModderMike*
> 
> Most would say the orientation of the PSU, whether the fan is up or down, is all about personal preference. The PSU is really the last thing in a build that could effect temps. TTL is a big believer in this.


If the psu efficiency is 85% and it uses 800watts, there are about 70 watts of heat that the psu releases.
If you put these in or out there is in my opinion a difference.


----------



## PCModderMike

Quote:


> Originally Posted by *qiplayer*
> 
> If the psu efficiency is 85% and it uses 800watts, there are about 70 watts of heat that the psu releases.
> If you put these in or out there is in my opinion a difference.


But again, whether the fan is up or down, it's still sucking air into the PSU and that air is being exhausted out the back of the PSU. If you look at a fan on a PSU, you'll see the fan is pushing air into the PSU. It's not an exhaust fan. So even if you decide to go fan up, that fan is not blowing air into the case.


----------



## Alex132

Quote:


> Originally Posted by *PCModderMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *qiplayer*
> 
> If the psu efficiency is 85% and it uses 800watts, there are about 70 watts of heat that the psu releases.
> If you put these in or out there is in my opinion a difference.
> 
> 
> 
> But again, whether the fan is up or down, it's still sucking air into the PSU and that air is being exhausted out the back of the PSU. If you look at a fan on a PSU, you'll see the fan is pushing air into the PSU. It's not an exhaust fan. So even if you decide to go fan up, that fan is not blowing air into the case.
Click to expand...

It's still fighting with the 690 fan, so it's a bad idea.


----------



## drek

http://www.3dmark.com/3dm11/5937350

Anyone know why my ram is showing 667mhz?


----------



## Alex132

Quote:


> Originally Posted by *drek*
> 
> http://www.3dmark.com/3dm11/5937350
> 
> Anyone know why my ram is showing 667mhz?


667Mhz = 1333Mhz


----------



## PinzaC55

Just curious, does anyone know how many GTX 690's have been manufactured so far?


----------



## RimeHD

Hello all, I'm having some problems with my 690 and BF3 MP.

Specs:
i7 3820 @ 4.3GHz
16GB RAM @ 1600MHz
GTX 690
875w PSU

BF3 Graphic Settings:

Everything is set to the ultra preset except MSAA is turned OFF, antialiasing post is set to MEDIUM (the best is HIGH), and Ambient Occlusion is set to SSAO (which is lower than the highest setting).

KEEP IN MIND THIS ALL APPLIES TO MP BF3 NOT SP UNLESS SAID OTHERWISE.

My problem is that my GPU usage (both gpus) is normally around 50-60% on ALL MAPS, only sometimes spiking up just a tad bit higher. This makes for horrid minimum fps (40) and terrible average fps (sometimes 70), especially on the BTK and AM maps which are very hard on one's computer. DO NOT TELL ME THAT I CAN'T GET HIGHER USAGE BECAUSE THE GAME CAN'T UTILIZE THE 690. I have seen several people get 70-80% GPU usage average with spikes up to 90% with the 690 on this game's MP, making for very high fps all the time, the minimum being 70-80.

I'm nearly positive that this isn't a hardware problem, or a lack-of-hardware problem. I have ran extreme benchmarks and scored very well with 98-100% GPU usage the whole way through, there isn't a bottleneck anywhere that I know of.

I'm using the 313.96 BETA driver and am positive that it CAN work well with the 690. I have a friend with a 690 that gets 70-80% GPU usage using this driver.

In the Nividia Control Panel I have set Performance Mode to MAXIMUM and have turned vertical sync off globally. Multi-GPU is enabled and I have turned on Single Display Performance.

Any Ideas? Thanks.


----------



## Alex132

Quote:


> Originally Posted by *RimeHD*
> 
> Hello all, I'm having some problems with my 690 and BF3 MP.
> 
> Specs:
> i7 3820 @ 4.3GHz
> 16GB RAM @ 1600MHz
> GTX 690
> 875w PSU
> 
> BF3 Graphic Settings:
> 
> Everything is set to the ultra preset except MSAA is turned OFF, antialiasing post is set to MEDIUM (the best is HIGH), and Ambient Occlusion is set to SSAO (which is lower than the highest setting).
> 
> KEEP IN MIND THIS ALL APPLIES TO MP BF3 NOT SP UNLESS SAID OTHERWISE.
> 
> My problem is that my GPU usage (both gpus) is normally around 50-60% on ALL MAPS, only sometimes spiking up just a tad bit higher. This makes for horrid minimum fps (40) and terrible average fps (sometimes 70), especially on the BTK and AM maps which are very hard on one's computer. DO NOT TELL ME THAT I CAN'T GET HIGHER USAGE BECAUSE THE GAME CAN'T UTILIZE THE 690. I have seen several people get 70-80% GPU usage average with spikes up to 90% with the 690 on this game's MP, making for very high fps all the time, the minimum being 70-80.
> 
> I'm nearly positive that this isn't a hardware problem, or a lack-of-hardware problem. I have ran extreme benchmarks and scored very well with 98-100% GPU usage the whole way through, there isn't a bottleneck anywhere that I know of.
> 
> I'm using the 313.96 BETA driver and am positive that it CAN work well with the 690. I have a friend with a 690 that gets 70-80% GPU usage using this driver.
> 
> In the Nividia Control Panel I have set Performance Mode to MAXIMUM and have turned vertical sync off globally. Multi-GPU is enabled and I have turned on Single Display Performance.
> 
> Any Ideas? Thanks.


What PSU exactly?


----------



## RimeHD

Quote:


> Originally Posted by *Alex132*
> 
> What PSU exactly?


It's actually 900w sorry, don't know why I said that. It's a SPARKLE Magna. But like I said before, I ran a Unigine Heaven benchmark and scored very well with 98-100% GPU usage the power is GREAT. It's just that I need someone to help me make this card run well with the game!


----------



## Alex132

Quote:


> Originally Posted by *RimeHD*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> What PSU exactly?
> 
> 
> 
> It's actually 900w sorry, don't know why I said that. It's a SPARKLE Magna. But like I said before, I ran a Unigine Heaven benchmark and scored very well with 98-100% GPU usage the power is GREAT. It's just that I need someone to help me make this card run well with the game!
Click to expand...

What drivers are you running? Try downloading the lateset beta drivers.


----------



## RimeHD

Quote:


> Originally Posted by *Alex132*
> 
> What drivers are you running? Try downloading the lateset beta drivers.


I said what drivers I was using in my original post. They are the latest beta drivers.


----------



## grunion

Quote:


> Originally Posted by *RimeHD*
> 
> I said what drivers I was using in my original post. They are the latest beta drivers.


I don't see you list the resolution.


----------



## noob.deagle

hey guys i have bad news ive been playing crysis 3 and ........well on max scenes like this 

drop the fps to 30 or less depending on your AA mode. looks absolutely amazing tho but i think i need to buy a 2nd GTX690


----------



## jhager8783

Quote:


> Originally Posted by *Alex132*
> 
> It's still fighting with the 690 fan, so it's a bad idea.


It's not like it's not worth trying, I'll probably have to get wheels or bigger feet for my HAF-X though. I guess the best way to tell a temp difference here with an infra red thermometer is to take readings just below the bottom 690, or maybe I should only focus on comparing temps in my GPU log. Whatever I find, I'll let you know if it effects temps.


----------



## ceteris

Quote:


> Originally Posted by *noob.deagle*
> 
> hey guys i have bad news ive been playing crysis 3 and ........well on max scenes like this
> 
> drop the fps to 30 or less depending on your AA mode. looks absolutely amazing tho but i think i need to buy a 2nd GTX690


LOL yeah Crysis 3 really punishes the 690. But I doubt Quad SLI will help much. Anything above dual SLI usually gets least priority in driver updates and this game is still rather new. I just hope another driver update after this 314 can improve this. Otherwise, I might consider dual Titans, although I prefer to wait for the next generation AMD or nVidia.


----------



## Alex132

Just use SSAA or w/e it's called.


----------



## max883

GTX 680 SLI = GTX 690


----------



## noob.deagle

Quote:


> Originally Posted by *Alex132*
> 
> Just use SSAA or w/e it's called.


SMAA ?, even SMAA causes huge frame drops on high.

TXAA high tho while its basically unplayble it does an amazing job at giving it a film like look; im not a huge fan of it cause i prefer crisp AA solutions but damn sometimes i thought it was a movie with TXAA on.


----------



## jayvo

So I'm new here guys, I just bought a GTX 690 last night from Amazon. It should be delivered to me tomorrow from Amazon. However, I just saw the GTX Titan announced and I'm wondering if I made the right choice? I can always have the 690 sent back if reviews suggest that the Titan is better. It is cheaper than the 690 however, so what do you guys think who already have the 690?


----------



## Alex132

Quote:


> Originally Posted by *jayvo*
> 
> So I'm new here guys, I just bought a GTX 690 last night from Amazon. It should be delivered to me tomorrow from Amazon. However, I just saw the GTX Titan announced and I'm wondering if I made the right choice? I can always have the 690 sent back if reviews suggest that the Titan is better. It is cheaper than the 690 however, so what do you guys think who already have the 690?


Personally I woulda waited till Titan was released, seen the reviews it got - and then decided.


----------



## Arizonian

Quote:


> Originally Posted by *jayvo*
> 
> So I'm new here guys, I just bought a GTX 690 last night from Amazon. It should be delivered to me tomorrow from Amazon. However, I just saw the GTX Titan announced and I'm wondering if I made the right choice? I can always have the 690 sent back if reviews suggest that the Titan is better. It is cheaper than the 690 however, so what do you guys think who already have the 690?


I'm not sure what your running but if it's multiple monitors then I'd wait for the Titan. Even if it's not as powerful as a GTX 690 your going to have 6GB VRAM on the Titan and will run multiple monitors nicely.

Price is going to be almost as much as the GTX 690. We'll have to wait and see when real benching/tests are done. Up to you, but I'd take previous advice and wait.

Also - Welcome to OCN with your first post. *"How to put your Rig in your Sig"*


----------



## jayvo

Quote:


> Originally Posted by *Alex132*
> 
> Personally I woulda waited till Titan was released, seen the reviews it got - and then decided.


Quote:


> Originally Posted by *Arizonian*
> 
> I'm not sure what your running but if it's multiple monitors then I'd wait for the Titan. Even if it's not as powerful as a GTX 690 your going to have 6GB VRAM on the Titan and will run multiple monitors nicely.
> 
> Price is going to be almost as much as the GTX 690. We'll have to wait and see when real benching/tests are done. Up to you, but I'd take previous advice and wait.
> 
> Also - Welcome to OCN with your first post. *"How to put your Rig in your Sig"*


Thanks guys......only running one monitor right now, but might consider multiple monitors in the future.


----------



## jcde7ago

PCModderMike has been added...welcome!!!

To anyone else that would like to get added to the list that may have been overlooked, please remember to post a picture/get your rig added to your sig with the brand of your 690...and also, PM'ing me for an add-request will expedite the process of getting you added to the list!


----------



## PCModderMike

Quote:


> Originally Posted by *jcde7ago*
> 
> PCModderMike has been added...welcome!!!
> 
> To anyone else that would like to get added to the list that may have been overlooked, please remember to post a picture/get your rig added to your sig with the brand of your 690...and also, PM'ing me for an add-request will expedite the process of getting you added to the list!


Thanks for the add!
Now that Titan has released, kinda feels silly being added just now....but at almost half the price of retail I couldn't pass it up.


----------



## Arizonian

Quote:


> Originally Posted by *PCModderMike*
> 
> Thanks for the add!
> Now that Titan has released, kinda feels silly being added just now....but at almost half the price of retail I couldn't pass it up.


Taking into consideration that it's being speculated we won't see new GTX 700 series or Radeon 8000 series cards until Dec 2013 / Jan 2014. On your single monitor your GTX 690 will be very relevant until the refresh Kepler / Sea Islands at the very least. If you paid anything less than a Titan price for more performance I'd say you scored.


----------



## PCModderMike

Quote:


> Originally Posted by *Arizonian*
> 
> Taking into consideration that it's being speculated we won't see new GTX 700 series or Radeon 8000 series cards until Dec 2013 / Jan 2014. On your single monitor your GTX 690 will be very relevant until the refresh Kepler / Sea Islands at the very least. If you paid anything less than a Titan price for more performance I'd say you scored.


Well thanks for the reassurance.







I don't see myself going for a surround 1440p setup anytime soon, so this 690 is rocking it just fine.


----------



## pilla99

If anyone is feeling bad about their 690: "By contrast, a single GeForce GTX Titan is a little shy with 2,688 CUDA cores at 837MHz, and an Nvidia rep suggested that if you've got a beefy gaming PC with a single monitor attached, the GTX 690 would deliver better performance."


----------



## Arizonian

Quote:


> Originally Posted by *pilla99*
> 
> If anyone is feeling bad about their 690: "By contrast, a single GeForce GTX Titan is a little shy with 2,688 CUDA cores at 837MHz, and an Nvidia rep suggested that if you've got a beefy gaming PC with a single monitor attached, the GTX 690 would deliver better performance."


Exactly. Take same single monitor 1440 or 120 Hz and lets speculate in 2014 when the 780 comes out. If the performance isn't greater than a single 690, then you'd be holding onto it until new releases in 2015.


----------



## DinaAngel

it seems like i might get myself 2 GTX Titan next month or so, anyone else going to?
i would do video again


----------



## RimeHD

Quote:


> Originally Posted by *grunion*
> 
> I don't see you list the resolution.


It's 1920x1080. PLEASE don't tell me that I cannot utilize the GPU more on this resolution. I know that's not true and you won't be helping me with my problem whatsoever.


----------



## TheMadHerbalist

Quote:


> Originally Posted by *DinaAngel*
> 
> it seems like i might get myself 2 GTX Titan next month or so, anyone else going to?
> i would do video again


I'm tempted to, but in going to wait to see what they can do. I'm more interested in the ivy bridge - e so I'll save my money for a new build with X3 titans


----------



## benjaminlev09

HEYyall,

Did anyone upgraded from 313.96 beta to the new 314.07 WHQL drivers and noticed any diffrences?
im currently using the 313.96 with no issues at all so wondered if it worth all the troubles.


----------



## jhager8783

Quote:


> Originally Posted by *DinaAngel*
> 
> it seems like i might get myself 2 GTX Titan next month or so, anyone else going to?
> i would do video again


Nope, gonna stick with my two 690's. Two 690's are at over 1.1 teraFLOPS (5,562 gigaFLOPS each) of processing power where as two Titans are at 9,000 (4,500 each) gigaFLOPS, and my rig can handle Crysis 3 on very high @ 6066x 1080. Like MadHerbalist, I'm going for the i7 3970x rather than my i7 3770k and a better mobo because I'm currently running 8x8 in SLI.


----------



## jhager8783

So I flipped the PSU whilst chowing on Oreos and bench-marked for heat while I wished for some milk. As it turns out, I had less than I expected, of heat.... and milk. I'm showing an ever so slight difference in average temps of the cards, but a bigger difference in ambient case temperature. So needless to say the ten minutes spend was worth the time. Thanks all for the input.


----------



## Qu1ckset

Quote:


> Originally Posted by *DinaAngel*
> 
> it seems like i might get myself 2 GTX Titan next month or so, anyone else going to?
> i would do video again


Quote:


> Originally Posted by *Arizonian*
> 
> Exactly. Take same single monitor 1440 or 120 Hz and lets speculate in 2014 when the 780 comes out. If the performance isn't greater than a single 690, then you'd be holding onto it until new releases in 2015.


Quote:


> Originally Posted by *pilla99*
> 
> If anyone is feeling bad about their 690: "By contrast, a single GeForce GTX Titan is a little shy with 2,688 CUDA cores at 837MHz, and an Nvidia rep suggested that if you've got a beefy gaming PC with a single monitor attached, the GTX 690 would deliver better performance."


when i started hearing the rumor that a single Geforce Titan would beat a GTX690 i was kinda down with my purchase seeing how the titan has 6GB Vram.. but after finding out the gtx690 still beats it im happy with my purchase and probably going to try my best to hold out till Maxwell, not to interested in the Kepler refresh, but sucks that the 700series is pushed back, making me have to wait longer for maxwells release.

My GTX690 is handling my 1440p monitor with no hiccups anyways


----------



## ceteris

Quote:


> Originally Posted by *jhager8783*
> 
> Nope, gonna stick with my two 690's. Two 690's are at over 1.1 teraFLOPS (5,562 gigaFLOPS each) of processing power where as two Titans are at 9,000 (4,500 each) gigaFLOPS, and my rig can handle Crysis 3 on very high @ 6066x 1080. Like MadHerbalist, I'm going for the i7 3970x rather than my i7 3770k and a better mobo because I'm currently running 8x8 in SLI.


If you are planning on getting an RIVE for that 3970X, you better act fast. Another member on the RIV Owners club just got e-mail confirmation that they are out of production and alot of e-tailers have been OOS on them.


----------



## PowerK

I sold my 690s the other day for Titans.


----------



## DinaAngel

allright gonna save up then for next year,
anyone know any more info about the ivy bridge E?


----------



## ceteris

Quote:


> Originally Posted by *DinaAngel*
> 
> allright gonna save up then for next year,
> anyone know any more info about the ivy bridge E?


Nothing more than 3rd Quarter 2013. Don't expect to see anything til July+


----------



## DinaAngel

allright, tht sounds good, i might buy myself soundcard for a change meanwhile :3


----------



## PinzaC55

Quick question - are the screws which secure the gtx 690 pcb to the case M2 or M2.5?


----------



## jhager8783

Quote:


> Originally Posted by *ceteris*
> 
> If you are planning on getting an RIVE for that 3970X, you better act fast. Another member on the RIV Owners club just got e-mail confirmation that they are out of production and alot of e-tailers have been OOS on them.


Will do, thnkx for the info.


----------



## jhager8783

http://promotions.newegg.com/ACC/13-0180/index.html

The new Arctic Accelero Twin Turbo, has anyone looked into these? I would be interested to see how the work in with an SLI setup.


----------



## rossb

I installed these on two 690s - at different times, not in SLI. Both failed and have been RMAd. I haven't been game to try them again on the replacement. I am not sure that it was the cooler that bricked the cards, since both worked fine for a while and then failed under heavy load. I can tell you that while they worked they were fantastic - kept the cards very cool and absolutely silent even at full load. Of course, they are ugly compared to the stock cooler, and take up 3 slots (something to bear in mind with SLI) but I am more concerned about noise than appearance. If I can work up the courage I may try again with the AC cooler on the 690. I also have AC coolers on my 7970 and 680 and these have worked without incident for nearly a year.


----------



## TheMadHerbalist

Quote:


> Originally Posted by *ceteris*
> 
> If you are planning on getting an RIVE for that 3970X, you better act fast. Another member on the RIV Owners club just got e-mail confirmation that they are out of production and alot of e-tailers have been OOS on them.


Damn shame, really liked that board.


----------



## jhager8783

Quote:


> Originally Posted by *rossb*
> 
> I installed these on two 690s - at different times, not in SLI. Both failed and have been RMAd. I haven't been game to try them again on the replacement. I am not sure that it was the cooler that bricked the cards, since both worked fine for a while and then failed under heavy load. I can tell you that while they worked they were fantastic - kept the cards very cool and absolutely silent even at full load. Of course, they are ugly compared to the stock cooler, and take up 3 slots (something to bear in mind with SLI) but I am more concerned about noise than appearance. If I can work up the courage I may try again with the AC cooler on the 690. I also have AC coolers on my 7970 and 680 and these have worked without incident for nearly a year.


Well, now it's definitely something to consider. I'm not 100% worried about noise, but I wonder if the 26% drops in temps are worth the $200. I've never dealt with third-part GPU coolers so forgive me for asking, but how are the fans powered? Do they use three pin fan connections, or do they tie into the boards power connector for the stock fan?


----------



## jhager8783

I was reading about the new Titan's ability to overclock monitors' refresh rates. I was wondering if it would work with a multiple monitor setup. I am using these 32" 1080p LED TVs with a 60Hz refresh rate (I know, but they were dirt cheap).



When I flipped my PSU, it didn't sit flush with the fan guard on it, so I removed it and put it on the first 140mm front exhaust fan. I think it's good to use what you have and recycle what you can. You can find a use for anything it seems.


----------



## rossb

Quote:


> Originally Posted by *jhager8783*
> 
> Well, now it's definitely something to consider. I'm not 100% worried about noise, but I wonder if the 26% drops in temps are worth the $200. I've never dealt with third-part GPU coolers so forgive me for asking, but how are the fans powered? Do they use three pin fan connections, or do they tie into the boards power connector for the stock fan?


The fans are PWM fans powered from the fan header on the GPU. But if you're not concerned about noise, I don't see why you would bother with an after-market cooler, unless you're trying for massive overclocks, in which case liquid cooling is probably a better option.


----------



## 86JR

So I am now officially in this club

2500k at 5ghz
690 gtx ASUS
Corsiar 850W non modular

Hoping the PSU is good enough... any ideas what I can overclock to?

I am on 1080p so I can max detail without any fps loss


----------



## MrTOOSHORT

Quote:


> Originally Posted by *86JR*
> 
> So I am now officially in this club
> 
> 2500k at 5ghz
> 690 gtx ASUS
> Corsiar 850W non modular
> 
> Hoping the PSU is good enough... any ideas what I can overclock to?
> 
> I am on 1080p so I can max detail without any fps loss


850w is overkill for your set up. My GTX 690 3dmark11 benchmark was done on my AX650 psu. I'd try +150 offset on the cores and test from there in regards to overclocking.


----------



## Cheesemaster

One of my 690's died I am going through a rma process with EVGA... I am seriously considering just getting three titans and calling it a day. I will run just one of my monitors until I get the titans. I cant run triple monitors with everything cranked on quad 690's it was kind of a waste, wish I new about vram limitations in 3d surround.. they rock real hard. But for 2k I should crank up all settings and should not have to compromise.


----------



## max883

Geforce titan is same prise as Geforce 690!!


----------



## iARDAs

Quote:


> Originally Posted by *max883*
> 
> 
> 
> Geforce titan is same prise as Geforce 690!!


The Titan shows its power at higher resolutions. Also it can be OCed very well.

Let us assume that an OCed Titan might be close to a GTX 690. Than people can decide on what to buy.

But for enthusiast's having 2 Titans will be better than having 2 690s. 4 core is not always optimal.

Titan really talks to a different audience.

For example in this thread, some owners will always be happier with a 690, but some owners might feel beneficial upgrading to a Titan. It all depends.


----------



## SuprUsrStan

Hey, add me to the club. I've picked up my GTX 690 on release day but I never got around to joining this club.

I was actually sifting through the posts looking for any indication of people trying to unlock overvolting of the 690's. I know the other 600 series cards can take a flashed bios and have their voltages unlocked but I tried that with my 690 and while I did get the power target raised to 150% for both cores, The voltages still stayed firmly at 1.175v. I know voltages are locked but are they HARDWARE locked on the gtx 690, making them different from sli 680's?

On water, my overclock maxes out at a +165 mhz offset. I feel like cranking the voltage is the only to boost my performance any further


----------



## PinzaC55

5 X GTX Titans listed in the UK http://www.aria.co.uk/Products?search=GTX+Titan&x=10&y=15


----------



## ceteris

Quote:


> Originally Posted by *Syan48306*
> 
> Hey, add me to the club. I've picked up my GTX 690 on release day but I never got around to joining this club.
> 
> I was actually sifting through the posts looking for any indication of people trying to unlock overvolting of the 690's. I know the other 600 series cards can take a flashed bios and have their voltages unlocked but I tried that with my 690 and while I did get the power target raised to 150% for both cores, The voltages still stayed firmly at 1.175v. I know voltages are locked but are they HARDWARE locked on the gtx 690, making them different from sli 680's?
> 
> On water, my overclock maxes out at a +165 mhz offset. I feel like cranking the voltage is the only to boost my performance any further
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


What rad do you have cooling it? What's your ambient and load temps?


----------



## 86JR

So I installed it and had a play on arma 2... if I max out all the settings it hits VRAM limit and does 3.4FPS. According to msi afterburner it hits 2.5gb of VRAM usage with texture 3d set to 200%.


----------



## TheMadHerbalist

Quote:


> Originally Posted by *Syan48306*
> 
> Hey, add me to the club. I've picked up my GTX 690 on release day but I never got around to joining this club.
> 
> I was actually sifting through the posts looking for any indication of people trying to unlock overvolting of the 690's. I know the other 600 series cards can take a flashed bios and have their voltages unlocked but I tried that with my 690 and while I did get the power target raised to 150% for both cores, The voltages still stayed firmly at 1.175v. I know voltages are locked but are they HARDWARE locked on the gtx 690, making them different from sli 680's?
> 
> On water, my overclock maxes out at a +165 mhz offset. I feel like cranking the voltage is the only to boost my performance any further


I tried the bios as well with no luck either. One of my cards gets a + 196 core & 620 mem, but the actual is 1228 on each GPU,while my other card sits at +144 core with an actual of 1188. Temps sit at around 30-35 under high load, so I want to add more power to them. I'm at the point of going with a hard mod for more volts since it seems noone been able to get the bios mod working.


----------



## Buzzkill

Quote:


> Originally Posted by *PinzaC55*
> 
> 5 X GTX Titans listed in the UK http://www.aria.co.uk/Products?search=GTX+Titan&x=10&y=15


EVGA Has 5 Titan Models listed.

Titan 06G-P4-2790-KR

Titan SC 06G-P4-2791-KR

Titan SC Signature 06G-P4-2793-KR

Titan SC Hydro Copper 06G-P4-2794-KR

Titan SC Hydro Copper Signature 06G-P4-2795-KR

http://www.evga.com/articles/00729/#TitanHCSignature


----------



## noob.deagle

Quote:


> Originally Posted by *jhager8783*
> 
> 
> 
> I was reading about the new Titan's ability to overclock monitors' refresh rates. I was wondering if it would work with a multiple monitor setup. I am using these 32" 1080p LED TVs with a 60Hz refresh rate (I know, but they were dirt cheap).
> 
> .


monitor overclocking can be done already on your current cards its not anything new. yes you can also do it on multi-monitor setups however there is a chance that not all monitors will reach the same refresh. i have mine running at 66hz anything higher and image becomes distorted.

so far titan seems promising mainly due to the higher Vram which considering 'next gen' on ps4 has a unified 8gb ram its possible newer games may use over 2gb on single displays @1080p.


----------



## Qu1ckset

Quote:


> Originally Posted by *noob.deagle*
> 
> monitor overclocking can be done already on your current cards its not anything new. yes you can also do it on multi-monitor setups however there is a chance that not all monitors will reach the same refresh. i have mine running at 66hz anything higher and image becomes distorted.
> 
> so far titan seems promising mainly due to the higher Vram which considering 'next gen' on ps4 has a unified 8gb ram its possible newer games may use over 2gb on single displays @1080p.


im hoping i can milk my gtx690 till maxwell, and will def play the waiting game.. if i didnt already have a 690 id but the titan with its 6gb vram.. but i dont see vram usage going up that fast


----------



## Cheesemaster

So, I just purchased three 680 classifieds. i am going to receive my new 690 and i still have my old 690. I was wanting to really enjoy my nvidia surround setup with the 690's but it fell short of vram.. they are great cards but i was ignorant to the fact of high res gaming... i hope the classys offer more....


----------



## jhager8783

Quote:


> Originally Posted by *Cheesemaster*
> 
> One of my 690's died I am going through a rma process with EVGA... I am seriously considering just getting three titans and calling it a day. I will run just one of my monitors until I get the titans. I cant run triple monitors with everything cranked on quad 690's it was kind of a waste, wish I new about vram limitations in 3d surround.. they rock real hard. But for 2k I should crank up all settings and should not have to compromise.


I agree completely, I may just go for Titans myself due to the very same issue. I've been kind of wishy washy on the decision, but then I think, if I get Titans, I'll have to leave all of you wonderfull people lol. In all honesty I'm looking to sell my two 690's my two 2gb 670 FTWs and my old Rosewill 850x 80 plus PSU and go for two Titans and add a third after I get a better mobo. Cheesemaster, guys like us will never be happy, we are addicts to this way of life, AA (anti-ailising) anonymous here we come.


----------



## jhager8783

Does anyone know if it's possible to update the members list photos. The pic still shows my fugly 670 FTW. I'd like to maybe get it updated to my 2x 690's. If not it's all good, just a question.


----------



## PCModderMike

Quote:


> Originally Posted by *jhager8783*
> 
> Does anyone know if it's possible to update the members list photos. The pic still shows my fugly 670 FTW. I'd like to maybe get it updated to my 2x 690's. If not it's all good, just a question.


Just PM the OP, he responds fast and is very helpful.


----------



## drek

Doesnt look like the titan is better than the 690 to me or am i wrong?


----------



## Shiftstealth

Quote:


> Originally Posted by *drek*
> 
> Doesnt look like the titan is better than the 690 to me or am i wrong?


You'll receive the same response as red camp vs green camp.

Its FPS is on average....lower.

Some people think it will be faster because there will be no frame latency.

Many have looked into it and think this won't be the case, but based on the reviews they recommended 1 690 over 1 titan.

Although 2 titans over 2 690's.

There you go.


----------



## zer0sum

Quote:


> Originally Posted by *iARDAs*
> 
> The Titan shows its power at higher resolutions. Also it can be OCed very well.
> 
> Let us assume that an OCed Titan might be close to a GTX 690. Than people can decide on what to buy.
> 
> But for enthusiast's having 2 Titans will be better than having 2 690s. 4 core is not always optimal.
> 
> Titan really talks to a different audience.
> 
> For example in this thread, some owners will always be happier with a 690, but some owners might feel beneficial upgrading to a Titan. It all depends.


I am definitely happier with my $700 fleabay 690


----------



## drek

Quote:


> Originally Posted by *Shiftstealth*
> 
> You'll receive the same response as red camp vs green camp.
> 
> Its FPS is on average....lower.
> 
> Some people think it will be faster because there will be no frame latency.
> 
> Many have looked into it and think this won't be the case, but based on the reviews they recommended 1 690 over 1 titan.
> 
> Although 2 titans over 2 690's.
> 
> There you go.


Thanks for the info, appreciated!


----------



## TheMadHerbalist

Been looking around for a few days now, and haven't found a tutorial for a 690 hard mod. Would it be almost identical to a gtx 680 hard mod? I know I can get some some more volts and stay with in 60C°. Feels so frustrating knowing the amount of untapped potential these cards have. Take a look at my my highest oc http://www.3dmark.com/fs/190804. 1241 core/ 3564 mem on stock volts. Need more volts to do the same on my other 690! Lol


----------



## PinzaC55

Quote:


> Originally Posted by *drek*
> 
> Doesnt look like the titan is better than the 690 to me or am i wrong?


The Titan appears to be the best single GPU card but not the best graphics card overall?


----------



## maximus56

Quote:


> Originally Posted by *Shiftstealth*
> 
> You'll receive the same response as red camp vs green camp.
> 
> Its FPS is on average....lower.
> 
> Some people think it will be faster because there will be no frame latency.
> 
> Many have looked into it and think this won't be the case, but based on the reviews they recommended 1 690 over 1 titan.
> 
> Although 2 titans over 2 690's.
> 
> There you go.


I don't buy the frame latency argument for a minute, as I have never experienced micro stutter with my 690. I have a couple of Titans on the way, and will be doing some benchmarking and share the results.


----------



## Cheesemaster

well my three classifieds have arrived today here is something I could not do with quad 690...


----------



## gladiator7

Quote:


> Originally Posted by *Cheesemaster*
> 
> well my three classifieds have arrived today here is something I could not do with quad 690...


Nice Cheesy parting shot to your ex comrades







wonder what kind of welcome you will get on the 680 owners thread for this, as you can "ditch" them too and could decide to go with 3 7970s.
I guess you won't be getting anywhere close to the quad 690 benchmarks on 3d mark 11


----------



## mtbiker033

hi guys!

I just ordered an evga 690 last night!









quick question, how does the 690 do in BF3?


----------



## SuprUsrStan

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> I tried the bios as well with no luck either. One of my cards gets a + 196 core & 620 mem, but the actual is 1228 on each GPU,while my other card sits at +144 core with an actual of 1188. Temps sit at around 30-35 under high load, so I want to add more power to them. I'm at the point of going with a hard mod for more volts since it seems noone been able to get the bios mod working.


I guess I'm in slightly better situation with the minimum of the cores at 1215. I'm probably going to sell my GTX 690 when the 780 comes out and just pick up two of those. The 680's are able to be voltage unlocked but it looks like the 690 is hardware voltage locked. :/ Besides, as cool as the GTX 690 is, its not that special with the original cooler removed and a water block attached. Two GPU's with water blocks look much better. imo.


----------



## Roikyou

Part of me would like to go to a single card vs the two 680's but would probably go with the 690 for that beefy 690 performance. Running 3014 monitor, so one 690 would be more than enough, less power than the two 680's, little less performance of course. Would like to lighten up my tower as cpu and gpu are water cooled with RX360 and RX240 radiators, dual bay pump. One video card, water cooled, stick with just the RX360, remove the RX240, lighter but beefy computer....if my justification makes sense. (to want to try something different) and have the ability to move my case a little easier when I would like to....


----------



## SuprUsrStan

Quote:


> Originally Posted by *PinzaC55*
> 
> The Titan appears to be the best single GPU card but not the best graphics card overall?


Comparing it to the 690 is a little unfair because its like saying, a GTX Titan is about 15% slower than 2 680's in SLI.

IMO, the titan is an amazing card but its too little too late; 10 months after the GTX 680's, same series, based on the same technology. With the 780's another 8 or 9 months out it might just be better to just sight tight for another hand full of months and pick up the 780's when it comes out. My guess is that the 780's will be similar performance to the Titan (maybe 10% slower. MAYBE) but only costing $500. Get 780's in SLI >>> Titan right now.


----------



## TheMadHerbalist

If it is the case that 780 in sli are slightly less powerful than a single titan, wouldn't some one who is a performance freak be better of (assuming deep pockets) be better of running 3 titans in sli vs quad 780's? I'm also going to guess that the titan can be overvolted a little(only guessing due to evga updating precision x 4.0 for titanr elease to include an over volt button)

Edit nvm misread last post


----------



## PCModderMike

My 690 block along with some other goodies came in today...this thing is massive!


----------



## MrTOOSHORT

Oh nice. I do love my block too.

Congrats on the gtx690 and block again Mike!


----------



## PCModderMike

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Oh nice. I do love my block too.
> 
> Congrats on the gtx690 and block again Mike!


Thank you sir.







Hoping to push the clocks like yours once it's under water.


----------



## maximus56

Quote:


> Originally Posted by *Roikyou*
> 
> Part of me would like to go to a single card vs the two 680's but would probably go with the 690 for that beefy 690 performance. Running 3014 monitor, so one 690 would be more than enough, less power than the two 680's, little less performance of course. Would like to lighten up my tower as cpu and gpu are water cooled with RX360 and RX240 radiators, dual bay pump. One video card, water cooled, stick with just the RX360, remove the RX240, lighter but beefy computer....if my justification makes sense. (to want to try something different) and have the ability to move my case a little easier when I would like to....


You will be fine with the 690. I have been running 3 120 hz monitors in surround and except for a sub optimized game like Fc3 and crysis 3 , I have never had any issues. I still crank out min avg 55 fps with ultra settings on Fc3 in surround. We will see what all the fuss with Titan is all about soon








And, one thing for sure, 680s in Tri sli don't come anywhere near the quad 690s.


----------



## wermad

Acquired a couple of nice preowned 690s to replace my old quad Fermi setup


----------



## gladiator7

Quote:


> Originally Posted by *wermad*
> 
> Acquired a couple of nice preowned 690s to replace my old quad Fermi setup










. If you don't mind me asking, how much did you pay for the pre owned 690s? Thanks.


----------



## wermad

Quote:


> Originally Posted by *gladiator7*
> 
> 
> 
> 
> 
> 
> 
> 
> . If you don't mind me asking, how much did you pay for the pre owned 690s? Thanks.


Great price. That's all I really would and want to say


----------



## gladiator7

Quote:


> Originally Posted by *wermad*
> 
> Great price. That's all I really would and want to say


Nice







enjoy mate, it seems like you will have a lot of fun , especially once you have picked up your 1200 ips for surround


----------



## jhager8783

PC Crsis?

I was playing Crysis 3 an hour ago when I had an odd occurrence. While playing, the frame rates were around 45-52fps, then I remembered that I didn't have EVGA Precision X on in the background. I alt-tabed and set to 120% power 1.38v and 75Mhz over with stock memory clock (temps [email protected] 71c with the settings mentioned below), alt-tabed back into the game and my fps had dropped by half. So I exited out of the game, restarted Precision X, back to the game and still fps sucked 12-17fps. Then I exited all, tried stock clocks, went for another go and fps sucked again.

Restarting the rig and running without EVGA P X seemed to be the only solution. I'm playing at 55-65 fps without EVGA P X. So is EPX the problem? I was playing at all the same settings: 6036x1080 texture and graphics on high with FXAA and v-sync on.

I checked everything I know to check. My guess is that EPX somehow interferes with the latest 314.xx drivers...? I don't remember having this problem with 310.xx - 313.xx drivers, then again I rarely forget to open EPX before I game. Am I right to assume a driver conflict? If not then *** is going on?

On a side (and probably unrelated) note, sometimes I notice that when I overclock more than usual, but within safe temperature parameters, my performance drops about 10-20 fps. I think it might be too much draw on the PSU to over clock the cards and the CPU. The PSU doesn't exactly run hot while gaming, but it's a little warmer than I think it ought to be.

Thanks in advance for any and all input.

*For those that don't look at the rig builder:

Gigabyte Z77X-UD5H - non WIFI

Quad SLI 690

i7 3770k oc'd to 4.65gHz

1000watt 80plus gold PSU

16Gb Hyper X @ 1866mHz

Samsung 128Gb SSD

HAF X*


----------



## Qu1ckset

I was playing crysis 3 @1440p and with 2x smaa and 16x AA I was getting 55-63 fps with constant dips to 30fps, I turned down the settings to 4x AA and I was getting 80+fps with dips down to 65fps buy not very often.. and that's at stock clocks...


----------



## Shiftstealth

Quote:


> Originally Posted by *Qu1ckset*
> 
> I was playing crysis 3 @1440p and with 2x smaa and 16x AA I was getting 55-63 fps with constant dips to 30fps, I turned down the settings to 4x AA and I was getting 80+fps with dips down to 65fps buy not very often.. and that's at stock clocks...


I've noticed the FPS dips are when Psycho's face is on the screen in cutscenes.


----------



## Qu1ckset

Quote:


> Originally Posted by *Shiftstealth*
> 
> I've noticed the FPS dips are when Psycho's face is on the screen in cutscenes.


Ya that's when it does the big dips for me as well, other then that my system seems run easily


----------



## jcde7ago

Quote:


> Originally Posted by *PCModderMike*
> 
> Thanks for the add!
> Now that Titan has released, kinda feels silly being added just now....but at almost half the price of retail I couldn't pass it up.


Heh, Titan or no Titan, the 690 is still quite a prestigious club, good sir....i've no doubt that if 690 owners wanted a Titan, spending the money wouldn't be the issue








Quote:


> Originally Posted by *PowerK*
> 
> I sold my 690s the other day for Titans.


Good luck with the move, i'm sure you'll enjoy the Titans! Though you are much braver than I, since I would never be able to trade what is essentially 4x 680s for 2x Titans.








Quote:


> Originally Posted by *86JR*
> 
> So I am now officially in this club
> 
> 2500k at 5ghz
> 690 gtx ASUS
> Corsiar 850W non modular
> 
> Hoping the PSU is good enough... any ideas what I can overclock to?
> 
> I am on 1080p so I can max detail without any fps loss


Snap a pic so I can get you added to the list! thumb:
Quote:


> Originally Posted by *mtbiker033*
> 
> hi guys!
> 
> I just ordered an evga 690 last night!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> quick question, how does the 690 do in BF3?


Congrats, be sure to post a pic so I can get you added to the list! And as for BF3...a 690 will absolutely destroy it.








Quote:


> Originally Posted by *Qu1ckset*
> 
> I was playing crysis 3 @1440p and with 2x smaa and 16x AA I was getting 55-63 fps with constant dips to 30fps, I turned down the settings to 4x AA and I was getting 80+fps with dips down to 65fps buy not very often.. and that's at stock clocks...


Lol, the fact that you're averaging essentially 45FPS with 2xSMAA and 16xAA is insane...but yeah, using vsync, i can max Crysis 3 out at 1440p using 4xMSAA and maintain a 60FPS constant - 8xMSAA was really, really good as well, but that is where the frame dips were happening for me, due to the VRAM limit - though I would say it was definitely still averaging in the 50s, and Crysis 3 is the only game so far that I can't run at a 60FPS constant using at least 8xMSAA.

Basically, I have no reason to switch to 2x Titans yet, or maybe even ever...especially not on a SINGLE 1440p monitor.









EDIT: Added *Paz1911* to the list.


----------



## Shiftstealth

Quote:


> Originally Posted by *jcde7ago*
> 
> Heh, Titan or no Titan, the 690 is still quite a prestigious club, good sir....i've no doubt that if 690 owners wanted a Titan, spending the money wouldn't be the issue
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck with the move, i'm sure you'll enjoy the Titans! Though you are much braver than I, since I would never be able to trade what is essentially 4x 680s for 2x Titans.
> 
> 
> 
> 
> 
> 
> 
> 
> Snap a pic so I can get you added to the list! thumb:
> Congrats, be sure to post a pic so I can get you added to the list! And as for BF3...a 690 will absolutely destroy it.
> 
> 
> 
> 
> 
> 
> 
> 
> Lol, the fact that you're averaging essentially 45FPS with 2xSMAA and 16xAA is insane...but yeah, using vsync, i can max Crysis 3 out at 1440p using 4xMSAA and maintain a 60FPS constant - 8xMSAA was really, really good as well, but that is where the frame dips were happening for me, due to the VRAM limit - though I would say it was definitely still averaging in the 50s, and Crysis 3 is the only game so far that I can't run at a 60FPS constant using at least 8xMSAA.
> 
> Basically, I have no reason to switch to 2x Titans yet, or maybe even ever...especially not on a SINGLE 1440p monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Added *Paz1911* to the list.


I turn AA off entirely and it still isnt smooth for me.


----------



## jcde7ago

Quote:


> Originally Posted by *Shiftstealth*
> 
> I turn AA off entirely and it still isnt smooth for me.


Something else must be going on then...other than that, your 3570K *IS* running about 800mhz slower than my 3930K...not to mention, Crysis 3 is optimized for multi-threaded performance, which the 3930K will definitely handily beat a 3570K in...


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> Acquired a couple of nice preowned 690s to replace my old quad Fermi setup


Congrats Wermad! Hope you enjoy your quad 690s as much as I have thoroughly enjoyed mine


----------



## Qu1ckset

Quote:


> Originally Posted by *jcde7ago*
> 
> Lol, the fact that you're averaging essentially 45FPS with 2xSMAA and 16xAA is insane...but yeah, using vsync, i can max Crysis 3 out at 1440p using 4xMSAA and maintain a 60FPS constant - 8xMSAA was really, really good as well, but that is where the frame dips were happening for me, due to the VRAM limit - though I would say it was definitely still averaging in the 50s, and Crysis 3 is the only game so far that I can't run at a 60FPS constant using at least 8xMSAA.
> 
> Basically, I have no reason to switch to 2x Titans yet, or maybe even ever...especially not on a SINGLE 1440p monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Added *Paz1911* to the list.


Ya I've been more then happy with my purchase, I was going to get a titan if one performed better then a gtx690, but that didn't happen, and I'm not about to spend $2000 on two titans when Maxwell will be out by hopefully late 2014 and make the titan look stupid on benchmarks..


----------



## fat_italian_stallion

Sad to leave, but it's time to remove my name off the list. Sold my cards to a fellow forum member to fund my Titans.


----------



## jcde7ago

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Sad to leave, but it's time to remove my name off the list. Sold my cards to a fellow forum member to fund my Titans.


Done.


----------



## wermad

Quote:


> Originally Posted by *maximus56*
> 
> Congrats Wermad! Hope you enjoy your quad 690s as much as I have thoroughly enjoyed mine


Thank you








Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Sad to leave, but it's time to remove my name off the list. Sold my cards to a fellow forum member to fund my Titans.


Enjoy em beasty Titans


----------



## maximus56

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Sad to leave, but it's time to remove my name off the list. Sold my cards to a fellow forum member to fund my Titans.


Good riddance...lol. Just kidding. Enjoy your Titans.
I have 4 Titans on the way for my next build, but I am in no hurry to part company with my beloved 690s.


----------



## PCModderMike

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Sad to leave, but it's time to remove my name off the list. Sold my cards to a fellow forum member to fund my Titans.


Awesome. Farewell.









Quote:


> Originally Posted by *maximus56*
> 
> Good riddance...lol. Just kidding. Enjoy your Titans.
> I have 4 Titans on the way for my next build, but I am in no hurry to part company with my beloved 690s.


4 Titans.....wow, all for one rig?
Would love to see some pics of that!


----------



## maximus56

Quote:


> Originally Posted by *PCModderMike*
> 
> Awesome. Farewell.
> 
> 
> 
> 
> 
> 
> 
> 
> 4 Titans.....wow, all for one rig?
> Would love to see some pics of that!


You got it


----------



## wermad

Fyi 690 owners: long story short, i ended up w/ an extra gtx 690 block. If any one is interested, please send me a pm


----------



## justanoldman

This may be asking a lot, but I am trying to get a reference and can't find a good one online. I am testing out an EVGA 690, and using Heaven 3.0 with max settings (8xAA, 16xAF, extreme tess.) at 1920x1080.

Most scores posted are with a max oc on the card, but they all oc differently. Can anyone give me a Heaven 3.0 result with max settings at 1080p, with no overclock on your 690, just the power target raised?


----------



## shremi

Quote:


> Originally Posted by *justanoldman*
> 
> This may be asking a lot, but I am trying to get a reference and can't find a good one online. I am testing out an EVGA 690, and using Heaven 3.0 with max settings (8xAA, 16xAF, extreme tess.) at 1920x1080.
> 
> Most scores posted are with a max oc on the card, but they all oc differently. Can anyone give me a Heaven 3.0 result with max settings at 1080p, with no overclock on your 690, just the power target raised?


I am not at home but if I recall right should be around 85 fps .... Let me get home and ill run a bench for you


----------



## TeamBlue

I had an evga 690 lined up before but iy fell through, now its a waiting game with the ups man, my asus 690 is on its way!!!!!


----------



## Rei86

Sold my 690, so jcde7ago you can take my name off.

It was nice being part of the top tier group


----------



## PinzaC55

Quote:


> Originally Posted by *justanoldman*
> 
> This may be asking a lot, but I am trying to get a reference and can't find a good one online. I am testing out an EVGA 690, and using Heaven 3.0 with max settings (8xAA, 16xAF, extreme tess.) at 1920x1080.
> 
> Most scores posted are with a max oc on the card, but they all oc differently. Can anyone give me a Heaven 3.0 result with max settings at 1080p, with no overclock on your 690, just the power target raised?


Ran Unigine Heaven 3 about 3 times on those settings and got 86 , 85, 81 FPS. Top temperature on the 690 was about 60 degrees.

I just bought Unigine Valley Advanced and on the Extreme 1080 HD Preset I got 64 FPS but the GPU's hit a whopping 80 degrees c!


----------



## gladiator7

Quote:


> Originally Posted by *PinzaC55*
> 
> Ran Unigine Heaven 3 about 3 times on those settings and got 86 , 85, 81 FPS. Top temperature on the 690 was about 60 degrees.
> 
> I just bought Unigine Valley Advanced and on the Extreme 1080 HD Preset I got 64 FPS but the GPU's hit a whopping 80 degrees c!


This seems low. Is this on stock settings only? temp throttle kicking in for sure

Edit: Actually, the first 2 seem fine. Also on Extreme HD settings?


----------



## justanoldman

Quote:


> Originally Posted by *PinzaC55*
> 
> Ran Unigine Heaven 3 about 3 times on those settings and got 86 , 85, 81 FPS. Top temperature on the 690 was about 60 degrees.
> 
> I just bought Unigine Valley Advanced and on the Extreme 1080 HD Preset I got 64 FPS but the GPU's hit a whopping 80 degrees c!


Thanks very much for doing that. Your Heaven 3.0 score seems normal in the 85 fps range, but your Valley 1.0 score seems a little different than what I have seen, I think it should be closer to the high 70s in fps. These cards throttle at 70c right? Maybe that affected your score.


----------



## Shiftstealth

Quote:


> Originally Posted by *justanoldman*
> 
> Thanks very much for doing that. Your Heaven 3.0 score seems normal in the 85 fps range, but your Valley 1.0 score seems a little different than what I have seen, I think it should be closer to the high 70s in fps. These cards throttle at 70c right? Maybe that affected your score.


They throttle their boost clocks like 13-26 mhz lower.


----------



## TheMadHerbalist

Quote:


> Originally Posted by *PinzaC55*
> 
> Ran Unigine Heaven 3 about 3 times on those settings and got 86 , 85, 81 FPS. Top temperature on the 690 was about 60 degrees.
> 
> I just bought Unigine Valley Advanced and on the Extreme 1080 HD Preset I got 64 FPS but the GPU's hit a whopping 80 degrees c!


where you running stock on the Unigine Valley? That score strikes me as a tad low.


----------



## PinzaC55

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> where you running stock on the Unigine Valley? That score strikes me as a tad low.


Everything stock on both Heaven and Valley. The results on Heaven seem fine for a single GTX 690.


----------



## TheMadHerbalist

I'll have to do a stock run, since I only remember my best run with a single was a 97.1.
Maybe CPU plays a larger role?


----------



## gladiator7

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> I'll have to do a stock run, since I only remember my best run with a single was a 97.1.
> Maybe CPU plays a larger role?


CPU might play a part with quads, but not sure how much for a single if you are going from 4.5 to say 5 ghz in valley benchmark.
Yours look really good ...u using k boost to keep constantly boost clocks throughout the bech run?


----------



## justanoldman

Posted some of this in the Valley thread but here is what I got:

It is interesting that the CPU overclock makes much more of a difference in Valley 1.0 than it does in Heaven. In Valley 1.0 I get the following with a 3770k at 3.9 and 5.0, and a GTX 690:
CPU stock, GPU stock = 78.1
CPU o.c., GPU stock = 80.2
CPU stock, GPU o.c. = 86.7
CPU o.c., GPU o.c. = 91.1

In Heaven 3.0, max settings at 1920x1080 I got:
CPU stock, GPU stock = 88.3
CPU o.c., GPU stock = 89.5
CPU stock, GPU o.c. = 99.1
CPU o.c., GPU o.c. = 100.6


----------



## gladiator7

Quote:


> Originally Posted by *justanoldman*
> 
> Posted some of this in the Valley thread but here is what I got:
> 
> It is interesting that the CPU overclock makes much more of a difference in Valley 1.0 than it does in Heaven. In Valley 1.0 I get the following with a 3770k at 3.9 and 5.0, and a GTX 690:
> CPU stock, GPU stock = 78.1
> CPU o.c., GPU stock = 80.2
> CPU stock, GPU o.c. = 86.7
> CPU o.c., GPU o.c. = 91.1
> 
> In Heaven 3.0, max settings at 1920x1080 I got:
> CPU stock, GPU stock = 88.3
> CPU o.c., GPU stock = 89.5
> CPU stock, GPU o.c. = 99.1
> CPU o.c., GPU o.c. = 100.6


Yeah, Heaven 3.0 is less demanding...what about Heaven 4.0?


----------



## Sujeto 1

Problem with GTX 690 is the thing about VRAM, since it cant use his 4GB what sense is to buy one rigthnow when next year games will demand even more RAM. on this case i rather buy two gtx 680 4GB than 1 GTX 690 hoping to dont see so much microstutering. Or 1 Titan hoping it can increase his perfomance trough OC near to GTX 690.


----------



## Shiftstealth

Quote:


> Originally Posted by *Sujeto 1*
> 
> Problem with GTX 690 is the thing about VRAM, since it cant use his 4GB what sense is to buy one rigthnow when next year games will demand even more RAM. on this case i rather buy two gtx 680 4GB than 1 GTX 690 hoping to dont see so much microstutering. Or 1 Titan hoping it can increase his perfomance trough OC near to GTX 690.


Only game i have any issues with is Crysis 3, and even then its only in SP.


----------



## gladiator7

Quote:


> Originally Posted by *Sujeto 1*
> 
> Problem with GTX 690 is the thing about VRAM, since it cant use his 4GB what sense is to buy one rigthnow when next year games will demand even more RAM. on this case i rather buy two gtx 680 4GB than 1 GTX 690 hoping to dont see so much microstutering. Or 1 Titan hoping it can increase his perfomance trough OC near to GTX 690.


Sujeto - either you are schizophrenic or a trouble maker...you know all the arguments for and against ,as you have been doing so all over the forums in favor of a 690 and disagreeing with all this VRAM nonsense...come'on now...don't kid the kidder


----------



## justanoldman

Quote:


> Originally Posted by *gladiator7*
> 
> Yeah, Heaven 3.0 is less demanding...what about Heaven 4.0?


Haven't really used 4.0 much since I was comparing results to other setups, but I just did it max settings, full screen, 1920x1080, with the CPU overclocked. I got 67.8 with GPU stock and 76.3 with GPU overclocked.


----------



## Sujeto 1

Quote:


> Originally Posted by *gladiator7*
> 
> Sujeto - either you are schizophrenic or a trouble maker...you know all the arguments for and against ,as you have been doing so all over the forums in favor of a 690 and disagreeing with all this VRAM nonsense...come'on now...don't kid the kidder


Gladiator stop it man, whats wrong with u, leave me alone and gnore my post. peace
I got to preorder a titan because i have no options indeed with this GTX 690 and his vram issues, if not i would glad to buy it. When i speak i cannot buy anothe pc in YEARS im serious. Just expand or add other cards but not a new pc at all.

btw all the stuff about GTX 690 is because i wasnt sure, after reading many posts i changed my mind for the reason said. You cant have 2 or 3 GTX 690 but you can have 2 or 3 GTX TITAN with out risk of issues.

Problem is that in USA, people is used to switch their cars, their video cards and their women every year, i cannot, i have like 3 years saving money for new Pc, not new pc for years, im screwed.


----------



## TheMadHerbalist

Quote:


> Originally Posted by *gladiator7*
> 
> CPU might play a part with quads, but not sure how much for a single if you are going from 4.5 to say 5 ghz in valley benchmark.
> Yours look really good ...u using k boost to keep constantly boost clocks throughout the bech run?


yeah I like it when I'm benching, let's me see when the card is reaching its peak. Worst GPU clocks about 1188 best one went 1248, might go higher but CPU got to hot from running at 5.0 GHz so I stopped. Lol on 3dmark 11 I can push each GPU a little harder, but unigine stuff seems not to like my clocks speeds from future mark. What I find strange is my core clocks hit the wall but I can still pump over 700+ on mem. Maybe I can set mem down from stock to get an extra 13mhz.


----------



## wermad

Quote:


> Originally Posted by *Sujeto 1*
> 
> Gladiator stop it man, whats wrong with u, leave me alone and gnore my post. peace
> I got to preorder a titan because i have no options indeed with this GTX 690 and his vram issues, if not i would glad to buy it. When i speak i cannot buy anothe pc in YEARS im serious. Just expand or add other cards but not a new pc at all.
> 
> btw all the stuff about GTX 690 is because i wasnt sure, after reading many posts i changed my mind for the reason said. You cant have 2 or 3 GTX 690 but you can have 2 or 3 GTX TITAN with out risk of issues.
> 
> Problem is that in USA, people is used to switch their cars, their video cards and their women every year, i cannot, i have like 3 years saving money for new Pc, not new pc for years, im screwed.


What platform are you planning to run your Titan on? Just curious


----------



## worms14

Will using mx4 on Gigabyte GTX690 give me better temps in comparison with stock compound


----------



## Sujeto 1

Quote:


> Originally Posted by *wermad*
> 
> What platform are you planning to run your Titan on? Just curious


Obviously not my current sign







, It will be a 3930K + 32 Gb ram 2133 mhz + 1200 Corsair Platinum it should be enough for several years.


----------



## PinzaC55

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> I'll have to do a stock run, since I only remember my best run with a single was a 97.1.
> Maybe CPU plays a larger role?


I love a puzzle and this has me stumped. I know it cannot be the fault of my hardware so it has to be either in my Settings in Nvidia Control Panel or the Valley benchmark although it could possibly be drivers. I plan to work systematically through it altering settings tomorrow till I find the right combo.


----------



## mtbiker033

I just got an evga 690 today, got it installed with 314.07 and did a stock run of 3dmark11:

http://www.3dmark.com/3dm11/6037651

P13,908

does that score seem right for stock?

thought I would add some pics


----------



## gladiator7

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> yeah I like it when I'm benching, let's me see when the card is reaching its peak. Worst GPU clocks about 1188 best one went 1248, might go higher but CPU got to hot from running at 5.0 GHz so I stopped. Lol on 3dmark 11 I can push each GPU a little harder, but unigine stuff seems not to like my clocks speeds from future mark. What I find strange is my core clocks hit the wall but I can still pump over 700+ on mem. Maybe I can set mem down from stock to get an extra 13mhz.


Yeah, FM is more forgiving on core over clock than Heaven. I can only do a boost clock of 135 on precision, but I have to scale memory overclock to around 350 or so in precision. But, I get more mileage out of setting mem boost at 750 and lowering my core overlcock to 129 in Heaven.
However, Kboost is broken for me, as whenever I enable it, the stupid thing forcibly down clocks my core on one gpu, while maintaining max constant boost on the other where it is supposed to be. No matter what I tried, it did not fix it. It seems that something got hung up in a system dll file when I once installed a driver over kboost while it was open and running. I miss that extra juice in benches from kboost, so I might end up re installing my os to fix this stupid issue








To all kids, here is a lesson, always close precision, and then install your drivers, then reboot back once to desktop before running precision again, especially when kboost is enabled.


----------



## gladiator7

Quote:


> Originally Posted by *worms14*
> 
> Will using mx4 on Gigabyte GTX690 give me better temps in comparison with stock compound


Not sure how much tim re pasting would do, but it can't hurt. Putting them under water is the sure way to keep your temps down, I never go over 35 c - may be 40c on mine no matter how I hard I push it.


----------



## TheMadHerbalist

Quote:


> Originally Posted by *PinzaC55*
> 
> I love a puzzle and this has me stumped. I know it cannot be the fault of my hardware so it has to be either in my Settings in Nvidia Control Panel or the Valley benchmark although it could possibly be drivers. I plan to work systematically through it altering settings tomorrow till I find the right combo.


that was with a 1202 core oc and pushing my 3930 to 5.0 GHz I think. Will do a stock run when I get home from work.

Quote:


> Originally Posted by *gladiator7*
> 
> Yeah, FM is more forgiving on core over clock than Heaven. I can only do a boost clock of 135 on precision, but I have to scale memory overclock to around 350 or so in precision. But, I get more mileage out of setting mem boost at 750 and lowering my core overlcock to 129 in Heaven.
> However, Kboost is broken for me, as whenever I enable it, the stupid thing forcibly down clocks my core on one gpu, while maintaining max constant boost on the other where it is supposed to be. No matter what I tried, it did not fix it. It seems that something got hung up in a system dll file when I once installed a driver over kboost while it was open and running. I miss that extra juice in benches from kboost, so I might end up re installing my os to fix this stupid issue
> 
> 
> 
> 
> 
> 
> 
> 
> To all kids, here is a lesson, always close precision, and then install your drivers, then reboot back once to desktop before running precision again, especially when kboost is enabled.


disable sli and try it again. Once k boost is off on both GPUs try enableing it again. Had the sender issue once. Turned out I enabled it with out turning off sli. Hope it helps.


----------



## gladiator7

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> that was with a 1202 core oc and pushing my 3930 to 5.0 GHz I think. Will do a stock run when I get home from work.
> disable sli and try it again. Once k boost is off on both GPUs try enableing it again. Had the sender issue once. Turned out I enabled it with out turning off sli. Hope it helps.


Thanks. But, I have tried everything under the sun, trust me.lol.I have had my card since July of last year and this started happening when I messed up my diver install or may be it was running AB and Precision together which I don't do anymore, who knows ..


----------



## PCModderMike

Quote:


> Originally Posted by *mtbiker033*
> 
> I just got an evga 690 today, got it installed with 314.07 and did a stock run of 3dmark11:
> 
> http://www.3dmark.com/3dm11/6037651
> 
> P13,908
> 
> does that score seem right for stock?
> 
> thought I would add some pics


Sexy


----------



## TheMadHerbalist

Heaven 4.0 at extreme everything and 1080 i got 63.1 at stock
Valley 1.0 at extreme i got a 75.2.
 heaven 4.0 is way tougher than 3.0


----------



## gladiator7

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> Heaven 4.0 at extreme everything and 1080 i got 63.1 at stock
> Valley 1.0 at extreme i got a 75.2.
> heaven 4.0 is way tougher than 3.0


Agreed. 4.0 is tough, but do you think it is scaling well right now with the quads though, given the drivers and all?


----------



## KrynnTech

I would like to Join!




Let me know if u need anymore information... Thanks! Krynn


----------



## jhager8783

Hooked up my two 670 FTW's for old time's sake and low and behold, Crysis 3 plays much much better in single monitor setup than my 690. I am getting 60fps on the 670's with all settings maxed, that is TXAA 4x and v-sync on with everything @ very high @ 1920x1080. Neither of my 690's in quad or solo could even pull 30fps at those settings.

Mult-monitor is a different story all together, very high settings with FXAA @ 6066x1080 on one 690 gives me around 60 fps in game and 28-30fps in cut-scenes. Two 690's are as bad as the 670's with the same settings, around 38fps in game and 17fps in cut-scenes.

Far Cry 3 is the same situation, the 670's out perform both 690 quad and 690 SLI in single monitor. Is there something wrong with my 690's, or is sing;e monitor gaming just not their strong suit?


----------



## jhager8783

Forgot to mention that the 670's were run @ stock clocks showing 65-70% GPU usage and never getting hotter than 50 degrees Celsius, where-as the 690(s) were overclocked and ran up to 90% GPU usage @ 74 degrees.


----------



## jcde7ago

Quote:


> Originally Posted by *jhager8783*
> 
> Hooked up my two 670 FTW's for old time's sake and low and behold, Crysis 3 plays much much better in single monitor setup than my 690. I am getting 60fps on the 670's with all settings maxed, that is TXAA 4x and v-sync on with everything @ very high @ 1920x1080. Neither of my 690's in quad or solo could even pull 30fps at those settings.
> 
> Mult-monitor is a different story all together, very high settings with FXAA @ 6066x1080 on one 690 gives me around 60 fps in game and 28-30fps in cut-scenes. Two 690's are as bad as the 670's with the same settings, around 38fps in game and 17fps in cut-scenes.
> 
> Far Cry 3 is the same situation, the 670's out perform both 690 quad and 690 SLI in single monitor. Is there something wrong with my 690's, or is sing;e monitor gaming just not their strong suit?


Something is not adding up, because I max out @ 60FPS Crysis 3 on my single 2560x1440p monitor, Very High, vsync, and 4xMSAA...no problem. If neither of your 690s can max out a single 1920x1080p monitor (which a single 680 can actually come pretty close to doing), then they're either both bad, or like I said, something is up with your setup...

In fact, the GTX 690 is hands-down the single best performing card, even when pitted against the Titan, for single-monitor setups from 1080-1600p - just look at all of the benches. The Titan shines at multi-monitor/surround/3D setups, and on those rare 1440p monitors that can reach 120hz.
Quote:


> Originally Posted by *KrynnTech*
> 
> I would like to Join!
> 
> 
> 
> 
> Let me know if u need anymore information... Thanks! Krynn


Welcome, and grats on the quad SLI set up there! I will get you added right away.


----------



## GTX 690 SLI

hi guys
please if someone can tell, why during benchmarking one card is running on high load while another is on minimum or not working atall!!?
i am new with all this so please if some can explain
One more question why in settings this green line is not streching over both cards and not only one??
Thanks.


----------



## jhager8783

Quote:


> Originally Posted by *jcde7ago*
> 
> Something is not adding up, because I max out @ 60FPS Crysis 3 on my single 2560x1440p monitor, Very High, vsync, and 4xMSAA...no problem. If neither of your 690s can max out a single 1920x1080p monitor (which a single 680 can actually come pretty close to doing), then they're either both bad, or like I said, something is up with your setup...
> 
> In fact, the GTX 690 is hands-down the single best performing card, even when pitted against the Titan, for single-monitor setups from 1080-1600p - just look at all of the benches. The Titan shines at multi-monitor/surround/3D setups, and on those rare 1440p monitors that can reach 120hz.


Well it must be something wrong with my setup. I don't see how though. I'm not an idiot when it comes to rigs, but I can acquiesce to the fact I may have f***ed something up somewhere along the lines, after all, I'm only human.

I always use driver sweeper to clean, then install latest Nvidia drivers and load my quad or SLI profiles. And as far as hardware is concerned, my PSU is testing fine and should be just enough to oc a quad setup without problems. I don't have the latest and greatest mobo and CPU, but it should be sufficient. My CPU isn't exactly a sloth being a 3770k @4.65 gHz, and with the Gigabyte z77x-ud5h, one 690 runs at 16x on my mobo and two @ 8x, same with the 670's obviously. Quad SLI sucks at most games as we all know, especially in 8x8 SLI, but one 690 should blow away my 670's. I am completely aware of this, but it is not happening. One 690 is louder, hotter, and slower than my two 670's; two is just multiplying the problem. They must both be bad unless anybody can think of a different reason.


----------



## jhager8783

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> hi guys
> please if someone can tell, why during benchmarking one card is running on high load while another is on minimum or not working atall!!?
> i am new with all this so please if some can explain
> One more question why in settings this green line is not streching over both cards and not only one??
> Thanks.


Sometimes I've had to click the "update sli profile" if I've made recent changes, or "disable SLI" then re-enable in order to get quad sli to work. Also, in order to get quad to work, I've had to flip my SLI cable(basically putting it on upside down). If these don't work, try to use driver sweeper to uninstall the drivers and re-install the latest 314.xx driver from Nvidia.


----------



## GTX 690 SLI

Please if someone can paste some screen shot where i can see this green line streching over both cards,or cant ??


----------



## jhager8783

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> Please if someone can paste some screen shot where i can see this green line streching over both cards,or cant ??


I don't know why you think it'll help, but here you go.


----------



## jhager8783

So I tweaked SLI profile for a solo card and got it working better, one card is now performing better than my 670 FTW's. As far as Quad SLI is concerned, it's the weirdest thing, I swapped PCI-E slots and everything is grand. Very odd indeed. I think my mobo might be jacked up.


----------



## GTX 690 SLI

Thanks for pics but do i need to plug one moore monitor to have them connected or not,please help if you can or my sli cable is out of order


----------



## PinzaC55

Quote:


> Originally Posted by *jhager8783*
> 
> So I tweaked SLI profile for a solo card and got it working better, one card is now performing better than my 670 FTW's. As far as Quad SLI is concerned, it's the weirdest thing, I swapped PCI-E slots and everything is grand. Very odd indeed. I think my mobo might be jacked up.


When I got my mobo and assembled my rig it refused to boot up, just getting as far as the Windows logo then freezing. By a process of trial and error I found that my LAN card could only live in slot 6 and my sound card could only live in slot 7! Reversing them caused the problem. I guess computer technology is so complex things like this just happen for no apparent reason.


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> hi guys
> please if someone can tell, why during benchmarking one card is running on high load while another is on minimum or not working atall!!?
> i am new with all this so please if some can explain
> One more question why in settings this green line is not streching over both cards and not only one??
> Thanks.


If its a single monitor, plug it into the 2 nd dvi (2nd from the left) on the top card with a dvi cable. If a 120 hz monitor, then you need a dual dvi cable. Do this when your computer is shutdown , and then reboot. Do all the trouble shooting with only one monitor connected.
Turn off all monitoring programs and only run precision.you should only be running 1 monitoring program at a time.
If kboost is activated, disable it , follow all on screen instructions. Do a driver reinstall and pick custom clean install as an option. Precision should be closed during install. Reboot, open precision again, keep kboost off. Tell us if this works.
One card should be in the first slot, and the other in the 4th slot. This is an example for Rive. Make sure the sli bridge is connected properly, mobo bios are up to date. Under pcie, choose gen 2 for all , and reboot, unless on an ivy board. Also, do not use pcie 3 patch with the new drivers, if you have it, unistall it immediately unless you are on an ivy mobo that supports pcie3 (in which case you don't need the patch anyways).
Double check if both 2 8 pin power connectors are firmly secure on each of the cards and connected to your psu. Lastly, you may need to try another psu, if all else fails.
Once done, if using 2 monitors, the 2nd one goes in to the first connector (from the left) on the top card. If using 3 monitors, the third one goes in the 2nd connector (from the left) on the bottom card. Again use proper dvi or dual dvi cables supplied with your monitor, especially if dual dvi.


----------



## Shiftstealth

Quote:


> Originally Posted by *Sujeto 1*
> 
> Gladiator stop it man, whats wrong with u, leave me alone and gnore my post. peace
> I got to preorder a titan because i have no options indeed with this GTX 690 and his vram issues, if not i would glad to buy it. When i speak i cannot buy anothe pc in YEARS im serious. Just expand or add other cards but not a new pc at all.
> 
> btw all the stuff about GTX 690 is because i wasnt sure, after reading many posts i changed my mind for the reason said. You cant have 2 or 3 GTX 690 but you can have 2 or 3 GTX TITAN with out risk of issues.
> 
> Problem is that in USA, people is used to switch their cars, their video cards and their women every year, i cannot, i have like 3 years saving money for new Pc, not new pc for years, im screwed.


You are posting in a GTX 690 thread about buying a Titan. You should expect this.


----------



## GTX 690 SLI

Quote:


> Originally Posted by *gladiator7*
> 
> If its a single monitor, plug it into the 2 nd dvi (2nd from the left) on the top card with a dvi cable. If a 120 hz monitor, then you need a dual dvi cable. Do this when your computer is shutdown , and then reboot. Do all the trouble shooting with only one monitor connected.
> Turn off all monitoring programs and only run precision.you should only be running 1 monitoring program at a time.
> If kboost is activated, disable it , follow all on screen instructions. Do a driver reinstall and pick custom clean install as an option. Precision should be closed during install. Reboot, open precision again, keep kboost off. Tell us if this works.
> One card should be in the first slot, and the other in the 4th slot. This is an example for Rive. Make sure the sli bridge is connected properly, mobo bios are up to date. Under pcie, choose gen 2 for all , and reboot, unless on an ivy board. Also, do not use pcie 3 patch with the new drivers, if you have it, unistall it immediately unless you are on an ivy mobo that supports pcie3 (in which case you don't need the patch anyways).
> Double check if both 2 8 pin power connectors are firmly secure on each of the cards and connected to your psu. Lastly, you may need to try another psu, if all else fails.
> Once done, if using 2 monitors, the 2nd one goes in to the first connector (from the left) on the top card. If using 3 monitors, the third one goes in the 2nd connector (from the left) on the bottom card. Again use proper dvi or dual dvi cables supplied with your monitor, especially if dual dvi.


Thanks for this valuable instructions.
I have manage to reconecti monitor and even i have bought new SLI cable thinking maybe there is problem but still same.
Steps from: Turn off all monitoring programs and only run precision.you should only ......................................
I dont understand sorry.
Please if you can explain somehow these steps i am really beginner with big heart
Any way BIG thanks and respect for your explanation, and i will continue to put this beasts in 100% mode.


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> Thanks for this valuable instructions.
> I have manage to reconecti monitor and even i have bought new SLI cable thinking maybe there is problem but still same.
> Steps from: Turn off all monitoring programs and only run precision.you should only ......................................
> I dont understand sorry.
> Please if you can explain somehow these steps i am really beginner with big heart
> Any way BIG thanks and respect for your explanation, and i will continue to put this beasts in 100% mode.


Meaning stop Fraps, AfterBurner, Nvidia Inspector, etc.
No problem..we were all beginners once








Great set up with the quads by the way


----------



## GTX 690 SLI

my PC


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> Thanks for this valuable instructions.
> I have manage to reconecti monitor and even i have bought new SLI cable thinking maybe there is problem but still same.
> Steps from: Turn off all monitoring programs and only run precision.you should only ......................................
> I dont understand sorry.
> Please if you can explain somehow these steps i am really beginner with big heart
> Any way BIG thanks and respect for your explanation, and i will continue to put this beasts in 100% mode.


By the way, do you see all 4 gpus in precision? I think that you should be able to. if you do, we can go the next steps of turning off kboost, driver re-install etc.


----------



## GTX 690 SLI

JES I CAN SEE all cpus
already i have reinstall new drivers


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> JES I CAN SEE all cpus
> already i have reinstall new drivers


On idle, there will always be a dynamic boost clocking mechanism, provided that kboost is disabled. Run a Heaven Valley 1.0 benchmark, keep precision closed, and notice if all gpus are boosting in sync. Speaking of which make sure sync is checked in Precision. If all gpus boost together in the Heaven Valley benchmark, you are good to go. Different boost clocks for idle is normal.


----------



## GTX 690 SLI

i have done sync in precision and close precision after start benchmark with heaven benchmark and have this!!


----------



## Buzzkill

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> Please if someone can paste some screen shot where i can see this green line streching over both cards,or cant ??




I have the 314.07 and Quad SLi with all showing in Precision 4.


----------



## gladiator7

Quote:


> Originally Posted by *Buzzkill*
> 
> 
> 
> I have the 314.07 and Quad SLi with all showing in Precision 4.


Yes, this is what it should look like.
GTX 690 SLI- Please triple check all your power connectors. and confirm back.


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> i have done sync in precision and close precision after start benchmark with heaven benchmark and have this!!


This sometime happens with Kboost and driver install when precision is open. Do a clean install of drivers, and just to be safe put precision in default and uninstall it, prior to a clean driver uninstall and re install.
edit; And, lastly are you using a socket 2011 mobo? If so, change to gen 2 from gen 3 in your mobo's uefi (hit del for rive or something similar for your mobo to get in to bios upon start up) and see what happens.It could also be an improper monitor connection as Buzzkill showed above, but then you should have a black gpu or two in nvcp showing. But, still double check your monitor connectors to ensure in the right connections. Hdmi has some issues once in a while with the adapter and getting recognized in 690. But this is rare..probably only happened to me..lol
Last but not least, check each card separately to ensure that you don't have a faulty card. I doubt this is the case however.


----------



## wholeeo

Most likely will join the 690 club by the end of this week. My only question is Asus or EVGA?


----------



## PCModderMike

EVGA, anything goes wrong, they're much easier to deal with in regards to an RMA.


----------



## Shiftstealth

Quote:


> Originally Posted by *wholeeo*
> 
> Most likely will join the 690 club by the end of this week. My only question is Asus or EVGA?


evga was sold out last night on the egg and now they are back in stock. I always buy evga when available. How can you not?


----------



## GTX 690 SLI

this is my settings


ok now i will deinstall evga precision and delete drivers and reinstall drivers


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> this is my settings


ok. You should not have any pcie 3 issues. we can eliminate that from trouble shooting steps.
Overclock that 3770k to 4.5 at least ..lol (not related to trouble shooting)


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> this is my settings
> 
> 
> ok now i will deinstall evga precision and delete drivers and reinstall drivers


After doing all of this in steps above. Give us an F 12 screen shot of your Heaven Valley benchmark. Again, set your boost clocks in precision, don't unable kboost, keep in sync mode, choose start at windows, reboot, don't have any osd monitoring enabled in precision.
And, then run Valkey.


----------



## GTX 690 SLI

same as before ufff!!??

running only one card


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> same as before ufff!!??
> 
> running only one card


This is not a quad a score. Please test each 690 separately. Also, check you power connectors.pleae make sure you are using the correct pice slots, as per you mobo manual


----------



## GTX 690 SLI

this is weird i have changed monitor, shift to other card and other one working ok but in benchmark like sekond and third procesor running and first and last nothing ???


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> this is weird i have changed monitor, shift to other card and other one working ok but in benchmark like sekond and third procesor running and first and last nothing ???


The green bar should be across all the 4 Gpus, once you get that, you will be back to normal.
Also please tell us what monitor connector on the card and type of connector you are using.


----------



## GTX 690 SLI

using dvi port,ok i will try to reinstall os, but do i need to unplug cards, hope not because water cooling uff!!


----------



## Shiftstealth

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> hi guys
> please if someone can tell, why during benchmarking one card is running on high load while another is on minimum or not working atall!!?
> i am new with all this so please if some can explain
> One more question why in settings this green line is not streching over both cards and not only one??
> Thanks.


Are you using an SLi cable on the 2 cards?


----------



## GTX 690 SLI

yes i am using sli cable.first i was thinking it was sli cable and i try to twist and even i buy new one but nothing so i think i need to reinstall OS and to maybe fix everything so night is long and i can stay avake ufff


----------



## Shiftstealth

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> yes i am using sli cable.first i was thinking it was sli cable and i try to twist and even i buy new one but nothing so i think i need to reinstall OS and to maybe fix everything so night is long and i can stay avake ufff


Perhaps, its almost like you have 2 SLi profiles running because each card runs 2 chips. Have you tried driver sweeper?


----------



## gladiator7

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> yes i am using sli cable.first i was thinking it was sli cable and i try to twist and even i buy new one but nothing so i think i need to reinstall OS and to maybe fix everything so night is long and i can stay avake ufff


Try that, I know pain in the rear, but may be necessary at this point. Good luck


----------



## Shiftstealth

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> this is my settings
> 
> 
> ok now i will deinstall evga precision and delete drivers and reinstall drivers


Wait, what motherboard do you have?


----------



## Buzzkill

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> yes i am using sli cable.first i was thinking it was sli cable and i try to twist and even i buy new one but nothing so i think i need to reinstall OS and to maybe fix everything so night is long and i can stay avake ufff


I would Reinstall the Driver 314.07 and check the clean install box. 1st stop precision or uninstall and reinstall precision after Clean install of driver. Also unplug all monitors and use only one monitor at first to get quad sli enabled then plug the rest of the monitors and change to multi monitor mode.

GeForce how to set up SLI http://www.geforce.com/hardware/technology/sli/system-requirements

Pictures of Quad & Tri SLI.


----------



## Shiftstealth

Quote:


> Originally Posted by *Buzzkill*
> 
> I would Reinstall the Driver 314.07 and check the clean install box. 1st stop precision or uninstall and reinstall precision after Clean install of driver. Also unplug all monitors and use only one monitor at first to get quad sli enabled then plug the rest of the monitors and change to multi monitor mode.


Meh i'm not sure, i think he has a mobo that doesnt support SLi but only crossfire.
My thoughts anyways


----------



## Buzzkill

Quote:


> Originally Posted by *Shiftstealth*
> 
> Meh i'm not sure, i think he has a mobo that doesnt support SLi but only crossfire.
> My thoughts anyways


http://www.geforce.com/hardware/technology/sli/motherboards


----------



## Buzzkill

Here is a thread about the same problem. Maybe this would help you GTX 690 SLI.
Problem with Quad Gtx 690 SLI !! Experts help me !


----------



## maximus56

Quote:


> Originally Posted by *Buzzkill*
> 
> I would Reinstall the Driver 314.07 and check the clean install box. 1st stop precision or uninstall and reinstall precision after Clean install of driver. Also unplug all monitors and use only one monitor at first to get quad sli enabled then plug the rest of the monitors and change to multi monitor mode.
> 
> GeForce how to set up SLI http://www.geforce.com/hardware/technology/sli/system-requirements
> 
> Pictures of Quad & Tri SLI.


Buzzkill- the nvidia connections never worked for me from
their website for surround. May be it was just me. This is what worked:
Card 1 top slot : monitor 1, 2nd DVI connector top row
Card 1 top slot: monitor 2, 1st DVI connector top row
Card 2 bottom slot: monitor 3, 2nd Dvi connector top row


----------



## Buzzkill

Quote:


> Originally Posted by *maximus56*
> 
> Buzzkill- the nvidia connections never worked for me from
> their website for surround. May be it was just me. This is what worked:
> Card 1 top slot : monitor 1, 2nd DVI connector top row
> Card 1 top slot: monitor 2, 1st DVI connector top row
> Card 2 bottom slot: monitor 3, 2nd Dvi connector top row


Others have said the same that the Conector picture is not the only way that Quad Sli works. Like if you had 2 D-DVI you need to use D-DVI on each card. I had to plug monitor in to multiple diffrent Dvi ports to get Quad set-up.


----------



## maximus56

Quote:


> Originally Posted by *Buzzkill*
> 
> Others have said the same that the Conector picture is not the only way that Quad Sli works. Like if you had 2 D-DVI you need to use D-DVI on each card. I had to plug monitor in to multiple diffrent Dvi ports to get Quad set-up.


Lol...that's exactly how I figured out my dual dvi connections for the surround set up.


----------



## GTX 690 SLI

my motherboard is supporting sli and crossfire
http://www.gigabyte.com/products/product-page.aspx?pid=4440#ov


----------



## grunion

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> my motherboard is supporting sli and crossfire
> http://www.gigabyte.com/products/product-page.aspx?pid=4440#ov


Quote:


> 2-way SLI™ and 2-way CrossFireX™ multi-GPU support


I do not see quad support, you might want to verify this with GB.

In the mean time try disabling HPET in bios, sometimes it does not play well with SLI.


----------



## Buzzkill

Quote:


> Originally Posted by *grunion*
> 
> I do not see quad support, you might want to verify this with GB.
> 
> In the mean time try disabling HPET in bios, sometimes it does not play well with SLI.


You are only using Two Video cards.

Quad SLI with 690's only uses two PCi-E slots. Quad(4way-sli) with four 680 is four PCi-E slots. PLX chips are on each 690.


----------



## GTX 690 SLI

Quote:


> Originally Posted by *Buzzkill*
> 
> You are only using Two Video cards.
> 
> Quad SLI with 690's only uses two PCi-E slots. Quad(4way-sli) with four 680 is four PCi-E slots. PLX chips are on each 690.


So by yours statement is it possible for me to use sli 690 with this motherboard


----------



## PinzaC55

I have banged away at Unigine Valley benchmark tonight till I was sick at heart. I changed Nvidia settings back and forth with truly miserable results before deleting EVGA Precision X and disabling all but absolute core programmes on Startup. This had the effect of raising my FPS to 65 on Extreme HD Ultra, which it seems is the best my PC is physically capable of without overclocking so it will have to do.


----------



## grunion

Quote:


> Originally Posted by *Buzzkill*
> 
> You are only using Two Video cards.
> 
> Quad SLI with 690's only uses two PCi-E slots. Quad(4way-sli) with four 680 is four PCi-E slots. PLX chips are on each 690.


I get this..
But you never know.

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> So by yours statement is it possible for me to use sli 690 with this motherboard


Disable the timer and contact GB for confirmation.
Make sure bios is current, try forcing pci-2.0 in bios.


----------



## justanoldman

Quote:


> Originally Posted by *PinzaC55*
> 
> I have banged away at Unigine Valley benchmark tonight till I was sick at heart. I changed Nvidia settings back and forth with truly miserable results before deleting EVGA Precision X and disabling all but absolute core programmes on Startup. This had the effect of raising my FPS to 65 on Extreme HD Ultra, which it seems is the best my PC is physically capable of without overclocking so it will have to do.


You have vsync off in Nvidia control panel, all unnecessary background tasks shut off, suspend your virus scanning if that might be getting in the way, uninstalled and reinstalled the drivers and precision x? Don't know if that will help, but just a thought.

What cpu and overclock are you running?


----------



## wermad

Got a spare 690 waterblock if anyone needs one


----------



## PinzaC55

Quote:


> Originally Posted by *justanoldman*
> 
> You have vsync off in Nvidia control panel, all unnecessary background tasks shut off, suspend your virus scanning if that might be getting in the way, uninstalled and reinstalled the drivers and precision x? Don't know if that will help, but just a thought.
> 
> What cpu and overclock are you running?


Vsync etc all off but didn't turn Malwarebytes off. CPU is 3930k, no overclock. According to this chart I should be getting 78 FPS http://www.overclockersclub.com/reviews/nvidia_geforce_gtx_titan_gaming/12.htm
BTW in Control Panel I have selected "Let 3D Application Decide".


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> Got a spare 690 waterblock if anyone needs one


These water blocks are awesome.








They have really helped with my wc, and my temps never go above 35 even when benching!


----------



## wermad

Quote:


> Originally Posted by *maximus56*
> 
> These water blocks are awesome.
> 
> 
> 
> 
> 
> 
> 
> 
> They have really helped with my wc, and my temps never go above 35 even when benching!


It looks the part too. My incoming cards came bundled with blocks so that leaves me with this spare one i just got in today. Selling cheap since I really just want to sell it.


----------



## justanoldman

Quote:


> Originally Posted by *PinzaC55*
> 
> Vsync etc all off but didn't turn Malwarebytes off. CPU is 3930k, no overclock. According to this chart I should be getting 78 FPS http://www.overclockersclub.com/reviews/nvidia_geforce_gtx_titan_gaming/12.htm
> BTW in Control Panel I have selected "Let 3D Application Decide".


Try turning off everything you can, go to a basic windows theme, turn off vsync in the Nvidia control panal global settings. See if it helps, could try reinstalling everything too. I got 78 at stock gpu and cpu also, so I think that is a reasonable number.


----------



## jhager8783

Quote:


> Originally Posted by *grunion*
> 
> I get this..
> But you never know.
> Disable the timer and contact GB for confirmation.
> Make sure bios is current, try forcing pci-2.0 in bios.


I have the same mobo, the GB zz7x-ud5h. When I switched from 670 SLI to 690 quad, I found it smoothed things out when I uninstalled all video drivers and related software, then flashed the BIOS to the latest version and reset all my overcklocks.


----------



## PinzaC55

Quote:


> Originally Posted by *justanoldman*
> 
> Try turning off everything you can, go to a basic windows theme, turn off vsync in the Nvidia control panal global settings. See if it helps, could try reinstalling everything too. I got 78 at stock gpu and cpu also, so I think that is a reasonable number.


Maybe tomorrow! I have been down that freakin' valley so many times now that I know every pile of sheep poop. Thanks for the advice though


----------



## jhager8783

Was really bored today, so I decided to do something really asinine to kill the time, 5 GPU's in one rig.



Yeah, it's physically possible, but what about the software. Well, it's a b*tch. I fiddled with the 314 driver enough to get quad to run with the 670 dedicated to PhsyX.



EVGA Precision X already recognized the 5th card. But I'm still trying to get everything running smoothly as far as benchmarks and gaming, and will let you guys know if I make any headway on this project.



Remember, this is completely asinine, there is no point what-so-ever to this, so don't try and justify why it shouldn't be done. I've beaten all my games, the heat broke in my shop and it's too cold to ride any of my bikes. Like I said, I was bored... end of story.


----------



## gladiator7

Quote:


> Originally Posted by *jhager8783*
> 
> Was really bored today, so I decided to do something really asinine to kill the time, 5 GPU's in one rig.
> 
> 
> 
> Yeah, it's physically possible, but what about the software. Well, it's a b*tch. I fiddled with the 314 driver enough to get quad to run with the 670 dedicated to PhsyX.
> 
> 
> 
> EVGA Precision X already recognized the 5th card. But I'm still trying to get everything running smoothly as far as benchmarks and gaming, and will let you guys know if I make any headway on this project.
> 
> 
> 
> Remember, this is completely asinine, there is no point what-so-ever to this, so don't try and justify why it shouldn't be done. I've beaten all my games, the heat broke in my shop and it's too cold to ride any of my bikes. Like I said, I was bored... end of story.


You are nuts...lol


----------



## Buzzkill

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> So by yours statement is it possible for me to use sli 690 with this motherboard


Yes it should work. I had to mess around to get Quad Sli enabled in Nvidia Control Panel. When I upgraded to 2 690 I installed one then the driver then install second 690 install driver again and enable quad sli. With windows 8 I had both cards installed; Then installed operating system, ran windows update, then I could enable Quad Sli in the nvidia control panel, I still installed the current driver nvidia had. I had Quads with EVGA Z77 FTW and now Maximus V Extreme.


----------



## jhager8783

Quote:


> Originally Posted by *gladiator7*
> 
> You are nuts...lol


Certifiably!

However, if you think that's nuts, wait til I'm done with my GeForce GTX chopper.

The tank is gonna be modeled after the Titan. The fan assembly being the gas cap, and a plexiglass window to the "vapor chamber" will reveal a baffled gas tank that look like heat-sink fins. Of course, the tank and fenders will be clear coated brushed aluminum and the frame will be black. The rims are gonna be black magnesium and CNC'd to look like the 690's fan. The white sidewalls will be patterned by Coker tires with an "infinite" print of the PCB. There will be a bit of airbrushing to match the smooth and sexy design flow of the 690 and Titan.

Milled GTX rotors are already on the way, and I'm using a cycle X frame I bought last year. I have most of the parts I need laying around, so I'm just waiting a while until I can find a cheap block of aluminum to make into a GTX valve cover for the kZ1100 motor. Meanwhile I have a LOT of projects going on so it'll be some time before she's looking like anything.


----------



## wholeeo

Well, just joined the club,









Forgot to ask Micro Center for the $150 In Game Coupon thing.







I want to put this baby under water but I really like how the logo lights up in my case.


----------



## Lukas026

Hey guys

add me to the club please









After sime time I finally got my hands on that beauty. Don't have camera atm so I can't take picture of my RIG so I hope SS will be enough.

I also put on Accelero Twin Turbo 690 and I totally love it !

Oh btw it is from MSI. Can't find EVGA at stock in my country but its np with these cards I guess...



See ya

PS: This is my stable overclock. I can push it some more but at about +750 mhz on memory I started to see ECC kicking in and some artifacts. With core I can also manage like +175 mhz but only +125 mhz is Unigine Heaven 3.0 12 hours stable


----------



## wermad

Quote:


> Originally Posted by *jhager8783*
> 
> Was really bored today, so I decided to do something really asinine to kill the time, 5 GPU's in one rig.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yeah, it's physically possible, but what about the software. Well, it's a b*tch. I fiddled with the 314 driver enough to get quad to run with the 670 dedicated to PhsyX.
> 
> 
> 
> EVGA Precision X already recognized the 5th card. But I'm still trying to get everything running smoothly as far as benchmarks and gaming, and will let you guys know if I make any headway on this project.
> 
> 
> 
> 
> 
> Remember, this is completely asinine, there is no point what-so-ever to this, so don't try and justify why it shouldn't be done. I've beaten all my games, the heat broke in my shop and it's too cold to ride any of my bikes. Like I said, I was bored... end of story.


I have a 4gb 670 waiting to be sold. Tempting to try this when my 690s arrive on Friday


----------



## worms14

Welcome.
Since yesterday I am a holder of a GTX690, GTX670, and went to the end I do not know how well I did.
The temperature and the volume is annoying.
I wish I had unlocked the bios with voltage up to 1.212v because that probably had the gtx670 and it is still relatively safe, and the power limit to 150.
Can anyone me retrieve the BIOS?
I changed BIOSes with Zotac Fire Storm, I hope that this program here to check it?
The original bios like Kepler edited by Tweaker BIOS 1.24 by changing only the voltage but I'm not entirely sure how well I did.
The second thing that I need to upload two BIOSes, because there are two gpu or just one?
http://imageshack.us/photo/my-images/189/img8623mae2.jpg/


----------



## jhager8783

Quote:


> Originally Posted by *wholeeo*
> 
> Well, just joined the club,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Forgot to ask Micro Center for the $150 In Game Coupon thing.
> 
> 
> 
> 
> 
> 
> 
> I want to put this baby under water but I really like how the logo lights up in my case.


If you're concerned about looks, the XSPC Razor block for the 690 is damned attractive when the lights are out... come to think of it, so is a vietnamese hooker.


----------



## jhager8783

Quote:


> Originally Posted by *wermad*
> 
> I have a 4gb 670 waiting to be sold. Tempting to try this when my 690s arrive on Friday


Let me know if you do, I've hit a couple of snags in getting action out of every GPU. Good luck and keep me informed!


----------



## worms14

Hi guys. I recently bought GTX690 and tried to flash it, extracted bios with gpu-z and edited with kepler tweaker. I changed the voltage (1,212) and power limit (150).
Do i have to flash gpu 1 and 1 separately (i suppose i do but i'd like to confirm). I want to flash it with zotac firestorm. What do you think about it?

I also attached my original 690 bios. If somebody could flash it for me I'd be 100% sure that I have good flashed bios. Thanks
http://www.sendspace.com/file/8fxyge


----------



## jassilamba

Question for those running a 690s in SLI. I have a single 690 in use presently and can get another one for a great deal (new). I also have 2 titans waiting to be shipped. Ron reading all the numbers looks like the 690 SLI will be a better option and will be cheaper than the titans.

Should I add another GPU (I use em for gaming at 1440p). Are there any issues with the 690s in SLI. Any one playing crysis 3 on their SLI setup?

I like to see a solid 60fps while playing. Metro, far cry 3 and crysis 3 are the ones where I don't get those numbers









What do you guys suggest. If I keep the titans how much should I sell my 680 with a heatkiller block and backplate?

Thanks


----------



## TheMadHerbalist

Quote:


> Originally Posted by *worms14*
> 
> Hi guys. I recently bought GTX690 and tried to flash it, extracted bios with gpu-z and edited with kepler tweaker. I changed the voltage (1,212) and power limit (150).
> Do i have to flash gpu 1 and 1 separately (i suppose i do but i'd like to confirm). I want to flash it with zotac firestorm. What do you think about it?
> 
> I also attached my original 690 bios. If somebody could flash it for me I'd be 100% sure that I have good flashed bios. Thanks
> http://www.sendspace.com/file/8fxyge


so far no ones been able to get the. Bios flash to work. If I could over volt to 1.2 I might be able to hit 1300 MHz on one card. Lol


----------



## worms14

Quote:


> Originally Posted by *DinaAngel*
> 
> iv figured the overclocking and power target now and why people doesnt get stabile clocks so easly and why hydrocopper is overclocked higher.
> 
> iv flashed my 690 with unlocked power target to 150% and power on the board is max
> so iv seen that 66 degrees is the temp limit before gpus down volts since of heat.
> so if you cool it down alot then it doesnt down volt and u get stabile to almost or 1300mhz depends on ur gpus, iv gotten to 1180 stabile but it down volts since of temp.
> 
> amps seems go higher unlocked, volts doesnt go up but amps do
> 
> http://www.techpowerup.com/downloads/2165/NVFlash_5.127.html
> works with 690 perfectly
> commands to use depends on ur nvflash --list
> if its
> 0 690
> 1 690
> then commands are
> nvflash -i0 "nameofbios0"
> nvflash -i1 "nameofbios1"
> without "
> but if you have something occupying 1 or some others then the 690 1 and 0 will be different
> 
> remember
> 0 gpu is on SP8 "main gpu 0 bios"
> SP16 is 1 gpu "1 bios"
> look here to see what is main bios
> http://i48.tinypic.com/orqmqb.jpg
> 
> i hope this helps!
> 
> Modded bioses 690 110k .zip file


But someone did post 2432 page 244What can I do to boost clocks were solid as a rock, because I jump from 915MHz to 1150MHz?


----------



## TheMadHerbalist

Quote:


> Originally Posted by *worms14*
> 
> But someone did post 2432 page 244What can I do to boost clocks were solid as a rock, because I jump from 915MHz to 1150MHz?


tried that bios already, didn't work, it just sets your boost to 1202 and tries to set your power max to 1.3 which can't be done since they are hard locked at around 1.2ish


----------



## gladiator7

Quote:


> Originally Posted by *worms14*
> 
> But someone did post 2432 page 244What can I do to boost clocks were solid as a rock, because I jump from 915MHz to 1150MHz?


Modded bios for GTX 690 don't fix the voltage issue in real life application.


----------



## PinzaC55

Hi guys , just been doing a little bit of modding.
Although my GPU isn't watercooled I can't stand exposed PCB's so I took a chance and bought the EK backplate, which is advertised as "not a stand alone item". Lo and behold when I got it, the screws supplied were M3 x 7 whereas the stock GTX 690 screws are M2.5 x 4! I had a hell of a job finding some long M2.5 screws but somebody had a little bag of 6 on Ebay so I got them and shortened them and it all went together with no problem. The result is exactly what I wanted - a huge tidy up


----------



## MrTOOSHORT

Quote:


> Originally Posted by *worms14*
> 
> Welcome.
> Since yesterday I am a holder of a GTX690, GTX670, and went to the end I do not know how well I did.
> The temperature and the volume is annoying.
> I wish I had unlocked the bios with voltage up to 1.212v because that probably had the gtx670 and it is still relatively safe, and the power limit to 150.
> Can anyone me retrieve the BIOS?
> I changed BIOSes with Zotac Fire Storm, I hope that this program here to check it?
> The original bios like Kepler edited by Tweaker BIOS 1.24 by changing only the voltage but I'm not entirely sure how well I did.
> The second thing that I need to upload two BIOSes, because there are two gpu or just one?


The voltage bios mod doesn't work on the GTX 690. The greatest bios modders couldn't figure it out or it's just not possible on the GTX 690.

Quote:


> Originally Posted by *jassilamba*
> 
> Question for those running a 690s in SLI. I have a single 690 in use presently and can get another one for a great deal (new). I also have 2 titans waiting to be shipped. Ron reading all the numbers looks like the 690 SLI will be a better option and will be cheaper than the titans.
> 
> Should I add another GPU (I use em for gaming at 1440p). Are there any issues with the 690s in SLI. Any one playing crysis 3 on their SLI setup?
> 
> I like to see a solid 60fps while playing. Metro, far cry 3 and crysis 3 are the ones where I don't get those numbers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you guys suggest. If I keep the titans how much should I sell my 680 with a heatkiller block and backplate?
> 
> Thanks


GTX 690 is sli is a waste of time only because of the 2gb vram limitation. Too much gpu muscle to vram ratio.

Quote:


> Originally Posted by *PinzaC55*
> 
> Hi guys , just been doing a little bit of modding.
> Although my GPU isn't watercooled I can't stand exposed PCB's so I took a chance and bought the EK backplate, which is advertised as "not a stand alone item". Lo and behold when I got it, the screws supplied were M3 x 7 whereas the stock GTX 690 screws are M2.5 x 4! I had a hell of a job finding some long M2.5 screws but somebody had a little bag of 6 on Ebay so I got them and shortened them and it all went together with no problem. The result is exactly what I wanted - a huge tidy up


Nice job PinzaC55, looks great!


----------



## PCModderMike

Quote:


> Originally Posted by *PinzaC55*
> 
> Hi guys , just been doing a little bit of modding.
> Although my GPU isn't watercooled I can't stand exposed PCB's so I took a chance and bought the EK backplate, which is advertised as "not a stand alone item". Lo and behold when I got it, the screws supplied were M3 x 7 whereas the stock GTX 690 screws are M2.5 x 4! I had a hell of a job finding some long M2.5 screws but somebody had a little bag of 6 on Ebay so I got them and shortened them and it all went together with no problem. The result is exactly what I wanted - a huge tidy up


Very cool








Eventually you'll cave and get the waterblock to go with it haha.


----------



## PinzaC55

Quote:


> Originally Posted by *PCModderMike*
> 
> Very cool
> 
> 
> 
> 
> 
> 
> 
> 
> Eventually you'll cave and get the waterblock to go with it haha.


Who knows? Damn Nvidia for making their cards so pretty!

This is what I don't understand - if EK made this backplate fully compatible with non-watercooled GTX 690's "out of the box" they would presumably sell more of them, so why don't they?


----------



## mtbiker033

Quote:


> Originally Posted by *PinzaC55*
> 
> Who knows? Damn Nvidia for making their cards so pretty!
> 
> This is what I don't understand - if EK made this backplate fully compatible with non-watercooled GTX 690's "out of the box" they would presumably sell more of them, so why don't they?


makes no sense does it, it should come with both length screws


----------



## worms14

PinzaC55, Very Cool


----------



## PinzaC55

Thank you kindly


----------



## jhager8783

An idea for air cooled overclockers?



The fan is an old archer 242 that puts out a lot of torque and screams it's ever-lovin' head off. I can't find any documentation on this fan because it's so old, but it has a pretty forceful stream of air behind it. The improvised mounting bracket was the factory VGA bracket that came with the HAF X case.


----------



## wermad

Quote:


> Originally Posted by *PinzaC55*
> 
> Hi guys , just been doing a little bit of modding.
> Although my GPU isn't watercooled I can't stand exposed PCB's so I took a chance and bought the EK backplate, which is advertised as "not a stand alone item". Lo and behold when I got it, the screws supplied were M3 x 7 whereas the stock GTX 690 screws are M2.5 x 4! I had a hell of a job finding some long M2.5 screws but somebody had a little bag of 6 on Ebay so I got them and shortened them and it all went together with no problem. The result is exactly what I wanted - a huge tidy up


Did you use the stock screws? I'm thinking of adding a couple of backplates.


----------



## jhager8783

So now that I've got my quad working well, I put it through a couple of benchmarks. Is this a decent score for my setup?

http://www.3dmark.com/3dm11/6060035


----------



## wermad

^^^Wow! Best I could muster (p18.1k quad 480s w/ a 3820).

Can't wait to see what my quads can do.

Btw, 670 sold so no quad + phyx testing


----------



## maximus56

Quote:


> Originally Posted by *jassilamba*
> 
> Question for those running a 690s in SLI. I have a single 690 in use presently and can get another one for a great deal (new). I also have 2 titans waiting to be shipped. Ron reading all the numbers looks like the 690 SLI will be a better option and will be cheaper than the titans.
> 
> Should I add another GPU (I use em for gaming at 1440p). Are there any issues with the 690s in SLI. Any one playing crysis 3 on their SLI setup?
> 
> I like to see a solid 60fps while playing. Metro, far cry 3 and crysis 3 are the ones where I don't get those numbers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you guys suggest. If I keep the titans how much should I sell my 680 with a heatkiller block and backplate?
> 
> Thanks


I can give you some impartial advice, as I have 2 690s and 4 Titans to be received next week. I have really enjoyed quad 690s so much so that I may just keep them for an HTPC build. All the games that you mentioned , run very well on my surround, provided I keep the AA settings low. Two 690s beat two Titans for your screen and my surround. I will never trade my 2 690 gpu grunt for 6 gb of ram that I have not missed yet. So, why am I getting 4 Titans , because it will be fun to benchmark and for the heck of it , I will turn up all AA settings to max just to see what the big deal is about max AA in surround.
I will, however,take 3 Titans over 2 690s, but not sure that makes sense for your res.
my two cents worth .....


----------



## Star Forge

Quote:


> Originally Posted by *mtbiker033*
> 
> makes no sense does it, it should come with both length screws


Better yet, are they M2.5x10mm screws? I am planning on getting a GTX 690, but with eVGA not making backplates anymore for it, I am tempted of getting one from EK.


----------



## PinzaC55

Wermad said
Quote:


> Did you use the stock screws? I'm thinking of adding a couple of backplates.


If you are using waterblocks you would need screws appropriate to the blocks. In my case I was screwing it to the GTX 690 fan housing case so the screws had to have an M2.5 thread. BTW you only remove 6 of the stock GTX 690 screws.

http://www.ekwb.com/shop/EK-IM/EK-IM-3831109856567.pdf

Quote:


> Originally Posted by *Star Forge*
> 
> Better yet, are they M2.5x10mm screws? I am planning on getting a GTX 690, but with eVGA not making backplates anymore for it, I am tempted of getting one from EK.


The supplied screws are M3 x 7mm. There's a bit of extra depth to the screw holes so I cut my screws to M2.5 x 8mm.
The only downside was that I used to have the sensors from my fan controller taped to the GPU's but there isn't space now. Not a bad thing because they looked untidy.


----------



## jhager8783

Quote:


> Originally Posted by *wermad*
> 
> ^^^Wow! Best I could muster (p18.1k quad 480s w/ a 3820).
> 
> Can't wait to see what my quads can do.
> 
> Btw, 670 sold so no quad + phyx testing


I'm sure your quad setup will kick a**. When I put my CPU under water and oc the hell out of it, I hope I'll hit at least 24,000.

I decided to wait until I get my ASUS P9x79 mobo and i7 3960 to continue with the pointless 5 GPU path. I figure anything worth doing is worth doing all the way, right?

btw, the results I got were in 8x8 with k-boost on, 135% over power with 1.67v, and 135mHz over core clock and a "super loud, ultra powerful, crazy stupid, little fan" cooling them. Any higher on the clocks and I get driver crash.

Have fun with your beasties, and post your scores as soon as your able!


----------



## jahcson

Quote:


> Originally Posted by *V3teran*
> 
> Icehancer is not the only ENB and it works differently for each person,i had much better results with L3VO,Icehancer for me looks oversaturated, also used many other mods that affect the roads,textures and lighting.
> 
> The sli issue works much better than it did before by using commandline.txt
> 
> Also put this in your commandline.txt
> -norestrictions
> -nomemrestrict
> -percentvidmem 100...............This enables 100 percent use of gpu
> -memrestrict 681574400
> -memrestrict 629145600
> -availablevidmem 4.0..............This enables 100 percent use of gpu ram
> -noprecache
> -novblank
> 
> And also Custom Anti-Aliasing flags in Nvidia Inspector .....nVIDIA's original SLI compatibility bits for GTA4 is "0x03400405". However, new SLI bits "0x42500045" seems to improve SLI scaling nicely
> http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8828733&postcount=26


So help me out here would you please V3teran !








How did you finally get your GTA IV red RESOURCE USAGE WARNING to finally go away and make it turn green ?

Your GTA IV video here shows the Resource Usage in red, at 12 seconds: 



Your GTA IV video here shows the Resource Usage in Green at 10 seconds: 




I applied the new SLI bits in Nvidia Inspector. It was kinda buggy when I double-checked the results of the .txt file I exported after applying them so even though I also tried NVIDIA GeForce SLI Profile Tool I ended up using the new bits with a simple copy paste through notepad instead.

I put a commandline.txt file in main game folder as you instructed.
And what did you mean about the Custom Anti-Aliasing flags in Nvidia Inspector ?

Does it have to do with disabling multi-GPU mode in Nvidia Control Panel ?

I'm using 1.0.7.0 with Ultimate-Textures, Vulgadrop2.1c, TrafficLoad, Ultimate HD Car Pack, and a few tiny ods and ends more.

My GTX 690 is still not being used to its max vram potential in the game, only seeing 1957MB green without -availablevidmem 4.0.
Been trying to solve this for a week, my Resource Usage is still showing up red in the games graphics settings with your commandline.txt as described above.









You're using L3EVO ENB which is for 1.0.4.0 right ?
Maybe that version would see my full vram as it does for you whereas 1.0.7.0 is unable to ?

Could you please answer some of the points above and maybe enlighten me as to how to obtain similar results as yours ?

Thanks, for reading and anything else you throw my way.
I know, I may eventually end up reinstalling the game and updating to patch 1.0.4.0 somewhere down the road, hopefully it won't come to that.


----------



## GTX 690 SLI

guys Please any help or idea is greatly welcome
Since i connect GTX 690 SLI on my motherboard GA-Z77X-UD5H i can not manage them in sli configuration,green line is not streching over both cards only one.
I have try to even buy new sli cable,twist around cable,restart drivers and before that removed everything concerned with cards,and at the end i have restart OS by formating C.
Now i have hit the wall and dont know what to do??
Maybe is time to call BIG BROTHER i dont know ??


----------



## mtbiker033

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> guys Please any help or idea is greatly welcome
> Since i connect GTX 690 SLI on my motherboard GA-Z77X-UD5H i can not manage them in sli configuration,green line is not streching over both cards only one.
> I have try to even buy new sli cable,twist around cable,restart drivers and before that removed everything concerned with cards,and at the end i have restart OS by formating C.
> Now i have hit the wall and dont know what to do??
> Maybe is time to call BIG BROTHER i dont know ??


what happens if you put the physx to CPU?


----------



## Shiftstealth

Quote:


> Originally Posted by *mtbiker033*
> 
> what happens if you put the physx to CPU?


CPU'S usually struggle with Physx.


----------



## wermad

Can I join you guys????


----------



## fat_italian_stallion

Quote:


> Originally Posted by *wermad*
> 
> Can I join you guys????


Glad to see you got them


----------



## Rei86

Quote:


> Originally Posted by *Shiftstealth*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mtbiker033*
> 
> what happens if you put the physx to CPU?
> 
> 
> 
> CPU'S usually struggle with Physx.
Click to expand...

CPUs usually does a chunk of the physics works but nvidia's PhysX is a proprietary coding that works on on cuda cards.

Also officially out of the club as I shipped the 690 out as of Wednesday: (

Sent from my Galaxy Nexus using Tapatalk 2


----------



## TeamBlue

My turn to join...


----------



## wermad

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> Glad to see you got them


I'll be putting them through some benchmarks. No oc'ing, just wanting to make sure all is well. Still looking forward to more Titan madness in your build









Thanks and I'll keep them busy with Surround


----------



## Kaapstad

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> guys Please any help or idea is greatly welcome
> Since i connect GTX 690 SLI on my motherboard GA-Z77X-UD5H i can not manage them in sli configuration,green line is not streching over both cards only one.
> I have try to even buy new sli cable,twist around cable,restart drivers and before that removed everything concerned with cards,and at the end i have restart OS by formating C.
> Now i have hit the wall and dont know what to do??
> Maybe is time to call BIG BROTHER i dont know ??


Make sure you have the monitor connected to correct output


----------



## GTX 690 SLI

Hi guys i have found the coore for my problem, unfortunately when i was buying parts for PC knowlege was obviously insuficient and reason is that i did not understand meaning of 2 way or 3 or 4 sli ,quad because my motherboard cover only 2way but not quad so problem is there??
I NEED TO REPLACED MOTHERBOARD WHICH SUPPORT QUAD processor setup.
Sorry for shearing with you this funny problem .
Thanks for everything.


----------



## Paz911

Hi guys
Had my GTX690 for around 1 month now, used to play only console games and on a average laptop and i can say that there is no way i will ever go back to console again, using a S27A950(Such a better image in comparison to my older laptop)

I got EVGA 690,I used Fractal fans and a fractal R4 case, i tried to make a silent rig, any recommended additions that you think would make it more quiet, at the moment the loudest thing is the H80 but i have had to send that off as it shorted out so using stock CPU fan. Its still pretty quiet though.

Also how loud are the water cooler rigs, do they make much noise and do they really make that much more of a difference in temperature.
Thanks


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> I'll be putting them through some benchmarks. No oc'ing, just wanting to make sure all is well. Still looking forward to more Titan madness in your build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks and I'll keep them busy with Surround


Congrats Wermad! This is a rather timid bunch here in this club...lol. some folks (not majority) are remorseful of their own 690 2gb vram and even more so of someone else's 2 690s at 2gbs..lol don't let this spoil your fun of your new set up..and, enjoy your quads, as I have mine..


----------



## wermad

Quote:


> Originally Posted by *maximus56*
> 
> Congrats Wermad! This is a rather timid bunch here in this club...lol. some folks (not majority) are remorseful of their own 690 2gb vram and even more so of someone else's 2 690s at 2gbs..lol don't let this spoil your fun of your new set up..and, enjoy your quads, as I have mine..


The way I see it, if the 6970/6990 were very good contenders with Eyefinity, why can't kepler? I'm keeping it at 1200 Surround. Had I gone 1440/1600, then 7970s would have been in order







. Most vram I pushed in Surround 1080 was 1.8gb w/ quad 580 3gb.

I still have to yet to get them up to steam since my build is pending a few more changes.


----------



## PinzaC55

This sold a couple of hours ago on ebay http://www.ebay.co.uk/itm/Asus-Nvidia-GeForce-GTX-690-Graphics-Card-FAULTY-A7-P-015-DB-/271163359188?_trksid=p2047675.l2557&clk_rvr_id=455270821783&ssPageName=STRK%3AMEWAX%3AIT&nma=true&si=HgNiT6%252BhIPTLeS%252Bsy1HULOCKHwU%253D&orig_cvip=true&rt=nc I had some early bids on it up to £102 because I was willing to take the chance it would either a) Work or b) It might be possible to RMA it to ASUS and have them repair it. I looked at their RMA conditions and they say nothing about 2nd hand status of cards, only whether it is still under warranty. If the buyer is on this forum it would be interesting to know what comes of it?


----------



## jhager8783

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> Hi guys i have found the coore for my problem, unfortunately when i was buying parts for PC knowlege was obviously insuficient and reason is that i did not understand meaning of 2 way or 3 or 4 sli ,quad because my motherboard cover only 2way but not quad so problem is there??
> I NEED TO REPLACED MOTHERBOARD WHICH SUPPORT QUAD processor setup.
> Sorry for shearing with you this funny problem .
> Thanks for everything.


That is odd since I have the same mobo as you and mine worked after messing with install order. I'm glad you got it figured out though. Cheers!


----------



## Buzzkill

Quote:


> Originally Posted by *jhager8783*
> 
> That is odd since I have the same mobo as you and mine worked after messing with install order. I'm glad you got it figured out though. Cheers!


If jhager8783 has same motherboard and he got it to work it has to be user error or just not getting quad setup right.


----------



## GTX 690 SLI

oK NOW I HAVE NICE AND SOVEABLE QUESTION..
Please can all of you put some ideo which motherboard for quad sli support i should buy.
really i need some ideas and serious ones please.


----------



## PinzaC55

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> oK NOW I HAVE NICE AND SOVEABLE QUESTION..
> Please can all of you put some ideo which motherboard for quad sli support i should buy.
> really i need some ideas and serious ones please.


I have the MSI Big Bang Xpower II and it supports quad SLI and practically everything else you can think of, including 128GB of RAM if you so wish.


----------



## rossb

Quote:


> Originally Posted by *Paz911*
> 
> Hi guys
> Had my GTX690 for around 1 month now, used to play only console games and on a average laptop and i can say that there is no way i will ever go back to console again, using a S27A950(Such a better image in comparison to my older laptop)
> 
> I got EVGA 690,I used Fractal fans and a fractal R4 case, i tried to make a silent rig, any recommended additions that you think would make it more quiet, at the moment the loudest thing is the H80 but i have had to send that off as it shorted out so using stock CPU fan. Its still pretty quiet though.
> 
> Also how loud are the water cooler rigs, do they make much noise and do they really make that much more of a difference in temperature.
> Thanks


I suggest replace the H80 with a Silver Arrow or D14, and replacing the stock fans with Noiseblocker PK-2s in every fan position and using a fan controller to undervolt them. Set a fan profile in PrecisionX so the 690 fan maxes at 50--60% up to 70 or 75 degrees and your rig will be silent at idle and quiet while gaming. If you want it completely silent while gaming, buy the Arctic Cooling heatsink for the 690. Much cheaper, easier and quieter than water cooling.


----------



## martinhal

Anyone folding with a 690 ? What is the PPD ?


----------



## mtbiker033

i just wanted to say I have had my 690 for a few days now and this thing is amazing

quiet, powerful, efficient, amazing looks

and wow does it play Bf3 good


----------



## egotrippin

You guys know we are all to blame for ridiculously high Titan pricing. All of us enthusiastic $1,000 GPU adopters have now pushed prices up. $1k GPUs are here to stay


----------



## PCModderMike

Quote:


> Originally Posted by *martinhal*
> 
> Anyone folding with a 690 ? What is the PPD ?


I did once, just to give it a try. Pulled about 70K, can't remember the exact project # it was running though.


----------



## wermad

Honestly, Titan is GTX 685. I"m sure Nvidia will launch a flagship single core gpu (GTX 780) late this year or early next. My guess that one would be ~$599-799 and should be nipping at the heals of Titan (but i doubt it would blow it to next week). Something in single core guise that will surpass Titan may not come until GTX 880 (do expect prcing closer to Titan). In all, Titan is Nvidia's answer to all of us who complained the 680 was half baked and wanted the real deal. Well, here it is, and its damn nice and damn expensive.


----------



## egotrippin

Quote:


> Originally Posted by *wermad*
> 
> Honestly, Titan is GTX 685. I"m sure Nvidia will launch a flagship single core gpu (GTX 780) late this year or early next. My guess that one would be ~$599-799 and should be nipping at the heals of Titan (but i doubt it would blow it to next week). Something in single core guise that will surpass Titan may not come until GTX 880 (do expect prcing closer to Titan). In all, Titan is Nvidia's answer to all of us who complained the 680 was half baked and wanted the real deal. Well, here it is, and its damn nice and damn expensive.


I guess I wasn't paying attention and I had no idea the Titan was coming. When I browsed to AnnandTech.com and saw the Titan review my heart momentarily hit my stomach as I feared I my 690 had been trumped by something new. After reading the reviews and benchmarks I see the 690 still has the performance edge and I'm only gaming at 2560x1440 and so far the 690 handles all of that with ease. There's the VRAM limit but I don't hit it unless I crank up the MSAA and I don't notice MSAA at that resolution anyway.

The AnandTech review said the Titan would deliver a more "consistent performance" than a GTX 690 albeit with less frames per second. Can anybody here confirm this? How is this measured?


----------



## justanoldman

Quote:


> Originally Posted by *rossb*
> 
> I suggest replace the H80 with a Silver Arrow or D14, and replacing the stock fans with Noiseblocker PK-2s in every fan position and using a fan controller to undervolt them. Set a fan profile in PrecisionX so the 690 fan maxes at 50--60% up to 70 or 75 degrees and your rig will be silent at idle and quiet while gaming. If you want it completely silent while gaming, buy the Arctic Cooling heatsink for the 690. Much cheaper, easier and quieter than water cooling.


70-75c? The card throttles at 70c, correct?


----------



## Rei86

Quote:


> Originally Posted by *justanoldman*
> 
> 70-75c? The card throttles at 70c, correct?


Yes you lose 14~15mhz every 10c above 70.

IE 1100 core clock 70c and below
1085 core clock 70~80c
1070 core 80~85c etc etc


----------



## TeamBlue

Holy Crap. I have posted in here two or three times that I was about to join, and finally I can... Here goes.

It's an ASUS.





You can't see the rest of it till later.


----------



## qiplayer

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> my motherboard is supporting sli and crossfire
> http://www.gigabyte.com/products/product-page.aspx?pid=4440#ov


I had problems too with a gb ud3
But also with an asus p8p67
I talk about quad sli with 690's


----------



## martinhal

Quote:


> Originally Posted by *qiplayer*
> 
> I had problems too with a gb ud3
> But also with an asus p8p67
> I talk about quad sli with 690's


Quad 690's that would be insane even by OCN standards .


----------



## wermad

quad sli 690 = sli 690

4-way sli 690 = 4x 690s. I've seen these setups for "folding farms" One had *nine* HD5970s. 4-way sli 690 (eight total cores) is useless for gaming.


----------



## PinzaC55

Quote:


> Originally Posted by *wermad*
> 
> quad sli 690 = sli 690
> 
> 4-way sli 690 = 4x 690s. I've seen these setups for "folding farms" One had *nine* HD5970s. 4-way sli 690 (eight total cores) is useless for gaming.


Great for bragging rights though and it pulls the chicks


----------



## wermad

Lol

Octo-sli ftw!


----------



## wholeeo

Here's mine,



Waiting on water block.


----------



## justanoldman

Speaking of water blocks, if I care more about performance that appearance, is there a certain brand that is better for a full coverage 690 block and back plate? I have never done water cooling of a GPU before. I see EK, Heatkiller, and XSPC on Frozen's website, but don't know anything about them. Or do they perform similarly, and it is just a mater of personal preference?


----------



## TeamBlue

They will all do well. You will run out of voltage before you run out of cooling from what I have seen.


----------



## TeamBlue

Double post. Do i need to do anything different to get added to the op?.


----------



## jassilamba

Quote:


> Originally Posted by *justanoldman*
> 
> Speaking of water blocks, if I care more about performance that appearance, is there a certain brand that is better for a full coverage 690 block and back plate? I have never done water cooling of a GPU before. I see EK, Heatkiller, and XSPC on Frozen's website, but don't know anything about them. Or do they perform similarly, and it is just a mater of personal preference?


The max that I hit on my heatkiller on my 690 under full load or benching is 34 on GPU 1 and 36 on GPU 2, I think what really matters is what TIM, quality of the mount, and the radiator. I have a 480 monsta rad dedicated to my 690.

So go with what you think will look good in your build.


----------



## PCModderMike

Quote:


> Originally Posted by *TeamBlue*
> 
> Double post. Do i need to do anything different to get added to the op?.


If OP hasn't caught your post yet, you can PM him and he responds fairly fast.


----------



## jhager8783

Quote:


> Originally Posted by *martinhal*
> 
> Anyone folding with a 690 ? What is the PPD ?


I have gotten 160,000 PPD on my quad 690 setup.


----------



## wermad

Koolance, HK, and AC tend to perform the best. Ek falls in the middle. XSPC is not too far behind. Keep in mind there's only a few degrees of separation (2-3c). I love the Koolance blocks that came with my 690s. I just sold a block to a 690 owner. In the end, you can't go wrong with any block, just pick which you like and fits your budget.


----------



## jhager8783

Quote:


> Originally Posted by *GTX 690 SLI*
> 
> oK NOW I HAVE NICE AND SOVEABLE QUESTION..
> Please can all of you put some ideo which motherboard for quad sli support i should buy.
> really i need some ideas and serious ones please.


As I stated earlier, when I first installed my quad setup with the same motherboard as you, I had trouble too. The only way I could get mine to work was to disable windows install, uninstall drivers with driver sweeper, swap the cards to opposite PCI-E slots and reboot without the SLI cable attached. Then once the drivers were installed, I shut down the computer, added the SLI cable and rebooted. I hope this works for you, if not you might want to get your motherboard tested for failure.

I really don't think it's different motherboard you need, however you may need to think about RMAing the one you have.

As far as suggestions on mother boards, anything with 2-way to 4-way sli support are going to work. 2/3/or 4-way sli refers to the number of physical PCI-e slots available not how many GPU's can be read by your system. I've had 5 GPU's registering in my system on the Gigabyte z77x-ud5h in 2-way + dedicated PhysX @ 4x4x4.

In many cases you have less problems dealing with 2-way sli boards than 3 or 4-way. Some 3 to 4-way boards require a more involved setup process, and it seems that this is where you are struggling.

If you are not an advance user and are not tech savvy keep it simple, go with a 2-way sli as it will suffice your needs.

Otherwise upgrade to a z79 and better processor to take full advantage of memory bandwidth and GPU power.


----------



## jhager8783

Waiting on water block.







[/quote]
Quote:


> Originally Posted by *wholeeo*
> 
> Here's mine,
> 
> 
> 
> Waiting on water block.


nice, love the Lambo in the mix, just like the advert. good one!

If I were you though, I'd duck those 8 pin cables under the 690 chassis instead of behind, it may allow for more room for exhaust from the rear of the card. Just a thought...


----------



## egotrippin

Quote:


> Originally Posted by *jhager8783*
> 
> I have gotten 160,000 PPD on my quad 690 setup.


That sounds about right . I did the math and my rig cost $35/month in electricity when folding. Your rig must cost $50/month. Does anybody know if the electricity usage is tax deductible?


----------



## jhager8783

I should probably already know the answer to this by now, but is it okay to run two 690's with a 1000watt, 2-way sli, single rail PSU?

The 8 pin adapters the 690 comes with would apparently suggest that a quad SLI PSU would be best (as it requires eight 8pin connectors). I

have been running mine on a 2-way without problems...so far.

I know there is limited amperage and current in the gauge wire that PSU's use, I just wonder if I'm straining for power especially during overclocking.

Any thoughts?


----------



## wermad

should be good.

hexus and hardware.uk pulled 650-750w total system (probably at the wall). do the efficiency conversion, ~600-700w for the total system.


----------



## egotrippin

Quote:


> Originally Posted by *jhager8783*
> 
> I should probably already know the answer to this by now, but is it okay to run two 690's with a 1000watt, 2-way sli, single rail PSU?
> 
> The 8 pin adapters the 690 comes with would apparently suggest that a quad SLI PSU would be best (as it requires eight 8pin connectors). I
> 
> have been running mine on a 2-way without problems...so far.
> 
> I know there is limited amperage and current in the gauge wire that PSU's use, I just wonder if I'm straining for power especially during overclocking.
> 
> Any thoughts?


I used a wall meter to measure my total power consumption. I think I was using <500 watts with 1 GTX 690 so I'd imagine you would have a couple hundred watts to spare.


----------



## wermad

I'll see if I can get my cpu up and running and hit 3d11 for some Kill-A-Watt readings


----------



## MrTOOSHORT

Just want to share some pics of my GTX 690!


----------



## Arizonian

^^^Very NICE indeed MrTOOSHORT^^^


----------



## PCModderMike

Very very nice! I love that backplate.


----------



## justanoldman

Thanks for the water block info TeamBlue, Jassilamba, and Wermad.

Mr. TooShort,
That is nice, very nice. With a setup like that, I won't feel bad when you beat me on the Valley 1.0 rankings again.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Arizonian*
> 
> ^^^Very NICE indeed MrTOOSHORT^^^


Thanks Arizonian!









Quote:


> Originally Posted by *PCModderMike*
> 
> Very very nice! I love that backplate.


Thanks PCModderMike!









Quote:


> Originally Posted by *justanoldman*
> 
> Thanks for the water block info TeamBlue, Jassilamba, and Wermad.
> 
> Mr. TooShort,
> That is nice, very nice. With a setup like that, I won't feel bad when you beat me on the Valley 1.0 rankings again.


Thanks justanoldman!

After you beat my score, I was trying for a 4000 score in that bench, but came up short:


----------



## justanoldman

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> After you beat my score, I was trying for a 4000 score in that bench, but came up short:


Dang, 94.8, you need to post that over there, show those Titan boys a thing or two. I have to wait to get this thing underwater before I can even try for over 94.


----------



## Murlocke

Quote:


> Originally Posted by *justanoldman*
> 
> Dang, 94.8, you need to post that over there, show those Titan boys a thing or two. I have to wait to get this thing underwater before I can even try for over 94.


I'll take 71 on a single GPU over 94 on a multi GPU any day of the week, ty. To each his own!


----------



## justanoldman

Quote:


> Originally Posted by *Murlocke*
> 
> I'll take 71 on a single GPU over 94 on a multi GPU any day of the week, ty. To each his own!


Here we go, let's count how many posts have been arguing back and forth on this on many different threads. Since you posted in the owner's thread, I guess I can say you are only 33.52% short of Mr.TooShort.

But seriously, I think the Titan is a great card, and I assume they will get this throttling thing under control and make it even better. Two Titans is a great setup and I won't argue against it. With single monitor, 2d gaming, a single 690 beats a single Titan by a wide margin. Now it becomes a point of personal preference, as you have pointed out, whether the single gpu benefits outweigh the underperformance numbers. Again, that is arguable either way.


----------



## TheSurroundGamr

Quote:


> Originally Posted by *Murlocke*
> 
> I'll take 71 on a single GPU over 94 on a multi GPU any day of the week, ty. To each his own!


I'd just water cool the SLI setup. But, I mean, hey, that's just me.


----------



## Arizonian

Quote:


> Originally Posted by *Murlocke*
> 
> I'll take 71 on a single GPU over 94 on a multi GPU any day of the week, ty. To each his own!


I never see those temps on air but then again I'm not benching any longer. Example is I run 68C in Crysis 3 Multiplayer where I have my profile set @ 1137 Mhz Core. Getting 93-103 FPS HIGH settings SMAAx2. 222 MHz Core over clock is Mucho good enough for me if I'm over 100 FPS average.









Quick shot I did two nights ago playing Crysis 3 multi-player.


Spoiler: Precision GTX690 Crysis 3 Monitoring


----------



## justanoldman

The 71 and 94 are for fps scores in Valley 1.0, not temps of the gpu, but since we weren't labeling our numbers I can see how that looks like temps.

By the way, just curious, but why so conservative on the memory offset? The 690s I tested were able to get over +500.


----------



## zloba

Just got a Titan at Malaysian launch, only 1 for now







)) Please add me to the Club







)


----------



## PCModderMike

Quote:


> Originally Posted by *zloba*
> 
> Just got a Titan at Malaysian launch, only 1 for now
> 
> 
> 
> 
> 
> 
> 
> )) Please add me to the Club
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> Spoiler: Warning: Spoiler!










Wrong club

Here's where you wanna go - http://www.overclock.net/t/1363440/nvidia-geforce-gtx-titan-owners-club


----------



## zer0sum

I think I can now be officially added to the club as I squeezed my 690 into my TJ08











I need to find some time to put a block on the 690 and sell the 670, but for now the 670 is dedicated to PhysX


----------



## PinzaC55

Quote:


> Originally Posted by *zer0sum*
> 
> I think I can now be officially added to the club as I squeezed my 690 into my TJ08
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need to find some time to put a block on the 690 and sell the 670, but for now the 670 is dedicated to PhysX


You have a 670 dedicated to Physx? What sort of scores do you get in Unigine Heaven 3.0? I asked a long time ago here whether a Physx card was worth having but didn't really get a definite answer. Welcome to the club BTW


----------



## FiShBuRn

Is it just me or Crysis 3 SP @ 1080p with everything maxed out needs more than 2gb of vram?


----------



## Arizonian

Quote:


> Originally Posted by *FiShBuRn*
> 
> Is it just me or Crysis 3 SP @ 1080p with everything maxed out needs more than 2gb of vram?


Currenly in campaign & multi player in Crysis 3 HIGH settings AAX16, with SMAA x2. In multi player with these settings my usage shows 1.42GB VRAM - a few posts before showed that Precision monitor.

Here is Crysis 3 *VERY HIGH* settings SMAAx2 in Advanced Graphics turned up to very high AAx16. Campaign my usage shows 1.809 GB VRAM. Just ran it and took a snip. Here I'm getting 63-73 FPS.



Spoiler: Precision - Crysis 3 Campaign


----------



## FiShBuRn

So its normal with 8xMSAA to hit 2gb vram...well we cant say anymore that we can maxed out every game


----------



## mtbiker033

Quote:


> Originally Posted by *Arizonian*
> 
> ^^^Very NICE indeed MrTOOSHORT^^^


absolutely awesome!

anyone with some good settings for the 690 for crysis3?


----------



## TheSurroundGamr

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Just want to share some pics of my GTX 690!


Wow, I'm really not a fan of backplates because all they do is make things look pretty, but, I'd buy the heck out of this one if I had a GTX 690 or GTX Titan!


----------



## TheMadHerbalist

So very close to placing above that trifire 7970 in the Unigine Valley thread, anyone have some tips to further increase the score, besides increasing my cpu or core overclock, I might be able to get more on the mem, but my cards will throttle, which makes me wonder as when it happens the highest temp is 41-42 C.
I recorded this run on my cell phone so i know its going to be of crappy quality, but maybe someone will spot something that might give further insight to a way to get a few more fps.


Spoiler: Warning: Spoiler!



i7 3930k @5.0 GHz
Dual 690 Sli @ 1202 MHz core / 3729 Mhz Mem.
Nvidia GeForce 314.07 Driver









So far I'm not regretting my dual gtx 690s, over the titans since I have managed to stay ahead of most titans in sli, only wished that they would of given more than 2gb of memory.


----------



## wermad

I'm still searching for the EVGA backplates. Might have to settle with EK black csq ones.


----------



## wholeeo

Heres my baby under water,



Replaced the stock blue leds with what were supposed to be yellow, turns out they are orange so whenever I get the chance I'll be resoldering some white leds.


----------



## wermad

Quote:


> Originally Posted by *wholeeo*
> 
> Heres my baby under water,
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Replaced the stock blue leds with what were supposed to be yellow, turns out they are orange so whenever I get the chance I'll be resoldering some white leds.


Stilll looks pretty good


----------



## justanoldman

Quote:


> Originally Posted by *TheMadHerbalist*
> 
> So very close to placing above that trifire 7970 in the Unigine Valley thread, anyone have some tips to further increase the score, besides increasing my cpu or core overclock, I might be able to get more on the mem, but my cards will throttle, which makes me wonder as when it happens the highest temp is 41-42 C.


You have yours oced more than mine can take, so you probably already know all this: Make sure nothing is running in the background, close out any running processes you can, switch to a windows basic theme. If you have more than one monitor, disconnect all but one. Hit enter to cycle through quick before benching


----------



## TheMadHerbalist

Quote:


> Originally Posted by *justanoldman*
> 
> You have yours oced more than mine can take, so you probably already know all this: Make sure nothing is running in the background, close out any running processes you can, switch to a windows basic theme. If you have more than one monitor, disconnect all but one. Hit enter to cycle through quick before benching


pretty much doing that already, tnx tho. Created a window install just for benching. I wonder if win 8 is better than win 7 for benching. Unless I do a hard voltmod seems I'm stuck at those core clock. Any one know any tweaks on the nvidia control panel that could increase fps. Hmm I'll try to swap the cards pci- e lanes and see if that can net me some more points as it seem my 2nd card can be oced a bit more than the other one.


----------



## PinzaC55

Quote:


> Originally Posted by *wermad*
> 
> I'm still searching for the EVGA backplates. Might have to settle with EK black csq ones.


Do you mean for the GTX 690?

http://eu.evga.com/products/moreInfo.asp?pn=M020-00-000243&family=Accessories%20-%20Hardware&uc=EUR


----------



## konigsberg7

*QUESTION ABOUT PHYSX AND MULTI-GPU*

I have 2 pictures below. One is obviously the control panel with the DEFAULT settings for Maximized 3D Performance aka enabling the 2nd GPU.

The second is where I plugged in my two monitors. I plugged it in that way (DVI A1 & A2) because that was the only way I was able to clone my two monitors. I didn't want to extend, just clone. This is a basic picture of what the back of the computer looks like when looking at the GTX 690.

Anyway, with the default settings, PhysX is focused at GPU B, while both monitors are at GPU A. Is this optimized? I feel like it should be set at GPU A or maybe the CPU?

I notice with Multi-GPU (Maximize 3D Performance) enabled, I crash a lot in CS GO when ALT-TABing and I crash completely on Hitman Absolution (on Hitman I just always have to disable Multi-GPU mode or I'll crash and have to restart (just a black screen)). Also, the cutscenes are choppy, no matter what (played via Steam and have diabled overlay, etc).

*Anyway,* how should *PhysX* be setup in this scenario? Is the auto-select (where it focuses is at GPU B), the best or not?

note: I switched it to A and I didn't crash from ALT-TABing in CS GO.


----------



## wermad

Quote:


> Originally Posted by *PinzaC55*
> 
> Do you mean for the GTX 690?
> 
> http://eu.evga.com/products/moreInfo.asp?pn=M020-00-000243&family=Accessories%20-%20Hardware&uc=EUR


Yes, but the USA shop is out. Most US stores are also out or back-ordered. HK is backordered in some places too. EK is still very available. Might have to settle on these tbh







I really don't like the naked pcb look on my cards.


----------



## Arizonian

Quote:


> Originally Posted by *FiShBuRn*
> 
> So its normal with 8xMSAA to hit 2gb vram...well we cant say anymore that we can maxed out every game


I decided to see for myself with everything maxed MSAAx8 Crysis 3 campaign. Ended up using 1.995 GB VRAM.



Spoiler: Crysis 3 Maxed Precision Monitor


----------



## Arm3nian

Quote:


> Originally Posted by *konigsberg7*
> 
> *QUESTION ABOUT PHYSX AND MULTI-GPU*
> 
> I have 2 pictures below. One is obviously the control panel with the DEFAULT settings for Maximized 3D Performance aka enabling the 2nd GPU.
> 
> The second is where I plugged in my two monitors. I plugged it in that way (DVI A1 & A2) because that was the only way I was able to clone my two monitors. I didn't want to extend, just clone. This is a basic picture of what the back of the computer looks like when looking at the GTX 690.
> 
> Anyway, with the default settings, PhysX is focused at GPU B, while both monitors are at GPU A. Is this optimized? I feel like it should be set at GPU A or maybe the CPU?
> 
> I notice with Multi-GPU (Maximize 3D Performance) enabled, I crash a lot in CS GO when ALT-TABing and I crash completely on Hitman Absolution (on Hitman I just always have to disable Multi-GPU mode or I'll crash and have to restart (just a black screen)). Also, the cutscenes are choppy, no matter what (played via Steam and have diabled overlay, etc).
> 
> *Anyway,* how should *PhysX* be setup in this scenario? Is the auto-select (where it focuses is at GPU B), the best or not?
> 
> note: I switched it to A and I didn't crash from ALT-TABing in CS GO.


The reason the control panel sets it to B in auto is because the A gpu is going to be used first, then the B gpu. If at a really low demanding game, A is going to get used only, when you get start to get into 70% range of Gpu A then it splits, so 35% load on each. For your monitors, it doesn't matter if they are connected to A or B, it will give same results because the gpu's are exactly the same and work together.

Does CS:go even support physx? Don't set physx to cpu, you have a card which supports physx, and does it much better than the cpu. Mine is set at B, but in your case if it is giving you problems then do what works, there will be no performance difference.


----------



## jhager8783

Thanks wermad and egotrippin, sorry for the late response I was on the road.

I pulled 832watts at the outlet so everything seems good. I thought I was but it never hurts to ask.


----------



## konigsberg7

Thanks for the response. I can't find anything that CS GO uses physX. I actually just put the PhysX back to GPU B and I was able to ALT-TAB fine. I don't know, maybe I was just having a couple of days of being unlucky.

Perhaps, It only happens when I shutdown 1 of my monitors so I can play on the other one by itself on 120HZ (Monitor 1, the ASUS). Then, I change the refresh rate to 120HZ in the control panel.

Still, I wonder why Hitman Absolution still crashes if Multi-GPU isn't disabled. It's not an instant crash. It happens after sometime (not a fixed time, can be an hour or can be 10 minutes) where it just goes black and you have to restart your computer.


----------



## Arm3nian

Quote:


> Originally Posted by *konigsberg7*
> 
> Thanks for the response. I can't find anything that CS GO uses physX. I actually just put the PhysX back to GPU B and I was able to ALT-TAB fine. I don't know, maybe I was just having a couple of days of being unlucky.
> 
> Perhaps, It only happens when I shutdown 1 of my monitors so I can play on the other one by itself on 120HZ (Monitor 1, the ASUS). Then, I change the refresh rate to 120HZ in the control panel.
> 
> Still, I wonder why Hitman Absolution still crashes if Multi-GPU isn't disabled. It's not an instant crash. It happens after sometime (not a fixed time, can be an hour or can be 10 minutes) where it just goes black and you have to restart your computer.


Well if other games run find with your current setup, it is probably just hitman having problems with your drivers, are you using the newest beta one? Try that, if it doesn't work then go back. If you are using the beta one try the latest stable one.

You can run 120hz on 1 monitor and 60hz on the other and have both on at the same time, i've done that before and gives me no trouble.


----------



## konigsberg7

For Hitman Absolution (Played via Steam), I am using the latest drivers. Never tried the beta.


----------



## Arm3nian

Quote:


> Originally Posted by *konigsberg7*
> 
> For Hitman Absolution (Played via Steam), I am using the latest drivers. Never tried the beta.


Hitman Absolution is a relatively new game, and i've seen it mentioned in almost every driver update.

Try the new beta drivers, just google "nvidia beta drivers", and get the latest one. see if that fixes your problem with your desired configuration. If it doesn't then I would try again later when the new drivers are released. If you don't feel comfortable with beta drivers you can always go back, but I prefer the newest drivers.

If after a few driver releases the problem isn't fixed then it is most likely your config, or there is some type of mod. But really, the configurations to where you connect your displays and what gpu is running the physx really doesn't effect performance. Your cards will always adjust to give optimal performance.


----------



## konigsberg7

What do you mean my config?

I ran Crysis 3 today, works beautifully. Also beat Max Payne 3 on 3D.

I figured out that CS GO crashes when I ALT-TAB when I have Maple open and maybe Chrome/Firefox, too. I posted a thread on another forum for CS GO and some people still get crashes. I'm sure it's just some universal bug where some people have certain software open.

Something wierd happened. I was messing around with things and I disabled multi-gpu mode and my screen went black so I had to restart. Then I found out that device manager listed both GPUs as disabled. I re-enabled them and everything is fine, but it was wierd.

I finally beat Hitman. I don't have to actually disable multi-gpu mode but I can just go to 3D settings and set the rendering to Single GPU.

Wierd thing is I get this error a lot of people do (I dont have to restart my computer, but it just crashes). I think it's only steam related. Here's a post about it:

http://forums.eidosgames.com/showthread.php?t=130923

If you have any insight on what may be causing it, I'd be interested. It only happens after I disable multi-gpu or set the 3D settings rendering to Single GPU.

Take in mind, the solution that guy posts is something thats official on Steam: https://support.steampowered.com/kb_article.php?ref=3134-TIAL-4638

Except, he doesn't delete the userdata folder. Don't know what thats for, per se. All configs for games are under steamapps


----------



## Arm3nian

Quote:


> Originally Posted by *konigsberg7*
> 
> What do you mean my config?
> 
> I ran Crysis 3 today, works beautifully. Also beat Max Payne 3 on 3D.
> 
> I figured out that CS GO crashes when I ALT-TAB when I have Maple open and maybe Chrome/Firefox, too. I posted a thread on another forum for CS GO and some people still get crashes. I'm sure it's just some universal bug where some people have certain software open.
> 
> Something wierd happened. I was messing around with things and I disabled multi-gpu mode and my screen went black so I had to restart. Then I found out that device manager listed both GPUs as disabled. I re-enabled them and everything is fine, but it was wierd.
> 
> I finally beat Hitman. I don't have to actually disable multi-gpu mode but I can just go to 3D settings and set the rendering to Single GPU.
> 
> Wierd thing is I get this error a lot of people do (I dont have to restart my computer, but it just crashes). I think it's only steam related. Here's a post about it:
> 
> http://forums.eidosgames.com/showthread.php?t=130923
> 
> If you have any insight on what may be causing it, I'd be interested. It only happens after I disable multi-gpu or set the 3D settings rendering to Single GPU.
> 
> Take in mind, the solution that guy posts is something thats official on Steam: https://support.steampowered.com/kb_article.php?ref=3134-TIAL-4638
> 
> Except, he doesn't delete the userdata folder. Don't know what thats for, per se. All configs for games are under steamapps


By your config I mean where you have the dvi connectors connected to and what you set your physx to, not your actual computer build. Crysis 3 ran good? Runs horribly for me, everything maxed, shadows medium, and smaa x1 low, it still lags. 8xmsaa is just unplayable.

For your screen going black, it happened to me yesterday. Your main display has to be on the correct gpu, because the second is reliant on the other one. The correct one for me is B i think, which makes no sense considering gpu A is the first 1. That dx error could be an unstable gpu, did you overclock at all?

Also, why are you disabling multi gpu?


----------



## konigsberg7

I disabled it just to play around with it. I was benchmarking.

Anyway, yeah Crysis 3 all MAX runs nicely. I use the latest non-beta drivers. It's very nice looking. Everything on "Very High" and 8x MSAA

I didn't overclock at all. I think it was an isolated incident.


----------



## mutunekk

Hi, î have a problem with my Asus GTX690, it was working amazingly then all of a sudden it stopped sending a signals to any of my screens... I tried all the ports but none of them is working. I tried flashing the bios with the standard rom, but that didn't help. Anyone experienced this and know how to fix it?

btw... my config is;
Asus Maximus Gene V
i7 3770
16GM Corsair Vengeance
Corsair AX 1200i

Thanks


----------



## konigsberg7

I have good news. I downloaded the beta drivers because I just bought Tomb Raider. I was having the same problems with Hitman since Tomb Raider uses the same engine. The beta drivers seem to fix it all.

They seem to be on it. Although, I don't know why they weren't on it since Hitman's release, as they both use the same engine:

http://www.pcgamer.com/2013/03/07/tomb-raiders-geforce-performance-issues-being-looked-at-by-nvidia-and-crystal-dynamics/?ns_campaign=article-feed&ns_mchannel=ref&ns_source=steam&ns_linkname=0&ns_fee=0

Brilliant, played Crysis 3 all the way to the end and the last cutscene, the very end. It doesn't load. Other people have it seemingly, but no solution: http://www.mycrysis.com/forums/viewtopic.php?f=65&t=61626&view=next

I found the ending on Youtube and a guy with a GTX 670 had it working. I have no idea. Maybe he had w8, this bug is just so disappointing. I want to watch it on my big screen TV.


----------



## JimmyD

where do i post proof to become a member? i have dual gtx690's








thanks


----------



## wermad

Quote:


> Originally Posted by *JimmyD*
> 
> where do i post proof to become a member? i have dual gtx690's
> 
> 
> 
> 
> 
> 
> 
> 
> thanks


Quote:


> To be added to the members list, please post the following:
> 
> A) Proof of card ownership (picture, etc.)
> B) Brand of card


Looks like they haven't updated it in a while







I'm still not on the list.


----------



## wholeeo

So I changed the orange leds to white and ended up not liking the look so had to resolder the orange leds. They compliment the copper more.


----------



## JimmyD

Quote:


> Originally Posted by *wermad*
> 
> Looks like they haven't updated it in a while
> 
> 
> 
> 
> 
> 
> 
> I'm still not on the list.


Really? thats a bummer:









Well, here's proof.....the fact that the cards are both 2 x 8pin should set them apart from looking like another card i think. they're both nvidia:thumb:


----------



## Arizonian

Quote:


> Originally Posted by *JimmyD*
> 
> where do i post proof to become a member? i have dual gtx690's
> 
> 
> 
> 
> 
> 
> 
> 
> thanks


Quote:


> Originally Posted by *wermad*
> 
> Looks like they haven't updated it in a while
> 
> 
> 
> 
> 
> 
> 
> I'm still not on the list.


Try PM"ing *jcde7ago* - he will add you.









Congrats to both of you.


----------



## Lukas026

Quote:


> Originally Posted by *mutunekk*
> 
> Hi, î have a problem with my Asus GTX690, it was working amazingly then all of a sudden it stopped sending a signals to any of my screens... I tried all the ports but none of them is working. I tried flashing the bios with the standard rom, but that didn't help. Anyone experienced this and know how to fix it?
> 
> btw... my config is;
> Asus Maximus Gene V
> i7 3770
> 16GM Corsair Vengeance
> Corsair AX 1200i
> 
> Thanks


happened to me today too and I dont know what to do ? RMA ?


----------



## FiShBuRn

Im thinking to put my 690 (only, maybe in the future i will add cpu to the loop) underwater, what do you think about this hardware:

EK-FC690 GTX - Acetal + Nickel
EK-PSC Fitting 13mm - G1/4 Black Nickel
EK-CoolStream RAD XTX (240)
EK-DCP 4.0 (12V DC Pump)
EK-BAY SPIN Reservoir - Plexi CSQ

Is the 13mm Fitting ok? Or its to "large"? Any other sugestion?


----------



## wermad

EVGA 690 backplate are back in stock at evga.com:

http://www.evga.com/Products/Product.aspx?pn=M020-00-000243

Paid cali sales tax but I should get them quicker









Frozencpu.com has them in backorder, NY sales tax, but the trip and the wait are not worth the $5 versus evga.com for me









http://www.frozencpu.com/products/16751/ex-vga-19/EVGA_GeForce_GTX_690_Backplate_M020-00-000243.html?id=rK2hhkNs&mv_pc=91


----------



## iARDAs

Sold my zotac 670. Will pick up 690 tomorrow if finances allow. Zotac or asus are my only options.


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> EVGA 690 backplate are back in stock at evga.com:
> 
> http://www.evga.com/Products/Product.aspx?pn=M020-00-000243
> 
> Paid cali sales tax but I should get them quicker
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Frozencpu.com has them in backorder, NY sales tax, but the trip and the wait are not worth the $5 versus evga.com for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.frozencpu.com/products/16751/ex-vga-19/EVGA_GeForce_GTX_690_Backplate_M020-00-000243.html?id=rK2hhkNs&mv_pc=91


Thank you sir.

edit:

Hope its compatible with the XSPC block.


----------



## TeamBlue

Anybody here playing hawken on the 690? I have had a couple crashes (only that game) and I'm wondering if it's 690 specific.


----------



## Blackmill

I have crashed on Mechwarrior online often.
Also im wondering if i should look into getting trading my 690 for a titan


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> EVGA 690 backplate are back in stock at evga.com:
> 
> http://www.evga.com/Products/Product.aspx?pn=M020-00-000243
> 
> Paid cali sales tax but I should get them quicker
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Frozencpu.com has them in backorder, NY sales tax, but the trip and the wait are not worth the $5 versus evga.com for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.frozencpu.com/products/16751/ex-vga-19/EVGA_GeForce_GTX_690_Backplate_M020-00-000243.html?id=rK2hhkNs&mv_pc=91


Cool


----------



## Tomalak

A quick question here, instead of making a new thread:

I'm gonna get a 1440p monitor soon, aside from the two u2311s I already have. Going to get a new GPU as well.

It's important for me to be able to run games in borderless window mode so I can easily switch to other two monitors. I know CFX can't do this (works only in fullscreen). What about SLI, *can I do this with a GTX 690?*


----------



## mutunekk

Quote:


> Originally Posted by *Lukas026*
> 
> happened to me today too and I dont know what to do ? RMA ?


I finally got it to work... tried a bunch of bios and also updated the motherboard. Finally a evga bios got the card to work but windows wouldn't launch so I had to do a complete reinstall... I think the issue is with the new drivers! Because all these problems stared when I installed them...


----------



## Lukas026

Quote:


> Originally Posted by *mutunekk*
> 
> I finally got it to work... tried a bunch of bios and also updated the motherboard. Finally a evga bios got the card to work but windows wouldn't launch so I had to do a complete reinstall... I think the issue is with the new drivers! Because all these problems stared when I installed them...


oh congrats

but i have a bunch of questions ? how you make it work ? I mean I have no signal when i plug my monitor to gtx 690 (no matter which output)...and also how did you flash it ? and where did you get the evga bios ?

thank you for replies


----------



## mtbiker033

Quote:


> Originally Posted by *FiShBuRn*
> 
> Im thinking to put my 690 (only, maybe in the future i will add cpu to the loop) underwater, what do you think about this hardware:
> 
> EK-FC690 GTX - Acetal + Nickel
> EK-PSC Fitting 13mm - G1/4 Black Nickel
> EK-CoolStream RAD XTX (240)
> EK-DCP 4.0 (12V DC Pump)
> EK-BAY SPIN Reservoir - Plexi CSQ
> 
> Is the 13mm Fitting ok? Or its to "large"? Any other sugestion?


I had the same thoughts recently, thanks for posting a parts list


----------



## justanoldman

So now that I have my Swiftech H220 installed I need to order a block for my EVGA 690 to put it in the loop. You guys have said the major brands are pretty close to each other. The EK seems to have good instructions online, and since I have never done this before I figured I would go with them.

If I use EK there is only one choice for a plain black backplate on Frozen's site, but there are four choices for the block.

Acetal+Nickel; Acetal; Acrylic; Nickel

No idea which is better. Can anyone point out which is the best for performance? The colors of my Noctua fans sort of prevent me from caring too much about appearance.


----------



## konigsberg7

Where's a good place to get news for our graphics card drivers? Specifically, release dates on new drivers -- predictions, etc?


----------



## wholeeo

Has anyone ever tried to install a backplate without removing the card from the PC? Really don't feel like draining my loop once my backplate arrives,


----------



## jcde7ago

Sorry guys, been backed up with work + also preparing to move to my second apartment in less than 60 days, really really stressful times!









Anyway, quite a few new 690 owners in the last couple of weeks! *Check out the list, and please PM me if I missed adding you to the club.* Congrats, all!









New members added as of 3/10/13:

- Lukas026
- worms14
- wermad
- TeamBlue
- wholeeo
- zer0sum
- JimmyD

You have guys all have some pretty sweet setups by the way....


----------



## MrTOOSHORT

Congrats to the new GTX 690 owners

- Lukas026
- worms14
- wermad
- TeamBlue
- wholeeo
- zer0sum
- JimmyD

for having the fastest gaming Nvidia card.


----------



## Lukas026

thank you for adding me to the club but it seems for now you can cross me off for some time. i cant get my card to work again properly and it seems like RMA is only way.

I tried to flash it when I have my monitor plugged to my IGD on z77 mpower but with no luck. when i run nvflash in DOS mode and type --list it says no nvdidia adapters can be found. I also tried it with the card in other pcie slots. again no luck. I also tried it in another computer to see if it isnt just PSU - no signal at all again.

if there is something which i can do more let me know. in fact i am realy desperate atm becouse it was so nice card. with my twin turbo 690 it was quiet and clock nicely to +125 / + 600. damn it









thank you for your answers


----------



## wermad

Hate paying Cali tx (







), love the fast Cali shipping







:


----------



## mtbiker033

has anyone used the KGB bios editor to unlock their 690? is it worth doing?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *wermad*
> 
> Hate paying Cali tx (
> 
> 
> 
> 
> 
> 
> 
> ), love the fast Cali shipping
> 
> 
> 
> 
> 
> 
> 
> :
> 
> *snip**


Awesome, glad to see you've finally found some.

Quote:


> Originally Posted by *mtbiker033*
> 
> has anyone used the KGB bios editor to unlock their 690? is it worth doing?


I've tried so many bios', KGB and V3dt bios editors and nothing does a thing. Voltage is locked no matter what. I do use an EVGA Hydro GTX 690 bios though, doesn't really do much than up the clocks a tad without overclocking.

On a side note...LoL


----------



## mtbiker033

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Awesome, glad to see you've finally found some.
> I've tried so many bios', KGB and V3dt bios editors and nothing does a thing. Voltage is locked no matter what. I do use an EVGA Hydro GTX 690 bios though, doesn't really do much than up the clocks a tad without overclocking.
> 
> On a side note...LoL


thanks for the reply!

epic 690th rep!!


----------



## Forrester

hi guys, anyone else here play borderlands 2? i'm having issues with performance even at 1920x1080 w/ low gpu usage but low framerates (690 of course haha)


----------



## JimmyD

Quote:


> Originally Posted by *Lukas026*
> 
> thank you for adding me to the club but it seems for now you can cross me off for some time. i cant get my card to work again properly and it seems like RMA is only way.
> 
> I tried to flash it when I have my monitor plugged to my IGD on z77 mpower but with no luck. when i run nvflash in DOS mode and type --list it says no nvdidia adapters can be found. I also tried it with the card in other pcie slots. again no luck. I also tried it in another computer to see if it isnt just PSU - no signal at all again.
> 
> if there is something which i can do more let me know. in fact i am realy desperate atm becouse it was so nice card. with my twin turbo 690 it was quiet and clock nicely to +125 / + 600. damn it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thank you for your answers


I think if you tried in in another pc (mobtherboard and all) and still nothing.....i think its almost 100% its the card......sorry bro


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> Hate paying Cali tx (
> 
> 
> 
> 
> 
> 
> 
> ), love the fast Cali shipping
> 
> 
> 
> 
> 
> 
> 
> :


I have to wait till Friday,


----------



## JimmyD

Quote:


> Originally Posted by *Forrester*
> 
> hi guys, anyone else here play borderlands 2? i'm having issues with performance even at 1920x1080 w/ low gpu usage but low framerates (690 of course haha)


check nvidia control panel to make sure multi gpu is enabled. I have some games that seem to un trigger it from time to time. Also, Im sure you have,but try a different driver....the 314.14 beta is out

I'm experiencing trouble with a couple games that i never used to. bf3 and batman AA

was borderlands 2 always giving u trouble?


----------



## wermad

Ugh, had to drill out a couple of stand-offs since two of the Koolance screws didn't reach. Luckily, all the gpu mounting screws passed through


----------



## Romin

any one with a 690 is online now ?! I need a favor ASAP !


----------



## Romin

I need a 1080P Furmark benchmark with 0 MSAA. I'll appreciate if someone could !


----------



## wermad

Quote:


> Originally Posted by *Romin*
> 
> I need a 1080P Furmark benchmark with 0 MSAA. I'll appreciate if someone could !


My rig is down, but i wouldn't recommend to run Furmark. Its been noted to be too harsh on a gpu and that's why you don't see Furmark in recent benchmarks. Risking an expensive 690 is not worth Furmark imho. Why not ask for 3d11 or Heaven/Valley?


----------



## Romin

Quote:


> Originally Posted by *wermad*
> 
> My rig is down, but i wouldn't recommend to run Furmark. Its been noted to be too harsh on a gpu and that's why you don't see Furmark in recent benchmarks. Risking an expensive 690 is not worth Furmark imho. Why not ask for 3d11 or Heaven/Valley?


Just wanna see the FPS Fraps shows ! not even OCed !


----------



## wermad

There's a reason reviewers have stopped using Furmark







. I don't condone using it for gauging a gpu's performance


----------



## Romin

I don't think running it for 20-30 sec would do any damage !


----------



## wermad

Quote:


> Originally Posted by *Romin*
> 
> I don't think running it for 20-30 sec would do any damage !


bust out $1000 for a 690 and find out yourself if you're so sure


----------



## Romin

Quote:


> Originally Posted by *wermad*
> 
> bust out $1000 for a 690 and find out yourself if you're so sure


u can't be serious !! Just 10 secs!!
Fyi, once I had a 690 !


----------



## wermad

Isn't sli Titan sill better? Two single core gpu(s) have always had an edge vs their x2 counterparts. With a bigger limited scope on this thing, I'm sure premiums will hit $2100-2500. Asus Mars III ~$3000 any one









Still happy with my 690s


----------



## ahnafakeef

Hi everyone! Need some help here!

What are the major issues with gaming on 1080p with a 690? and how bad are they?
Microstuttering, VRAM limitations? and what else?

Also, what is the highest OC that the 690 can achieve? and what is the average? can it at least achieve stock 680 clocks?

Confused between a 690 or a Titan. I want to know what I'm getting into if I get a 690. Please provide necessary information so I can make the right decision. Thanks a lot!


----------



## PinzaC55

In case anyone in the UK is interested , somebody on Ebay has 3 EK GTX 690 backplates advertised for £14 each free post http://www.ebay.co.uk/itm/EK-FC690-GTX-Backplate-Black-/261183839570?pt=UK_Computing_Water_Cooling&hash=item3ccfc51d52 Note* I am not the seller nor do I know him, just trying to help other GTX 690 owners







I have my phone set up to alert me of any GTX 690 items on the site.


----------



## justanoldman

Quote:


> Originally Posted by *ahnafakeef*
> 
> Hi everyone! Need some help here!
> 
> What are the major issues with gaming on 1080p with a 690? and how bad are they?
> Microstuttering, VRAM limitations? and what else?
> 
> Also, what is the highest OC that the 690 can achieve? and what is the average? can it at least achieve stock 680 clocks?
> 
> Confused between a 690 or a Titan. I want to know what I'm getting into if I get a 690. Please provide necessary information so I can make the right decision. Thanks a lot!


Haven't had mine long, but have not had any "issues". At 1080p single monitor 2d gaming I would think a single 680 or good 670 could take care of that. The 690 or Titan would be overkill it seems. Two 670s for $700 would more than take care of it. Look in the Valley 1.0 thread and you can see the scores of the setups. I think it would be rather hard to run out of VRAM on just one 1080p.

I would guess average OC would be in the 1150 to 1200 range on the core, and 1750 to 1850 on the memory. From everything I have seen I get this:
two 670 < one Titan < one 690 < two 680, after that you are talking about $2k so if you are going to spend that much it is up to you how to spend it.


----------



## maximus56

Quote:


> Originally Posted by *ahnafakeef*
> 
> Hi everyone! Need some help here!
> 
> What are the major issues with gaming on 1080p with a 690? and how bad are they?
> Microstuttering, VRAM limitations? and what else?
> 
> Also, what is the highest OC that the 690 can achieve? and what is the average? can it at least achieve stock 680 clocks?
> 
> Confused between a 690 or a Titan. I want to know what I'm getting into if I get a 690. Please provide necessary information so I can make the right decision. Thanks a lot!


No issues ...OCs better than a single Titan. I have a quad 690 and a quad Titan.


----------



## PinzaC55

Quote:


> Originally Posted by *maximus56*
> 
> No issues ...OCs better than a single Titan. I have a quad 690 and a quad Titan.


I see from your specs you have a Mad Catz Strike 7 keyboard? If you were me and thinking about getting one, would you?


----------



## mtbiker033

Quote:


> Originally Posted by *ahnafakeef*
> 
> Hi everyone! Need some help here!
> 
> What are the major issues with gaming on 1080p with a 690? and how bad are they?
> Microstuttering, VRAM limitations? and what else?
> 
> Also, what is the highest OC that the 690 can achieve? and what is the average? can it at least achieve stock 680 clocks?
> 
> Confused between a 690 or a Titan. I want to know what I'm getting into if I get a 690. Please provide necessary information so I can make the right decision. Thanks a lot!


I have only had mine a few weeks and haven't tweaked it much but the best OC I did running valley was +125 core and +500 memory. I only have a single 1080p monitor atm but it works absolutely fine on it for Bf3 and Crysis3 at stock speeds.


----------



## maximus56

Quote:


> Originally Posted by *PinzaC55*
> 
> I see from your specs you have a Mad Catz Strike 7 keyboard? If you were me and thinking about getting one, would you?


I like it, but it is bit of a novelty, and I find the software to be buggy and lacking support.
The software also conflicts with other apps sometimes, ie networking, etc.


----------



## PinzaC55

Quote:


> Originally Posted by *maximus56*
> 
> I like it, but it is bit of a novelty, and I find the software to be buggy and lacking support.
> The software also conflicts with other apps sometimes, ie networking, etc.


Hmmmm... I looked at the reviews on Amazon and some said it was great , some not. Thanks for the info.


----------



## Romin

Quote:


> Originally Posted by *Romin*
> 
> I need a 1080P Furmark benchmark with 0 MSAA. I'll appreciate if someone could !


No one wants to help me out here ?! It's really important for me !


----------



## JimmyD

Quote:


> Originally Posted by *Romin*
> 
> No one wants to help me out here ?! It's really important for me !


http://www.google.ca/url?sa=t&rct=j&q=&esrc=s&source=web&cd=10&ved=0CF8QFjAJ&url=http%3A%2F%2Fwww.overclock.net%2Ft%2F1360263%2Fcan-furmark-kill-your-gpu-actually-yes&ei=SQBAUeuCKqHj0gGzloHgDg&usg=AFQjCNEBJX_ykU8rKgl-z5ISrohVDjjXkQ&bvm=bv.43287494,d.dmg&cad=rja

sorry bro, look up the results on google or youtube. There seems to be enough people refusing, it may be for good reason. After doing a little research myself, I'm not about to take a chance with my 690's. plus you probably just need it with 1 card not 2 and i really don't wanna pull a card out sorry. no hard feelings


----------



## Romin

Quote:


> Originally Posted by *JimmyD*
> 
> http://www.google.ca/url?sa=t&rct=j&q=&esrc=s&source=web&cd=10&ved=0CF8QFjAJ&url=http%3A%2F%2Fwww.overclock.net%2Ft%2F1360263%2Fcan-furmark-kill-your-gpu-actually-yes&ei=SQBAUeuCKqHj0gGzloHgDg&usg=AFQjCNEBJX_ykU8rKgl-z5ISrohVDjjXkQ&bvm=bv.43287494,d.dmg&cad=rja
> 
> sorry bro, look up the results on google or youtube. There seems to be enough people refusing, it may be for good reason. After doing a little research myself, I'm not about to take a chance with my 690's. plus you probably just need it with 1 card not 2 and i really don't wanna pull a card out sorry. no hard feelings


I just want a screen shot of Fraps that shows the FPS ! There is no need to run the benchmark more than 10 Secs, that way GPU temp won't even go high ! I think it's a temp problem everyone scared of, am I wrong?


----------



## Lukas026

ok so today I gave it to RMA. Guess I am off the club for some time...

Anyone here have any experience with MSI rma process ? I am from czech republic btw (europe)


----------



## ahnafakeef

Quote:


> Originally Posted by *mtbiker033*
> 
> I have only had mine a few weeks and haven't tweaked it much but the best OC I did running valley was +125 core and +500 memory. I only have a single 1080p monitor atm but it works absolutely fine on it for Bf3 and Crysis3 at stock speeds.


What is the highest stable OC in games? Wont be benchmarking so I'm not concerned about that.
Quote:


> Originally Posted by *justanoldman*
> 
> Haven't had mine long, but have not had any "issues". At 1080p single monitor 2d gaming I would think a single 680 or good 670 could take care of that. The 690 or Titan would be overkill it seems. Two 670s for $700 would more than take care of it. Look in the Valley 1.0 thread and you can see the scores of the setups. I think it would be rather hard to run out of VRAM on just one 1080p.
> 
> I would guess average OC would be in the 1150 to 1200 range on the core, and 1750 to 1850 on the memory. From everything I have seen I get this:
> two 670 < one Titan < one 690 < two 680, after that you are talking about $2k so if you are going to spend that much it is up to you how to spend it.


Going overkill is what I plan to do. Its between either the Titan or a 690. Dont want to go through the hassle of two cards.
So for a budget of $1K, you would suggest a 690 over a Titan?
Quote:


> Originally Posted by *maximus56*
> 
> No issues ...OCs better than a single Titan. I have a quad 690 and a quad Titan.


So, a 690 would be better than a Titan even with the VRAM limitation of the 690?

Thanks a lot to all of you!


----------



## justanoldman

Quote:


> Originally Posted by *ahnafakeef*
> 
> Going overkill is what I plan to do. Its between either the Titan or a 690. Dont want to go through the hassle of two cards.
> So for a budget of $1K, you would suggest a 690 over a Titan?
> So, a 690 would be better than a Titan even with the VRAM limitation of the 690?
> 
> Thanks a lot to all of you!


Unfortunately there is no right answer to your question. In the 690 owner thread many will say that card, in the Titan owner thread they will call you stupid not to buy one.

If you plan to expand later where you would add a second 690 or Titan, then I would lean toward Titan because two way sli is easier to deal with than four way.

If you plan on sticking with only one card ever, then the 690 beats the Titan in fps by a decent margin. The 690 is ahead by 20+ fps in the Valley thread, and every review puts it ahead in every resolution. The downside is you have sli vs. one gpu. Some say you will get micro stuttering and some games don't like sli. Others will say they have no micro stuttering with the 690 and almost all current games work with sli. There are those who claim the Titan is "smoother" than the 690, but you can't measure or quantify that so take if for what it is worth.

If you are looking for member support then the Titan is the way to go. Since it is new and a good number of people are buying it there is a ton of activity. I mean no offense at all, but this owner's thread is relatively inactive and a number of questions get no response.

If you are ok with the fact that you have sli in one card, a single 690 beats a Titan. If you will add another card later Titan is probably a better choice. I have not heard of anyone having vram issues on single 1080p with 2gb.


----------



## Buzzkill

Quote:


> Originally Posted by *Romin*
> 
> I just want a screen shot of Fraps that shows the FPS ! There is no need to run the benchmark more than 10 Secs, that way GPU temp won't even go high ! I think it's a temp problem everyone scared of, am I wrong?


I run 3D Mark 11 but I have Quad 690's. I have not had problems lately. At first 3D Mark 11 had problems with Windows 8


----------



## DinaAngel

im much happyer with this setup of mine than 690!
http://valid.canardpc.com/2728858


----------



## ahnafakeef

Quote:


> Originally Posted by *justanoldman*
> 
> Unfortunately there is no right answer to your question. In the 690 owner thread many will say that card, in the Titan owner thread they will call you stupid not to buy one.
> 
> If you plan to expand later where you would add a second 690 or Titan, then I would lean toward Titan because two way sli is easier to deal with than four way.
> 
> If you plan on sticking with only one card ever, then the 690 beats the Titan in fps by a decent margin. The 690 is ahead by 20+ fps in the Valley thread, and every review puts it ahead in every resolution. The downside is you have sli vs. one gpu. Some say you will get micro stuttering and some games don't like sli. Others will say they have no micro stuttering with the 690 and almost all current games work with sli. There are those who claim the Titan is "smoother" than the 690, but you can't measure or quantify that so take if for what it is worth.
> 
> If you are looking for member support then the Titan is the way to go. Since it is new and a good number of people are buying it there is a ton of activity. I mean no offense at all, but this owner's thread is relatively inactive and a number of questions get no response.
> 
> If you are ok with the fact that you have sli in one card, a single 690 beats a Titan. If you will add another card later Titan is probably a better choice. I have not heard of anyone having vram issues on single 1080p with 2gb.


So both options will have its own demerits. I'll just have to deal with it I guess. Probably not going to risk it with SLI issues (I wont be able to RMA) and going to get a Titan like I originally planned. Thanks!


----------



## jcde7ago

Quote:


> Originally Posted by *justanoldman*
> 
> Unfortunately there is no right answer to your question. In the 690 owner thread many will say that card, in the Titan owner thread they will call you stupid not to buy one.
> 
> If you plan to expand later where you would add a second 690 or Titan, then I would lean toward Titan because two way sli is easier to deal with than four way.
> 
> If you plan on sticking with only one card ever, then the 690 beats the Titan in fps by a decent margin. The 690 is ahead by 20+ fps in the Valley thread, and every review puts it ahead in every resolution. The downside is you have sli vs. one gpu. Some say you will get micro stuttering and some games don't like sli. Others will say they have no micro stuttering with the 690 and almost all current games work with sli. There are those who claim the Titan is "smoother" than the 690, but you can't measure or quantify that so take if for what it is worth.
> 
> If you are looking for member support then the Titan is the way to go. Since it is new and a good number of people are buying it there is a ton of activity. *I mean no offense at all, but this owner's thread is relatively inactive and a number of questions get no response.*
> 
> If you are ok with the fact that you have sli in one card, a single 690 beats a Titan. If you will add another card later Titan is probably a better choice. I have not heard of anyone having vram issues on single 1080p with 2gb.


I think you meant, "this thread's owner is relatively inactive." And no offense taken.









That is true to a good extent. But for what it's worth, it's a big change in scenery for me from a year ago, when the GTX 690 first released and my job wasn't nearly as stressful as it has been, and i had all the time in the world to comment on each and every question and observation, run benchmarks until dawn, etc. etc. It's more difficult to keep up with an owner's club, or anything else for that matter, when you're working all day, get home, and have 2~ hours tops to fit in eating, exercising, a little bit of gaming, etc before you're in bed + at work again. But I digress.









A big reason why I don't pop in and generally respond to questions is after a year of the 690 being available, unless it is for very specific games, a lot of the information people are looking for are readily available, it just takes a quick bit of searching, etc. to find it. Also, a lot of the current members are extremely active and seem to have more time than me to respond...so, I let the community run the thread. If there is a great need for my advice at this point moreso than just "managing" the club, and using the information provided by other members, then let me know, and i'll make some time to review whatever comments and questions people feel they need more concrete answers to.

Just know that this is the natural order of things when something else that's big gets released right next door - the Titan.









Speaking of which, here are my thoughts on the whole, GTX Titan vs. 680 SLI/690 deal:

I've said it before and i'll say it again - as the owner of this thread and an owner of a 690 who has benched with a Titan, *for single monitor gaming at 1080/1440/1600p, the GTX 690 is, in my opinion, the absolute, superior choice. Period.*

*But what about SLI issues?* Tell me what game in recent memory had issues with SLI support. Oh, and Titan owners bashing SLI? A good 9/10 of them are probably looking to go the SLI route down the road anyways, so I guess there is a bit of irony in that, no?









*But what about frametimes and smoothness?* http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Performance-Review-and-Frame-Rating-Update/Crysis-3- (*hint: The 690 is still superior*).

*But what about multi-monitor gaming and the 2GB VRAM limitation?* This is where the Titan(s) is a clear choice. Titans in SLI, packed with 6GB of VRAM, running surround, etc., is an absolute game changer. It really is. Unfortunately, surround gaming is far from being the norm, and really, the Ttian is kind of stuck in this weird place where it packs the VRAM to tackle such a high resolution on a multi-monitor setup, but will eventually run out of pure muscle unless it's paired with another, or 2 or 3 other Titans. Again, it goes back to my single-monitor-gaming argument.

*Bottom line is, if you're a 60FPS, vsync ON gamer who has to max everything out with 2-4XMSAA at a solid 60FPS (yes, including Crysis 3), at up to a single 1600p monitor, then a 690 is the way to go.* Don't let these talks of imperceptible frametimes or "SLI issues" fool you. I could drop a single paycheck on 3 Titans and think nothing of it, but at this point...why would I downgrade?


----------



## justanoldman

^ Thanks for the response and helpful info. +rep

I agree with your view, and that is why I got a 690 for my single monitor 2560x1600 gaming. I tried gaming on my 3x24" and while it was fun for awhile, my old eyes didn't appreciate it over time.


----------



## PinzaC55

Worth watching


----------



## Arizonian

Quote:


> Originally Posted by *PinzaC55*
> 
> Worth watching


Wow my 690 runs through that much more smoothly.









Quote:


> Originally Posted by *jcde7ago*
> 
> *SNIP*
> 
> Speaking of which, here are my thoughts on the whole, GTX Titan vs. 680 SLI/690 deal:
> 
> I've said it before and i'll say it again - as the owner of this thread and an owner of a 690 who has benched with a Titan, *for single monitor gaming at 1080/1440/1600p, the GTX 690 is, in my opinion, the absolute, superior choice. Period.*
> 
> *But what about SLI issues?* Tell me what game in recent memory had issues with SLI support. Oh, and Titan owners bashing SLI? A good 9/10 of them are probably looking to go the SLI route down the road anyways, so I guess there is a bit of irony in that, no?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *But what about frametimes and smoothness?* http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Performance-Review-and-Frame-Rating-Update/Crysis-3- (*hint: The 690 is still superior*).
> 
> *But what about multi-monitor gaming and the 2GB VRAM limitation?* This is where the Titan(s) is a clear choice. Titans in SLI, packed with 6GB of VRAM, running surround, etc., is an absolute game changer. It really is. Unfortunately, surround gaming is far from being the norm, and really, the Ttian is kind of stuck in this weird place where it packs the VRAM to tackle such a high resolution on a multi-monitor setup, but will eventually run out of pure muscle unless it's paired with another, or 2 or 3 other Titans. Again, it goes back to my single-monitor-gaming argument.
> 
> *Bottom line is, if you're a 60FPS, vsync ON gamer who has to max everything out with 2-4XMSAA at a solid 60FPS (yes, including Crysis 3), at up to a single 1600p monitor, then a 690 is the way to go.* Don't let these talks of imperceptible frametimes or "SLI issues" fool you. I could drop a single paycheck on 3 Titans and think nothing of it, but at this point...why would I downgrade?


Great explanation.









The 690 being over 9 months old is still very relevant for single monitors just not multiple monitors. In fact the better option for single monitors IMO bu that's a personal decision to want a GTX 690 or go with SLI or crossfire for most performance.

I also see the SLI vs single GPU option debatable. I find most OCN users love to add a second card regardless.


----------



## jcde7ago

Quote:


> Originally Posted by *Arizonian*
> 
> Wow my 690 runs through that much more smoothly.


As does mine...that was pretty disappointing, even at 1080p...
Quote:


> The 690 being over 9 months old is still very relevant for single monitors just not multiple monitors. In fact the better option for single monitors IMO bu that's a personal decision to want a GTX 690 or go with SLI or crossfire for most performance.
> 
> I also see the SLI vs single GPU option debatable. I find most OCN users love to add a second card regardless.


Yup. Most people talk about "fastest single GPU solution" etc. etc., bashing SLI along the way, like not every single major game from the last 3-4 years doesn't come out with an SLI profile or driver from Nvidia BEFORE the game launches that includes SLI support, then go and pick up another of said single card solution anyways...and then how quickly SLI becomes a non-issue for them!









Also, as i've explained many times, this is 2004 no more. We're not dealing with 8800s, GTX 2xx, GTX 4xx...microstuttering has improved significantly to the point of being a non-issue, ESPECIALLY with hardware-based frame metering in the GTX 600 series (most prominently in the 690) which makes the microstuttering as good/non-existent if not better than even 680s in SLI (can't find those benchmarks atm, i'm at work).

This whole frametimes fluff is exactly what it is....fluff. People talking about "pure FPS not mattering" are delusional, it's simple as that. The higher the FPS, especially when vsynced at 60FPS where frametimes matter less because it's not spiking from 120-150+ to a much lower number, the more fluid the gameplay is going to seem/appear, period. And that's what matters - what we actually perceive.

When I say that i'm maxing out a game at the highest settings with high MSAA levels and experiencing no microstuttering at all w/ a constant 60 FPS because of VSYNC, I mean exactly that. There isn't some magical stuttering and slow down that i'm "choosing" to ignore with my $1,000 graphics card. Believe me when i say that i'd have turned around and shipped this 690 back the first hour i had it if I saw anything that even resembled, "microstuttering" or gameplay that wasn't smooth due to "frame times." When people are bickering over 1-2 ms of frametime differences, you know Nvidia is is sitting back and laughing at their newfound cash-cow...now in order to have ALL THE FPS and ALL THE FRAMETIMES in the world, you gotta buy 2x Titans! How quickly the "FPS is king" script has flipped at OCN for such an imperceptible number as a mere 1 or 2 milliseconds.


----------



## ahnafakeef

Quote:


> Originally Posted by *jcde7ago*
> 
> As does mine...that was pretty disappointing, even at 1080p...
> Yup. Most people talk about "fastest single GPU solution" etc. etc., bashing SLI along the way, like not every single major game from the last 3-4 years doesn't come out with an SLI profile or driver from Nvidia BEFORE the game launches that includes SLI support, then go and pick up another of said single card solution anyways...and then how quickly SLI becomes a non-issue for them!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, as i've explained many times, this is 2004 no more. We're not dealing with 8800s, GTX 2xx, GTX 4xx...microstuttering has improved significantly to the point of being a non-issue, ESPECIALLY with hardware-based frame metering in the GTX 600 series (most prominently in the 690) which makes the microstuttering as good/non-existent if not better than even 680s in SLI (can't find those benchmarks atm, i'm at work).
> 
> This whole frametimes fluff is exactly what it is....fluff. People talking about "pure FPS not mattering" are delusional, it's simple as that. The higher the FPS, especially when vsynced at 60FPS, the more fluid the gameplay is going to seem/appear, period. And that's what matters - what we actually perceive. When people are bickering over 1-2 ms of frametime differences, you know Nvidia is is sitting back and laughing at their newfound cash-cow...now in order to have ALL THE FPS and ALL THE FRAMETIMES in the world, you gotta buy 2x Titans! How quickly the "FPS is king" script has flipped at OCN for such an imperceptible number as a mere 1 or 2 milliseconds.


Have you OCed your 690? How high could you get it up to?

How much OC can I expect if I lose the silicon lottery very badly and the chip isnt good?

Thanks for taking the time to reply to my post from your hectic schedule. I really appreciate it.


----------



## jcde7ago

Quote:


> Originally Posted by *ahnafakeef*
> 
> Have you OCed your 690? How high could you get it up to?
> 
> How much OC can I expect if I lose the silicon lottery very badly and the chip isnt good?
> 
> Thanks for taking the time to reply to my post from your hectic schedule. I really appreciate it.


I can get another +125mhz on the core, and +700 on the mem before I get driver crashes. In Crysis 3, I have to drop the core to +100mhz and the mem to +500 to keep the drivers from crashing. I'd say this is on the higher end though, especially without a custom BIOS. I'd say that +75-100mhz is easily attainable on a 690, +300-400 on mem at the least.


----------



## Arizonian

Quote:


> Originally Posted by *ahnafakeef*
> 
> Have you OCed your 690? How high could you get it up to?
> 
> How much OC can I expect if I lose the silicon lottery very badly and the chip isnt good?
> 
> *SNIP*


My memory OC is horrible. I did not get lucky at all with Memory over clocking one bit. I'm seeing others get crazy high Memory over clocks. Nothing over +100 offset for me or it crashes.

As for Core clocks I get up to 1176 MHz Core. Second GPU does up to 1202 MHz Core but it don't matter if GPU #1 is only as good as 1176 Mhz Core. Healthy 261 MHz Core stable over clock maxed.

My scores / specs / stats - second post *HERE* if you want closer look.

Never lucky enough to own a golden chip.









So in answer to your question how bad is the worst OC.... ANY card which can only run @ STOCK clocks.


----------



## maximus56

Quote:


> Originally Posted by *jcde7ago*
> 
> I think you meant, "this thread's owner is relatively inactive." And no offense taken.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That is true to a good extent. But for what it's worth, it's a big change in scenery for me from a year ago, when the GTX 690 first released and my job wasn't nearly as stressful as it has been, and i had all the time in the world to comment on each and every question and observation, run benchmarks until dawn, etc. etc. It's more difficult to keep up with an owner's club, or anything else for that matter, when you're working all day, get home, and have 2~ hours tops to fit in eating, exercising, a little bit of gaming, etc before you're in bed + at work again. But I digress.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A big reason why I don't pop in and generally respond to questions is after a year of the 690 being available, unless it is for very specific games, a lot of the information people are looking for are readily available, it just takes a quick bit of searching, etc. to find it. Also, a lot of the current members are extremely active and seem to have more time than me to respond...so, I let the community run the thread. If there is a great need for my advice at this point moreso than just "managing" the club, and using the information provided by other members, then let me know, and i'll make some time to review whatever comments and questions people feel they need more concrete answers to.
> 
> Just know that this is the natural order of things when something else that's big gets released right next door - the Titan.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Speaking of which, here are my thoughts on the whole, GTX Titan vs. 680 SLI/690 deal:
> 
> I've said it before and i'll say it again - as the owner of this thread and an owner of a 690 who has benched with a Titan, *for single monitor gaming at 1080/1440/1600p, the GTX 690 is, in my opinion, the absolute, superior choice. Period.*
> 
> *But what about SLI issues?* Tell me what game in recent memory had issues with SLI support. Oh, and Titan owners bashing SLI? A good 9/10 of them are probably looking to go the SLI route down the road anyways, so I guess there is a bit of irony in that, no?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *But what about frametimes and smoothness?* http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-GeForce-GTX-TITAN-Performance-Review-and-Frame-Rating-Update/Crysis-3- (*hint: The 690 is still superior*).
> 
> *But what about multi-monitor gaming and the 2GB VRAM limitation?* This is where the Titan(s) is a clear choice. Titans in SLI, packed with 6GB of VRAM, running surround, etc., is an absolute game changer. It really is. Unfortunately, surround gaming is far from being the norm, and really, the Ttian is kind of stuck in this weird place where it packs the VRAM to tackle such a high resolution on a multi-monitor setup, but will eventually run out of pure muscle unless it's paired with another, or 2 or 3 other Titans. Again, it goes back to my single-monitor-gaming argument.
> 
> *Bottom line is, if you're a 60FPS, vsync ON gamer who has to max everything out with 2-4XMSAA at a solid 60FPS (yes, including Crysis 3), at up to a single 1600p monitor, then a 690 is the way to go.* Don't let these talks of imperceptible frametimes or "SLI issues" fool you. I could drop a single paycheck on 3 Titans and think nothing of it, but at this point...why would I downgrade?


This! Without maxing AA a single 690 can handle this res at 120 fps.
I have 4 way sli Titan, and I am still waiting to be wowed by it on 5760 x 1080. Perhaps part of it is the driver support, and throttling issues, but I can tell you that my gaming enjoyment with quad 690 on the same res was no different than having max AA in all games with quad titans. Yes, I can run skyrim with 90 mods, enb, and 3d but I am also paying quite a premium for this privilege.
My OC on core is 129 offset, 700 on the memory on both 690s, watercooled.


----------



## wermad

I understand and I'm very happy for Titan owners but there's a club for you to discuss all the wonders of your new gpu.

I think we should leave this thread open to 690 discussion. Comparing your new Titan to your 690, or whatever, doesn't make sense here because Titan nor Tahiti (nor any other gpu) will help a 690 owner, like me, better understand and use his/her *GTX 690*.

I'm getting ready to put my 690s through their paces and I prefer my posts and other's post about 690 not to get lost in the confusion of Titan excitement.

Not hating, but bottom line, its getting a bit too







tbh.

Just my









edit: there's lots of welcomed Titan discussion in other multiple threads, like the Surround club, fyi


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> I understand and I'm very happy for Titan owners but there's a club for you to discuss all the wonders of your new gpu.
> 
> I think we should leave this thread open to 690 discussion. Comparing your new Titan to your 690, or whatever, doesn't make sense here because Titan nor Tahiti (nor any other gpu) will help a 690 owner, like me, better understand and use his/her *GTX 690*.
> 
> I'm getting ready to put my 690s through their paces and I prefer my posts and other's post about 690 not to get lost in the confusion of Titan excitement.
> 
> Not hating, but bottom line, its getting a bit too
> 
> 
> 
> 
> 
> 
> 
> tbh.
> 
> Just my
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: there's lots of welcomed Titan discussion in other multiple threads, like the Surround club, fyi


As an owner of both GPUs, I thought providing my 2 cents was in line with someone's question here...lol


----------



## wermad

If someone has questions about titan, its better to ask in the titan club, nah? Lol









edit: if someone has a question on your setup or your experience, do so via pm if it doesn't relate to the thread's topic. I contacted several sli 690 owners via pm to get their take on this setup. Its just courtesy to avoid going off topic tbh.


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> If someone has questions about titan, its better to ask in the titan club, nah? Lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: if someone has a question on your setup or your experience, do so via pm if it doesn't relate to the thread's topic. I contacted several sli 690 owners via pm to get their take on this setup. Its just courtesy to avoid going off topic tbh.


Good suggestion. And, as always, I give two thumbs up to a quad 690!!
Edit: By the way, if I had made similar postings on the Titan owners forum, I would not hear the end of it from the Titan fanatic fans! Lol


----------



## PinzaC55

Just a random thought....many people baulk at the cost of the GTX 690 or "the other one







" but look at it this way; I bought mine new OEM in December for £630 when it popped up on Ebay and I knew that if I hesitated an hour it would be gone. Lets say I keep it for a year then sell it when its value may be £300. £630 - £300 is £330, so it will have cost me about 90 pence a day - thats hardly a bad deal is it?


----------



## pilla99

Quote:


> Originally Posted by *PinzaC55*
> 
> Just a random thought....many people baulk at the cost of the GTX 690 or "the other one
> 
> 
> 
> 
> 
> 
> 
> " but look at it this way; I bought mine new OEM in December for £630 when it popped up on Ebay and I knew that if I hesitated an hour it would be gone. Lets say I keep it for a year then sell it when its value may be £300. £630 - £300 is £330, so it will have cost me about 90 pence a day - thats hardly a bad deal is it?


I don't know that currency makes no sense to me. I will never know if this was in fact a good deal.


----------



## jhager8783

Just gave my 690's some more breathing room by getting rid of the huge Scythe Ninja 2 and installing a Corsair H100i push/pull. Also bought a RAM cooler for good measure.


----------



## tinmann

Mine finally showed up today






I need 2 more dvi cables to make surround work. With SLI 480's I used one DVI and two HDMI cables. And wouldn't you know it hardly anyone in the area carries them except radio Shack and Staple and they are crazy expensive at both places. I ordered two online, they shou ld be here in a few days. My Evga backplate will be here tomorrow.


----------



## JimmyD

Quote:


> Originally Posted by *tinmann*
> 
> Mine finally showed up today
> 
> 
> 
> 
> 
> 
> I need 2 more dvi cables to make surround work. With SLI 480's I used one DVI and two HDMI cables. And wouldn't you know it hardly anyone in the area carries them except radio Shack and Staple and they are crazy expensive at both places. I ordered two online, they shou ld be here in a few days. My Evga backplate will be here tomorrow.


sweeeeeeeeeeeeeeeeet


----------



## jhager8783

Walmart buddy! They are $8.99 each in VA


----------



## jhager8783

I've noticed my 690's.... well.... sagging a little, or drooping per se. Does anybody know of a VGA support bracket that could mount from behind the card, or something that isn't insanely gaudy?

Oh, and does anybody else have the habit of removing heavier graphics cards when they go mobile with their rig? I'm wondering if I'm just being over protective, but if I where to drop the case on accident, I imagine the GTX's PCB would suffer.


----------



## Arizonian

Quote:


> Originally Posted by *jhager8783*
> 
> I've noticed my 690's.... well.... sagging a little, or drooping per se. Does anybody know of a VGA support bracket that could mount from behind the card, or something that isn't insanely gaudy?
> 
> Oh, and does anybody else have the habit of removing heavier graphics cards when they go mobile with their rig? I'm wondering if I'm just being over protective, but if I where to drop the case on accident, I imagine the GTX's PCB would suffer.


Get a backplate which will give it the protection you need. Too bad for $1000 it wasn't part if the card. But I digress.

Only one I've seen support for a GPU comes with the HIS 7970 that mounts from the bottom of the case holding up the GPU weight.

Edit: You're the first person to say this is happening. The 690 has extra thick PCB board too. Mine has no sag installed 9 months ago. A bit concerning if more start to experience this.


----------



## wermad

Quote:


> Originally Posted by *jhager8783*
> 
> I've noticed my 690's.... well.... sagging a little, or drooping per se. Does anybody know of a VGA support bracket that could mount from behind the card, or something that isn't insanely gaudy?
> 
> Oh, and does anybody else have the habit of removing heavier graphics cards when they go mobile with their rig? I'm wondering if I'm just being over protective, but if I where to drop the case on accident, I imagine the GTX's PCB would suffer.


Your HAF-X should have come with a vga support bracket:



For lan mobility, get a HAF-XB, the "Horizon" mb layout will not put the same strain as a traditional atx setup.

If I decide to sell my giant MM case, the HAF-XB would be the next case for me


----------



## jhager8783

Quote:


> Originally Posted by *tinmann*
> 
> Mine finally showed up today
> 
> 
> 
> 
> 
> 
> I need 2 more dvi cables to make surround work. With SLI 480's I used one DVI and two HDMI cables. And wouldn't you know it hardly anyone in the area carries them except radio Shack and Staple and they are crazy expensive at both places. I ordered two online, they shou ld be here in a few days. My Evga backplate will be here tomorrow.


Nice rig! I like that you are practicing good cable management. That'll make it great for airflow inside your case with these things running at their most. Speaking of, I'm sure you're aware that these cards exhaust inside and outside the case; if you can, find away to maximize airflow to compensate. Here's what I did:



Two 140mm front exhaust fans in the drive bay with a tunnel ram in between, a 230mm front exhaust, and the rest of the case fans as intakes.


----------



## wermad

I need to use my new SG3 and take better pics of my quads


----------



## jhager8783

Quote:


> Originally Posted by *wermad*
> 
> Your HAF-X should have come with a vga support bracket:
> 
> 
> 
> For lan mobility, get a HAF-XB, the "Horizon" mb layout will not put the same strain as a traditional atx setup.
> 
> If I decide to sell my giant MM case, the HAF-XB would be the next case for me


I tried using the support bracket before, but the 690 is too thick by 3/8 of an inch for the little legs to fit in between. I don't take the rig mobile very much, so taking the cards out isn't a huge bother. What I'm most concerned about is the sagging. Am I stuck with the factory VGA bracket? Is there nothing else that'll work?


----------



## jhager8783

Quote:


> Originally Posted by *wermad*
> 
> I need to use my new SG3 and take better pics of my quads


The blurrier the better I say, that-a-way nobody else sees all the dust in your case, and/or scratches in the side panel or fingerprints on your cards.


----------



## PinzaC55

Quote:


> Originally Posted by *jhager8783*
> 
> I've noticed my 690's.... well.... sagging a little, or drooping per se. Does anybody know of a VGA support bracket that could mount from behind the card, or something that isn't insanely gaudy?
> 
> Oh, and does anybody else have the habit of removing heavier graphics cards when they go mobile with their rig? I'm wondering if I'm just being over protective, but if I where to drop the case on accident, I imagine the GTX's PCB would suffer.


My 690 came with an extender bracket which fixes to the rear by 3 screws like this




I suppose you may be able to get them from Nvidia?


----------



## ahnafakeef

Quote:


> Originally Posted by *jcde7ago*
> 
> I can get another +125mhz on the core, and +700 on the mem before I get driver crashes. In Crysis 3, I have to drop the core to +100mhz and the mem to +500 to keep the drivers from crashing. I'd say this is on the higher end though, especially without a custom BIOS. I'd say that +75-100mhz is easily attainable on a 690, +300-400 on mem at the least.


I'm no expert and I mean no offense, but that doesnt like too great an OC. Or is it normal for 690s?
Quote:


> Originally Posted by *Arizonian*
> 
> My memory OC is horrible. I did not get lucky at all with Memory over clocking one bit. I'm seeing others get crazy high Memory over clocks. Nothing over +100 offset for me or it crashes.
> 
> As for Core clocks I get up to 1176 MHz Core. Second GPU does up to 1202 MHz Core but it don't matter if GPU #1 is only as good as 1176 Mhz Core. Healthy 261 MHz Core stable over clock maxed.
> 
> My scores / specs / stats - second post *HERE* if you want closer look.
> 
> Never lucky enough to own a golden chip.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So in answer to your question how bad is the worst OC.... ANY card which can only run @ STOCK clocks.


Memory OCs shouldnt really matter though, especially with a dual-GPU card, right? Doesnt core clock affect performance more?
Do the two GPUs in a 690 have to be OCed separately? I had no idea!
I guess my 660Ti is one of the worst ones then. It crashes even at +50 core although +300 on the memory ran fine in two games. Havent tried any lower than +50.

Thanks a lot to both of you!


----------



## jcde7ago

Quote:


> Originally Posted by *maximus56*
> 
> This! Without maxing AA a single 690 can handle this res at 120 fps.
> I have 4 way sli Titan, and I am still waiting to be wowed by it on 5760 x 1080. Perhaps part of it is the driver support, and throttling issues, but I can tell you that my gaming enjoyment with quad 690 on the same res was no different than having max AA in all games with quad titans. Yes, I can run skyrim with 90 mods, enb, and 3d but I am also paying quite a premium for this privilege.
> My OC on core is 129 offset, 700 on the memory on both 690s, watercooled.


Thanks, appreciate the comparison info.









Quote:


> Originally Posted by *tinmann*
> 
> Mine finally showed up today
> 
> 
> 
> 
> 
> 
> I need 2 more dvi cables to make surround work. With SLI 480's I used one DVI and two HDMI cables. And wouldn't you know it hardly anyone in the area carries them except radio Shack and Staple and they are crazy expensive at both places. I ordered two online, they shou ld be here in a few days. My Evga backplate will be here tomorrow.


Congrats....added!








Quote:


> Originally Posted by *ahnafakeef*
> 
> I'm no expert and I mean no offense, but that doesnt like too great an OC. Or is it normal for 690s?


It's a fairly decent OC, nothing too special. Again though, a lot of the high OCs are with a modded BIOS, etc., which i'm not running.


----------



## TeamBlue

Quote:


> Originally Posted by *PinzaC55*
> 
> My 690 came with an extender bracket which fixes to the rear by 3 screws like this
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I suppose you may be able to get them from Nvidia?


Those are DELL parts, yeah?


----------



## PinzaC55

Quote:


> Originally Posted by *TeamBlue*
> 
> Those are DELL parts, yeah?


Not as far as I am aware? It simply came OEM with only Nvidia logo's and the extender bracket attached - AFAIK all 690's have the screws, and I found a reference to it somewhere on the Nvidia site. In my HAF X case it doesn't attach to anything, just sits on top of the drive cage as per the second photo.


----------



## RobotDevil666

Hey guys ...
I got my 690 two weeks ago and I'm very happy with it , i can do conservative +100 on core and +100 on mem , tried +150 but Crysis 3 crashed







so for now i stick with what i have.
I have a question though , what are the temps you guys are getting on air ? and how high the fan goes , mine tops at 80-82C with fan around 60% and to be honest it's not the quietest card i had , than again maybe i got spoiled by DirectCU II cards.


----------



## wholeeo

Crysis 3 crashes on me randomly on stock settings. It worked out perfect for me last night though since it was getting late and crashed after 2 hours of gameplay after I reached a checkpoint,







Not sure if my card doesn't have enough voltage set for default clocks or Crysis 3 just has some bugs. Perhaps Prime95 24 hour Stable isn't really stable, not sure whats going on,


----------



## wermad

What is this Crysis 3 ppl are talking about??????







Ugh, I really can't see myself paying $60 for a new game. Waiting for the sales









Finally did some gaming and it was an ok experience. I did have significant slow down in a mildly intensive game but I'm suspecting its a driver issue. I do love the super low temps. High 20s with all fou cores ~45% usage (hardly pushing them







).


----------



## jhager8783

Quote:


> Originally Posted by *TeamBlue*
> 
> Those are DELL parts, yeah?


Regardless, the screws in the rear of the card give me an idea to make a support bracket the mounts from behind the motherboard. I'll get to work on something and let you know how it goes.


----------



## Qu1ckset

Quote:


> Originally Posted by *wermad*
> 
> What is this Crysis 3 ppl are talking about??????
> 
> 
> 
> 
> 
> 
> 
> Ugh, I really can't see myself paying $60 for a new game. Waiting for the sales
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally did some gaming and it was an ok experience. I did have significant slow down in a mildly intensive game but I'm suspecting its a driver issue. I do love the super low temps. High 20s with all fou cores ~45% usage (hardly pushing them
> 
> 
> 
> 
> 
> 
> 
> ).


I play at 1440p (till my monitor died two days ago) and my single GTX690 raped any game I threw at it getting 80-110fps in any game at max settings, till I installed crysis 3, game brings my 690 to its knees on the in game cut scenes going from 60-80fps down to 35fps lol

The eye candy is amazing tho, best graphics to date!


----------



## justanoldman

Quote:


> Originally Posted by *RobotDevil666*
> 
> Hey guys ...
> I got my 690 two weeks ago and I'm very happy with it , i can do conservative +100 on core and +100 on mem , tried +150 but Crysis 3 crashed
> 
> 
> 
> 
> 
> 
> 
> so for now i stick with what i have.
> I have a question though , what are the temps you guys are getting on air ? and how high the fan goes , mine tops at 80-82C with fan around 60% and to be honest it's not the quietest card i had , than again maybe i got spoiled by DirectCU II cards.


Have you tried raising the memory while keeping the core at +100? I ask because many of these cards can't do +150 on the core but can do at least +500 on the memory.

As for keeping it cool, I am not sure there is much you can do to keep it below the 70c throttle point on air. I am looking to add it to my new H220 cooler at some point, because the noise of the fan while maxed is not exactly quiet - not as bad as some others - but I look forward to water cooling it.


----------



## PinzaC55

Quote:


> Originally Posted by *RobotDevil666*
> 
> Hey guys ...
> I got my 690 two weeks ago and I'm very happy with it , i can do conservative +100 on core and +100 on mem , tried +150 but Crysis 3 crashed
> 
> 
> 
> 
> 
> 
> 
> so for now i stick with what i have.
> I have a question though , what are the temps you guys are getting on air ? and how high the fan goes , mine tops at 80-82C with fan around 60% and to be honest it's not the quietest card i had , than again maybe i got spoiled by DirectCU II cards.


Running Unigine Valley benchmark on Extreme settings my card hit 80 degrees C but cooled down pretty quickly. The fan is very quiet on mine.


----------



## wermad

Quote:


> Originally Posted by *Qu1ckset*
> 
> I play at 1440p (till my monitor died two days ago) and my single GTX690 raped any game I threw at it getting 80-110fps in any game at max settings, till I installed crysis 3, game brings my 690 to its knees on the in game cut scenes going from 60-80fps down to 35fps lol
> 
> The eye candy is amazing tho, best graphics to date!


I really can't po' the wife any more with more expenses. I dumped a lot of cash on this new build and three dell 1200s (they're phenomenal!). Thought about getting three Korean 1440s but its just too sketchy and i hear too many stories of them dieing pretty quick.


----------



## maximus56

Quote:


> Originally Posted by *RobotDevil666*
> 
> Hey guys ...
> I got my 690 two weeks ago and I'm very happy with it , i can do conservative +100 on core and +100 on mem , tried +150 but Crysis 3 crashed
> 
> 
> 
> 
> 
> 
> 
> so for now i stick with what i have.
> I have a question though , what are the temps you guys are getting on air ? and how high the fan goes , mine tops at 80-82C with fan around 60% and to be honest it's not the quietest card i had , than again maybe i got spoiled by DirectCU II cards.


Lol...don't feel too bad..C3 gives me 35 - 40 fps with 16 x AA on 5760 x1080 on quad Titans ..and is a stuttering mess..
I think that c3 is either not well optimized or the drivers have not fully caught up to it yet.


----------



## Qu1ckset

Quote:


> Originally Posted by *wermad*
> 
> I really can't po' the wife any more with more expenses. I dumped a lot of cash on this new build and three dell 1200s (they're phenomenal!). Thought about getting three Korean 1440s but its just too sketchy and i hear too many stories of them dieing pretty quick.


Well I got a new pcb for my monitor shipping out today, hopefully that fixes the problem.. 1440p and 1600p looks so amazing, I can't bring myself to getting a lower rez.

I'm waiting for some smaller bezel'd 1440p monitors to be released for some 3x 1440 portrait gaming! , landscaped sucked Imo


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> What is this Crysis 3 ppl are talking about??????
> 
> 
> 
> 
> 
> 
> 
> Ugh, I really can't see myself paying $60 for a new game. Waiting for the sales
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally did some gaming and it was an ok experience. I did have significant slow down in a mildly intensive game but I'm suspecting its a driver issue. I do love the super low temps. High 20s with all fou cores ~45% usage (hardly pushing them
> 
> 
> 
> 
> 
> 
> 
> ).


I purchased my copy(Origin key) on Ebay off someone selling the AMD Never Settle Bundle. I rarely pay 60 for a game,


----------



## wermad

Quote:


> Originally Posted by *Qu1ckset*
> 
> Well I got a new pcb for my monitor shipping out today, hopefully that fixes the problem.. 1440p and 1600p looks so amazing, I can't bring myself to getting a lower rez.
> 
> I'm waiting for some smaller bezel'd 1440p monitors to be released for some 3x 1440 portrait gaming! , landscaped sucked Imo


I'm a portrait guy too. I just don't see the appeal of Landscape. Wish Dell 1440s (or any other "major" brand) would offer them at a cheaper price. Still, you gotta cough up $500 for a guarantee perfect 1440 Korean. Still better then the major brands but there's the uncertainties. Its not a major step from 1080 to 1200 but the extra space going from 21.5 to 24" is sweet, especially in portrait.
Quote:


> Originally Posted by *wholeeo*
> 
> I purchased my copy(Origin key) on Ebay off someone selling the AMD Never Settle Bundle. I rarely pay 60 for a game,


I'm hunting for something like that. Just don't want to buy a dead key. Just bought MP3 for $10 since it was cheap and its a game used in a lot of reviews/testing. Still have my older games that i still love and don't mind gaming on (Metro, Crysis2, CoD WaW, etc.) can keep me busy until newer titles come down in price.


----------



## RobotDevil666

Quote:


> Originally Posted by *wermad*
> 
> I'm hunting for something like that. Just don't want to buy a dead key. Just bought MP3 for $10 since it was cheap and its a game used in a lot of reviews/testing. Still have my older games that i still love and don't mind gaming on (Metro, Crysis2, CoD WaW, etc.) can keep me busy until newer titles come down in price.


Why don't you buy it from one of the CD Key selling sites , that's what i usually do , i got my Crysys 3 key for £27 on launch day , while it was 39.99 on Origin.
Since Crysis 3 is out few weeks i expect the prices to be bit lower.


----------



## Maximilium

You can add me:

2 SLI EVGA GTX 690 HydroCopper


----------



## Arizonian

^^^Nice clean black n white rig^^^


----------



## Maximilium

Thanks.


----------



## TeamBlue

Is that a 360 and a 200? I'm putting a 3570k and 690 on a 240 and a 200, curious about your temps.
Quote:


> Originally Posted by *Maximilium*
> 
> Thanks.


----------



## Qu1ckset

Quote:


> Originally Posted by *TeamBlue*
> 
> Is that a 360 and a 200? I'm putting a 3570k and 690 on a 240 and a 200, curious about your temps.


I got 3930k and single GTX690 On a hwlab stealth 240 and hwlab gtx240 with my cpu @4.0 and gpu @ stock, while gaming the cpu is around 41C and gpu's 29-32C and 36C in crysis 3 lol


----------



## jhager8783

Hey guys, I've got a strange problem, my 690 sli setup all of the sudden only wants to run in 3 way sli. I've tried everyhing I know to do , uninstall drivers with driver sweeper, reinstall... switching the cards in their slots, disabling physx... sometimes it tells me that my sli connector is missing, so I shut down, flip it over and the message goes away, but to the same effect. 3 way sli is fine for me, but I'd like to know whether ome of my cards is messed up. Anyone ever had a problem like this?

On a side note, microsoft silverlight somehow managed to install itself on my rig without my permission. The same thing happened to my dad's computer which ceased booting after it installed itself, and my mom's computer which ran at % 100 cpu at all times until I uninstalled it. Uninstalling it made no difference with my problem. The only hardware changes I've made recently is adding the H100i. If someone has an idea or solution, I would be gratefull.


----------



## wermad

Quote:


> Originally Posted by *jhager8783*
> 
> Hey guys, I've got a strange problem, my 690 sli setup all of the sudden only wants to run in 3 way sli. I've tried everyhing I know to do , uninstall drivers with driver sweeper, reinstall... switching the cards in their slots, disabling physx... sometimes it tells me that my sli connector is missing, so I shut down, flip it over and the message goes away, but to the same effect. 3 way sli is fine for me, but I'd like to know whether ome of my cards is messed up. Anyone ever had a problem like this?
> 
> On a side note, microsoft silverlight somehow managed to install itself on my rig without my permission. The same thing happened to my dad's computer which ceased booting after it installed itself, and my mom's computer which ran at % 100 cpu at all times until I uninstalled it. Uninstalling it made no difference with my problem. The only hardware changes I've made recently is adding the H100i. If someone has an idea or solution, I would be gratefull.


Could be a bad core. Do all four show up in gpuz? If they do, it could be one of the card's plx bridge not working. Trying going to single card and test it. Make sure you enable sli to bridge both gpus. Also, double check your mb bios version, make sure its up to date. I'm still trying to sort out the bugs on mine so i really don't have much input atm (







). I'm running 314.xx wqhl right now btw.

Try restoring to an earlier date then the silverlight installation.


----------



## jcde7ago

Quote:


> Originally Posted by *wermad*
> 
> I really can't po' the wife any more with more expenses. I dumped a lot of cash on this new build and three dell 1200s (they're phenomenal!). Thought about getting three Korean 1440s but its just too sketchy and i hear too many stories of them dieing pretty quick.


My Achieva Shimian 2560x1440p IPS i've had since January of last year (going on 15 months now) has been absolutely flawless. No light bleed, no dead/stuck pixels, etc. Best $290 i've ever spent on computer hardware - and I took a huge gamble, as I think I was one of the first 7-8 people to buy them in the initial, "Korean IPS monitor" thread that started the craze.








Quote:


> Originally Posted by *Maximilium*
> 
> You can add me:
> 
> 2 SLI EVGA GTX 690 HydroCopper


Congrats, added! And that is one CLEAN setup you've got there.


----------



## jhager8783

I switch to the third dvi port and it shows quad now, it just unsettling that it was working fine off the first but now it's not. Strange...


----------



## zer0sum

Quote:


> Originally Posted by *TeamBlue*
> 
> Is that a 360 and a 200? I'm putting a 3570k and 690 on a 240 and a 200, curious about your temps.


I have a 3570k @ 4.7ghz and a 670 @ 1300/7000 on a 1800mm and 120mm rad in my TJ08 and the temps are low.
Just switching over to a 690 and should have some feedback on temps once I get some time to mess with it.


----------



## wermad

Quote:


> Originally Posted by *jcde7ago*
> 
> My Achieva Shimian 2560x1440p IPS i've had since January of last year (going on 15 months now) has been absolutely flawless. No light bleed, no dead/stuck pixels, etc. Best $290 i've ever spent on computer hardware - and I took a huge gamble, as I think I was one of the first 7-8 people to buy them in the initial, "Korean IPS monitor" thread that started the craze.


I heard all those who got lucky and got their screens very early got some nice ones. Now they sell crap at that price range. I can't justify ~$1500 in monitors and then have one or more of them crap out on me. I'm pretty happy with my new dells. A much needed upgrade from my old 22" ones.
Quote:


> Originally Posted by *jhager8783*
> 
> I switch to the third dvi port and it shows quad now, it just unsettling that it was working fine off the first but now it's not. Strange...


I now the primary dvi is the one on its own, the secondary ones are the stacked ones from i can figure from the nvidia cp. either way, that should have not prevent sli to enable on all cores.


----------



## TeamBlue

Quote:


> Originally Posted by *wermad*
> 
> I heard all those who got lucky and got their screens very early got some nice ones. Now they sell crap at that price range. I can't justify ~$1500 in monitors and then have one or more of them crap out on me. I'm pretty happy with my new dells. A much needed upgrade from my old 22" ones.
> I now the primary dvi is the one on its own, the secondary ones are the stacked ones from i can figure from the nvidia cp. either way, that should have not prevent sli to enable on all cores.


I went the pixel perfect/squaretrade warranty route. Cost me 375 total, but it comes with the assurance that if it dies, I get a prepaid shipping label and a refund for the full amount of purchase!


----------



## wholeeo

Just installed the EVGA backplate without removing the waterblock.


----------



## wermad

Quote:


> Originally Posted by *wholeeo*
> 
> Just installed the EVGA backplate without removing the waterblock.


Nice! Same here but i had to drill out a couple of the standoffs since the Koolance screws didn't pass all the way through


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> Nice! Same here but i had to drill out a couple of the standoffs since the Koolance screws didn't pass all the way through


Don't you have quad SLI, how did you manage to do that without removing the blocks, 

I used a Children's Mucinex bottle with some painters tape on its top to hold up the block. It was the perfect size.


----------



## wermad

Quote:


> Originally Posted by *wholeeo*
> 
> Don't you have quad SLI, how did you manage to do that without removing the blocks,
> 
> I used a Children's Mucinex bottle with some painters tape on its top to hold up the block. It was the perfect size.


I have a good drainage system, then its easy to pull them out. i didn't have to remove the block from the pcb since the backplate didn't need a bunch of screws to be attached. An honestly, its not gonna provide a lot of structural support.

edit: my mb has ideal 16x/16x slot spacing.


----------



## Stateless

Hey guys. I have moved onto the Titan GPU and as I was doing some cleaning and organizing, I realized I had a brand new, still in box Koolance Waterblock for the GTX690. I had bought 2 when I bought 2 690's and then realized that I did not need two 690's. I had set the extra block in a box and just forgot about it.

If anyone is interested in it for purchase, send me an offer via PM.

Thanks!


----------



## worms14

Hello again.
My Gigabyte gtx690 Ended his life and current EVGA GTX690 have.
I would like to improve his AC cooling temperature.
Did someone change the paste in EVGA, it is worth doing?
Will recovery of the heatsink help with temps?
Maybe there is a person who've done it before and can share an information?


----------



## Nocturnal Link

Quote:


> Originally Posted by *egotrippin*


（φ　ﾟ Дﾟ）φ

O.M.G.

What monitors are those and where can I get them?!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *worms14*
> 
> Hello again.
> My Gigabyte gtx690 Ended his life and current EVGA GTX690 have.
> I would like to improve his AC cooling temperature.
> Did someone change the paste in EVGA, it is worth doing?
> Will recovery of the heatsink help with temps?
> Maybe there is a person who've done it before and can share an information?


It's almost always worth changing the paste on the gpu and is very easy to do with the GTX 690.


----------



## maximus56

Quote:


> Originally Posted by *jhager8783*
> 
> Hey guys, I've got a strange problem, my 690 sli setup all of the sudden only wants to run in 3 way sli. I've tried everyhing I know to do , uninstall drivers with driver sweeper, reinstall... switching the cards in their slots, disabling physx... sometimes it tells me that my sli connector is missing, so I shut down, flip it over and the message goes away, but to the same effect. 3 way sli is fine for me, but I'd like to know whether ome of my cards is messed up. Anyone ever had a problem like this?
> 
> On a side note, microsoft silverlight somehow managed to install itself on my rig without my permission. The same thing happened to my dad's computer which ceased booting after it installed itself, and my mom's computer which ran at % 100 cpu at all times until I uninstalled it. Uninstalling it made no difference with my problem. The only hardware changes I've made recently is adding the H100i. If someone has an idea or solution, I would be gratefull.


It would happen to me on quad 690, if the Kboost got corrupted somehow. Try disabling, kboost, and reinstalling the drivers in safe mode. Keep kboost disabled after rebooting into windows normally, and see if this fixes the problem. If not, then you may have to do a restore to a previous point when this was working properly, or a fresh o/s install.


----------



## benjaminlev09

Hey all,

anyone tested those new BETA 314.21 drivers yet on the 690?

im currently use 313.96 with no issues and i wanna know if theres any differences


----------



## justanoldman

Quote:


> Originally Posted by *benjaminlev09*
> 
> Hey all,
> 
> anyone tested those new BETA 314.21 drivers yet on the 690?
> 
> im currently use 313.96 with no issues and i wanna know if theres any differences


I just beat MrTooShort in Valley, so for my quick experiment the driver is a little better. Of course, when he uses the driver, I will lose again.


----------



## mtbiker033

Quote:


> Originally Posted by *benjaminlev09*
> 
> Hey all,
> 
> anyone tested those new BETA 314.21 drivers yet on the 690?
> 
> im currently use 313.96 with no issues and i wanna know if theres any differences


I installed 314.21 last night, my valley score was about the same, I played some crysis3 and it was fine. not sure this is dirver is any really big deal unless you play tomb raider which I think is the point of this driver.


----------



## JimmyD

Quote:


> Originally Posted by *mtbiker033*
> 
> I installed 314.21 last night, my valley score was about the same, I played some crysis3 and it was fine. not sure this is dirver is any really big deal unless you play tomb raider which I think is the point of this driver.


Exactly, its clearly just for tomb raider. Nvidia claims up to 65% increase.
everything maxed.......mine hasnt dropped below 60fps.....i was getting 44fps
Note: im locked into 60 with 60hz screen & vsync on....i imagine im getting somewhere around 90 but i haven't ran a benchmark to see. good improvements either way.


----------



## Maximilium

Quote:


> Originally Posted by *TeamBlue*
> 
> Is that a 360 and a 200? I'm putting a 3570k and 690 on a 240 and a 200, curious about your temps.


On Idle without overclocking:

Motherboard = 30°
CPU = 32°
Video-card #1 GPU's = 29° & 29°
Video-card #2 GPU's = 30° & 30°
DRAM = 28°


----------



## itskerby

Can someone with a single 690 and 3+ monitors tell me what Mhz your card idles at on the desktop? (or Pstate if you know it)


----------



## PCModderMike

Been kinda quiet in here....








I've been busy getting my 690 under water.


----------



## wermad

Going to post some benchmarks once I sort out this mb. Im stable at 4.8 but I know my cpu can do more. This mb is a bit confusing because of the terminology is a bit foreign to me (used to asus).

I can tell you I get a solid (and pegged) 90fps in CoD WaW in Surround (3600x1920)







. Crysis 2 almost maxed ~50-60fps w/ vram pegged at 2gb. In comparison, my 580s pushed 30-40fps but interestingly enough, vram was at 1.8gb (quad 580 3gb).

Just downloaded BF3 and Crysis 3 since I've had a bit of request to run some #s. Gonna purchase the full feature Fraps to get some graphs going for you guys


----------



## Arizonian

^^^^^^ Coming along nicely there, PCModderMike. ^^^^^^

With Titans release we'll see less new 690 owners. On top of that the 690 really runs smoothly so no issues from owners after all the bumps in the road have been ironed out very well in the last 9+ months.

Been running with no issues.


----------



## maximus56

Quote:


> Originally Posted by *PCModderMike*
> 
> Been kinda quiet in here....
> 
> 
> 
> 
> 
> 
> 
> 
> I've been busy getting my 690 under water.


Edit: Double Post


----------



## maximus56

Quote:


> Originally Posted by *PCModderMike*
> 
> Been kinda quiet in here....
> 
> 
> 
> 
> 
> 
> 
> 
> I've been busy getting my 690 under water.


Nice build








Ok, you are right, it has been kind of quite.








I really like these cards, and don't want to sell them to the bottom feeders, as everyone assumes that the 690s should be dropping in price, since the Titan has been introduced. I beg to differ. For me these cards have been virtually trouble free, since the beginning. I have had to tweak my Titans quite a bit, but the 690s run on auto pilot without any throttling issues.









So, as originally planned, I have decided to hold on to my 690s for the next build







..will post pictures once I am done. Not sure how long, but it will be done


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> Going to post some benchmarks once I sort out this mb. Im stable at 4.8 but I know my cpu can do more. This mb is a bit confusing because of the terminology is a bit foreign to me (used to asus).
> 
> I can tell you I get a solid (and pegged) 90fps in CoD WaW in Surround (3600x1920)
> 
> 
> 
> 
> 
> 
> 
> . Crysis 2 almost maxed ~50-60fps w/ vram pegged at 2gb. In comparison, my 580s pushed 30-40fps but interestingly enough, vram was at 1.8gb (quad 580 3gb).
> 
> Just downloaded BF3 and Crysis 3 since I've had a bit of request to run some #s. Gonna purchase the full feature Fraps to get some graphs going for you guys


Wermad - Looking forward to your work on the graphs, surround data points








Hopefully, you can accomplish, what I could not; prove that quad 690s are well worth the investment








I never got around to doing anything, like what you are planning to do, due to my work schedule. But, these rock


----------



## troublez

Hey guys,

I have been a long time lurker and I thought it would be a good time to post.

I purchased an EVGA GTX 690 3 days ago from newegg! It is in the mail and I am excited to replace my cross-fired HD 5830s.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *troublez*
> 
> Hey guys,
> 
> I have been a long time lurker and I thought it would be a good time to post.
> 
> I purchased an EVGA GTX 690 3 days ago from newegg! It is in the mail and I am excited to replace my cross-fired HD 5830s.


You're in for a treat, your eyes won't believe it!









And welcome to OCN, enjoy that GTX 690!


----------



## PCModderMike

Quote:


> Originally Posted by *Arizonian*
> 
> ^^^^^^ Coming along nicely there, PCModderMike. ^^^^^^
> 
> With Titans release we'll see less new 690 owners. On top of that the 690 really runs smoothly so no issues from owners after all the bumps in the road have been ironed out very well in the last 9+ months.
> 
> Been running with no issues.


Thanks Azironian. I am also very happy with my 690, and have had zero issues with it so far. The Titan was not available when I bought my 690, but even if it had been, I still would have chosen the 690.










Quote:


> Originally Posted by *maximus56*
> 
> Nice build
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, you are right, it has been kind of quite.
> 
> 
> 
> 
> 
> 
> 
> 
> I really like these cards, and don't want to sell them to the bottom feeders, as *everyone assumes that the 690s should be dropping in price, since the Titan has been introduced. I beg to differ. For me these cards have been virtually trouble free, since the beginning.* I have had to tweak my Titans quite a bit, but the 690s run on auto pilot without any throttling issues.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, as originally planned, I have decided to hold on to my 690s for the next build
> 
> 
> 
> 
> 
> 
> 
> ..will post pictures once I am done. Not sure how long, but it will be done


Thank you maximus56.








Also just wanted to quote ya because I 100% agree with that statement. Also looking forward to the pics, good luck with your build.


----------



## TeamBlue

Quote:


> Originally Posted by *PCModderMike*
> 
> Been kinda quiet in here....
> 
> 
> 
> 
> 
> 
> 
> 
> I've been busy getting my 690 under water.


You too huh? Waiting for my replacement alpha 350 lens, so you get a galaxy s3 pic for now.


----------



## Barth-Scout

Hi guys. I just got my new PSU for the GTX 690.
The PSU is a corsair ax760i. In the box were a PCIe cable that splits to 2x8pin.
Can i use this for the GTX690? or do i have to use 2 cables? the reason why i ask is because it is a mini-itx case.
Less cabels = happier me.


----------



## Jurgennoppe

Hi,i just bought this system and would like to oc the 690, what should i do for this and what is it capable off without risking much?

1 x ASUS Maximus V Formula
1 x GAINWARD GeForce GTX 690 (915 mhz, 4096 mb, DDR5, 6008 mhz, 512 bit, 11, fan)
2 x LG ELECTRONICS DVD RW (24x, sata, black) 39,00 €
1 x MICROSOFT Windows 7 Home Premium OEM (64, OEM, NL, SP1)
1 x TOSHIBA DT01ACA200 (2000 gb, sata/600, 7200 rpm, 64 mb, 3.5")
1 x INTEL Core i7 3770K (3.5 ghz, 8 mb, s1155, 77 watt, boxed)
1 x SCYTHE Mugen 3 PCGH (s775, sAM2, s1366, sAM3, s1156, s1155, sFM1)
1 x G.SKILL 16GTX (16384 mb, 1600 mhz, CL7, 1.5 v, non-ecc, unbuffered, kit of 2)
1 x SAMSUNG 840 PRO Series (256 gb, sata/600, 530 mbps, 520 mbps)
1 x CORSAIR HX Gold (1050 watt, atx, sli ready, modular)
1 x NZXT Phantom 410 (red, no psu, atx)
1 x XILENCE Aktieve hardeschijf koeler


----------



## wermad

Quote:


> Originally Posted by *Barth-Scout*
> 
> Hi guys. I just got my new PSU for the GTX 690.
> The PSU is a corsair ax760i. In the box were a PCIe cable that splits to 2x8pin.
> Can i use this for the GTX690? or do i have to use 2 cables? the reason why i ask is because it is a mini-itx case.
> Less cabels = happier me.


I'm using a single cable for each of my cards. Should be fine


----------



## Buzzkill

Quote:


> Originally Posted by *Barth-Scout*
> 
> Hi guys. I just got my new PSU for the GTX 690.
> The PSU is a corsair ax760i. In the box were a PCIe cable that splits to 2x8pin.
> Can i use this for the GTX690? or do i have to use 2 cables? the reason why i ask is because it is a mini-itx case.
> Less cabels = happier me.


The 2x4pin is for EPS to motherboard 4 or 8pin. The 6+2pin is for GPU's
*EPS*
*PCI-E*


----------



## propeldragon

does anyone have a support/retention bracket for there gtx 690? i have been looking everywhere for one. i really want to buy one if someone is willing to sell theirs. i dont even know where to begin looking lol.

thanks


----------



## PCModderMike

Quote:


> Originally Posted by *TeamBlue*
> 
> You too huh? Waiting for my replacement alpha 350 lens, so you get a galaxy s3 pic for now.


Nice!








Quote:


> Originally Posted by *propeldragon*
> 
> does anyone have a support/retention bracket for there gtx 690? i have been looking everywhere for one. i really want to buy one if someone is willing to sell theirs. i dont even know where to begin looking lol.
> 
> thanks


Looking for a backplate? Hard find one lately. Out of stock everywhere it seems....but werm recently bought two. Directly from EVGA?


----------



## wermad

Quote:


> Originally Posted by *propeldragon*
> 
> does anyone have a support/retention bracket for there gtx 690? i have been looking everywhere for one. i really want to buy one if someone is willing to sell theirs. i dont even know where to begin looking lol.
> 
> thanks


Are you looking for something to strengthen the back (ie backplate) or something to support/lift the card so it doesn't sag?


----------



## Buzzkill

Quote:


> Originally Posted by *PCModderMike*
> 
> Nice!
> 
> 
> 
> 
> 
> 
> 
> 
> Looking for a backplate? Hard find one lately. Out of stock everywhere it seems....but werm recently bought two. Directly from EVGA?


Ebay has one listing with 10+ available.
EVGA GTX 690 Backplate $29.99 Free Shipping
http://www.ebay.com/itm/EVGA-GTX-690-Backplate-M020-00-000243-/350745847267?pt=PCC_Video_TV_Cards&hash=item51aa14e5e3


----------



## PCModderMike

Quote:


> Originally Posted by *Buzzkill*
> 
> Ebay has one listing with 10+ available.
> EVGA GTX 690 Backplate $29.99
> http://www.ebay.com/itm/EVGA-GTX-690-Backplate-M020-00-000243-/350745847267?pt=PCC_Video_TV_Cards&hash=item51aa14e5e3


That's cool. Should quote @propeldragon with that....he's the one looking.


----------



## justanoldman

Quote:


> Originally Posted by *Jurgennoppe*
> 
> Hi,i just bought this system and would like to oc the 690, what should i do for this and what is it capable off without risking much?
> 
> 1 x ASUS Maximus V Formula
> 1 x GAINWARD GeForce GTX 690 (915 mhz, 4096 mb, DDR5, 6008 mhz, 512 bit, 11, fan)
> 2 x LG ELECTRONICS DVD RW (24x, sata, black) 39,00 €
> 1 x MICROSOFT Windows 7 Home Premium OEM (64, OEM, NL, SP1)
> 1 x TOSHIBA DT01ACA200 (2000 gb, sata/600, 7200 rpm, 64 mb, 3.5")
> 1 x INTEL Core i7 3770K (3.5 ghz, 8 mb, s1155, 77 watt, boxed)
> 1 x SCYTHE Mugen 3 PCGH (s775, sAM2, s1366, sAM3, s1156, s1155, sFM1)
> 1 x G.SKILL 16GTX (16384 mb, 1600 mhz, CL7, 1.5 v, non-ecc, unbuffered, kit of 2)
> 1 x SAMSUNG 840 PRO Series (256 gb, sata/600, 530 mbps, 520 mbps)
> 1 x CORSAIR HX Gold (1050 watt, atx, sli ready, modular)
> 1 x NZXT Phantom 410 (red, no psu, atx)
> 1 x XILENCE Aktieve hardeschijf koeler


Overclocking the 690 is not a big deal and can be done in a day. Overclocking your 3770k will take much longer. Temps will keep you from going too far with an IVY chip with that cooler though.

OC your chip, follow this guide, it has everything you need:
http://www.overclock.net/t/1291703/ivy-bridge-overclocking-guide-asus-motherboards

This is a good guide for gpu oc, it is listed for the 670 but it still applies:
http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide


----------



## propeldragon

Quote:


> Originally Posted by *PCModderMike*
> 
> Nice!
> 
> 
> 
> 
> 
> 
> 
> 
> Looking for a backplate? Hard find one lately. Out of stock everywhere it seems....but werm recently bought two. Directly from EVGA?


no not a backplate. support/ retention bracket. it goes on the front of the card, 3 screw holes.


----------



## PCModderMike

Quote:


> Originally Posted by *propeldragon*
> 
> no not a backplate. support/ retention bracket. it goes on the front of the card, 3 screw holes.


Link to said bracket?


----------



## TeamBlue

Quote:


> Originally Posted by *propeldragon*
> 
> no not a backplate. support/ retention bracket. it goes on the front of the card, 3 screw holes.


This one?



http://cdn.overclock.net/3/3b/500x1000px-LL-3b4b8ecc_NvidiaGTX690GPU14.12.12.jpeg

Or option 2... Probably not.


----------



## PCModderMike

Quote:


> Originally Posted by *TeamBlue*
> 
> This one?
> 
> 
> 
> http://cdn.overclock.net/3/3b/500x1000px-LL-3b4b8ecc_NvidiaGTX690GPU14.12.12.jpeg
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Or option 2... Probably not.


Ooooh now that makes sense. That looks like something that would come in an OEM built PC.


----------



## wermad

There's a support rod but ppl say its too flimsy. You can always make one.

Then there's the Cooler master haf-x vga support bracket:


----------



## propeldragon

Quote:


> Originally Posted by *TeamBlue*
> 
> This one?
> 
> 
> 
> http://cdn.overclock.net/3/3b/500x1000px-LL-3b4b8ecc_NvidiaGTX690GPU14.12.12.jpeg
> 
> Or option 2... Probably not.


option 1 lol


----------



## propeldragon

i actually just bought one from a guy on ebay for $10 shipped. i was so lucky.


----------



## PCModderMike

Now that my 690 is under water, decided to push it a little.








My 2700K was left stock. Very pleased with it how it responded.


----------



## maximus56

Quote:


> Originally Posted by *PCModderMike*
> 
> Now that my 690 is under water, decided to push it a little.
> 
> 
> 
> 
> 
> 
> 
> 
> My 2700K was left stock. Very pleased with it how it responded.


Nicely done!
By the way, Charleston is a pretty neat spot to hang your hat. I love vacationing in Kiawah, and driving into Charleston for some fabulous food!


----------



## PCModderMike

Quote:


> Originally Posted by *maximus56*
> 
> Nicely done!
> By the way, Charleston is a pretty neat spot to hang your hat. I love vacationing in Kiawah, and driving into Charleston for some fabulous food!


Hey now, definitely sounds like you know what you're talking about there! I've only been down here about 3 years now, love it! Work and live right outside downtown. I used to work on Kiawah and Seabrook....but haven't been out there since I left that position. Great area though!


----------



## justanoldman

Quote:


> Originally Posted by *PCModderMike*
> 
> Now that my 690 is under water, decided to push it a little.
> 
> 
> 
> 
> 
> 
> 
> 
> My 2700K was left stock. Very pleased with it how it responded.


That is a nice core clock, have you tried to get the mem offset up?
Did you gain any MHz going from air to water?


----------



## PCModderMike

Quote:


> Originally Posted by *justanoldman*
> 
> That is a nice core clock, have you tried to get the mem offset up?
> Did you gain any MHz going from air to water?


Thanks. Haven't touched the memory yet, played with the core only last night. Now that I found the max stable core clock, I can do the same for the memory. Honestly, I never overclocked the 690 while it was on air, so can't say what kind of gain I got.


----------



## mtbiker033

I am using the 690 with my P67 Sandybridge set-up with 2500k which should only be pcie 2.0 though my board does have pcie 3.0 capability with the appropriate bios update and cpu. the weird thing is both HWinFO and GPUz show it running at pcie 3.0:



now I am on an older bios so I haven't even done the update for Ivy, I''m pretty sure this is just a glitch


----------



## MrTOOSHORT

Quote:


> Originally Posted by *mtbiker033*
> 
> I am using the 690 with my P67 Sandybridge set-up with 2500k which should only be pcie 2.0 though my board does have pcie 3.0 capability with the appropriate bios update and cpu. the weird thing is both HWinFO and GPUz show it running at pcie 3.0:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> now I am on an older bios so I haven't even done the update for Ivy, I''m pretty sure this is just a glitch


GPUz shows pci-e 3.0 of the PLX chip, so that's running 3.0, but your pci-E bus is 2.0.


----------



## mtbiker033

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> GPUz shows pci-e 3.0 of the PLX chip, so that's running 3.0, but your pci-E bus is 2.0.


you are right, I looked at the sensor data and saw it's running at 2.0:


----------



## propeldragon

guys i have a question. the gtx 690 has basicly 2 gb of vram. if i overclock the memory would it use less vram? like if i was using 1.8gb and then i overclock the memory to + 200 or something would i only use like 1.7gb of vram? (numbers are just for a reference)

thanks


----------



## wermad

Quote:


> Originally Posted by *propeldragon*
> 
> guys i have a question. the gtx 690 has basicly 2 gb of vram. if i overclock the memory would it use less vram? like if i was using 1.8gb and then i overclock the memory to + 200 or something would i only use like 1.7gb of vram? (numbers are just for a reference)
> 
> thanks


Vram is like a gas-tank. Its there for what every your system needs. Most likely the higher clocks made the application use less vram since its working a bit faster. Well, that's my theory


----------



## propeldragon

Quote:


> Originally Posted by *wermad*
> 
> Vram is like a gas-tank. Its there for what every your system needs. Most likely the higher clocks made the application use less vram since its working a bit faster. Well, that's my theory


thats how i see it too. like getting better gas mileage lol


----------



## wermad

Quote:


> Originally Posted by *propeldragon*
> 
> thats how i see it too. like getting better gas mileage lol


----------



## PCModderMike

Took the 2700K up to 4.8...._almost_ breaking 16,500.


----------



## wermad

Opinions on pcie 2.0 vs 3.0 on the 690? Worth switching my CPU,?


----------



## Arizonian

Quote:


> Originally Posted by *wermad*
> 
> Opinions on pcie 2.0 vs 3.0 on the 690? Worth switching my CPU,?


Those with tri or quad GPU set ups have said there are gains using PCIe 3.0 where it can utilize bandwidth efficiently. Haven't seen the results if they are out there, just basing it off OCN 690 club members here that have commented deep in these posts.

As for Single or SLI GPU's see minimal gains and IMO not worth it.



Source. PCI-E 2.0 vs. PCI-E 3.0 with the GTX 690 dated May 2, 2012.


----------



## MrTOOSHORT

PCI-E 3.0 is not really seeing any gains. Maybe 200 points in 3dmark11, not much. Keep the 2700k unless you're itching for an upgrade again wermad.


----------



## TeamBlue

Well, tried out some crysis 3 last night finally, I am thoroughly impressed! at 1440p I only had to knock down aa and maxed everything else, no studdering! I have to admit, I was a little confused about tx vs ms vs fx vs sm as far as aa is concerned.


----------



## krug

MSI GeForce GTX 690, Proof!

*3DMark11*
P16022 _win8_ Proof!
P17113 _win7_ Proof!
X6941 _win8_ Proof!

*3DMark*
6273 Fire Strike Extreme _win8_ Proof!
11952 Fire Strike _win8_ Proof! Latest driver was not approved yet..


----------



## wermad

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> PCI-E 3.0 is not really seeing any gains. Maybe 200 points in 3dmark11, not much. Keep the 2700k unless you're itching for an upgrade again wermad.


Quote:


> Originally Posted by *Arizonian*
> 
> Those with tri or quad GPU set ups have said there are gains using PCIe 3.0 where it can utilize bandwidth efficiently. Haven't seen the results if they are out there, just basing it off OCN 690 club members here that have commented deep in these posts.
> 
> As for Single or SLI GPU's see minimal gains and IMO not worth it.
> 
> 
> 
> Source. PCI-E 2.0 vs. PCI-E 3.0 with the GTX 690 dated May 2, 2012.


Thanks bud








Quote:


> Originally Posted by *MrTOOSHORT*
> 
> PCI-E 3.0 is not really seeing any gains. Maybe 200 points in 3dmark11, not much. Keep the 2700k unless you're itching for an upgrade again wermad.


Just curious as the IB is said to be a bit faster clock-for-clock vs SB and I wasn't too sure the full benefits of pcie 3.0. I'm sitting comfortably and easily @ 4.8 with the 2700k but I'd really like to push it to 5.0-5.2. The mb is a great but there are better ones out there, though the sniper's audio is very good. i do have a pcie 1x sound card but i'll loose lanes with z77. Also thinking of going back to x79. I'll see what I can conjure







. IB still runs hotter then SB and that's something that makes it less appealing to switch.

Anyways, got a few more opinions of IB owners so I'm scratching the IB switch. X79 should give me lanes directly from the cpu vs the cpu>plx (though they say the plx has little impact). i have a feeling something is holding back the 690s. I'll have to find some spare time to explore the limits of this setup. Fermi was pretty easy to max out but Kepler is just too of an enigma


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> Thanks bud
> 
> 
> 
> 
> 
> 
> 
> 
> Just curious as the IB is said to be a bit faster clock-for-clock vs SB and I wasn't too sure the full benefits of pcie 3.0. I'm sitting comfortably and easily @ 4.8 with the 2700k but I'd really like to push it to 5.0-5.2. The mb is a great but there are better ones out there, though the sniper's audio is very good. i do have a pcie 1x sound card but i'll loose lanes with z77. Also thinking of going back to x79. I'll see what I can conjure
> 
> 
> 
> 
> 
> 
> 
> . IB still runs hotter then SB and that's something that makes it less appealing to switch.
> 
> Anyways, got a few more opinions of IB owners so I'm scratching the IB switch. X79 should give me lanes directly from the cpu vs the cpu>plx (though they say the plx has little impact). i have a feeling something is holding back the 690s. I'll have to find some spare time to explore the limits of this setup. Fermi was pretty easy to max out but Kepler is just too of an enigma


Problem with IB is with us overclocking like we do we are just about forced into delidding. I was hitting 90 c on water with my chip before delidding. Not sure if they are still using the bird poop paste in newer IB's though,


----------



## propeldragon

Quote:


> Originally Posted by *Arizonian*
> 
> Those with tri or quad GPU set ups have said there are gains using PCIe 3.0 where it can utilize bandwidth efficiently. Haven't seen the results if they are out there, just basing it off OCN 690 club members here that have commented deep in these posts.
> 
> As for Single or SLI GPU's see minimal gains and IMO not worth it.
> 
> 
> 
> Source. PCI-E 2.0 vs. PCI-E 3.0 with the GTX 690 dated May 2, 2012.


is there any new benchmarks for this? could there have been updates for pci 3.0? i feel like there should be a bigger difference? why would they do this change for only 1 fps?


----------



## TeamBlue

So, um, really awesome temps?! This is with a phobya 200/phobya 180 fan, ek xt240 with a pair of gelid wing 12's going full out. Also in this loop there is a dominator block and a 3570k at 4.2, pretty much stock volts. I might try running prime in the background to get the loop as hot as it will go, but with the 690 at 99% for over 35 minutes, I think this is pretty accurate.


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> Opinions on pcie 2.0 vs 3.0 on the 690? Worth switching my CPU,?


I see definite gains in benchmarks with quad 690s when running 3.0 vs 2.0. 6 cores absolutely help on the 3930 k .
I have not compared any game benchmarks, primarily because I don't game as often as I would like to...lol..but it certainly won't hurt.
Just be very careful with kboost, when running 3.0. It messes up 3.0, or may be its the other way around. There is very specific protocol to enabling an disabling kboost,. Just make sure that 3.0 is uninstalled when toggling kboost on or off, you can reinstall 3.0 once done with kboost. I only use kboost for benchmarks anyway.
Ofcourse, this only applies to x79 boards. If you get an ivy board, you will have native 3.0 enabled, however, you won't have the 6 cores of sb-e , obviously.


----------



## King4x4

Which only makes a difference in some benchmarks and folding.

Sent from my GT-I9300 using Tapatalk 2


----------



## maximus56

Yeah..only the benchmarks that actually count. Lol


----------



## wermad

My days of bechmarking for "fun" and "friendly competition" are over. I only use a few benchmarks for stability testing. I'm more interested in gaming in Surround. I can get 4.7 stable on this mb but anything above that is a bit tricky, though I'm tweaking it.

I get mixed opinions on whether my cpu @4.8 is enough for the quads. Then again, I need to really push this system to find any limitations. So far, I've played a couple of mild games and Crysis 2. All cores were loading ~70-80%. I need to test another game and see if (and that's a big and questionable "if") there's a "hold up" w/ my cpu.


----------



## maximus56

Quote:


> Originally Posted by *wermad*
> 
> My days of bechmarking for "fun" and "friendly competition" are over. I only use a few benchmarks for stability testing. I'm more interested in gaming in Surround. I can get 4.7 stable on this mb but anything above that is a bit tricky, though I'm tweaking it.
> 
> I get mixed opinions on whether my cpu @4.8 is enough for the quads. Then again, I need to really push this system to find any limitations. So far, I've played a couple of mild games and Crysis 2. All cores were loading ~70-80%. I need to test another game and see if (and that's a big and questionable "if") there's a "hold up" w/ my cpu.


Based, on my rather limited gaming experience (lol), I would say that quad 690s, with the current CPUs and 3.0 enabled don't really demonstrate a material fps difference. The single greatest contributor to sub par gaming in surround is high level of AA settings. But, my guess is that you already knew that. I typically run my 690s overclocked with 129 on the core offset and 550 on memory, and see great performance in gaming. 4.8 ghz is plenty, but not sure how much difference a six core vs a quad core makes.


----------



## wermad

The cpu and mb are the same and i got 99% pegged usage on the old quad 580s I had a couple of months ago. Performance is better on the 690s but the usage is a bit erratic and that's very unusual. Could be the dreaded ghost of christmas past "quad sli curse"









I'll have some time this weekend to further explore my setup.


----------



## King4x4

On surround?


----------



## wermad

Quote:


> Originally Posted by *King4x4*
> 
> On surround?


Yup. Once you go Surround, you can't go back









The few mild games I've played in Surround seem a bit odd at times; like they don't really like Kepler. Fermi was rock solid on those older (and port) games. Even my older sli GTX 590 setup was very good and solid on all my games (sans vram limitation).

Once I get the cpu stable, i should run 3dmark11 to gauge the gpu(s).


----------



## Arm3nian

So I have this http://www.monoprice.com/products/product.asp?c_id=104&cp_id=10428&cs_id=1042802&p_id=5311&seq=1&format=2
and have it connected to my 690 mini display port, and have an hdmi cable running from that to my av receiver. I remember when I first did it I got audio, but later it stopped working so I switched to optical from my motherboard, but want to go back now for bluray. Windows shows that it is connected with nvidia hd audio, but no sound. I've tried 10000000 restarts on pc and receiver but nothing.. any suggestions


----------



## Jurgennoppe

Hi ,everyone i just got this new system, this is it.

1 x z77 oc formula
1 x GAINWARD GeForce GTX 690 (915 mhz, 4096 mb, DDR5, 6008 mhz, 512 bit, 11, fan)
2 x LG ELECTRONICS DVD RW (24x, sata, black) 39,00 €
1 x MICROSOFT Windows 7 Home Premium OEM (64, OEM, NL, SP1)
1 x TOSHIBA DT01ACA200 (2000 gb, sata/600, 7200 rpm, 64 mb, 3.5")
1 x INTEL Core i7 3770K (3.5 ghz, 8 mb, s1155, 77 watt, boxed)
1 x SCYTHE Mugen 3 PCGH (s775, sAM2, s1366, sAM3, s1156, s1155, sFM1)
1 x G.SKILL 16GTX (16384 mb, 1600 mhz, CL7, 1.5 v, non-ecc, unbuffered, kit of 2)
1 x SAMSUNG 840 PRO Series (256 gb, sata/600, 530 mbps, 520 mbps)
1 x CORSAIR HX Gold (1050 watt, atx, sli ready, modular)
1 x NZXT Phantom 410 (red, no psu, atx)
1 x XILENCE Aktieve hardeschijf koeler

As you can see, one 690 in the system, now i have some questions.

I have oc'ed the system to a stable 4.2ghz. Yet when i just installed crysis 3 and set every single thing maxed out, and the opening on the boat begins i dont get more then 40-45 fps? is that normal with this videocard?

Also, in gpu-z it says sli enabled, yet i only have one 690. Can someone please help me out here?

in afterburner the core clock and memory clock both show at 0 mhz??


----------



## Tomalak

Quote:


> Originally Posted by *Jurgennoppe*
> 
> *Also, in gpu-z it says sli enabled, yet i only have one 690. Can someone please help me out here*?


Others can answer the rest, I can tell you that the SLI thing is perfectly alright, since gtx 690 is internal SLI in a sense of having two GPUs on one board.


----------



## Jurgennoppe

ok,,thx, i'll run unigine now and post the resuslt too, hopefully,someone else can help me out with the other questions

i ram ungine on extreme, but it keeps relooping at the end of the benchmark, i never get a score, how is this?


----------



## wholeeo

You have to press F9 for the benchmark run to start.


----------



## Jurgennoppe

omg, i know to start it, i don't get a result at the end, that's the problem.


----------



## justanoldman

Quote:


> Originally Posted by *Jurgennoppe*
> 
> ok,,thx, i'll run unigine now and post the resuslt too, hopefully,someone else can help me out with the other questions
> 
> i ram ungine on extreme, but it keeps relooping at the end of the benchmark, i never get a score, how is this?


Have not heard of that before, if it says "Benchmarking" on the screen it should go through the scenes and count them in the lower right corner, then at the last scene pop up the score. Is this Valley or something else?
Quote:


> Originally Posted by *wholeeo*
> 
> You have to press F9 for the benchmark run to start.


Speaking of Valley, Wholeeo, I am officially jealous of your 690, that thing is a very nice overclocker, and that is an impressive score you posted.


----------



## wholeeo

Quote:


> Originally Posted by *justanoldman*
> 
> Have not heard of that before, if it says "Benchmarking" on the screen it should go through the scenes and count them in the lower right corner, then at the last scene pop up the score. Is this Valley or something else?
> Speaking of Valley, Wholeeo, I am officially jealous of your 690, that thing is a very nice overclocker, and that is an impressive score you posted.


Thanks









Need to purchase some benchmarking ram. I think doing the runs with 32 GB of regular 1600 ram is slowing me up tiny a bit. Perhaps a stick of 2600 mhz would get me the 1 point I need to break 100,







Also need to flash a bios that lets me raise the power target to 150%.


----------



## Jurgennoppe

ok, so i dl'ed unigine heaven 4.0 benchmark and this is the result....

Is this normal, it seems very low.....

so if you think it's on the low side, what could be the problem then?
like i said, i have my cpu oced to 4.5 ghz and havent oc'ed my gpu
i have 16gb of cl7 ram 7-7-8-24

i ran the newest version now 4.0 and this is my result..... what to think of this??

http://imageshack.us/f/716/naamlooshca.png/

why does it says 2112 mb X1???

is sth wrong with my new pc??


----------



## wholeeo

Quote:


> Originally Posted by *Jurgennoppe*
> 
> ok, so i dl'ed unigine heaven 4.0 benchmark and this is the result....
> 
> Is this normal, it seems very low.....
> 
> so if you think it's on the low side, what could be the problem then?
> like i said, i have my cpu oced to 4.5 ghz and havent oc'ed my gpu
> i have 16gb of cl7 ram 7-7-8-24
> 
> i ran the newest version now 4.0 and this is my result..... what to think of this??
> 
> http://imageshack.us/f/716/naamlooshca.png/
> 
> why does it says 2112 mb X1???
> 
> is sth wrong with my new pc??


Can't help you as I have yet to run Heaven on my card. Try Valley as it seems that's what people seem to be going towards for benchmarking nowadays.


----------



## Jurgennoppe

thx for your reply, id o hope that someone can help me out, my score seems really low for a 2400 euro pc....


----------



## Jurgennoppe

is this score normal for my pc?

it's oced at 4.5ghz.

http://imageshack.us/photo/my-images/22/naamloosau.png/


----------



## justanoldman

Quote:


> Originally Posted by *Jurgennoppe*
> 
> is this score normal for my pc?
> 
> it's oced at 4.5ghz.
> 
> http://imageshack.us/photo/my-images/22/naamloosau.png/


There are a lot of things that go into a 3dmark score so I can't tell you for sure, but the graph with your score puts it in the average range.

If you want to oc your 690 to the max, and run Valley 1.0 I can tell you if your score is ok or not.

I would suggest putting your new rig in your posts, so it is easier for people to answer questions:
http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig


----------



## Jurgennoppe

ty, i wanna oc it, i just dont know how to, btw, this is my rig mate!

any easy and safe way to do this for a noob?

1 x z77 oc formula
1 x GAINWARD GeForce GTX 690 (915 mhz, 4096 mb, DDR5, 6008 mhz, 512 bit, 11, fan)
2 x LG ELECTRONICS DVD RW (24x, sata, black) 39,00 €
1 x MICROSOFT Windows 7 Home Premium OEM (64, OEM, NL, SP1)
1 x TOSHIBA DT01ACA200 (2000 gb, sata/600, 7200 rpm, 64 mb, 3.5")
1 x INTEL Core i7 3770K (3.5 ghz, 8 mb, s1155, 77 watt, boxed)
1 x SCYTHE Mugen 3 PCGH (s775, sAM2, s1366, sAM3, s1156, s1155, sFM1)
1 x G.SKILL 16GTX (16384 mb, 1600 mhz, CL7, 1.5 v, non-ecc, unbuffered, kit of 2)
1 x SAMSUNG 840 PRO Series (256 gb, sata/600, 530 mbps, 520 mbps)
1 x CORSAIR HX Gold (1050 watt, atx, sli ready, modular)
1 x NZXT Phantom 410 (red, no psu, atx)
1 x XILENCE Aktieve hardeschijf koeler


----------



## justanoldman

Quote:


> Originally Posted by *Jurgennoppe*
> 
> ty, i wanna oc it, i just dont know how to, btw, this is my rig mate!
> 
> any easy and safe way to do this for a noob?


I realize that is your rig, I am saying you don't want to post all that every time, so the best thing to do is follow the link I posted so anyone can click on your rig in your posts.

Overclocking a chip is not easy or quick, but ocing a gpu is not that hard. This guide is talking about the 670 but it applies to the whole 600 series. Take the time to read through it and you will be safe to oc your 690:
http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide


----------



## TeamBlue

debating wether or not to flash my bios to unlock power target to 150%. I can hit 165 core no problem currently but 1mhz over that and I get driver crashes. Do you guys have any positive experiences doing this? What kind of gains have you seen by unlocking power target?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *TeamBlue*
> 
> debating wether or not to flash my bios to unlock power target to 150%. I can hit 165 core no problem currently but 1mhz over that and I get driver crashes. Do you guys have any positive experiences doing this? What kind of gains have you seen by unlocking power target?


No gains to be had modding the bios of the GTX 690.

I've tried many modded bios' on my GTX 690 and the overclock was always the same.


----------



## Lagpirate

Hey guys, I've been lurking this thread for a while until I got my 690. I've had it for about a week now and honestly I'm not that impressed. I've been getting WAY lower than average frame rates, for example: in bf3 I am seeing an average framerate of about 45-55 frames a second. From what I've seen, a 690 should be able to push 80+ easy.. Idk, does anyone have any idea of what's going on with my card? I have an amd fx 8120 clocked at 4.0. Also, it may be worth noting that in bf3 I'm only getting about 45% gpu usage on both cores. I have sli enabled, so I know it's not an issue there.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Lagpirate*
> 
> Hey guys, I've been lurking this thread for a while until I got my 690. I've had it for about a week now and honestly I'm not that impressed. I've been getting WAY lower than average frame rates, for example: in bf3 I am seeing an average framerate of about 45-55 frames a second. From what I've seen, a 690 should be able to push 80+ easy.. Idk, does anyone have any idea of what's going on with my card? I have an amd fx 8120 clocked at 4.0. Also, it may be worth noting that in bf3 I'm only getting about 45% gpu usage on both cores. I have sli enabled, so I know it's not an issue there.


You need to overclock that cpu more. Can you get it to 4.5GHz? A GTX 690 requires lot's of cpu power to avoid a bottleneck in certain games.


----------



## TeamBlue

man that sucks. I was hoping that being able to bump the power target up would squeeze a few extra mhz out of it
Quote:


> Originally Posted by *MrTOOSHORT*
> 
> No gains to be had modding the bios of the GTX 690.
> 
> I've tried many modded bios' on my GTX 690 and the overclock was always the same.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *TeamBlue*
> 
> man that sucks. I was hoping that being able to bump the power target up would squeeze a few extra mhz out of it


Yep, a GTX 690 is not a fun card to play with overclocking wise. When I first got the card, I was hoping for the little voltage bump to 1.21v and squeeze 30-40Mhz more out of it with a modded bios.

I have more fun with this GTX 480 I have because of the voltage tweaking.


----------



## Lagpirate

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> You need to overclock that cpu more. Can you get it to 4.5GHz? A GTX 690 requires lot's of cpu power to avoid a bottleneck in certain games.


I've overclocked from 3.1 up to 4.0 on air, and I've pretty much hit my thermal wall. I've heard of people hitting 4.5 on the 8120 but only under water. Do you really think that is the reason why I am seeing such poor performance?


----------



## wholeeo

Anyone in the 690 club want a free copy of Aliens vs Predator? Purchased it by mistake. Already owned it, 

Anyway if interested just PM me.

Edit: Given away.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Lagpirate*
> 
> I've overclocked from 3.1 up to 4.0 on air, and I've pretty much hit my thermal wall. I've heard of people hitting 4.5 on the 8120 but only under water. Do you really think that is the reason why I am seeing such poor performance?


Yeah, your cpu is a problem at that overclock with a GTX 690, sorry to say. Even 4.5GHz might not help too much.


----------



## Lagpirate

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Yeah, your cpu is a problem at that overclock with a GTX 690, sorry to say. Even 4.5GHz might not help too much.


Wow that's disappointing. :/ 4.5 is a really high clock for any processor.. So I guess there's not really a processor out there that doesn't bottleneck the 690 in one way or another.. Would you suggest getting a new processor? Or would it even be worth it? Should I get a
AIO water cooler maybe?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Lagpirate*
> 
> Wow that's disappointing. :/ 4.5 is a really high clock for any processor.. So I guess there's not really a processor out there that doesn't bottleneck the 690 in one way or another.. Would you suggest getting a new processor? Or would it even be worth it? Should I get a
> AIO water cooler maybe?


Well I'd suggest an FX-8350 at the very least, or go Intel. A 3570k with a nice z77 mobo will do the trick.

This thread might help you:

http://www.overclock.net/t/1351739/will-the-amd-fx-8350-bottleneck-any-gpus


----------



## Lagpirate

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Well I'd suggest an FX-8350 at the very least, or go Intel. A 3570k with a nice z77 mobo will do the trick.
> 
> This thread might help you:
> 
> http://www.overclock.net/t/1351739/will-the-amd-fx-8350-bottleneck-any-gpus


Thanks alot man, you've helped me out quite a bit. Im thinking that i will just sell my current Mobo/chip and make the move to intel. Its kinda hard not too after dropping 900 bucks on a card, you know? might as well go all the way! +rep


----------



## Lagpirate

Btw add me to the club!  it's evga brand


----------



## Arizonian

Quote:


> Originally Posted by *Lagpirate*
> 
> Btw add me to the club!  it's evga brand


Right on...congrats and welcome.


----------



## TeamBlue

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Yep, a GTX 690 is not a fun card to play with overclocking wise. When I first got the card, I was hoping for the little voltage bump to 1.21v and squeeze 30-40Mhz more out of it with a modded bios.
> 
> I have more fun with this GTX 480 I have because of the voltage tweaking.


I just picked one up from a guy on here, 140 bucks. I used to have 3 under water with my x58 machine, tweaking paradise lol. I'm more excited to play with this 670ftw card, waiting on a new mobo and proc... so sad that everything is locked, my temps under water are 34c max and I know the card has more in it.


----------



## Jurgennoppe

Hi,

this is my setup

1 x z77 oc formula
1 x GAINWARD GeForce GTX 690 (915 mhz, 4096 mb, DDR5, 6008 mhz, 512 bit, 11, fan)
2 x LG ELECTRONICS DVD RW (24x, sata, black) 39,00 €
1 x MICROSOFT Windows 7 Home Premium OEM (64, OEM, NL, SP1)
1 x TOSHIBA DT01ACA200 (2000 gb, sata/600, 7200 rpm, 64 mb, 3.5")
1 x INTEL Core i7 3770K (3.5 ghz, 8 mb, s1155, 77 watt, boxed)
1 x SCYTHE Mugen 3 PCGH (s775, sAM2, s1366, sAM3, s1156, s1155, sFM1)
1 x G.SKILL 16GTX (16384 mb, 1600 mhz, CL7, 1.5 v, non-ecc, unbuffered, kit of 2)
1 x SAMSUNG 840 PRO Series (256 gb, sata/600, 530 mbps, 520 mbps)
1 x CORSAIR HX Gold (1050 watt, atx, sli ready, modular)
1 x NZXT Phantom 410 (red, no psu, atx)
1 x XILENCE Aktieve hardeschijf koeler

i oced to 4.5 ghz.

While playing crysis 3 for like 15 minutes, i had one crash at a checkpoint.

I checked afterburner and it said max gpu usage was 114% and 108%, is this normal? ihavent oced the gpu, so.... why is it using more than 100%

http://imageshack.us/photo/my-images/856/naamloosrv.png/

also, the temps on 3770k went up to 82C max, si that safe?.....


----------



## danta

Hi, Just wanting some help with identifying some chips on the standard refrence gtx 690. There labled up as Q502 and Q503 on the back edge of the board. I needing to replace them and the pcb repair center i took them to needs to know exactly what they are before they'll do and work


----------



## justanoldman

Quote:


> Originally Posted by *Jurgennoppe*
> 
> Hi,
> i oced to 4.5 ghz.
> 
> While playing crysis 3 for like 15 minutes, i had one crash at a checkpoint.
> 
> I checked afterburner and it said max gpu usage was 114% and 108%, is this normal? ihavent oced the gpu, so.... why is it using more than 100%
> 
> also, the temps on 3770k went up to 82C max, si that safe?.....


If you got those CPU temps from just playing a game, then I would say they are definitely too high. How did you test your 4.5 oc, what vCore are you using? Prime95 or IBT temps should have been over 100c which is not good, if you have 82c while gaming. I don't use afterburner, but I don't see where it hit those highs on the graph, could have been a spike when it crashed.


----------



## Jurgennoppe

Quote:


> Originally Posted by *justanoldman*
> 
> If you got those CPU temps from just playing a game, then I would say they are definitely too high. How did you test your 4.5 oc, what vCore are you using? Prime95 or IBT temps should have been over 100c which is not good, if you have 82c while gaming. I don't use afterburner, but I don't see where it hit those highs on the graph, could have been a spike when it crashed.


prime was at 85 max after ten mins of testing...

my vcore is 1.260 1.250 would barely boot and crashed 10 secs into prime....

this is so weird i get higher temps on my cpu from playing crysis 3 then 100% load on prime....


----------



## justanoldman

Quote:


> Originally Posted by *Jurgennoppe*
> 
> prime was at 85 max after ten mins of testing...
> 
> my vcore is 1.260 1.250 would barely boot and crashed 10 secs into prime....
> 
> this is so weird i get higher temps on my cpu from playing crysis 3 then 100% load on prime....


Not like I am an expert or anything, but my gaming temps are usually 20c below Prim95 temps. I have never heard of anyone having a 3c difference, that does not make sense to me.

Unless you can go several hours of Prime95 with zero problems and no WHEA warnings in the Event Viewer your oc is not really stable. Some say 6 hours is enough others use 24 hours of Prime95, that is up to the person to decide.

Until your cpu oc is reasonably stable you can't know what is crashing the computer. Try running Prime95 with Sum Inputs and Round Off error checking on, with a torture test - custom - and just change the memory to use to 90% of available per task manager performance tab. What are your temps after an hour, and can you go 12 hours without failing or WHEA warnings?


----------



## danta

Hi, Just wanting some help with identifying some chips on the standard refrence gtx 690. There labled up as Q502 and Q503 on the back edge of the board. I needing to replace them and the pcb repair center i took them to needs to know exactly what they are before they'll do and work


----------



## wermad

Those running SLI, can you guys post your cpu and clocks? I'm having difficulties getting anything stable above 4.7 on my current setup. I need more info on what is a good clock for quads. There's just mixed opinions (as usual) on what's a suitable clock and cpu for quads. Thanks


----------



## AllGamer

So how many of you are considering switching to a Titan ?









according to the TechPowerUp review, seems like 2x Titan is the way to go, i couldn't believe the Quad SLI was performing so much worse than 1 or 2 of those Things.

I'm still pondering if I should switch, or wait for GTX Titan90 (if they ever make one)

it's a good time to get rid of the GTX 690 now while they are still worth something.


----------



## Arizonian

Quote:


> Originally Posted by *AllGamer*
> 
> So how many of you are considering switching to a Titan ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> according to the TechPowerUp review, seems like 2x Titan is the way to go, i couldn't believe the Quad SLI was performing so much worse than 1 or 2 of those Things.
> 
> I'm still pondering if I should switch, or wait for GTX Titan90 (if they ever make one)


I'm sure for those of us that are on multiple monitors would want to switch to a Titan for the extra VRAM.

However those of us on single monitors would lose performance to switch to a signal Titan.

Only way I would move to a Titan is if I would buy two Titans SLI. I'm not about to spend $2000 on GPU's. Spent my money this year on some audiophile gear and new monitor already. End of year looking at either two new singles aka 780/8970 or maybe another dual GPU like a GTX 790.









Edit: Benchers are eating the Titans up right now.


----------



## propeldragon

i bought a evga backplate for gtx 690 and it didnt come with screws. do i reuse the screws?

thanks


----------



## Arizonian

Quote:


> Originally Posted by *propeldragon*
> 
> i bought a evga backplate for gtx 690 and it didnt come with screws. do i reuse the screws?
> 
> thanks


Just reuse your own screws. I use the diagram on the directions to place each screw I remove from the GTX 690 onto the diagram where I removed the screw from. This way when I put the screws back I put the exact screw in the exact same hole. OCD.


----------



## wermad

Im going to chug along w/ 690s. I'll sort out the bugs eventually. Titans with water would have been ~$800 more then what I spent on my 690s w/ water. So far no major complaints in Surround. Its a better experience in terms of performance compared to my old quad 580s


----------



## King4x4

Have a 690 and it didn't cut it for surround so I switched to Tri-680 4GBs (3x$480) Waterblocks (3x150$) costs $800 less then SLI Titans with Waterblocks and I don't have any issues with Vram even at 7680x1440 while getting nearly %95 of titan scaling is just too nice.


----------



## malmental

I posted something in the Titan's thread and allow me to show the GTX 690 thread some love too...










Tweaker Turns GeForce GTX 690 Into a Quadro K5000



Quote:


> When you buy GeForce GTX 690, the most expensive dual-gpu graphics NVIDIA has in their offer, you expect the best performance and every feature enabled. Well this is actually true, however it does not support the same technology as it's professional counterpart Quadro K5000.
> 
> If the GeForce could be modified into Quadro, you would save $1000. A member of the EEV Forums did just that, because his GeForce GTX 690 would support NVIDIA Surround (multi-monitor technology) in Windows, but it would not support Mosaic in Linux, which is exclusive technology for Quadro series. What is worth noting is the fact that NVIDIA Quadro K5000 is a single-gpu card, clocked lower than 690.
> 
> In order to understand how this modification works, we have to bring up few facts. In the past it was actually very easy to modify GeForce cards into Quadro (and opposite), by changing the hardware straps on the board. These straps control the PCI Device ID, which carries the information what card is connected to the slot. So if you knew the codes for each card, you could manipulate them to emulate specific model of the card. However NVIDIA changed the way to control the ID. To change it on the GTX 690 it requires some modification to the board itself. By changing the analogue values of certain components we can manipulate the ID. The modder gave us a whole list of values corresponding to specific codes:
> 
> 5K = 8
> 10K = 9
> 15K = A
> 20K = B
> 25K = C
> 30K = D
> 35K = E
> 40K = F
> 
> NVIDIA GeForce GTX 690 has an ID of 0×1188, while Quadro K5000 has 0x11BA and Tesla K10 0x118F. Long story short, it equates to 20K and 15K resistors to be enabled in order to change 690 into K5000, whereas it requires 5K and 40K resistors to transform it to Tesla K10. And that's it. He posted few images for verification. The story does not say however, if the card is now operating with only one GPU or two. It surely does trick the NVIDIA's X Server, which now reads the code as K5000/K10. Other users suggest that the same way can be applied to GeForce GTX 660Ti/670 to transform them into 680. It would technically just change the name in the NVIDIA Control Panel (not verified). Of course this method will work with the GeForce GTX Titan into Tesla K20 mod as well.



Quote:


> NVIDIA did not comment on this story, instead they deleted the thread regarding this tweak on their official forums (made by the same user). Keep an eye on the thread below.


SOURCE

Cheers..


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> Im going to chug along w/ 690s. I'll sort out the bugs eventually. Titans with water would have been ~$800 more then what I spent on my 690s w/ water. So far no major complaints in Surround. Its a better experience in terms of performance compared to my old quad 580s


This is why I keep my 3rd monitor away from my primary two. Keeps me from surround gaming which eventually would turn into another watercooled 690 or sli Titans, I'd also then invest in smaller bezel screens, lightboost capable, the whole nine,


----------



## AllGamer

Quote:


> Originally Posted by *Arizonian*
> 
> I'm sure for those of us that are on multiple monitors would want to switch to a Titan for the extra VRAM.
> 
> However those of us on single monitors would lose performance to switch to a signal Titan.
> 
> Only way I would move to a Titan is if I would buy two Titans SLI. I'm not about to spend $2000 on GPU's. Spent my money this year on some audiophile gear and new monitor already. End of year looking at either two new singles aka 780/8970 or maybe another dual GPU like a GTX 790.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Benchers are eating the Titans up right now.


Quote:


> Originally Posted by *King4x4*
> 
> Have a 690 and it didn't cut it for surround so I switched to Tri-680 4GBs (3x$480) Waterblocks (3x150$) costs $800 less then SLI Titans with Waterblocks and I don't have any issues with Vram even at 7680x1440 while getting nearly %95 of titan scaling is just too nice.


That is exactly my problem, i do play mostly in surround view, and even 3D surround view for those few games that supports it.

and when those games runs, the FPS is very close indeed to the benches shown in TechPowerUp rewview http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/

I really like my dual GTX 690, but the performance is kind of mediocre considering i paid $2000 to get the top of the line gear last December.

and 2 Titan on SLI seems to be easily eating the 690 for breakfast most of the time.

after checking out that review, i understood better how pointless was to have a Quad or Tri SLI with the GTX 690

and for the same money i can get a better bang for the buck with just 2 Titan


----------



## Qu1ckset

Quote:


> Originally Posted by *AllGamer*
> 
> That is exactly my problem, i do play mostly in surround view, and even 3D surround view for those few games that supports it.
> 
> and when those games runs, the FPS is very close indeed to the benches shown in TechPowerUp rewview http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/
> 
> I really like my dual GTX 690, but the performance is kind of mediocre considering i paid $2000 to get the top of the line gear last December.
> 
> and 2 Titan on SLI seems to be easily eating the 690 for breakfast most of the time.
> 
> after checking out that review, i understood better how pointless was to have a Quad or Tri SLI with the GTX 690
> 
> and for the same money i can get a better bang for the buck with just 2 Titan


I was seriously thinking about SLI Titans but for the money it would cost me, i think im going to wait till Nvidia's 800series maxwell cards, but dam its looking farther and farther away seeing how they haven't even launched the 700series...


----------



## wermad

Pulled a rookie mistake (







) and forgot to plug in the auxiliary power sata connector to the mobo. Some how i left it out when redoing my midplate or I may have tossed out the sata cable i had for it. Anyways, read JimmyD's thread on his lack of performance with two 690s and he found out he was missing the auxiliary power cable too. I did have mine w/ the 580s but I just forgot to plug it back in. Or, must have left it out when i was messing around with the 670 and 7970. Anyways. Usage is pegged @ 99% on all four cores and vram usage has dropped to 1.8-1.9 gb (from pegged). Also, I did get a nice boost w/ an oc to 1200 using PrecisionX without the auxiliary power on there. Going to test if the oc will boost up my frames. Running Very high w/ fxaa in Crysis3 and I'm getting a smooth 60-70fps w/ vsync off


----------



## propeldragon

i have my fan speed of my 690 at 75% and its getting to 90c. is there something wrong with my 690?


----------



## AllGamer

Quote:


> Originally Posted by *wermad*
> 
> Pulled a rookie mistake (
> 
> 
> 
> 
> 
> 
> 
> ) and forgot to plug in the auxiliary power sata connector to the mobo. Some how i left it out when redoing my midplate or I may have tossed out the sata cable i had for it. Anyways, read JimmyD's thread on his lack of performance with two 690s and he found out he was missing the auxiliary power cable too. I did have mine w/ the 580s but I just forgot to plug it back in. Or, must have left it out when i was messing around with the 670 and 7970. Anyways. Usage is pegged @ 99% on all four cores and vram usage has dropped to 1.8-1.9 gb (from pegged). Also, I did get a nice boost w/ an oc to 1200 using PrecisionX without the auxiliary power on there. Going to test if the oc will boost up my frames. Running Very high w/ fxaa in Crysis3 and I'm getting a smooth 60-70fps w/ vsync off


oh my AUX 5pin power cable is connected alright, you can even see it from this photo on my profile


----------



## AllGamer

Quote:


> Originally Posted by *propeldragon*
> 
> i have my fan speed of my 690 at 75% and its getting to 90c. is there something wrong with my 690?


are you OCed or running on default settings?

that seems rather Hot

what is the case ventilation setup?

photo of your rig inside might help us help you


----------



## propeldragon

its at default settings. i have a titan vga cooler dual fan. pci fan infront of the gpu and 2 fans to the side of it. i got a lot of fans blowing on this.


----------



## wermad

Quote:


> Originally Posted by *AllGamer*
> 
> oh my AUX 5pin power cable is connected alright, you can even see it from this photo on my profile


I totally lost sight of this. I was in a rush to put my build together (excuses, excuses,







). JimmyD was in contact with me about his issue and I really didn't help him much since I was struggling a bit. I found his thread and that turned on the light bulb in my head on my setup. Sure enough, no auxiliary sata power cable for the gpu(s). For two 690s, its a must for most if not all high end boards that can run quad sli. I was almost ready to call it quits on the 690s but i knew something was wrong. even was convinced it was my mb w/ a "medium" oc (turns out oc on cpu is perfect fine).

So in the end, I was helped by a member who actually looked for my assistance







. It happens to the best of us. I'm quite happy i could conquer Crysis 3.

Still, dreaming of going 5x1 so I'll wait for amd's next gpu series


----------



## nawaf010

hi i recently upgraded from gigabytes gtx 680 to evga's dual 690 im really having multiple problems like seeing not a single frame improve over the 680 its also having severe lag when running on SLI

my specs are

intel i7-3770 Cpu 3.4 gh

16 gb Ram


----------



## KuroiKami

i wonder one thing. how is the boost clock work? mine is just constant using the stock speed 914 mhz. and sometimes fall down 714 mhz.

Do i need to play game for it to kick in ? i tried furmark to stress test it. full gpu load get around 67c no OC yet but could do it later.

I use 3570k stock and boost speed now. should i OC to like 4000+mhz or stock/boost doesn't bottleneck.

Or else i have to say the card look very good


----------



## wermad

Quote:


> Originally Posted by *nawaf010*
> 
> hi i recently upgraded from gigabytes gtx 680 to evga's dual 690 im really having multiple problems like seeing not a single frame improve over the 680 its also having severe lag when running on SLI
> 
> my specs are
> 
> intel i7-3770 Cpu 3.4 gh
> 
> 16 gb Ram


Sounds like a cpu bottle neck. You now have four gpu cores to contend with (vs one on the 680). I see you have a non "K" cpu. For quads, the sweet spot seems to be 4.5-5.0 (depending on your cpu architecture). Get a K cpu, it makes life easier. pcie 3.0 doesn't do much for you so even a used Sandy Bridge will work.
Quote:


> Originally Posted by *KuroiKami*
> 
> i wonder one thing. how is the boost clock work? mine is just constant using the stock speed 914 mhz. and sometimes fall down 714 mhz.
> 
> Do i need to play game for it to kick in ? i tried furmark to stress test it. full gpu load get around 67c no OC yet but could do it later.
> 
> I use 3570k stock and boost speed now. should i OC to like 4000+mhz or stock/boost doesn't bottleneck.
> 
> Or else i have to say the card look very good


http://www.hardocp.com/article/2012/05/03/nvidia_geforce_gtx_690_dual_gpu_video_card_review/7#.UVKtx1dvRFf

I used this guide


----------



## wholeeo

Question for you guys. Do a majority of people keep their cards overclocked 24/7? Not that its needed with this card yet but I feel like its a huge waste for me having it under water and not keeping it OC'd,









Had the same issue with my 580s, though they were stable at 970 core I always kept them stock under water,


----------



## MrTOOSHORT

Quote:


> Originally Posted by *wholeeo*
> 
> Question for you guys. Do a majority of people keep their cards overclocked 24/7? Not that its needed with this card yet but I feel like its a huge waste for me having it under water and not keeping it OC'd,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had the same issue with my 580s, though they were stable at 970 core I always kept them stock under water,


Voltage load is the same with an overclock or not with the GTX 690. Overclocking the frequency with stock voltage will not use much more power than stock frequency and voltage. I believe upping the voltage is the real power hog, which can't be done with the GTX 690.

I left my GTX 690 overclocked 24/7.


----------



## wholeeo

Thanks for that info,









I"ve been thinking of using the bios editor to lock my max overclocks so that I don't have to run precision / AB anymore.


----------



## Lukas026

hey again

so after some time MSI took my RMA and sent me my money back. Now I am just unpacking whole new ASUS GTX 690. So I guess I am in the club again









Also installing my new Noctua fans in my new Define R4 system and assembling it. Will post pics for proof of ownership later.

Now I have one question: what are the best drivers atm for playing Tomb Raider and Crysis 3 ? I saw 314.22 WHQL got some pretty nice improvments but also a lot of ppl saying that they disable boost clock or what. Is that true ? Also any experience on overclocking on these new drivers ? Help much appreciated...


----------



## wermad

Crysis 3 seems to crash frequently for me (in Surround). Every other game is fine. I'm running stock clocks on my gpu(s) but this morning I played for an hour. It crashed three times. Not getting a windows driver crash error so i know its not the gpus. I'm running 314.xx wqhl. I have 310.xx wqhl I'll try later on to see if that helps.


----------



## Cylas

My 24/7 setting with my Gainward 690GTX is + 175 GPU and + 480 Mem. Clock (under water). Temp. Idle 19c° and 100% Load 31c° with Delta-T 0-1c°.


----------



## PCModderMike

Quote:


> Originally Posted by *wholeeo*
> 
> Question for you guys. Do a majority of people keep their cards overclocked 24/7? Not that its needed with this card yet but I feel like its a huge waste for me having it under water and not keeping it OC'd,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had the same issue with my 580s, though they were stable at 970 core I always kept them stock under water,


Since I have mine blocked, I figure what the heck and leave mine OCed 24/7.


----------



## anuraj1

Quote:


> Originally Posted by *AllGamer*
> 
> oh my AUX 5pin power cable is connected alright, you can even see it from this photo on my profile


I just bought a second GTX 690 and it will be here tomorrow. Is the 5pin AUX you are talking about something separate from the 4 8pin AUX power connectors for the two cards??


----------



## wermad

Quote:


> Originally Posted by *anuraj1*
> 
> I just bought a second GTX 690 and it will be here tomorrow. Is the 5pin AUX you are talking about something separate from the 4 8pin AUX power connectors for the two cards??


Some (or most) motherboards that can run high end gpu setups will require the user to plug in whats referred to as the auxiliary gpu power cable. Some mothers have an extra connection for either sata, molex, or 6-pin pcie. This supplies extra power to the motherboard to handle multiple gpu(s).

Which motherboard do you have and i can find what you need (if any) to connect for quad sli. I had to plug in my motherboard's sata gpu auxiliary to make it work properly


----------



## Lukas026

just a small update

I am finished and trying to find out my max core clock via Unigine Valley 1.0 looping. so far it looks like +145 mhz is my limit. its stable for 2 hours now. max temps 78 / 72 on 95 % fan. I think i will move to overclocking memory now.

btw any new stress test for 690 around here ? I have seen someone mention looping canyon flight from 3D Mark 06 is great. Any inputs on that ?

Thanks


----------



## anuraj1

Quote:


> Originally Posted by *wermad*
> 
> Some (or most) motherboards that can run high end gpu setups will require the user to plug in whats referred to as the auxiliary gpu power cable. Some mothers have an extra connection for either sata, molex, or 6-pin pcie. This supplies extra power to the motherboard to handle multiple gpu(s).
> 
> Which motherboard do you have and i can find what you need (if any) to connect for quad sli. I had to plug in my motherboard's sata gpu auxiliary to make it work properly


Thank you so much for the help. I have an ASUS P8Z77-V Pro w/Thunderbolt:

http://www.newegg.com/Product/Product.aspx?Item=N82E16813131853


----------



## Rage19420

Ok so im torn between the Titan and a 690. From all i seen is that the 690 seems a bit better in my main game of choice (BF3) and scales better when adding another for some quad goodness. Im running a new 120hz monitor and really trying to reach a consistent 120fps.

Thoughts?


----------



## mtbiker033

Quote:


> Originally Posted by *Rage19420*
> 
> Ok so im torn between the Titan and a 690. From all i seen is that the 690 seems a bit better in my main game of choice (BF3) and scales better when adding another for some quad goodness. Im running a new 120hz monitor and really trying to reach a consistent 120fps.
> 
> Thoughts?


I play allot of BF3 on the 690 and it handles it great. I'm only on a 60hz monitor for now but seeing the fps I get you should be fine on the 120hz. I don't even overclock it while playing BF, I just set the power target to 135% and have a custom fan curve. Enjoy!

I play at 1080p
all settings ultra
no motion blur
hbao on
2xmsaa
SweetFX 1.4


----------



## wermad

Quote:


> Originally Posted by *anuraj1*
> 
> Thank you so much for the help. I have an ASUS P8Z77-V Pro w/Thunderbolt:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131853


Unfortunately, your mb only does 8x/8x which is going to hold back your quads. I know its pcie 3.0 but the system will see it at 2.0. You need a z77 mb w/ a plx chip like the Sniper3, UP7, Maximus V Extreme, or Asrock extreme 9/11. Or x79 platform with at least two 16x/16x. Well, I'm not too verse on how to enable true 3.0 and if the 690s will take advantage of it since 3.0 8x ~ 2.0 16x. Either way, it will still be a good idea to switch since you'll definitely need to feed more power into the mb and yours don't have an auxiliary one.


----------



## anuraj1

Quote:


> Originally Posted by *wermad*
> 
> Unfortunately, your mb only does 8x/8x which is going to hold back your quads. I know its pcie 3.0 but the system will see it at 2.0. You need a z77 mb w/ a plx chip like the Sniper3, UP7, Maximus V Extreme, or Asrock extreme 9/11. Or x79 platform with at least two 16x/16x. Well, I'm not too verse on how to enable true 3.0 and if the 690s will take advantage of it since 3.0 8x ~ 2.0 16x. Either way, it will still be a good idea to switch since you'll definitely need to feed more power into the mb and yours don't have an auxiliary one.


I haven't even opened the motherboard box, so I can definitely return it and get a new one. I like ASUS products so I am leaning towards the Maximus V Extreme, but which one would you recommend?

Thank you again for your insight!

EDIT: The Sniper3 seems pretty awesome, is this the right one: http://www.amazon.com/Gigabyte-CrossFireX-NVIDIA-Motherboard-G1-SNIPER/dp/B007R21KAI/ref=sr_1_4?ie=UTF8&qid=1364591664&sr=8-4&keywords=Sniper3

EDIT2: I guess I'd be willing to move to x79 if z77 can't run both cards at x16 3.0. If you could point me in the right direction for which would be best for the 690 quad sli, that would be great


----------



## wermad

Quote:


> Originally Posted by *anuraj1*
> 
> I haven't even opened the motherboard box, so I can definitely return it and get a new one. I like ASUS products so I am leaning towards the Maximus V Extreme, but which one would you recommend?
> 
> Thank you again for your insight!
> 
> EDIT: The Sniper3 seems pretty awesome, is this the right one: http://www.amazon.com/Gigabyte-CrossFireX-NVIDIA-Motherboard-G1-SNIPER/dp/B007R21KAI/ref=sr_1_4?ie=UTF8&qid=1364591664&sr=8-4&keywords=Sniper3


The retail price of the sniper is ~$289. Since there aren't anymore, some sellers are over pricing them. You can find used ones on ebay going for ~$200. Its got great onboard audio which helps a lot since quads will not let you add a pcie sound card. The Maximus is the better overclocker (or the UP7), so if that is your priority, i would go w/ that. The sniper overclocks ok but I've had better success w/ Asus boards.


----------



## anuraj1

Quote:


> Originally Posted by *wermad*
> 
> The retail price of the sniper is ~$289. Since there aren't anymore, some sellers are over pricing them. You can find used ones on ebay going for ~$200. Its got great onboard audio which helps a lot since quads will not let you add a pcie sound card. The Maximus is the better overclocker (or the UP7), so if that is your priority, i would go w/ that. The sniper overclocks ok but I've had better success w/ Asus boards.


So I was looking at the Maximus V and this worried me in the details:

PCI Express 3.0 x16: 5 (x16 or dual x8 or x8/x16/x8 or x8/x16/x8/x8)

Doesn't this mean that having dual cards would only be x8 and x8?


----------



## wermad

Quote:


> Originally Posted by *anuraj1*
> 
> So I was looking at the Maximus V and this worried me in the details:
> 
> PCI Express 3.0 x16: 5 (x16 or dual x8 or x8/x16/x8 or x8/x16/x8/x8)
> 
> Doesn't this mean that having dual cards would only be x8 and x8?


Interesting. I thought it did offer 16x/16x. apparently not







. Go w/ the UP7, I hear lots of great things on that one. If you want a good audio system, get a sniper3.

The Asrock z77 9 will also do 16/x16x

edit: Here's some up7 pr0n with waterblock (







):



double edit: if you want to stick w/ Asus, the WS z77 does offer 16x/16x (blue slots):


----------



## anuraj1

Quote:


> Originally Posted by *wermad*
> 
> Interesting. I thought it did offer 16x/16x. apparently not
> 
> 
> 
> 
> 
> 
> 
> . Go w/ the UP7, I hear lots of great things on that one. If you want a good audio system, get a sniper3.
> 
> The Asrock z77 9 will also do 16/x16x
> 
> edit: Here's some up7 pr0n with waterblock (
> 
> 
> 
> 
> 
> 
> 
> ):


Alright, UP7 it is! I really appreciate you taking the time to help me. I might have to come back to ask some questions... Once I've got it set up I'll take some pics so I can be a part of the club!


----------



## wermad

Quote:


> Originally Posted by *anuraj1*
> 
> Alright, UP7 it is! I really appreciate you taking the time to help me. I might have to come back to ask some questions... Once I've got it set up I'll take some pics so I can be a part of the club!


Awesome









I did add the Asus WS Z77 if you want to stick w/ asus and its slightly cheaper then the up7









http://www.newegg.com/Product/Product.aspx?Item=N82E16813131822


----------



## malmental

vote for the ASUS Z77 WS.....

as an owner of the ASUS P8P67 WS Rev (NF200) B3 all I can say is it's the best board I have ever owned.
(night rider unit..)
I've had chances to move up to the Z77 board but honestly I can't find a reason to do it..
the VRM system is the best on the market IMO..


----------



## wermad

P8P67 WS is awesome. i owned two of them. It didnt oc as nicely as the Maximus IV Extreme but it did have the best slot spacing and could support 4-way crossfire. You can find used ones for ~$100. Sill, PLX is a far better cry then the good ol' NF200.

For the best overclocking z77, the UP7 is one of them. Still, sniper3 is great if you can find one. There's a used on ebay going for ~$175 in auction (a day left).


----------



## anuraj1

Quote:


> Originally Posted by *wermad*
> 
> P8P67 WS is awesome. i owned two of them. It didnt oc as nicely as the Maximus IV Extreme but it did have the best slot spacing and could support 4-way crossfire. You can find used ones for ~$100. Sill, PLX is a far better cry then the good ol' NF200.
> 
> For the best overclocking z77, the UP7 is one of them. Still, sniper3 is great if you can find one. There's a used on ebay going for ~$175 in auction (a day left).


Oops, sorry, I already ordered the UP7 before I left work. Just got home and saw these replies. Now I just hope that the UP7 fits in the Storm Stryker case that just arrived haha.


----------



## wermad

Quote:


> Originally Posted by *anuraj1*
> 
> Oops, sorry, I already ordered the UP7 before I left work. Just got home and saw these replies. Now I just hope that the UP7 fits in the Storm Stryker case that just arrived haha.


uP7 is still a top notch choice. Should overlap the grommets a bit in your Stryker but it should fit


----------



## anuraj1

Quote:


> Originally Posted by *wermad*
> 
> uP7 is still a top notch choice. Should overlap the grommets a bit in your Stryker but it should fit


Awesome, I can't wait to get started. FedEx's delivery estimate of today has just changed to "N/A" so it looks like I'll have to wait for that as well as the new motherboard to get started. I can't wait to get this working!


----------



## kingchris

Well its in the dual now, just waiting till after easter to order some sli joiners, need to connect some fittings. then i can fire her up again and try it out on a pci 3.0 m/b


----------



## wermad

Looks awesome Chris!


----------



## Lukas026

nice screens chris









on my note I need some help from u guys: what program would you recommend for overclocking memory on gtx 690 and also how can be sure its error free ? I mean i can bench unigine valley 1.0 for hours with +600 on memory but than in tomb raider or crysis 3 it starts artifacting. i dropped it down to 500 mhz and it seems okey but i want to find out my max









i tried evga oc scanner and occt (both 4.4.0 and 3.1.0), but they dont utilize gpus in SLI very good. any ideas ?

also i am now stable on +135 / +500 in all games and I must say i love this card









thank you for advices


----------



## MrTOOSHORT

You've found your max right there playing those games. Stick to +500, that's way more than enough of a memory overclock.


----------



## Lukas026

yeah i know its enough but my point is that i am not entirely sure if something "looks" messed up ingame if its my memory overclock or some glitch in game itself. any error checking program out there for 690 ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Lukas026*
> 
> yeah i know its enough but my point is that i am not entirely sure if something "looks" messed up ingame if its my memory overclock or some glitch in game itself. *any error checking program* out there for 690 ?


No there isn't. Just visual in games or benchies. So if you're having glitches in games and you think it's the memory overclock, then lower your overclock to troubleshoot. I'm just pointing out the fact that +500 is high for a memory overclock and if that's stable, you should be happy.

There are people in this thread that can only manage +150 - 300 on the memory on their GTX 690.


----------



## Lukas026

okey fine. one last question









what software would you recommend for overclocking at all ? I am using evga precision x but maybe there is some better one:afterburner or gpu tweak ? my gtx brand is ASUS but I guess it doesnt matter all.

last versions i found: msi after burner 2.3.1 (beta 7 3.0), asus gpu tweak 2.3.6.0 and evga precision x 4.0. ideas ?


----------



## MrTOOSHORT

Precision is fine. I use Afterburner myself.


----------



## Arizonian

Quote:


> Originally Posted by *Lukas026*
> 
> okey fine. one last question
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what software would you recommend for overclocking at all ? I am using evga precision x but maybe there is some better one:afterburner or gpu tweak ? my gtx brand is ASUS but I guess it doesnt matter all.
> 
> last versions i found: msi after burner 2.3.1 (beta 7 3.0), asus gpu tweak 2.3.6.0 and evga precision x 4.0. ideas ?


MSI Afterburner or EVGA Precision are great. I personally use Precision mostly due to my 690 being an EVGA card.

I stay away from betas which can turn out to be a hassle sometimes you don't need. Wait till the bugs are ironed out unless you have some major issue and the beta solves it.

As for your Memory over clock, if it's too high you will artifact as you saw first hand. Not all games or benches benefit from Memory and I don't see the gains in FPS as much as I do from an over clocked Core. The 690 is so beastly in most of the games out you'll dominate in FPS on your single 1440p monitor for a good while,

Enjoy your new GTX 690.


----------



## DPM78

Hi guys,

I have been in the green world for the last 2 weeks for the first time in my life.
So I own a ASUS GTX690 dedicated to play simracing (rfactor, rfactor2, iracing, Assetto Corsa...) in surround (3xLG FullHD TV 32"). And I am really amazed by the smoothness of this card.

I have searched for this information, but couldn't find it:
How I can disable 1 GPU in surround mode (5760x1080)? (from the Nvidia control panel or NV Inspector)

Actually, a lot of these games, I am playing don't scale right in SLI, that's why I would like to use only 1 of the 2 GPUs.

Coming from a HD6970 crossfire, it was really easy to deactivate the crossfire and stay in eyefinity.
Really hope one of you get the answer to my question.

Thanks,


----------



## nicholasewood

EVGA GTX 690 Signature Series

Add me to the club!


----------



## Lagpirate

just ordered my evga backplate from amazon, CANT WAIT!


----------



## Lukas026

so after many hours of playing I can say I am stable at +125 core and +500 on memory.

all games runs great on this card i must say. so smooth. I am only gaming on 1200p so vram is not a problem. tomb raider maxed out (incl. 4x ssaa), crysis 3, far cry 3 and witcher 2 (ubersampling on). damn they look great







love it !

temps are also quite good with profiled fan curve (maximum was 85 / 79 C in far cry 3 but there is also a lot cpu heat from my i5 3570k @ 4.6)

will finaly post some pics and bench scores next week

just sharing info for you guys

thanks for the help in previous posts btw


----------



## iARDAs

can anyone share the stock 690 Graphics score on 3DMark11?

In the performance settings?

Thanks in advance.


----------



## Arniebomba

Quote:


> Originally Posted by *iARDAs*
> 
> can anyone share the stock 690 Graphics score on 3DMark11?
> 
> In the performance settings?
> 
> Thanks in advance.


About 16K with a 3770K

Regarding to some very low 3DMark scores, i came accross this thread.
I've red a lot of post and now i'm stuck with some questions.

I've red that the 690 can only use 2Gb of VRAM, making a triple setup at very high allmost impossible. Though the 690 has 4Gb of Vram. Does it only use 2gb of it? If so, why?
I am running a 690 sli setup. Does it only use the VRAM of one card?
Came accross this article which says 2gb shouldnt be an issue for 5760x1080 http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/

Besides that, does anybody else have the same problem with 3D Mark (11) and quad sli?


----------



## wermad

Quote:


> Originally Posted by *Arniebomba*
> 
> I've red that the 690 can only use 2Gb of VRAM, making a *triple setup* at very high allmost impossible. Though the 690 has 4Gb of Vram. Does it only use 2gb of it? If so, why?
> I am running a 690 sli setup. Does it only use the VRAM of one card?
> Came accross this article which says 2gb shouldnt be an issue for 5760x1080 http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
> 
> Besides that, does anybody else have the same problem with 3D Mark (11) and quad sli?


I have three screens and the 2gb is not an issue


----------



## King4x4

Quote:


> Originally Posted by *Arniebomba*
> 
> About 16K with a 3770K
> 
> Regarding to some very low 3DMark scores, i came accross this thread.
> I've red a lot of post and now i'm stuck with some questions.
> 
> I've red that the 690 can only use 2Gb of VRAM, making a triple setup at very high allmost impossible. Though the 690 has 4Gb of Vram. Does it only use 2gb of it? If so, why?
> I am running a 690 sli setup. Does it only use the VRAM of one card?
> Came accross this article which says 2gb shouldnt be an issue for 5760x1080 http://www.pugetsystems.com/labs/articles/Video-Card-Performance-2GB-vs-4GB-Memory-154/
> 
> Besides that, does anybody else have the same problem with 3D Mark (11) and quad sli?


That Holds true only for 7680x1440.... you are perfectly fine with 5760x1080.

SLI-Crossfire doesn't add up ram... it mirrors it. So if you have 4 GPUs and three of them has 4GB and a fourth one with 2GB the whole system will only use 2GB since the cards need to have the same info on the ram on each card.


----------



## propeldragon

2013-04-02 01.33.19.jpg 1344k .jpg file


2013-04-02 01.33.48.jpg 1369k .jpg file


PROOF


----------



## propeldragon

i was just overclocking my 690 and testing it in crysis 3 and the max core clock i got before it crashed was +135mhz, which in game is 1189. should i use something else to test my overclock? i figured crysis 3 is the most demanding game out there at the moment. i was hoping to get in the 12xxmhz clocks. whats the average overclock for memory?


----------



## wermad

Quote:


> Originally Posted by *propeldragon*
> 
> i was just overclocking my 690 and testing it in crysis 3 and the max core clock i got before it crashed was +135mhz, which in game is 1189. should i use something else to test my overclock? i figured crysis 3 is the most demanding game out there at the moment. i was hoping to get in the 12xxmhz clocks. whats the average overclock for memory?


Try a gpu benchmark like 3dmark11, Unigen Heaven or Valley. If it crashes too, its unstable. I used the HardOCp guide and only added 100+ (brings it to 1600).

http://www.hardocp.com/article/2012/05/03/nvidia_geforce_gtx_690_dual_gpu_video_card_review/7

Crysis 3 runs like crap sometimes, running in surround with quads makes it worse. I have yet to have any game crash (stock clocks for now) but crysis 3 crashed quite a bit. Even Crysis 2 runs smooth as butter in Surround. I would wait for more mature drivers and possibly a patch from Crytek/EA.


----------



## propeldragon

crysis 3 is not optimized at all. i cant even run it maxed with 8x msaa. total bs right there.


----------



## wermad

Quote:


> Originally Posted by *propeldragon*
> 
> crysis 3 is not optimized at all. i cant even run it maxed with 8x msaa. total bs right there.


I actually ran it in high and fxaa w/out vsync since it was so bad in the first couple of levels (opening at sea and the damn level). i do have Max Payne 3 which I notice is another intensive game used to benchmark in reviews. I'll fire that one up once I finish crysis 3. I have to admit I'm a bit disappointed with the story of Crysis 3. Just adds more confusion to the console guys (more butchering like the first's story to get them up to speed).

Well, i have yet to try other games since i corrected a







mistake







. I did play a bit of Metro. Favorite part is the Dungeon (buy a crap load of ammo for this one







). Without DoV, it soars to 100fps. I have it w/ vsync on to keep it smoother @ 50-60fps.


----------



## Arniebomba

Quote:


> Originally Posted by *King4x4*
> 
> That Holds true only for 7680x1440.... you are perfectly fine with 5760x1080.
> 
> SLI-Crossfire doesn't add up ram... it mirrors it. So if you have 4 GPUs and three of them has 4GB and a fourth one with 2GB the whole system will only use 2GB since the cards need to have the same info on the ram on each card.


Why does Nvidia quote 4GB of vram but only 2 is usable? Or am i getting the picture wrong?


----------



## wermad

Quote:


> Originally Posted by *Arniebomba*
> 
> Why does Nvidia quote 4GB of vram but only 2 is usable? Or am i getting the picture wrong?


Its a marketing gimmick. Technically, they are correct. They are selling you a product with a total vram count of 4gb. But, the effect or usable ram is all dependent on how much vram each gpu core has assigned. In the 690s case, its only 2gb of usable vram.


----------



## Arizonian

Quote:


> Originally Posted by *Arniebomba*
> 
> Why does Nvidia quote 4GB of vram but only 2 is usable? Or am i getting the picture wrong?


Yup both Nvidia and AMD use this tactic on dual GPU"s because technically it is a total of VRAM on the one PCB. HOWEVER as we all know the GPU's switch handling one frame at a time. Only 2GB VRAM handling each frame at any one given point.

Kinda like the ingredients lable on food. It will tell you xxx calories 'per serving' in bold but on the back if you read closer that container has more than one serving. Sounds better.









Legally, it's the truth. Technically it's a marketing tool.


----------



## propeldragon

just did +160mhz (which is 1215mhz in game) to the core clock for bf3 and it was stable. this 690 is doing better than my old 2 680s. highest i got with them in bf3 was 1189mhz.


----------



## Arizonian

Quote:


> Originally Posted by *propeldragon*
> 
> just did +160mhz (which is 1215mhz in game) to the core clock for bf3 and it was stable. this 690 is doing better than my old 2 680s. highest i got with them in bf3 was 1189mhz.


That's a sweet over clock. Congrats on such a good card.









*"How to put your Rig in your Sig"*


----------



## propeldragon

and i didnt even try going higher LOL. i need to try memory but whats the point of doing memory? like if i was using my 2gb up, would i need to overclock my memory?


----------



## DPM78

Quote:


> Originally Posted by *DPM78*
> 
> Hi guys,
> 
> I have been in the green world for the last 2 weeks for the first time in my life.
> So I own a ASUS GTX690 dedicated to play simracing (rfactor, rfactor2, iracing, Assetto Corsa...) in surround (3xLG FullHD TV 32"). And I am really amazed by the smoothness of this card.
> 
> I have searched for this information, but couldn't find it:
> How I can disable 1 GPU in surround mode (5760x1080)? (from the Nvidia control panel or NV Inspector)
> 
> Actually, a lot of these games, I am playing don't scale right in SLI, that's why I would like to use only 1 of the 2 GPUs.
> 
> Coming from a HD6970 crossfire, it was really easy to deactivate the crossfire and stay in eyefinity.
> Really hope one of you get the answer to my question.
> 
> Thanks,


Quote:


> Originally Posted by *wermad*
> 
> I have three screens and the 2gb is not an issue


Sorry to disturb, but anyone got an answer for me?
PS: sorry for my english


----------



## justanoldman

How prevalent is coil whine in the 690 cards? It is somewhat common or rare?


----------



## Arizonian

Quote:


> Originally Posted by *justanoldman*
> 
> How prevalent is coil whine in the 690 cards? It is somewhat common or rare?


No coil whine issues with the 690. Very uncommon. Don't recall any members complain about coil whine in this club thread.


----------



## justanoldman

Quote:


> Originally Posted by *Arizonian*
> 
> No coil whine issues with the 690. Very uncommon. Don't recall any members complain about coil whine in this club thread.


Just put it under water, now I have it I guess. Just exchange it, or is there something to be done?


----------



## Arizonian

Quote:


> Originally Posted by *justanoldman*
> 
> Just put it under water, now I have it I guess. Just exchange it, or is there something to be done?


Sometimes it works itself out under some heavy use. However no confirmed way to remove coil whine from a noisy capacitor. It doesn't hurt performance but once you hear it, it seems that's all you hear afterward.

Up to you if you want to replace it. Sorry to hear it. After all these pages I can't recall anyone else with coil whine and a 690.


----------



## TeamBlue

Quote:


> Originally Posted by *justanoldman*
> 
> Just put it under water, now I have it I guess. Just exchange it, or is there something to be done?


I would get an exchange, I had it bad on my 7970s, my 690 will whine ever so slightly in some menus.


----------



## Arniebomba

Quote:


> Originally Posted by *wermad*
> 
> Its a marketing gimmick. Technically, they are correct. They are selling you a product with a total vram count of 4gb. But, the effect or usable ram is all dependent on how much vram each gpu core has assigned. In the 690s case, its only 2gb of usable vram.


Quote:


> Originally Posted by *Arizonian*
> 
> Yup both Nvidia and AMD use this tactic on dual GPU"s because technically it is a total of VRAM on the one PCB. HOWEVER as we all know the GPU's switch handling one frame at a time. Only 2GB VRAM handling each frame at any one given point.
> 
> Kinda like the ingredients lable on food. It will tell you xxx calories 'per serving' in bold but on the back if you read closer that container has more than one serving. Sounds better.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Legally, it's the truth. Technically it's a marketing tool.


Thanks for the info guys! Appreciated


----------



## justanoldman

For any of you 690 owner's with it under water and a really quiet rig. When you do a benchmark run of Valley 1.0, then when you exit it switches to that credits screen for a few seconds, do you hear any change of noise at all from your card?

I ask because on air you would not hear it over the fan, and I am trying to figure out if a tiny bit of noise at that credit screen is considered normal, or dead silent is normal.

Thanks a lot, sorry for being a newb with some of this stuff.


----------



## propeldragon

i dont have any coil whine. my 680s and 690 didnt. i have a 660 in my htpc and that has coil whine when i get the fan going faster. i dont know if thats even coil whine or just how the fans are for cheaper gpus.


----------



## wermad

Quote:


> Originally Posted by *DPM78*
> 
> Sorry to disturb, but anyone got an answer for me?
> PS: sorry for my english


Hmmmm, in Surround.....never heard of that since to setup Surround requires sli'ing all gpu(s) in the system. Check the game; some have the option to "enable multiple gpu support" (ie sli), though you want to disable/deactivate this.

It might work w/ the "activate all display" options. I'll give that a try when i get a chance







. Make sure you have all three monitors plugged into the A core (though you'll need a display port to dvi/hdmi active adapter.).
Quote:


> Originally Posted by *justanoldman*
> 
> For any of you 690 owner's with it under water and a really quiet rig. When you do a benchmark run of Valley 1.0, then when you exit it switches to that credits screen for a few seconds, do you hear any change of noise at all from your card?
> 
> I ask because on air you would not hear it over the fan, and I am trying to figure out if a tiny bit of noise at that credit screen is considered normal, or dead silent is normal.
> 
> Thanks a lot, sorry for being a newb with some of this stuff.


Sounds like coil whine. A lot of gpu(s) have it. I've noticed it more prevalent w/ amd cards. On air, you rarely hear it since the fan(s) are blasting away but once on water, you'll notice it if any of your cards have it. I was lucky my two 690s don't have it. If its bother some, you can try to rma it with the manufacturer.


----------



## wholeeo

Has anyone tried Tomb Raider @ 1440p? If so what settings are you playing on? Had to drop a few things going from 1080p to 1440p to keep a consistent 60fps, ram doesn't seem like the issue as it was @ 1500 mb during the time I was playing. On 1080p I played with everything maxed except for AA which was 2xSSAA. Would hate to not have enough horse power to push this resolution, think I'd rather just go back to 1080p for now if that's the case. On the other hand for the little bit I played Bioshock Infinite with everything on Ultra I rarely got under 60 so it may just be a TR issue.


----------



## Arizonian

Quote:


> Originally Posted by *wholeeo*
> 
> Has anyone tried Tomb Raider @ 1440p? If so what settings are you playing on? Had to drop a few things going from 1080p to 1440p to keep a consistent 60fps, ram doesn't seem like the issue as it was @ 1500 mb during the time I was playing. On 1080p I played with everything maxed except for AA which was 2xSSAA. Would hate to not have enough horse power to push this resolution, think I'd rather just go back to 1080p for now if that's the case. On the other hand for the little bit I played Bioshock Infinite with everything on Ultra I rarely got under 60 so it may just be a TR issue.


I don't play Tomb Raider yet. I did just switch from 1080p 120 Hz monitor to a 1440p monitor.

You do not need to have AA on with a 1440p monitor in order to achieve an even more brilliant color gaming experience then 1080p with AA on. You will barely notice AA is off in most games at 1440p. Try it.

I play Crysis 3 'high' settings with AA off and maintain 60-90 FPS. Can't play 'very high' settings since I'm getting between 40-60 FPS. Crysis 3 @ 1440p dosent need AA on and looks just as vibrant.

I'm finding it might be easier to push a 1440p monitor at 60 FPS than keep 100+ FPS on a 120 Hz monitor. Love both play styles with CON and PROS.


----------



## justanoldman

Thanks for all the help so far guys, EVGA is sending me a replacement 690.

One more question, what GPU temps do you guys get under water for your 690? Do the two temps differ by much? I have a low end simple loop and my oced temps are about 47c and 50c on the gpus. Any specific TIM you found works better on the dies?


----------



## wermad

Quote:


> Originally Posted by *justanoldman*
> 
> Thanks for all the help so far guys, EVGA is sending me a replacement 690.
> 
> One more question, what GPU temps do you guys get under water for your 690? Do the two temps differ by much? I have a low end simple loop and my oced temps are about 47c and 50c on the gpus. Any specific TIM you found works better on the dies?


Great news on the rma!









I have a large (and very overkill) loop. With my fans in low speed and running Crysis3 in Surround, they hover in the mid 40s, ambient is ~25c. I'm running stock, so your oc'd temps seem normal







.

Most good tims out there do a great job. I'm still using what ever is left of my Ceramique. Once its done, I'll switch to something else.


----------



## anuraj1

The build isn't quite done yet, but please add me to the club!

(Cable management is not my forte...)

EDIT: Both are EVGA


----------



## Whatupdoe1337

Hello all, away from rig at the moment so I am unable to post pics of my 690 as of now, but I have a quick question. I play on a 120hz 1080p monitor and was wondering if i would see any actual benefits from overclocking my card at 1080p? If the oc is rather mild, say +120 core and +200 memory, would I see any performance increases? Or would the OC need to be higher?


----------



## justanoldman

Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Hello all, away from rig at the moment so I am unable to post pics of my 690 as of now, but I have a quick question. I play on a 120hz 1080p monitor and was wondering if i would see any actual benefits from overclocking my card at 1080p? If the oc is rather mild, say +120 core and +200 memory, would I see any performance increases? Or would the OC need to be higher?


+120 is not a mild oc, I would say that is more like an average one. Not many cards take over +150 on the core. You will be able to go higher on the memory though. From my tests you can get a 10 to 15% fps increase with an average oc. It is up to you whether that is worth the extra heat and trouble, I would say test it in the games you play at the settings you use.


----------



## samoth777

Hi.
Is the 690 hassle free? I know stuttering is a big issue with crossfire setups but how about SLI on one card? I play on 1080p.


----------



## Arizonian

Quote:


> Originally Posted by *samoth777*
> 
> Hi.
> Is the 690 hassle free? I know stuttering is a big issue with crossfire setups but how about SLI on one card? I play on 1080p.


On my single 120 Hz or 1440p it is great. First dual GPU for me. I was expecting all the headaches that I heard come along with dual GPUs. However this series was done very well everything from frame metering, temps, noise, extra thick PCB, to the design.

The drivers that work with the 680 SLI have worked with the 690s with no issues for me. Except for the very beginning when they launched and I had a problem with my card not being able to down clock in 2D.

Since micro stuttering is perceived by some and not others, I can honestly say I have not seen any that I couldn't attribute to even a single card in game or new game release.

If you're planning on going on any higher resolution or multiple monitors I would suggest a card with more the VRAM as it has been confirmed 2GB VRAM is not enough in more cases than not.


----------



## samoth777

Quote:


> Originally Posted by *Arizonian*
> 
> On my single 120 Hz or 1440p it is great. First dual GPU for me. I was expecting all the headaches that I heard come along with dual GPUs. However this series was done very well everything from frame metering, temps, noise, extra thick PCB, to the design.
> 
> The drivers that work with the 680 SLI have worked with the 690s with no issues for me. Except for the very beginning when they launched and I had a problem with my card not being able to down clock in 2D.
> 
> Since micro stuttering is perceived by some and not others, I can honestly say I have not seen any that I couldn't attribute to even a single card in game or new game release.
> 
> If you're planning on going on any higher resolution or multiple monitors I would suggest a card with more the VRAM as it has been confirmed 2GB VRAM is not enough in more cases than not.


thanks for that info dude.







so it's going to be a matter of me being able to perceive the microstuttering or not. hmmm...

can anyone else share with me their 690 experience? I just worry about hassles like stuttering or driver issues etc. whether they exist or not at all.


----------



## wermad

Quote:


> Originally Posted by *samoth777*
> 
> thanks for that info dude.
> 
> 
> 
> 
> 
> 
> 
> so it's going to be a matter of me being able to perceive the microstuttering or not. hmmm...
> 
> can anyone else share with me their 690 experience? I just worry about hassles like stuttering or driver issues etc. whether they exist or not at all.


With Nvidia, drivers tend to be much more reliable and stable. i have a couple of 690s and they run awesome in 1200 Surround. One should handle a 1080 screen with ease. Don't worry about the 2gb of vram since it will hardly be an issue with a single 1080 screen. Only game that maxes vram is Crysis 3 for my setup w/ Surround.


----------



## mtbiker033

Quote:


> Originally Posted by *samoth777*
> 
> thanks for that info dude.
> 
> 
> 
> 
> 
> 
> 
> so it's going to be a matter of me being able to perceive the microstuttering or not. hmmm...
> 
> can anyone else share with me their 690 experience? I just worry about hassles like stuttering or driver issues etc. whether they exist or not at all.


the 690 is the nicest, most powerful, quietest, and most efficient card I have ever had (been using SLI for awhile)

SLI 8800GT
SLI GTX260 core 216
GTX295
SLI GTX470
SLI GTX570
690

make a custom fan profile, try to keep temps below 70, set power target to 135% and you are good to go


----------



## Qu1ckset

Quote:


> Originally Posted by *mtbiker033*
> 
> the 690 is the nicest, most powerful, quietest, and most efficient card I have ever had (been using SLI for awhile)
> 
> SLI 8800GT
> SLI GTX260 core 216
> GTX295
> SLI GTX470
> SLI GTX570
> 690
> 
> make a custom fan profile, try to keep temps below 70, set power target to 135% and you are good to go


ive owned a fair about of cards myself and im the most pleased with my gtx690, i haven't had the upgrade bug for awhile

(1920x1080) CFX 2xHD5770s (quite with a lil bit of stuttering)
(1920x1080) CFX 2xHD5970s (Beastly cards but lots of stuttering, loud and power hungry)
(2560x1600) SLI 2xGTX580s (Loved this setup, a lil power hungry with my OC'd i7 950, sold rig needed money







)
(5760x1080) CFX 1xHD6990 (beast card, tons of eyefinity issues, very very loud! )
(2560x1440) SLI 1xGTX690 (Beast @1440p , was really quite on air , and very power efficient)

ive owned a few cards in between those but didn't have them long enough to give my full opinion on them


----------



## wermad

690s have been smooth for me too. Very nice temps on water both idle and load.

my history:

4870x2 + 4870
2x 470
4x 480
3x 560 Ti 448
3x 6970 Lighthnings
2x 590
7970
670 4gb
4x 580 3gb
(now) 2X 690


----------



## Lukas026

greetings again

after some time i decided to use some aftermarket cooling on my asus gtx 690. now to my question. only air cooler compatible with it is accelero twin turbo 690. it has some great references on net but it is so ugly. i had it before on my last msi 690 but i returned it and I dont think i want to buy it again...

maybe I should try watercooling. but to be honest I am little scared. i never used it before either on my gpu / cpu or rams. what sort of things i need to be able to use waterblock on my single 690 ? and which one would you suggest ? also is there some guide to help me better understand watercooling gpu at all ?

i know i seem stupid now







but you always helped me so i am trying again...

thanks


----------



## mtbiker033

Quote:


> Originally Posted by *Lukas026*
> 
> greetings again
> 
> after some time i decided to use some aftermarket cooling on my asus gtx 690. now to my question. only air cooler compatible with it is accelero twin turbo 690. it has some great references on net but it is so ugly. i had it before on my last msi 690 but i returned it and I dont think i want to buy it again...
> 
> maybe I should try watercooling. but to be honest I am little scared. i never used it before either on my gpu / cpu or rams. what sort of things i need to be able to use waterblock on my single 690 ? and which one would you suggest ? also is there some guide to help me better understand watercooling gpu at all ?
> 
> i know i seem stupid now
> 
> 
> 
> 
> 
> 
> 
> but you always helped me so i am trying again...
> 
> thanks


I would recommend either water cooling or keep it stock. The stock air cooling solution works well enough in my application though is dependent on each owner's case/airflow/ambients. I use a test bench so I have no problem with stock air cooling. YMMV!


----------



## justanoldman

Quote:


> Originally Posted by *Lukas026*
> 
> greetings again
> 
> after some time i decided to use some aftermarket cooling on my asus gtx 690. now to my question. only air cooler compatible with it is accelero twin turbo 690. it has some great references on net but it is so ugly. i had it before on my last msi 690 but i returned it and I dont think i want to buy it again...
> 
> maybe I should try watercooling. but to be honest I am little scared. i never used it before either on my gpu / cpu or rams. what sort of things i need to be able to use waterblock on my single 690 ? and which one would you suggest ? also is there some guide to help me better understand watercooling gpu at all ?
> 
> i know i seem stupid now
> 
> 
> 
> 
> 
> 
> 
> but you always helped me so i am trying again...
> 
> thanks


I think you are limited to rad space in your case, and rads are what you need for cooling underwater. I went the cheap route for my first loop by expanding the Swiftech H220 to cool my 690.

Water cooling your cpu is easy with several AIO kits you can buy. Going full custom or expanding something like the H220 is a decent amount of work and time. Delidding two chips with a razor blade was easy compared to setting up a water loop right.

You first have to decide on a budget, you can do it for $400 or you can spend several thousand. Googling water cooling guides will give you many to choose from depending on how much time you want to spend on it. Then you will start wondering what size tubing, which pump and whether to do two or one, which fittings, what reduces flow and what doesn't.

And on and on, then you can read many threads where people argue back and forth was is better and never really know. Which rads, what size rads, mixed metal causing problems, which coolant or additives, etc. There is no right or easy answer I guess, I am still trying to figure it out.

My simple loop is the H220, extra 320 rad with three fans, 690 waterblock and backplate, 4 fittings, tubing, and coolant. I am also looking to add a 140 rad to it, since my temps are ok but could be better. FrozenCPU is a good place to ask questions about specific gear when you are ready, and you get a 5.1% ocn discount.

You can choose not to oc the 690, and use a quiet fan curve which will let it throttle a little at 70c and the card is not too loud. If you want quiet and an oced card then water is great, my simple loop is one of the best improvements I have done. I know all that will leave you with more questions than answers but hopefully it helps a little.


----------



## Acapella75

Just got my 690 in the mail last night. Got it off eBay for $800. I went for single card sli over 2 cards due to matx case and future itx build. I'll post a pic when I get home. I'm stoked!!


----------



## anuraj1

Quote:


> Originally Posted by *Acapella75*
> 
> Just got my 690 in the mail last night. Got it off eBay for $800. I went for single card sli over 2 cards due to matx case and future itx build. I'll post a pic when I get home. I'm stoked!!


Congratulations!


----------



## TeamBlue

Quote:


> Originally Posted by *Acapella75*
> 
> Just got my 690 in the mail last night. Got it off eBay for $800. I went for single card sli over 2 cards due to matx case and future itx build. I'll post a pic when I get home. I'm stoked!!


Congratulations! Here's a super bad hackjob photoshopped cell phone pic... There's a 690 in there somewhere.



The original for reference.



I had to burn it pretty bad to get the card to show up.


----------



## PCModderMike

Quote:


> Originally Posted by *Acapella75*
> 
> Just got my 690 in the mail last night. Got it off eBay for $800. I went for single card sli over 2 cards due to matx case and future itx build. I'll post a pic when I get home. I'm stoked!!


Awesome price for such a card. That's about the same price I got mine for, and it's the exact same reason. I did an mATX build and it's great having so much power on a single PCB. Congrats.


----------



## iARDAs

I would take GTX 690 anyday over a GTX 680 or a GTX 670 2GB SLI setup.


----------



## PCModderMike

I had a GTX 670 2GB SLI setup not too long ago, and ya I do much prefer the 690. Less heat, less power, and performs a lot better.


----------



## y2jdmbfan

Quote:


> Originally Posted by *iARDAs*
> 
> I would take GTX 690 anyday over a GTX 680 or a GTX 670 2GB SLI setup.


What about versus the Titan?


----------



## maximus56

I would take a single 690 over a single Titan any day.








In fact the only upgrade that I would ever recommend would be as follows:

2 Titans over 1 690 and 3 Titans over 2 690s. This is based on my experience with both cards, and both configurations. Unfortunately, the Titan tri sli ( or quad sli for Titan for that matter which may never happen) drivers are not optimized yet, but I am hoping that Nvidia gets around to it soon, or all my expenditure would have been in vain.


----------



## Arizonian

Quote:


> Originally Posted by *y2jdmbfan*
> 
> What about versus the Titan?


Less performance but better frame times on the Titan vs 690. Normal for dual vs single GPU. With frame metering on the 690, micro stuttering isn't even worth considering for me as an issue. Some prefer the single card solution. If this dual GPU had issues I'd concur but after 9 months I cannot say it does. I love mine.









Up to you and your preference as it's very subjective depending on who you ask. In the end no real losers with either GPU as they are both amazingly powerful for current gen gaming.

Quick Edit....if your going multiple monitors - go Titan. If staying single monitor 2GB VRAM is enough right now on 1440p or below.


----------



## y2jdmbfan

Quote:


> Originally Posted by *Arizonian*
> 
> Less performance but better frame times on the Titan vs 690. Normal for dual vs single GPU. With frame metering on the 690, micro stuttering isn't even worth considering for me as an issue. Some prefer the single card solution. If this dual GPU had issues I'd concur but after 9 months I cannot say it does. I love mine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Up to you and your preference as it's very subjective depending on who you ask. In the end no real losers with either GPU as they are both amazingly powerful for current gen gaming.
> 
> Quick Edit....if your going multiple monitors - go Titan. If staying single monitor 2GB VRAM is enough right now on 1440p or below.


Single 1440P for me as of now...Just moved up to 1440P from 2 1200P Dells...Don't feel like I really need 2 1440P monitors, and I am definitely not into surround gaming....


----------



## Lagpirate

Just got my EVGA baclplate for my 690 in the mail today! I got to go to work really soon but i will post pics as soon as I can!
If anyone is having problems getting an evga backplate, get them off amazon. they have a few left. I am pretty sure EVGA is still sold out.


----------



## Cylas

Has anyone tried to copy the Voltage Block from 680GTX to the 690GTX Bios?


----------



## max883

How do i to copy the Voltage Block from 680GTX to the 690GTX Bios?? do you have the bios i can test ?


----------



## wholeeo

If I were you guys I wouldn't waste your time. After trying various different bios's and ways to get my voltage over 1.175 I don't think its possible without a hard mod.


----------



## wermad

Hey guys, if anyone needs a water block for their 690, MrTooShort is selling an ek one with a rare nickel plated backplate:

http://www.overclock.net/t/1378965/gtx-690-ek-waterblock-ek-backplate


----------



## justanoldman

Quote:


> Originally Posted by *wermad*
> 
> Hey guys, if anyone needs a water block for their 690, MrTooShort is selling an ek one with a rare nickel plated backplate:
> 
> http://www.overclock.net/t/1378965/gtx-690-ek-waterblock-ek-backplate


Ah man, I would have bought those in a second if I had not bought mine a couple weeks ago.


----------



## wholeeo

Quote:


> Originally Posted by *wermad*
> 
> Hey guys, if anyone needs a water block for their 690, MrTooShort is selling an ek one with a rare nickel plated backplate:
> 
> http://www.overclock.net/t/1378965/gtx-690-ek-waterblock-ek-backplate


I actually got to put up the Koolance block you sold me. My brother decided not to watercool.


----------



## wermad

Quote:


> Originally Posted by *wholeeo*
> 
> I actually got to put up the Koolance block you sold me. My brother decided not to watercool.


Kewl







. I love the temps on mine. Looks pretty good w/ the EVGA backplate. And its surprisingly light


----------



## Whatupdoe1337

Hello all, downloading far cry 3 now, and was just wondering what settings I would need to reach an average of 120fps on a single 690 and a 3930k @ 4.5ghz? Any help would be appreciated


----------



## wermad

Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Hello all, downloading far cry 3 now, and was just wondering what settings I would need to reach an average of 120fps on a single 690 and a 3930k @ 4.5ghz? Any help would be appreciated


What screen size?


----------



## Shaitan

I finally got my build setup the way I want it. I had a long road getting a card that worked normally (this is my fourth card), but here she is in all her glory


----------



## Whatupdoe1337

Quote:


> Originally Posted by *wermad*
> 
> What screen size?


Woops. Sorry, I play at 1080p


----------



## Whatupdoe1337

One other question too, sorry for being so dependent, I'm rather new to pc gaming. I've reached a stable oc of +130 on the core and +400 on the memory. While playing bf3, my clocks locked in at 1176mhz and 3204 mhz for the memory, is this normal clock speeds for this kind of oc? I was hoping to reach at least 1200 on the core speed


----------



## wermad

Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Woops. Sorry, I play at 1080p


Is it a 120hz monitor? If not, with high settings, I'm sure you can hit 100+. The minimum is 60+ imho to get a great experience. I ran w/ the default settings and in Surround, Got 100-200 fps. Some of the intense fighting scenes do slow it down a bit though.

Fill out your specs in your profile btw







.


----------



## OmniScience

Hey everyone,

This question is targeted to those with bios modded 690's. I recently modded mine 2 690's to introduce 150% power and 1.175 volts. I'm completely on water and neither card peaks above 45 degrees ever. My question however is, with the new bios, is anyone finding a hard time clocking them? i used to be able to run 1.15 volts, 130% power target and be able to sync all cards to +150 core and +200 memory on precision. Now with extra voltage and 150% PT I can barely finish valley with a clock at all. What is everyone else experiencing. Also is it in my interest to clock each core separately? I started and have gotten some really weird results such as:

GPU1: +153mhZ core/+200 memory
GPU2: +145mHz core/+500 memory... is this normal?!

Have yet to do GPUs 3 and 4.

Not sure if my PSU is limiting me as I also have a 3930K running at 5.0Ghz. My AX1200 is coming Wednesday.

Love to hear what others have experienced. Rocking the following:

EDIT**: my 1000W power supply just isn't enough at the clocks/speeds pumps, etc that I'm pushing this thing at.


----------



## Lagpirate

Hey guys, I'm having a few issues with my 690. I got it about 3 weeks ago, and I loved it so much I ordered an EVGA backplate for it. My backplate came in on Saturday, and I installed it Sunday night. Upon installation, I fired up my computer and noticed that evga precisionX was only recognizing one gpu. I thought, "we'll this is strange. Maybe sli got disabled somehow for some reason.." I opened up nvidia control panel, and it too was only recognizing one gpu. Not even giving me the option to enable sli. At this point I'm thinking, "oh great I botched the install and fried a gpu core." So I removed the backplate, and sure as ****, my gpu started workin normally again. Detected both gpu's, both dvi slots were working. At this point I'm like,"cool I fixed it" but just to make sure I started running benchmarks/stress tests on oc scanner, just trying to see if everything was still holding up as it should be. Zero artifacts, and the temps are the same as before I even put the backplate on it. So my question is, is there anything I should be worried about? I already plan on emailing evga to inform them of this issue, and explain to them that I'm kinda upset that I can't use my backate in fear of ruining my 1k card. I'm not even goin I atempt to put it back on cause I feel like I've gotten lucky already...

Sorry for the log post! Any input is appreciated


----------



## TeamBlue

Hey guys, thought you should see this giveaway, there are plenty of 690 owners in this forum, so it's kind of on topic, right?
http://www.monsterpcmods.com/forum/viewtopic.php?f=21&t=18


----------



## wholeeo

Quote:


> Originally Posted by *Lagpirate*
> 
> Hey guys, I'm having a few issues with my 690. I got it about 3 weeks ago, and I loved it so much I ordered an EVGA backplate for it. My backplate came in on Saturday, and I installed it Sunday night. Upon installation, I fired up my computer and noticed that evga precisionX was only recognizing one gpu. I thought, "we'll this is strange. Maybe sli got disabled somehow for some reason.." I opened up nvidia control panel, and it too was only recognizing one gpu. Not even giving me the option to enable sli. At this point I'm thinking, "oh great I botched the install and fried a gpu core." So I removed the backplate, and sure as ****, my gpu started workin normally again. Detected both gpu's, both dvi slots were working. At this point I'm like,"cool I fixed it" but just to make sure I started running benchmarks/stress tests on oc scanner, just trying to see if everything was still holding up as it should be. Zero artifacts, and the temps are the same as before I even put the backplate on it. So my question is, is there anything I should be worried about? I already plan on emailing evga to inform them of this issue, and explain to them that I'm kinda upset that I can't use my backate in fear of ruining my 1k card. I'm not even goin I atempt to put it back on cause I feel like I've gotten lucky already...
> 
> Sorry for the log post! Any input is appreciated


I actually had the same problem. I had thought the back plate may have shorted something but after half an hour or so of troubleshooting with it seemed to be that the card wasn't completely all the way into the PCIE slot. It was to my surprise as I had installed the backplate without removing the card from the PC. Seeing your post though has me doubting that that was the issue. Maybe it was indeed something to do with the plate,


----------



## maximus56

Quote:


> Originally Posted by *Lagpirate*
> 
> Hey guys, I'm having a few issues with my 690. I got it about 3 weeks ago, and I loved it so much I ordered an EVGA backplate for it. My backplate came in on Saturday, and I installed it Sunday night. Upon installation, I fired up my computer and noticed that evga precisionX was only recognizing one gpu. I thought, "we'll this is strange. Maybe sli got disabled somehow for some reason.." I opened up nvidia control panel, and it too was only recognizing one gpu. Not even giving me the option to enable sli. At this point I'm thinking, "oh great I botched the install and fried a gpu core." So I removed the backplate, and sure as ****, my gpu started workin normally again. Detected both gpu's, both dvi slots were working. At this point I'm like,"cool I fixed it" but just to make sure I started running benchmarks/stress tests on oc scanner, just trying to see if everything was still holding up as it should be. Zero artifacts, and the temps are the same as before I even put the backplate on it. So my question is, is there anything I should be worried about? I already plan on emailing evga to inform them of this issue, and explain to them that I'm kinda upset that I can't use my backate in fear of ruining my 1k card. I'm not even goin I atempt to put it back on cause I feel like I've gotten lucky already...
> 
> Sorry for the log post! Any input is appreciated


Yes, it does have something to do with the backplate. It had the same issue with it, and had to tighten the screws well for it to work. I was using a Koolance block though. Just air blow or clean the surface well before installing it, just in case something is shorting it, but I doubt it.


----------



## Lagpirate

Quote:


> Originally Posted by *wholeeo*
> 
> I actually had the same problem. I had thought the back plate may have shorted something but after half an hour or so of troubleshooting with it seemed to be that the card wasn't completely all the way into the PCIE slot. It was to my surprise as I had installed the backplate without removing the card from the PC. Seeing your post though has me doubting that that was the issue. Maybe it was indeed something to do with the plate,


Yea I actually made sure that it was securely in the pcie slot twice just to make sure that I didn't make a simple mistake like that. It was securely in place and both power cables were plugged in all the way.
Quote:


> Originally Posted by *maximus56*
> 
> Yes, it does have something to do with the backplate. It had the same issue with it, and had to tighten the screws well for it to work. I was using a Koolance block though. Just air blow or clean the surface well before installing it, just in case something is shorting it, but I doubt it.


I wonder if the evga backplates have the same issue. I really tried not to over tighten the screws cause I have heard some horror stories about people ruining their gpu's that way... I even tried loosening the screws a bit when the card didn't fire in fear tht I may had over tightened it. But it still didn't fire after that. Maybe I should have tried tightening them instead..

Honestly, the whole experience has got me spooked. I'm not sure if I want to try to reinstall it, but if I dont I'm stuck with a backplate I can't use.


----------



## lowgun

Hello 690 club! I've been thinking about moving to 2 690's from my current to 670 4GBs. I play with Nvidia Surround across monitors, and although I know people scream about 2GB not being enough, from my tests and all kinds of reviews online, it doesn't make much of a difference for games today. What I'm worried about is reports from people saying that the second 690 barely adds any performance, if any at all. So it would seem that the 3/4 way scaling was quite bad. Now, these reviews and reports were from when the 690 came out. I'm hoping the drivers have matured since then and people are getting higher GPU usage across all 4 GPUs now, but I haven't been able to find any sources testing this information newer than September of last year. Could anyone shine some light on if that second 690 has kicked its butt in to gear on surround setups?


----------



## Qu1ckset

Quote:


> Originally Posted by *lowgun*
> 
> Hello 690 club! I've been thinking about moving to 2 690's from my current to 670 4GBs. I play with Nvidia Surround across monitors, and although I know people scream about 2GB not being enough, from my tests and all kinds of reviews online, it doesn't make much of a difference for games today. What I'm worried about is reports from people saying that the second 690 barely adds any performance, if any at all. So it would seem that the 3/4 way scaling was quite bad. Now, these reviews and reports were from when the 690 came out. I'm hoping the drivers have matured since then and people are getting higher GPU usage across all 4 GPUs now, but I haven't been able to find any sources testing this information newer than September of last year. Could anyone shine some light on if that second 690 has kicked its butt in to gear on surround setups?


Honestly I wouldn't recommend getting 2x gtx690s for surround, plus 4way CFX and SLI is a waste because the last card/ 4th gpu doesn't add a lot of performance plus the 2gb vram cap will bottleneck the card easier, why not get a third 670?? Or even wait it out till the 700series cards??


----------



## Qu1ckset

I'm seriously considering selling or trading my gtx690 in for a titan, do you think the titan will ever pass the gtx690 in performance with driver updates, or do you think this rumor baby titan in SLI wud be a better path?

The extra vram is looking tempting but I have zero use for the compute part of the card


----------



## lowgun

With a third/fourth 670 they will be sandwiched together and won't have much breathing room. The 700 series is seemingly forever and a day away, and the Titan is stupid expensive for the performance is gives. I'd like to get a Titan, but being in between a 680 and 690 it should have been $700-900 instead of $1000+


----------



## Lagpirate

Quote:


> Originally Posted by *Qu1ckset*
> 
> I'm seriously considering selling or trading my gtx690 in for a titan, do you think the titan will ever pass the gtx690 in performance with driver updates, or do you think this rumor baby titan in SLI wud be a better path?
> 
> The extra vram is looking tempting but I have zero use for the compute part of the card


Comparing the 690 to a single Titan isn't really fair because the 690 is a dual gpu card vs. the Titan being a single gpu. At 1080p, the 690 will always beat out the Titan. However, once you start to get into surround gaming on high resolution moniors, the Titan starts to become the better choice because of the VRAM and higher memory bandwidth. Honestly if you plan on going higher than 1440p on a single monitor or 1080p on surround gaming, I would personally recommend the Titan. You may not get as HIGH as fps, but your overall gamin experience will be much smoother. I hope this helped you.









Edit: to fully answer your question, I would probably wait until the Titan light comes out. You will save a little bit of money on going sli, and your 690 will still fetch a hefty price a couple months from now.


----------



## lowgun

So what is the supposed specs for this Titan "Light"? Is there any information floating around on it or a link I could check out?


----------



## Arizonian

Quote:


> Originally Posted by *Qu1ckset*
> 
> I'm seriously considering selling or trading my gtx690 in for a titan, do you think the titan will ever pass the gtx690 in performance with driver updates, or do you think this rumor baby titan in SLI wud be a better path?
> 
> The extra vram is looking tempting but I have zero use for the compute part of the card


Quote:


> Originally Posted by *Lagpirate*
> 
> Comparing the 690 to a single Titan isn't really fair because the 690 is a dual gpu card vs. the Titan being a single gpu. At 1080p, the 690 will always beat out the Titan. However, once you start to get into surround gaming on high resolution moniors, the Titan starts to become the better choice because of the VRAM and higher memory bandwidth. Honestly if you plan on going higher than 1440p on a single monitor or 1080p on surround gaming, I would personally recommend the Titan. You may not get as HIGH as fps, but your overall gamin experience will be much smoother. I hope this helped you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: to fully answer your question, I would probably wait until the Titan light comes out. You will save a little bit of money on going sli, and your 690 will still fetch a hefty price a couple months from now.


Comparing two products priced the same is very fair.

I agree with if your going surround go Titan. If your going to spend $2000 for two cards regardless single or multiple monitor, go Titan.

Single monitor - single card - you can debate will get a better gaming experience from Titan due to frame rates over single 690 with frame metering. IF I saw an issue with my single 690 I'd concur but it's not the case. Frame times you can't see when FPS is pushing through is moot unless you see micro stutter, which I do not on my set up. In this scenario Titan or 690 will do great with 690 pushing more FPS. Yes Titan is fastest single GPU with same perks over a dual gpu to some degree (thanks frame metering) but for the same price not the fastest FPS set up.


----------



## Lagpirate

Quote:


> Originally Posted by *lowgun*
> 
> So what is the supposed specs for this Titan "Light"? Is there any information floating around on it or a link I could check out?


There is a thread floating around on here, in the rumors section I would link it but I
I'm on my smartphone right now and it's not being so smart. It's called something like, "nvidia to release rumored Titan LE". The rumored specs are: same clocks and boost clocks of the Titan, with 5gb of VRAM, and a slightly lower memory bandwidth. It's almost identical to the Titan, just scaled down about 10-15%. As a result, you can expect the price to e scaled down about 10-15% as well. That would place the Titan LE at a price point of about 850-900 dollars. But again this is all speculation at this point in time. Should be interesting to see what nvidia does to say the least


----------



## Qu1ckset

Quote:


> Originally Posted by *Lagpirate*
> 
> Comparing the 690 to a single Titan isn't really fair because the 690 is a dual gpu card vs. the Titan being a single gpu. At 1080p, the 690 will always beat out the Titan. However, once you start to get into surround gaming on high resolution moniors, the Titan starts to become the better choice because of the VRAM and higher memory bandwidth. Honestly if you plan on going higher than 1440p on a single monitor or 1080p on surround gaming, I would personally recommend the Titan. You may not get as HIGH as fps, but your overall gamin experience will be much smoother. I hope this helped you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: to fully answer your question, I would probably wait until the Titan light comes out. You will save a little bit of money on going sli, and your 690 will still fetch a hefty price a couple months from now.


I have a huge itch for a upgrade, don't get me wrong my gtx690 is a beast and runs everything with ease at 1440p, only game so far to challenge it to date is crysis 3, but im thinking about end of th year and early next year with next gen consoles, and don't want to be bottlenecked by vram.. in benches the titan isn't to far behind so I wouldn't mind switching to it, but if the titan light comes out at 600-700 I might grab two and SLI them if there 15% worse then the titan, id get two titans but 2k is way more then I'd like to spend lol

*edit*
I plan on staying with a single 1440p monitor


----------



## Lagpirate

Quote:


> Originally Posted by *Arizonian*
> 
> Comparing two products priced the same is very fair.
> 
> I agree with if your going surround go Titan. If your going to spend $2000 for two cards regardless single or multiple monitor, go Titan.
> 
> Single monitor - single card - you can debate will get a better gaming experience from Titan due to frame rates over single 690 with frame metering. IF I saw an issue with my single 690 I'd concur but it's not the case. Frame times you can't see when FPS is pushing through is moot unless you see micro stutter, which I do not on my set up. In this scenario Titan or 690 will do great with 690 pushing more FPS. Yes Titan is fastest single GPU with same perks over a dual gpu to some degree (thanks frame metering) but for the same price not the fastest FPS set up.


While you have a point in saying that both cards have the same price point, I still don't think the cards belong in the same category as one is a single gpu and one is a dual gpu. I completely agree with you on your points though. If you are going to spend 2k on gpu's, go Titan. Multi monitor, DEFINITELY Titan. I also agree with your comments on microstuttering. I have yet to see ANY issues with micro stuttering on my 690 as well because it just pushes so many frames a second. So yes, for the same price, the 690 will give you better fps on a single monitor.


----------



## Qu1ckset

Quote:


> Originally Posted by *lowgun*
> 
> With a third/fourth 670 they will be sandwiched together and won't have much breathing room. The 700 series is seemingly forever and a day away, and the Titan is stupid expensive for the performance is gives. I'd like to get a Titan, but being in between a 680 and 690 it should have been $700-900 instead of $1000+


Watercooling solves the sandwich issue


----------



## lowgun

Quote:


> Originally Posted by *Qu1ckset*
> 
> Watercooling solves the sandwich issue


I keep thinking of adding a full loop, but I just can't justify the huge cost for it when I could apply that money towards something that will actually add performance (such as another video card).


----------



## King4x4

The good thing about watercooling... Is that it pays for itself in the long run.

A waterloop generally lives for 7 years with its rig... So thats seven years of pure rig silence with a lot of procrastination when changing parts. For those who have an upgrade itch every month is a huge money saver!


----------



## Lagpirate

Quote:


> Originally Posted by *lowgun*
> 
> I keep thinking of adding a full loop, but I just can't justify the huge cost for it when I could apply that money towards something that will actually add performance (such as another video card).


Also, water cooling does essentially add performance. With a good loop you can overclock your gpu's/CPU much higher than on air. You'd be surprised how good of a loop you can get nowadays for a decent price.


----------



## lowgun

Quote:


> Originally Posted by *Lagpirate*
> 
> Also, water cooling does essentially add performance. With a good loop you can overclock your gpu's/CPU much higher than on air. You'd be surprised how good of a loop you can get nowadays for a decent price.


I think this was a very valid reason pre-600 series, but now Nvidia won't let us push our cards more than they want us to be able to. I have my CPU on a H100, but the GPUs are on air. I might still do a full loop one day, but right now for the $500+ I'd have to drop to get a full loop, I just don't find it worth it. I wish Nvidia would actually let us use the cards we own the way we want to


----------



## Whatupdoe1337

Hey, Could really use some help. My second gpu in my 690 is not clocking over the 915 mhz standard at all...even without OC, it doesnt move to the boost clock at all, though the memory is equal to gpu1. Any ideas?


----------



## Whatupdoe1337

Ok, got it to boost, but its sitting exactly 13mhz lower than gpu1, is this normal?


----------



## Whatupdoe1337

And now its doing the first thing again...***


----------



## wermad

Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Hey, Could really use some help. My second gpu in my 690 is not clocking over the 915 mhz standard at all...even without OC, it doesnt move to the boost clock at all, though the memory is equal to gpu1. Any ideas?


Hmmm...are you running air or water? Could be a bad core.
Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Ok, got it to boost, but its sitting exactly 13mhz lower than gpu1, is this normal?


Its normal that the cores might be off a bit like that.
Quote:


> Originally Posted by *Whatupdoe1337*
> 
> And now its doing the first thing again...***


If you have a utility like Afterburner or EVGA Precision, try resetting it to default settings via that. Also, run a gpu benchmark and check your gpu usage.


----------



## Lagpirate

Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Ok, got it to boost, but its sitting exactly 13mhz lower than gpu1, is this normal?


When a 690 gpu core downclocks, it downclocks by exactly 13 mhz at 70c, and another 13 mhz for each 10c over 70c. How hot is your gpu core getting? cause it could be downclocking.


----------



## wermad

Oc'd, primary card, 1st core is ~1200 and the second ~1189; second card, 1st core 1197, second 1195.


----------



## propeldragon

i have a question. for the 690 which is gpu 1 and which is gpu 2? is gpu 1 towards the back of the case where you plug in the dvi cable and 2 at the front?

thanks


----------



## endergx

do you guys know why you can't use an ek backplate as a standalone thing without the block? i bought one without researching... but this might be an excuse to just go for a full watercooled system.


----------



## wermad

Quote:


> Originally Posted by *propeldragon*
> 
> i have a question. for the 690 which is gpu 1 and which is gpu 2? is gpu 1 towards the back of the case where you plug in the dvi cable and 2 at the front?
> 
> thanks


I believe the "A" core is the one next to the slots.
Quote:


> Originally Posted by *endergx*
> 
> do you guys know why you can't use an ek backplate as a standalone thing without the block? i bought one without researching... but this might be an excuse to just go for a full watercooled system.


as long as the screws can reach, it should work. Same thing as the EVGA backplate which can be added to the stock NV cooler.


----------



## PhantomTaco

Just thought I'd drop in and say prob shoudl remove me from the 690 club, been on sli titans for a good while now







. Still love the 690


----------



## endergx

Thanks wermad!


----------



## Whatupdoe1337

Ok, I fixed it for now, time will tell how permanent the fix is though, if it persists ill contact support and see if I can rma or something. I am on air, but my temps never rise above 70℃, so I don't think it was that. I had to uninstall then reinstall drivers as well as precision x, then re-overclock to my stable settings (+120 core and +300 mem), but its working in benchmarks and far cry 3 so far. Thank you very much to wermad and lagpirate for the help


----------



## Dragon69

help please! i just got my 690..
playing crysis and tomb raider.
if i enable SLI on my 690, 2 Cards working A and B, it lags, i mean crazy lagging problems. not playable. 1-10fps for crysis and tomb raider
if i disable SLI, (A) 690 working (B) 690 off. its normal, playable but not that smooth. typical single 680 performance.. 20-35fps for crysis and tomb raider...

found out that the 2nd (B) 690s clocks is not moving. 324Mhz is the max Mhz it reaches.. weird








is this a driver problem or i need to replace my card? sucks though i just got it hehe

also 690+680 sli is impossible.. nvdia control panel automatically puts the 680 to decicated physX


----------



## wermad

Quote:


> Originally Posted by *Dragon69*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> help please! i just got my 690..
> playing crysis and tomb raider.
> if i enable SLI on my 690, 2 Cards working A and B, it lags, i mean crazy lagging problems. not playable. 1-10fps for crysis and tomb raider
> if i disable SLI, (A) 690 working (B) 690 off. its normal, playable but not that smooth. typical single 680 performance.. 20-35fps for crysis and tomb raider...
> 
> found out that the 2nd (B) 690s clocks is not moving. 324Mhz is the max Mhz it reaches.. weird
> 
> 
> 
> 
> 
> 
> 
> 
> is this a driver problem or i need to replace my card? sucks though i just got it hehe
> 
> also 690+680 sli is impossible.. nvdia control panel automatically puts the 680 to decicated physX


Tomb Raider has been known to be hella buggy w/ gtx 6xx series. Some ppl have them some don't. Its hit or miss right now. Most likely a driver update will fix this.

Unfortunately. you can't sli a 690 and a 680. It doesn't work like Amd (







). It will use the other card as a phyx card.


----------



## Dragon69

i updated my drives 314.22
on metro 2033 same thing happen. lags alot if SLI enabled..

Card B is only running at 324Mhz







pulling Card A down i guess

thanks for replying wermad


----------



## wermad

Quote:


> Originally Posted by *Dragon69*
> 
> i updated my drives 314.22
> on metro 2033 same thing happen. lags alot if SLI enabled..
> 
> Card B is only running at 324Mhz
> 
> 
> 
> 
> 
> 
> 
> pulling Card A down i guess
> 
> thanks for replying wermad


Go back to the last wqhl driver. If you have a gpu utility, reset it back to default.


----------



## Dragon69

yup.. i have evga precision X.. and its on default








always have been


----------



## wermad

Quote:


> Originally Posted by *Dragon69*
> 
> yup.. i have evga precision X.. and its on default
> 
> 
> 
> 
> 
> 
> 
> 
> always have been


Oh, could be a bad core. Make sure you're using the 16x slot on the mb.


----------



## Dragon69

yup i am using the x16 slot









just wondering if a bios flash would help. im not sure


----------



## wermad

Quote:


> Originally Posted by *Dragon69*
> 
> yup i am using the x16 slot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just wondering if a bios flash would help. im not sure


Wouldn't hurt


----------



## Kranik

Can I join? : )



Please? : )


----------



## propeldragon

Quote:


> Originally Posted by *Dragon69*
> 
> 
> 
> 
> 
> help please! i just got my 690..
> playing crysis and tomb raider.
> if i enable SLI on my 690, 2 Cards working A and B, it lags, i mean crazy lagging problems. not playable. 1-10fps for crysis and tomb raider
> if i disable SLI, (A) 690 working (B) 690 off. its normal, playable but not that smooth. typical single 680 performance.. 20-35fps for crysis and tomb raider...
> 
> found out that the 2nd (B) 690s clocks is not moving. 324Mhz is the max Mhz it reaches.. weird
> 
> 
> 
> 
> 
> 
> 
> 
> is this a driver problem or i need to replace my card? sucks though i just got it hehe
> 
> also 690+680 sli is impossible.. nvdia control panel automatically puts the 680 to decicated physX


did you reinstall drivers after you switched?


----------



## Zhohner

The beast hard at work:





It's the Gigabyte model.


----------



## Dragon69

nice rig zhoner!


----------



## Dragon69

Quote:


> Originally Posted by *propeldragon*
> 
> did you reinstall drivers after you switched?


yes i did








used driver sweeper
updated it to beta drivers to and 314.22
all the same. 2nd gpus clock is 324Mhz only.. bringing 1st card down with him if i enable SLI mode


----------



## PinzaC55

Quote:


> Originally Posted by *wermad*
> 
> I believe the "A" core is the one next to the slots.
> as long as the screws can reach, it should work. Same thing as the EVGA backplate which can be added to the stock NV cooler.


I have already used the EK plate as a standalone. You need six M2.5 x 7 screws with black countersunk heads. The EK screws are too big and the GTX 690 screws are too short.


----------



## TeamBlue

Quote:


> Originally Posted by *PhantomTaco*
> 
> Just thought I'd drop in and say prob shoudl remove me from the 690 club, been on sli titans for a good while now
> 
> 
> 
> 
> 
> 
> 
> . Still love the 690


I love your 690 because it's my 690 now! lol.


----------



## wermad

Quote:


> Originally Posted by *Zhohner*
> 
> The beast hard at work:
> 
> 
> 
> 
> 
> It's the Gigabyte model.


Awesome! HAF XB is on my list for my next possible upgrade.


----------



## Whatupdoe1337

Hello all, back again, sadly with another problem. My second core randomly keeps clocking to the max clocks, even while I'm just browsing the internet or on the desktop. Out of nowhere my fans just ramp up and precision X reads full voltage and clocks, but only for the second gpu on my 690. Any thoughts?


----------



## Zhohner

Quote:


> Originally Posted by *wermad*
> 
> Awesome! HAF XB is on my list for my next possible upgrade.


Thanks mate.







out of the cases I've owned this one is definitely my favorite.


----------



## FiShBuRn

I will put my 690 underwater during this week if anyone needs an EVGA backplate contact me because i will be using the EK backplate.


----------



## Lukas026

hello again

please update my status on first page of the thread. now I am on ASUS GeForce GTX 690










I am stable at +125 mhz core and +500 mhz memory in many games for countless hours and also in synth. benchmarks

crysis 3, tomb raider, far cry 3, metro 2033 - all on max settings. but one thing above all: the witcher 2 with ubersampling on - god its so damn good. who havent tried yet should









also posting some pics and results:

Proof:



Unigine Valley 1.0:



3D Mark 11 - Extreme:


----------



## PinzaC55

Just curious...does anyone here have a second card (not GTX 690) dedicated to Phyx and if so, what card and what effect does it have on performance?


----------



## rent.a.john

I just recently bought a EVGA 690, my buddy is coming over and I'm looking to hook my computer up to my TV to play some steam big picture mode. Only problem is, this card does not have a HDMI slot, will the DVI to HDMI adapter that came with the card output sound through my TV?


----------



## Qu1ckset

Quote:


> Originally Posted by *rent.a.john*
> 
> I just recently bought a EVGA 690, my buddy is coming over and I'm looking to hook my computer up to my TV to play some steam big picture mode. Only problem is, this card does not have a HDMI slot, will the DVI to HDMI adapter that came with the card output sound through my TV?


No the only difference between dvi and hdmi is hdmi ha sound like dvi does not. If your tv is like mine it might have a 3.5mm jack to accommodate computers, and then all you would need is a male to male 3.5mm cord

Best solution is to get a mini display port to hdmi cord and then the sound will work


----------



## wholeeo

Quote:


> Originally Posted by *Qu1ckset*
> 
> No the only difference between dvi and hdmi is hdmi ha sound like dvi does not. If your tv is like mine it might have a 3.5mm jack to accommodate computers, and then all you would need is a male to male 3.5mm cord
> 
> Best solution is to get a mini display port to hdmi cord and then the sound will work


Apologies but this is wrong and is commonly mistaken in regards to using the DVI to HDMI adapter that comes with most GPU's.
Quote:


> Originally Posted by *rent.a.john*
> 
> I just recently bought a EVGA 690, my buddy is coming over and I'm looking to hook my computer up to my TV to play some steam big picture mode. Only problem is, this card does not have a HDMI slot, will the DVI to HDMI adapter that came with the card output sound through my TV?


The DVI to HDMI adapter that came with the card will indeed output sound.


----------



## Qu1ckset

Quote:


> Originally Posted by *wholeeo*
> 
> Apologies but this is wrong and is commonly mistaken in regards to using the DVI to HDMI adapter that comes with most GPU's.
> The DVI to HDMI adapter that came with the card will indeed output sound.


Really, I guess you learn something everyday!


----------



## rent.a.john

Awesome, thanks you two!


----------



## wermad

Hmmmmm, thought the pc's sound would come from the onboard or a sound card, ???

Quote:


> Originally Posted by *PinzaC55*
> 
> Just curious...does anyone here have a second card (not GTX 690) dedicated to Phyx and if so, what card and what effect does it have on performance?


It shouldn't degrade performance, but will enhance performance for games that do take advantage of Phyx.


----------



## Qu1ckset

Quote:


> Originally Posted by *wermad*
> 
> Hmmmmm, thought the pc's sound would come from the onboard or a sound card, ???


you can choose which device to playback sound example, videocard, sound card or onboard..


----------



## wermad

Quote:


> Originally Posted by *Qu1ckset*
> 
> you can choose which device to playback sound example, videocard, sound card or onboard..


Didn't know that, how do you route it? would love to hook it up to the flat-screen in the living room


----------



## Qu1ckset

Quote:


> Originally Posted by *wermad*
> 
> Didn't know that, how do you route it? would love to hook it up to the flat-screen in the living room


on your desktop, right click on the speaker icon, on the very right of your taskbar and then click on playback devices and choose the one you want your sound to play threw!

or

assuming your on win7, go control panel > hardware and sound > manage audio devices and then choose your playback device of your choice!

hope that helps


----------



## wermad

Hey guys, I got an offer for a different gpu setup. I'm selling my last 690. Any one interested in going quad (powah!!!!), send me a pm







.


----------



## wholeeo

Titans?


----------



## wermad

Quote:


> Originally Posted by *wholeeo*
> 
> Titans?


Lips are sealed until they're in my hands









7970s, 7990s, 7950s, 680 4gbs, 670 4gbs, ...?????


----------



## konigsberg7

By default what does nvidia set the physx to? GPU 1 or 2? I dont use GPU B but GPU B Is like the normal slot to put in your DVI plug, so im wondering if it set it to B cuz its empty or its just like that even if you have a monitor connected to B


----------



## BENSON519

I have a interesting question. Currently I only have a wimpy gtx 660 and I am upgrading very soon to a 690. I have a asus vg248qe monitor which is the 144hz monitor. My question is will this beast of a card run 5760x1080p at a decent fps? I am sure I cannot run ultra settings until I can afford another card in sli. I only play bf3 which is why I want the 120 plus hz. Anybody have a setup like this? I cannot find any info anywhere so I figured this was a good place to start. I only have enough dough for one card and 2 monitors right now


----------



## FiShBuRn

My baby is underwater:


----------



## Arizonian

Quote:


> Originally Posted by *FiShBuRn*
> 
> My baby is underwater:
> 
> 
> Spoiler: Warning: Spoiler!


Nice FiShBuRn.







If you over clock it post back how much you were able to squeeze.


----------



## justanoldman

Quote:


> Originally Posted by *BENSON519*
> 
> I have a interesting question. Currently I only have a wimpy gtx 660 and I am upgrading very soon to a 690. I have a asus vg248qe monitor which is the 144hz monitor. My question is will this beast of a card run 5760x1080p at a decent fps? I am sure I cannot run ultra settings until I can afford another card in sli. I only play bf3 which is why I want the 120 plus hz. Anybody have a setup like this? I cannot find any info anywhere so I figured this was a good place to start. I only have enough dough for one card and 2 monitors right now


Triple monitors is tough. 120fps on it is not going to happen with any setup I know of with normal settings. If you look at the Valley 1.0 thread at 5760x1080 four Titans or four 7970 cards only get in the 90s. One 690 beats a Titan in pretty much every way, but for triple monitors I would probably go with two Titans and that should get you over 60fps.

Take a look at these graphs on page 18 and 19 of this review. It shows how a 690 and Titans do at 5760x1080:
http://www.guru3d.com/articles_pages/geforce_gtx_titan_3_way_sli_review,18.html


----------



## BENSON519

I
Quote:


> Originally Posted by *justanoldman*
> 
> Triple monitors is tough. 120fps on it is not going to happen with any setup I know of with normal settings. If you look at the Valley 1.0 thread at 5760x1080 four Titans or four 7970 cards only get in the 90s. One 690 beats a Titan in pretty much every way, but for triple monitors I would probably go with two Titans and that should get you over 60fps.
> 
> Take a look at these graphs on page 18 and 19 of this review. It shows how a 690 and Titans do at 5760x1080:
> http://www.guru3d.com/articles_pages/geforce_gtx_titan_3_way_sli_review,18.html


do not plan on playing on ultra for now. I currently game with one evga gtx 660 sc and with the settings on high I can easily get 80fps. On ultra I can only get 45. This is only one monitor. I found some benchmarks where the 690 on ultra was averaging at 70fps completely maxed out at 5760x1080. What I am asking is it should be around 90fps on high settings? I don't think one Titan right now would do well with three asus vg248qe monitors. Basically I am asking if I can hook up 3 of these monitors to one gtx 690 and still get an awesome experience


----------



## BENSON519

The
Quote:


> Originally Posted by *BENSON519*
> 
> I
> do not plan on playing on ultra for now. I currently game with one evga gtx 660 sc and with the settings on high I can easily get 80fps. On ultra I can only get 45. This is only one monitor. I found some benchmarks where the 690 on ultra was averaging at 70fps completely maxed out at 5760x1080. What I am asking is it should be around 90fps on high settings? I don't think one Titan right now would do well with three asus vg248qe monitors. Basically I am asking if I can hook up 3 of these monitors to one gtx 690 and still get an awesome experience[/quote). These are bf3 benchmarks


----------



## justanoldman

Quote:


> Originally Posted by *BENSON519*
> 
> I
> do not plan on playing on ultra for now. I currently game with one evga gtx 660 sc and with the settings on high I can easily get 80fps. On ultra I can only get 45. This is only one monitor. I found some benchmarks where the 690 on ultra was averaging at 70fps completely maxed out at 5760x1080. What I am asking is it should be around 90fps on high settings? I don't think one Titan right now would do well with three asus vg248qe monitors. Basically I am asking if I can hook up 3 of these monitors to one gtx 690 and still get an awesome experience


I understand, in my opinion one 690, one Titan, or two 680s is not going to give what most people would call an awesome experience at 5760x1080. I think that would take two Titans. But if you are willing to turn down all the settings then 690 will be able to handle it. I can't tell you what exactly your FPS will be though since it will be dependent on what game and how far your turn down each graphics setting. I game on a single monitor at 2560x1600 and the 690 is great for me. I did quick test with my three work monitors, and while it worked fine, my eyes got a little tired of the bezels and fish eye effect, and I like being able to up the settings on my single 30 inch.

Someone else here might run triple monitors with a single 690, they would be able to give you more specific info.


----------



## FiShBuRn

Quote:


> Originally Posted by *Arizonian*
> 
> Nice FiShBuRn.
> 
> 
> 
> 
> 
> 
> 
> If you over clock it post back how much you were able to squeeze.


1228/7100 - is the max stable i can get


----------



## BENSON519

Quote:


> Originally Posted by *justanoldman*
> 
> I understand, in my opinion one 690, one Titan, or two 680s is not going to give what most people would call an awesome experience at 5760x1080. I think that would take two Titans. But if you are willing to turn down all the settings then 690 will be able to handle it. I can't tell you what exactly your FPS will be though since it will be dependent on what game and how far your turn down each graphics setting. I game on a single monitor at 2560x1600 and the 690 is great for me. I did quick test with my three work monitors, and while it worked fine, my eyes got a little tired of the bezels and fish eye effect, and I like being able to up the settings on my single 30 inch.
> 
> Someone else here might run triple monitors with a single 690, they would be able to give you more specific info.


Until December 10th, 2012, I was playing battlefield 3 on Xbox so anything on pc is awesome to me. Lol. I cannot believe the difference. If I can get 3 monitors to do the same as one gtx 660 on 1 monitor, I would be as happy as a tornado in a trailer park! Lol


----------



## wermad

Quote:


> Originally Posted by *BENSON519*
> 
> Until December 10th, 2012, I was playing battlefield 3 on Xbox so anything on pc is awesome to me. Lol. I cannot believe the difference. If I can get 3 monitors to do the same as one gtx 660 on 1 monitor, I would be as happy as a tornado in a trailer park! Lol


Two (aka quad sli) are not that great due to the poor scaling. Three way is still the honey hole for the best investment. 2gb is still pretty solid for Surround. You just might have to tone down the msaa a bit to help frames. There's guys in the Surround club running three 2560x1440s monitors with two or thee Keplers. So a couple of them in 1080/1200 Surround will do fine. So to answer your question, its a great choice w/ the 690. But don't rule out the 670 or 680. With the later you can go triple (albeit your mb will need to support 3-way, but there's a ton of older and cheap mb out there for fix that). Though for three cards, water is highly recommend. I love the 690s cooler, though I never really used it personally, a lot of folks say its pretty darn good. So more pros for the 690, especially if you're going with an air cooled setup.








gents and ladies, sold my 690s to new homes. moving on to different pastures


----------



## BENSON519

I
Quote:


> Originally Posted by *wermad*
> 
> Two (aka quad sli) are not that great due to the poor scaling. Three way is still the honey hole for the best investment. 2gb is still pretty solid for Surround. You just might have to tone down the msaa a bit to help frames. There's guys in the Surround club running three 2560x1440s monitors with two or thee Keplers. So a couple of them in 1080/1200 Surround will do fine. So to answer your question, its a great choice w/ the 690. But don't rule out the 670 or 680. With the later you can go triple (albeit your mb will need to support 3-way, but there's a ton of older and cheap mb out there for fix that). Though for three cards, water is highly recommend. I love the 690s cooler, though I never really used it personally, a lot of folks say its pretty darn good. So more pros for the 690, especially if you're going with an air cooled setup.
> 
> Thank you for the advice. Greatly appreciated. Now I have lots of options to look at
> 
> 
> 
> 
> 
> 
> 
> 
> gents and ladies, sold my 690s to new homes. moving on to different pastures


----------



## Tmzasz

EVGA with EK Waterblock


----------



## justanoldman

Quote:


> Originally Posted by *Tmzasz*
> 
> EVGA with EK Waterblock


If you are stable with 4.5 at around 1.2v that is not a bad chip. Good candidate for delidding and taking it up to 4.8 or so if you are so inclined. Have you tried to oc the 690 yet?


----------



## Tmzasz

well ive got a mediocre oc on the 690 now but might be able to take it further i just need a good stress test for it

Powertarget is at 115%
GPU offset is at 159 mhz
Mem offset is at 465 mhz


----------



## justanoldman

Quote:


> Originally Posted by *Tmzasz*
> 
> well ive got a mediocre oc on the 690 now but might be able to take it further i just need a good stress test for it
> 
> Powertarget is at 115%
> GPU offset is at 159 mhz
> Mem offset is at 465 mhz


+159 is hardly mediocre that it rather high actually. Most 690s are in the +100-150 range, but usually can go a little higher on the memory as in the 400-700 range. You can raise the power target all the way up, it will just allow it to go as high as the card needs.

Try out Valley 1.0 and see what score you can get, a few of us are listed on the chart.
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form-single-and-multi-monitors


----------



## Tmzasz

Theres a screenie of the results of valley after about 15 tiral and error runs i managed to get the settings dialed in as follows

Power Target 135%
GPU Offset +160
MEM offset +480 ( any higher fps drop but increasing the GPU offset any higher unstable )

max temp during bench was
GPU 1 56 Deg C
GPU 2 53 Deg C

they both idle at 40-41 Deg C


----------



## egotrippin

Quote:


> Originally Posted by *Tmzasz*
> 
> 
> Theres a screenie of the results of valley after about 15 tiral and error runs i managed to get the settings dialed in as follows
> 
> Power Target 135%
> GPU Offset +160
> MEM offset +480 ( any higher fps drop but increasing the GPU offset any higher unstable )
> 
> max temp during bench was
> GPU 1 56 Deg C
> GPU 2 53 Deg C
> 
> they both idle at 40-41 Deg C


That's awesome and I'm jealous. I can't get my GPU clock over 125 (thankfully I haven't needed to). Now all you have to do is put all your chips under water and you won't even be able to tell they are running ;-)


----------



## egotrippin

I haven't posted in here in a while but I have tid bit of a new perspective after recently switching from water cooling to air cooling (temporarily).

I recently had to take my waterblocks off because I sold my old case + rads without buying a new one first so I'm running on air for the first time and I don't like it one bit. Not only is there the added noise but I noticed the smell around the computer changes. Now that I have fans are angrily rushing air over blocks that are running in the 50c-60c range I can detect the char-grilled dust particles floating up my nose.

I never considered air vs water cooling would affect the air quality in my room but it does and I can't wait to switch back soon. When my blocks are properly drowned I can't hear OR smell or otherwise detect their presence in the room but on air, it's a nasty little air and noise polluting monster that growls at me every time I try to play with it.

That said, errrrrbody get wet!


----------



## Tmzasz

Yea it's got an ek full cover block on it that's why the temps are as low as they are not sure I can push it any faster and advice on how to proceed is welcome ( I come from a 460 SLI setup and before that a 8800GTS 320 that even under water could double as a hot plate so my gpu oc knowledge is a bit rusty)


----------



## egotrippin

Quote:


> Originally Posted by *Tmzasz*
> 
> Yea it's got an ek full cover block on it that's why the temps are as low as they are not sure I can push it any faster and advice on how to proceed is welcome ( I come from a 460 SLI setup and before that a 8800GTS 320 that even under water could double as a hot plate so my gpu oc knowledge is a bit rusty)


My bad! I don't have any clue if a higher overclock would be possible if you lowered your temps more. I'm not sure when temp becomes a factor in stability in the general range that we are speaking of but I do know that when benching or gaming my GPU temps are generally 40c +- 3c but hellofalotta good it does me because your overclock is blowing mine away.


----------



## Cimimonsti

I saved every penny.


----------



## justanoldman

Quote:


> Originally Posted by *Tmzasz*
> 
> Theres a screenie of the results of valley after about 15 tiral and error runs i managed to get the settings dialed in as follows Power Target 135% GPU Offset +160 MEM offset +480 ( any higher fps drop but increasing the GPU offset any higher unstable ) max temp during bench was GPU 1 56 Deg C GPU 2 53 Deg C they both idle at 40-41 Deg C[/QUOTE]
> Here are some hints when trying to get a good benchmark score:
> Nvidia control panel things to try: multi display to single display performance, power management to max performance, texture filtering quality to high performance, vsync to off. Then switch to windows basic theme and turn off any background things you can. Don't have any monitoring software running, and only have one monitor plugged in if you have more than one.
> Quote:
> [QUOTE]Originally Posted by [B]Cimimonsti[/B] [URL=https://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/4320#post_19808846][IMG alt="View Post"]https://www.overclock.net/img/forum/go_quote.gif[/URL]
> 
> I saved every penny.


Congrats! It is a nice card, hope you enjoy it.


----------



## PCModderMike

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> gents and ladies, sold my 690s to new homes. moving on to different pastures


That didn't take very long....what's next?


----------



## uio77

Hello Guys :

I finally saved enough to gift myself this wonderful card. But before order it, I have a question for all the GTX 690 owners. Now that AMD will finally release its "mighty" 7990 for a rumored price of $850; do you guys think, Nvidia will lower the price of the 690 to match competition? I can wait a couple of weeks to pull the trigger. Any suggestion??


----------



## Lagpirate

I honestly doubt that nv will lower the price on the 690. Nv cards really hold their value, the 580 is still selling brand new at damn near list price. If I were you I would find a nice used 690 and save. I picked up a 2 week old 690 for 800 bucks on ocn. Just keep an eye out


----------



## zalbard

Quote:


> Originally Posted by *uio77*
> 
> Hello Guys :
> 
> I finally saved enough to gift myself this wonderful card. But before order it, I have a question for all the GTX 690 owners. Now that AMD will finally release its "mighty" 7990 for a rumored price of $850; do you guys think, Nvidia will lower the price of the 690 to match competition? I can wait a couple of weeks to pull the trigger. Any suggestion??


They won't.

And with the current driver issues I would stay away from 7990.


----------



## uio77

Thanks for the suggestions. I am aiming a couple of "like new" in Amz. If is not any answer from Nvidia by Friday; I will go ahead and order one..

I have no plans to get a 7990. Crossfire has way to many problems.

I will post pictures by next week, to see if you let me join the club..


----------



## Tmzasz

Personaly i havent owned an ati card since i was given a 9500 pro i just havent ever liked ati enough to try the newer ones ( that and that 9500 pro kept overheating playing last chaos lol ). Now in notebooks and such im weird i wont touch a nvidia notebook chip ( had one catch on fire due to that defect a few years back and hp tried to say it was my fault ) and with AMD's latest marketing on their FX chips saying its native or true 8 core when its 4 cores doing the same dual threading as intel well AMD wont ever see me give them another penny in the desktop market LOL


----------



## Qu1ckset

Both sides sell good products and what ever side has the best product I will buy both nvidia and amd have come along way with smoothness but for now I'm sticking to nvidia, i will be selling my GTX690 next month, to grab a titan or wait and see if a dual titan comes out..

Before I was mostly using AMD cards.. 2x5770s > 2x5970s > 2x 580s > hd6990 > current gtx690


----------



## Tmzasz

hmmm a Titan Dualie would be sick imo ) and im seriously partial to green and blue guess thats another reason i went intel/nvidia LOL


----------



## egotrippin

I'm amazed by all the kind and tolerant comments about AMD in here. You can go onto any forum on the internet and come across heated debates about AMD vs Nvidia and fanboys are foaming at the mouth on both sides, attacking each other and the rival brand with such vehement criticism. The truth is they both have awesome products that fulfill market needs.

If I was a straight value oriented shopper (cheap), looking to mine bitcoins (gold dig), or use certain compute functions (hacking) I'd go with an AMD setup but for gaming (entertainment), folding (saving the world), or aesthetics (is it wrong to love something because it's beautiful?) I like Nvidia. =]


----------



## Tmzasz

ok ive identified my cards max stable clock settings after numerous tests through valley and oc scanner ( real world samples to come soon







)

Power Target 135%
GPU Offset 164 (165 crashes 20 seconds in )
MEM Offset 480 (oddly 479 is hit and miss and 481 + it takes a crap on itself...)

Max recorded temp over 1hr running valley in loop mode was 57Deg on GPU 1 and 54 on GPU 2


----------



## PinzaC55

Its interesting to see how the price of GTX 690's is dropping on ebay. A used 690 with no packaging or accessories just sold for £460 and a faulty ASUS GTX 690 sold for £255 whereas a similar card in the same condition by the same seller sold for £411 a couple of months ago.


----------



## wermad

Quote:


> Originally Posted by *PinzaC55*
> 
> Its interesting to see how the price of GTX 690's is dropping on ebay. A used 690 with no packaging or accessories just sold for £460 and a faulty ASUS GTX 690 sold for £255 whereas a similar card in the same condition by the same seller sold for £411 a couple of months ago.


No new gen cards (other then Titan) is the reason. In other words, its still very desirable. I sold my old 690s for what they're typically going in the forums and ebay. Even though Titan is a great piece of hardware, it really still has some stronger points compared to a 690 or sli 680s, especially in a single screen setup. Plus, the 690 rocks any single screen (not sure on the 4k screens though) and it really is the epitome for single card w/ single screen setups. I will miss em 690s









Rumor has it the new gen, including Amd's, will not be out until this fall. Late fall to early winter is when you might see a drop in price for the 6xx series. From what I've gathered over the years, it may be ~10-20% drop during that time, expect a big drop by middle of next year. Two years after Fermi 5xx series, they're selling for about 50-40% of their launch msrp.


----------



## PinzaC55

Being the owner of a 690 I hope they maintain their price. On the other hand the falling price of used ones makes it more attractive to consider Quad SLI. Its a double edged sword! BTW I have my Android phone set up to alert me of ebay listings of 690's and when someone lists one with a "Buy It Now" price of about £550 it disappears after a couple of hours.


----------



## uio77

I just read two reviews of the AMD 7990 and the GTX 690 is still the king of the dual processor cards. This just pushed me to buy it already. Arriving on Friday....I will post pictures...


----------



## PinzaC55

Quote:


> Originally Posted by *uio77*
> 
> I just read two reviews of the AMD 7990 and the GTX 690 is still the king of the dual processor cards. This just pushed me to buy it already. Arriving on Friday....I will post pictures...


Even if they were equal the GTX 690 would still win because of the gargantuan size of the 7990 and THREE fans FFS! It will sound like a hovercraft at full revs http://www.pcworld.com/article/2032177/amd-provides-a-sneak-peek-at-its-radeon-hd-7990.html


----------



## Qu1ckset

Quote:


> Originally Posted by *PinzaC55*
> 
> Even if they were equal the GTX 690 would still win because of the gargantuan size of the 7990 and THREE fans FFS! It will sound like a hovercraft at full revs http://www.pcworld.com/article/2032177/amd-provides-a-sneak-peek-at-its-radeon-hd-7990.html


That's what waterblocks are for!


----------



## mightyphoenix

New member to the club


----------



## Viewer3

Quote:


> Originally Posted by *mightyphoenix*
> 
> New member to the club


That's a lot of horsepower you've got there.

On an unrelated note, I just noticed that the EVGA GTX 690 on Newegg has now gone to "DISCONTINUED" status...

The ASUS version is still listed as for sale, though. I've seen the 690 go "out of stock" every now and then on Newegg, but I've never seen it given the status of "discontinued". Does anyone think this is just a mistake (which could very well be corrected by the time some of you actually get to read this), or could the GTX 690 really be on its way out in favor of the Titan and/or dual Titan sometimes in the future?


----------



## Arizonian

Quote:


> Originally Posted by *Viewer3*
> 
> That's a lot of horsepower you've got there.
> 
> On an unrelated note, I just noticed that the EVGA GTX 690 on Newegg has now gone to "DISCONTINUED" status...
> 
> The ASUS version is still listed as for sale, though. I've seen the 690 go "out of stock" every now and then on Newegg, but I've never seen it given the status of "discontinued". Does anyone think this is just a mistake, or could the GTX 690 really be on its way out in favor of the Titan and/or dual Titan sometimes in the future?


Discontinued for Newegg just means that the product is not in stock, they don't know when it'll be back in stock. When they don't have a date with the next shipment after little while it will go to the discontinued status. I've seen many items that showed discontinued come back in stock.


----------



## Buzzkill

Quote:


> Originally Posted by *Viewer3*
> 
> That's a lot of horsepower you've got there.
> 
> On an unrelated note, I just noticed that the EVGA GTX 690 on Newegg has now gone to "DISCONTINUED" status...
> 
> The ASUS version is still listed as for sale, though. I've seen the 690 go "out of stock" every now and then on Newegg, but I've never seen it given the status of "discontinued". Does anyone think this is just a mistake (which could very well be corrected by the time some of you actually get to read this), or could the GTX 690 really be on its way out in favor of the Titan and/or dual Titan sometimes in the future?


Newegg list out of stock items as Discontinued.


----------



## Tmzasz

yea newegg likes the extremist approach to inventory maintenance LOL now frozen at least differentiates them out of stock and out of stock order-able LOL


----------



## TheBlindOne

hope this pic is good enough to join the club just love this evga 690's


----------



## Cimimonsti

I have that waterblock. Can't say the same for a quad sli, liquid cooled. =(

Awesome setup.


----------



## Cylas

Lars Weinand, Sr. Technical Marketing Manager from Nvidia said in another forum, that Nvidia now allows manufacturers to have control over the "voltage control features", for the 690 Series. Is this the answer for the new HD7990


----------



## Viewer3

Quote:


> Originally Posted by *Buzzkill*
> 
> Newegg list out of stock items as Discontinued.


Not true. In fact I'm looking at a few "Out of Stock" video cards on Newegg as we speak, none of them have the red "Discontinued" label. They simply say "OUT OF STOCK" near the top of the page.

Though it looks like the GTX 690 was corrected, as I figured it would be. It's now back in stock. =]


----------



## noob.deagle

so im not too happy with my GTX690 temps even at stock it easily hits 80c (all stock fan profile), and on an aggressive fan profile i still hit at least 76c in most games.

what are peoples opinions on replacing the GTX690 TIM ?, i assume if you open it and just replace the TIM you can just close it again with no need to replace the thermal pads ? never done it on a GPU before not sure if its worth doing or not.


----------



## PCModderMike

It's definitely an option, but you won't notice a huge difference changing out the stock TIM. Also if you remove the cooler carefully and do not damage the thermal pads, then yes you can still use them when putting the card back together. 80 Celsius is really not that bad for an air cooled 690. Before I water cooled mine, I averaged around 85C in my small FT03 during intense gaming.


----------



## wermad

There's a few 690 blocks for ~$100 or less. Good time to make the jump to water. Idles ~20-25c and loads 35-40c









All righty guys, 'preciate all the help and I'm glad to have contributed to this club. I'm officially no more 690, thus...


----------



## PCModderMike

Goodbye werm....enjoy your Titans


----------



## Qu1ckset

You can remove me off the list, just sold my 690, what to get next..









Bye Bye!!


*tube was there so it didnt leak, as water was still in the block*


----------



## egotrippin

Quote:


> Originally Posted by *noob.deagle*
> 
> so im not too happy with my GTX690 temps even at stock it easily hits 80c (all stock fan profile), and on an aggressive fan profile i still hit at least 76c in most games.
> 
> what are peoples opinions on replacing the GTX690 TIM ?, i assume if you open it and just replace the TIM you can just close it again with no need to replace the thermal pads ? never done it on a GPU before not sure if its worth doing or not.


I replaced the thermal paste, only because I switched to waterblocks and then back to air... and I'm about to go to water again. I don't know if it made a difference from factory TIM since I didn't use it before getting a waterblock. I can say that it is very easy to take it apart and put it back together. Just make sure you have a torx (star shaped) screwdriver


----------



## noob.deagle

Quote:


> Originally Posted by *egotrippin*
> 
> I replaced the thermal paste, only because I switched to waterblocks and then back to air... and I'm about to go to water again. I don't know if it made a difference from factory TIM since I didn't use it before getting a waterblock. I can say that it is very easy to take it apart and put it back together. Just make sure you have a torx (star shaped) screwdriver


im very tempted, based on other places ive read people have had good results changing it, but paying $1000 for a card and then opening scares me a little, i'm not sure if ASUS warranty will still be applied if you do something like this unlike EVGA; guess ill look into warranties first.


----------



## maximus56

Quote:


> Originally Posted by *Qu1ckset*
> 
> You can remove me off the list, just sold my 690, what to get next..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bye Bye!!
> 
> 
> *tube was there so it didnt leak, as water was still in the block*


why not jump on the Titan bandwagon, like everyone else


----------



## Qu1ckset

Quote:


> Originally Posted by *maximus56*
> 
> why not jump on the Titan bandwagon, like everyone else


Thinking about getting a titan, but if they make the titan le into the gtx780 with 5gb vram, we could see a gtx790 which would be ubër card!!


----------



## maximus56

Quote:


> Originally Posted by *Qu1ckset*
> 
> Thinking about getting a titan, but if they make the titan le into the gtx780 with 5gb vram, we could see a gtx790 which would be ubër card!!


" ubër" is an interesting adjective..lol


----------



## uio77

Do any of you have idea where to find the backplate for the EVGA 690??


----------



## wholeeo

Quote:


> Originally Posted by *uio77*
> 
> Do any of you have idea where to find the backplate for the EVGA 690??


evga.com 

frozencpu also has a few different types.


----------



## konigsberg7

Does anyone ever crash in the way where you screen goes black and your whole computer stops responding where you have to restart your computer?

This happened to me in Tomb Raider and once in Bioshock Infinite... I remember in Tomb Raider (until newer drivers were released) the game would crash a lot for everyone and I would get the scenario above (where I'd have to restart my computer).... in that time I found out if I went to "Manage 3D Settings" and put it "Single GPU" for Multi-GPU rendering mode --- it would give me a TDR error and just crash to the desktop and I wouldnt have to restart.

Any oponions on this?

It only happens rarely. I've been able to doFurmark benchmarks and Unigine Heaven engine benchamrks and it doens't crash at all.

I don't OC. I have a 850 watt PSU (Corsair 850HX)-- I just have ONE gtx 690


----------



## Paps.pt

I've seen some users with 690s in SLI. Is that really worth it?


----------



## maximus56

Quote:


> Originally Posted by *Paps.pt*
> 
> I've seen some users with 690s in SLI. Is that really worth it?


Some people think that it's not. I was pretty happy with it in main rig. Now I am in the process of putting these in a secondary rig/rigs.


----------



## uio77

Thanks for the links...









Sadly, EVGA does not have is stock.

All the plates on frozencpu are for water blocks. I am on air..


----------



## egotrippin

Quote:


> Originally Posted by *noob.deagle*
> 
> im very tempted, based on other places ive read people have had good results changing it, but paying $1000 for a card and then opening scares me a little, i'm not sure if ASUS warranty will still be applied if you do something like this unlike EVGA; guess ill look into warranties first.


Try taking apart your $1000 card, and then hooking up a waterblock and then putting that card underneath another waterblock and a radiator and reservoir creating about a dozen points of failure that could send your 690 for a swim! If you're going to take the block off, don't do it just for TIM. Go aqua-fantastic with liquid cooling or if you're feeling like instead adopting a half measure that doesn't cool as well as water but still has the added benefit of defacing the beauty of the card, check out this monstrosity ARCTIC Accelero Twin Turbo 690 VGA Cooler I hear it works pretty well.


----------



## kzinti1

I was within one click of buying another GTX 690 this morning. Then I remembered the backplate.
I can't find one. Except at EVGA Europe. Oh, and one on Amazon for $60 or so, just for this $20 plate.
Then I figured I'd just buy another Titan, instead. But I'd have to buy another waterblock like the one I'm using now (XSPC) and it only has a 4 slot and a 6 slot water bridge available.
I only need a 3 slot bridge.
No SLi bridge plus having to buy another plate and waterblock for a new Titan.
It looks like the best thing I can do is put these latest cards I already own back in their boxes and mount the pair of GTX 590's I already have.
Or buy another GTX 680 and run tri-SLi.
I'm wanting something really exceptional to mount in my new Corsair 900D.
I can't figure EVGA not having GTX 690 backplates in stock. They already knew that there would be a run on the 690's when the Titan came out.
EVGA has been disappointing quite a few people lately. They need to get their act together.
EVGA should have put backplates on each of their videocards above $500.
When they cheap out on adding a necessary protection as is a backplate, I begin worrying about what other cuts they have made during the manufacturing process and also the abolishment of their lifetime warranty. They've become quite greedy lately, or something else is going on.
Their motherboards haven't been very worthwhile for some time now.
I'm wondering if they're looking at some upcoming hostile takeover and are just making all the money they can for now.


----------



## uio77

Just installed the GTX 690 and is crashing my games. First, frame rate started to drop, image froze, then black screen. When it recovers (if it does), audio and image is not synced. And finally completely crashed the game. Happened to me in Bioshock Infinite, Tomb Raider, Batman Arkam City. After the game crashed; Steam is not responding neither and I have to restart windows to make it work. I tried with a movie and works fine.
I have the Beta drivers 320.00 that I installed after completely uninstall the previous drivers and first attached the card into the mobo.
I run the stress program from EVGA and showed no problems.
Any idea on what should I do??


----------



## wholeeo

Might sound unrelated but try clearing / updating bios. It's helped me in the past when I've had strange GPU issues.


----------



## uio77

Quote:


> Originally Posted by *wholeeo*
> 
> Might sound unrelated but try clearing / updating bios. It's helped me in the past when I've had strange GPU issues.


Thanks.. Let me try cleaning the PCI slot and the card too. I have the latest bios installed.


----------



## Lagpirate

Also it just might be the beta drivers crashing your stuff out. I would try reverting to previous drivers and seein if the problem persists.


----------



## uio77

Quote:


> Originally Posted by *Lagpirate*
> 
> Also it just might be the beta drivers crashing your stuff out. I would try reverting to previous drivers and seein if the problem persists.


Sounds good. Let me try that too.. I'll be back with the results.


----------



## uio77

Quote:


> Originally Posted by *uio77*
> 
> Sounds good. Let me try that too.. I'll be back with the results.


No luck. Sending the card for RMA. I hope EVGA will process it fast...


----------



## uio77

Claptrap asks if we can join the club..


----------



## tonyjones

I can't decide between the 690 and The Titan


----------



## Qu1ckset

Quote:


> Originally Posted by *tonyjones*
> 
> I can't decide between the 690 and The Titan


If your gaming on a single 1080p id go with the Titan, its a beast, has more vram and you don't have to worry about SLI profiles for new games, i had a GTX690 with my single 1440p monitor and was a beast, i sold it because im switching and switching to SLI Titans.. but ya for 1080p id go for Titan, for a single 1440p or 1600p monitor id go for a GTX690, hope that helps you


----------



## pilla99

Quote:


> Originally Posted by *tonyjones*
> 
> I can't decide between the 690 and The Titan


If you're using a single monitor go 690. Better fps and you can find them cheaper. Unless you plan on going for another 1k purchase for sli, then go Titan.
Rumor is 780's dropping soon. I would wait it out till may.


----------



## Qu1ckset

Quote:


> Originally Posted by *pilla99*
> 
> If you're using a single monitor go 690. Better fps and you can find them cheaper. Unless you plan on going for another 1k purchase for sli, then go Titan.
> Rumor is 780's dropping soon. I would wait it out till may.


"if buying new" and using a 1080p monitor id go titan it can get really high fps in any game at that rez plus has extra vram for future where the 690 does not, plus you dont have to worry about SLI, when using a 1440p or 1600p i agree the gtx690 is a better bargain as the 690 gets about 15-20fps more in most games..

if you don't care about buying used, id 100% go for GTX690, im seeing them as low as $700USD used


----------



## kzinti1

I finally decided against a second GTX 690. There are too many people reporting it doesn't run very well in quad-SLi.
I have another Titan on the way.
I did find out that Dead Island Riptide runs much better with the 690.
But the Titan has beat the 690 with every benchmark I've thrown at it. I haven't a clue as to why.
Any ideas?
690 = 2348 = http://www.3dmark.com/fs/409912
Titan = 5107 = http://www.3dmark.com/fs/332963


----------



## justanoldman

Quote:


> Originally Posted by *kzinti1*
> 
> I finally decided against a second GTX 690. There are too many people reporting it doesn't run very well in quad-SLi.
> I have another Titan on the way.
> I did find out that Dead Island Riptide runs much better with the 690.
> But the Titan has beat the 690 with every benchmark I've thrown at it. I haven't a clue as to why.
> Any ideas?
> 690 = 2348 = http://www.3dmark.com/fs/409912
> Titan = 5107 = http://www.3dmark.com/fs/332963


From what I have seen Firestrike gives completely wrong scores sometimes. Run Valley 1.0 and you will see that the 690 is 20% better in either 1080p or 1440p.

Edit: Forgot to mention for Firestrike to work with some sli you need to uninstall Microsoft update kb2670838


----------



## kzinti1

Thanks for the info!
Why does FutureMark keep selling software that doesn't work properly?
I just checked the Compare function and it's down again.
Never knew about removing the KB 2670838. Is it new?
Have you read this article about enabling SLi for MSI Kombuster? Interesting read.
http://www.geeks3d.com/20110310/tips-how-to-enable-sli-and-crossfire-support-for-msi-kombustor-2-0/


----------



## kzinti1

I removed the KB2670838 Update. According to an article I just read, this update will automatically install again as it's part of IE 10.
This is exactly why my score was cut in half.
Here's another run I just made with Firestrike Extreme: http://www.3dmark.com/fs/412651
5743 vs. 2348 yesterday.
Many people have complained about that update.
I guess I'll have to wait and see how long it'll be until it reinstalls itself.
Thanks for the help!


----------



## justanoldman

Quote:


> Originally Posted by *kzinti1*
> 
> I removed the KB2670838 Update. According to an article I just read, this update will automatically install again as it's part of IE 10.
> This is exactly why my score was cut in half.
> Here's another run I just made with Firestrike Extreme: http://www.3dmark.com/fs/412651
> 5743 vs. 2348 yesterday.
> Many people have complained about that update.
> I guess I'll have to wait and see how long it'll be until it reinstalls itself.
> Thanks for the help!


They keep saying they will update the software for this Firestrike bug, but nothing yet. Here is the thread on it:
http://community.futuremark.com/forum/showthread.php?172828-680-sli-low-score-3DMark

The Valley 1.0 thread here is a good place to see how cards compare. A single 690 beats a single Titan by around 20% when both oced in most things I have seen. The Titan has a lot more memory though if people mod games.

I would take a single 690 over Titan any day, but if I need a ton of gpu horsepower then I would get two Titans over two 690s because the 4 way sli has never received proper support. The 700s are almost out too.


----------



## kzinti1

Apparently, when I removed the KB2670838 Update, I also removed IE 10.
When I checked Updates, I had to install IE 9 and it's security updates.
The thing is, in the articles I read, that KB267038 Update was installed in their IE 9 folder when it supposedly is only to apply to IE 10.
This is the 1st time I've had trouble wit IE since before Windows XP finally arrived.
FutureMark has always only been fast when it comes to selling software that isn't ready for prime time.
It takes them forever to even acknowledge a bug, much less fix it.
I've been hoping for years that somebody would come out with a benchmarking suite as comprehensive as the FM series but far more reliable.
I'm afraid that they'll always have a monopoly on benchmarking programs.


----------



## maximus56

Quote:


> Originally Posted by *pilla99*
> 
> If you're using a single monitor go 690. Better fps and you can find them cheaper. Unless you plan on going for another 1k purchase for sli, then go Titan.
> Rumor is 780's dropping soon. I would wait it out till may.


I would take a 690 over a single Titan for this resolution any day too.
Based on all these other comments, one wonders how stupid could Nvidia be to keep calling 690 it's other flagship gpu, if one is to believe that 690 as useless as everyone makes it out to be. I have enjoyed my 690 a lot more than my Titan on any single screen, based on the gaming experience.


----------



## kzinti1

I had to reinstall that damned KB2670838 Update.
I tried to download a new version of Glary Utilities and it wouldn't download.
Tried several other download sites and they wouldn't work.
Tried downloading several different apps at different sites. No go.
But, I could still download Windows Update. Figures.
The only change I made to my computer was removing the KB2670838 Update, so I reinstalled it.
Now I can download again.
I've never really had anything against Microsoft, but this is a real pisser!
I ran Firestrike Extreme and I'm down to 2150 with this GTX 690. Again.
If you haven't installed Internet Explorer 10, then don't!
If you have the option to install KB2670838 Update separately, don't!
I guess I'll be waiting on yet another Futuremark fix. Again!
It's only been a couple of weeks since FM finally fixed the Compare function. Now, this.


----------



## TheMaddaCheeb

Well, this is my first post here on Overclock. Figured I start by getting into the club that I am proud to be a part of. My 690 is a EVGA manufactured one. I added the backplate early on and I am loving it. Works great for surround gaming, in my opinion. Runs cool and quiet in my D-Frame.


----------



## Arizonian

Quote:


> Originally Posted by *TheMaddaCheeb*
> 
> Well, this is my first post here on Overclock. Figured I start by getting into the club that I am proud to be a part of. My 690 is a EVGA manufactured one. I added the backplate early on and I am loving it. Works great for surround gaming,in my opinion. Runs cool and quite in my D-Frame.


^^^Welcome aboard to both OCN and the 690 Club TheMaddaCheeb. Congrats on the card.


----------



## PinzaC55

Quote:


> Originally Posted by *TheMaddaCheeb*
> 
> Well, this is my first post here on Overclock. Figured I start by getting into the club that I am proud to be a part of. My 690 is a EVGA manufactured one. I added the backplate early on and I am loving it. Works great for surround gaming, in my opinion. Runs cool and quiet in my D-Frame.


Welcome to the club! BTW what is that case? Never seen anything like it before.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *PinzaC55*
> 
> Welcome to the club! *BTW what is that case?* Never seen anything like it before.


I believe it's this one:

http://www.memoryexpress.com/Products/MX44665


----------



## TheMaddaCheeb

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I believe it's this one:
> 
> http://www.memoryexpress.com/Products/MX44665


Yes, you are correct. It is the D-Frame from InWin, but unlike the pictures on that site the one in my picture is the red & black one.


----------



## PinzaC55

Quote:


> Originally Posted by *TheMaddaCheeb*
> 
> Yes, you are correct. It is the D-Frame from InWin, but unlike the pictures on that site the one in my picture is the red & black one.


Pretty awesome looking "case" there. Must be hell to keep clean unless you have a dedicated computer room.


----------



## TheMaddaCheeb

Quote:


> Originally Posted by *PinzaC55*
> 
> Pretty awesome looking "case" there. Must be hell to keep clean unless you have a dedicated computer room.


We shall see, hasn't collected dust yet. I am ready with my DataVac though.


----------



## Lagpirate

That is such a sweet case...can't wait to see the orange one!


----------



## max883

http://www.3dmark.com/3dm/596508?

Is this score ok?? will i get a better score if i turn on Hypertreading ??


----------



## kzinti1

I found the Orange In-Win case at the Egg. http://www.newegg.com/Product/Product.aspx?Item=N82E16811108437&IsVirtualParent=1
If I had any room left I'd buy one. My new Corsair 900D, still on backorder, is going to care of all my spare room.
I have about 6 cases I'd like to sell but they aren't worth the shipping hassle.
OT, after uninstalling and then having to reinstall the KB2670838 Update, I've somehow corrupted my PhysX software.
PhysX won't uninstall, repair refuses to run and it won't uninstall then reinstall during a clean install of the latest WHQL, or Beta drivers.
It appears that I'm gonna have to reinstall Windows and leave IE 10 out of it to get my computer to run properly again.
If I could enter Safe Boot I think I could get rid of all traces of PhysX for a clean reinstall.
F8 just takes me to a selection of my hdd's to boot from, no mention of Safe Boot there.
SysConfig takes me to the Boot Option page, but I can't click on a single thing there and I have no idea what else to do other than a reinstall of Windows.
I spent all day Sunday and most of that night trying to get rid of PhysX. I even blew $17 on Treexy (used to be called driver sweeper) and that didn't work either.


----------



## justanoldman

Quote:


> Originally Posted by *max883*
> 
> http://www.3dmark.com/3dm/596508?
> 
> Is this score ok?? will i get a better score if i turn on Hypertreading ??


That looks like a good score to me. I think ht should help, but you will have to try it and see. Run Valley 1.0 with your max ocs and you will be able to tell how your 690 is doing.


----------



## TheMaddaCheeb

Quote:


> Originally Posted by *kzinti1*
> 
> If I could enter Safe Boot I think I could get rid of all traces of PhysX for a clean reinstall.
> F8 just takes me to a selection of my hdd's to boot from, no mention of Safe Boot there.
> SysConfig takes me to the Boot Option page, but I can't click on a single thing there and I have no idea what else to do other than a reinstall of Windows.
> I spent all day Sunday and most of that night trying to get rid of PhysX. I even blew $17 on Treexy (used to be called driver sweeper) and that didn't work either.


On some motherboards F8 at boot gives you the boot selection. When that happens try selecting your boot drive then try hitting F8 again immediately by that should bring up the windows F8 screen where you can select Safe Mode and such. Hope that helps you out.


----------



## Anu Daviau

Add me plz


----------



## kzinti1

Thanks for the help with my Safe Boot bug. I'll try it this weekend.
Now, forget the cards I'm using in the following.
They're just to show the effect of the Microsoft KB2670838 Update BUG.
One GTX Titan = 5123 http://www.3dmark.com/fs/423352
Two GTX Titans = 2370 http://www.3dmark.com/fs/423505
Even though this bug has nothing to do with FutureMark, we have to keep on them to get an update for Firestrike to fix this boner by Microsoft!
As I said earlier, when I uninstalled the Microsoft KB2670838 Update, I lost the ability to download anything from anywhere.
If you've installed this update, chances are you'll also run across some weird errors, as I did, when you try and uninstall this Update.
I've searched the problems with this Update and there are too many to list.
This bug *will* affect your GTX 690 card scores in Firestrike as it also did to mine.
All we can do is to keep urging FutureMark to come up with some kind of fix.
How has this bug affected anyone running a pair of GTX 690's in SLi?
The same?
I'm still trying to decide if I'm going to buy another GTX 690 or just wait and buy a pair of the upcoming GTX 790's.
I suppose I'll just wait and see how the 790's perform, since I have plenty of cards to play around with for now.


----------



## konigsberg7

*PCI-E SLOT OR GPU OR NORMAL?*

Checkout this image:



Even if I do the stress test to the right, where that question mark is, it will start at 3.0 and then it will just go back to 1.1 (in the stress test itself). Do you think my other GPU is messed up or is it the PCI-E slot. Anyway I've been having troubles with this card so I sent in for an RMA and I'm just hoping it's not my MOBO.

Reason for RMA is that sometimes when people normally TDR, my screen goes black instead and I have to restart my whole computer UNLESS I set it to "Single-GPU" under Multi-GPU Rendering mode -- then it TDRs and crashes to desktop.

I also noticed when I had it at Force Alternate Frame Rendering 1, my Google Chrome flickered black. However, when I took out the GPU to get the S/N for the RMA and then put it back in (I blew on the slot and the GPU), it seems to be fixed. Haven't tested the black screen TDR yet.

*So what's your oponion? I think it's hard to break a PCI slot. I don't get Blue Screens or anything. OR perhaps that 1.1 is normal?*


----------



## Masta Squidge

Quote:


> Originally Posted by *konigsberg7*
> 
> *PCI-E SLOT OR GPU OR NORMAL?*
> 
> Checkout this image:
> 
> 
> 
> Even if I do the stress test to the right, where that question mark is, it will start at 3.0 and then it will just go back to 1.1 (in the stress test itself). Do you think my other GPU is messed up or is it the PCI-E slot. Anyway I've been having troubles with this card so I sent in for an RMA and I'm just hoping it's not my MOBO.
> 
> Reason for RMA is that sometimes when people normally TDR, my screen goes black instead and I have to restart my whole computer UNLESS I set it to "Single-GPU" under Multi-GPU Rendering mode -- then it TDRs and crashes to desktop.
> 
> I also noticed when I had it at Force Alternate Frame Rendering 1, my Google Chrome flickered black. However, when I took out the GPU to get the S/N for the RMA and then put it back in (I blew on the slot and the GPU), it seems to be fixed. Haven't tested the black screen TDR yet.
> 
> *So what's your oponion? I think it's hard to break a PCI slot. I don't get Blue Screens or anything. OR perhaps that 1.1 is normal?*


I had to update my mobo bios to fix that with my Titan. But mine defaults to 2.0 I believe as a result of having a second card installed.

Prior to the bios update I was seeing all kinds of different listings of 1.1


----------



## konigsberg7

I have latest everything for updates. I don't know either -- you have the Titan on SLI, seeimingly and the GTX 690 is 2 GPUs on 1 PCI slot (so it may be different)

NVM I was mistaken. If I set it to Force Alternate Frame Rendering 1 under Multi-Gpu Rendering Mode under Manage 3D settings my google chrome will still flicker black. I just forgot to close google chrome all the way. Also, I have to have one display active since I put both my monitors on one GPU so I can clone them.


----------



## kzinti1

Quote:


> Originally Posted by *TheMaddaCheeb*
> 
> On some motherboards F8 at boot gives you the boot selection. When that happens try selecting your boot drive then try hitting F8 again immediately by that should bring up the windows F8 screen where you can select Safe Mode and such. Hope that helps you out.


It worked! I finally got rid of PhysX, reinstalled it, and now it's recognized by GPU-Z.
I wish that I could give you a +10, but OCN says that you'll have to settle for a +1 and my sincere thanks.
It worked so easily it was a little scary and it did all I've been trying to do almost instantly.


----------



## Captain Lolburger

Quote:


> Originally Posted by *konigsberg7*
> 
> *PCI-E SLOT OR GPU OR NORMAL?*
> 
> Checkout this image:
> 
> 
> 
> Even if I do the stress test to the right, where that question mark is, it will start at 3.0 and then it will just go back to 1.1 (in the stress test itself). Do you think my other GPU is messed up or is it the PCI-E slot. Anyway I've been having troubles with this card so I sent in for an RMA and I'm just hoping it's not my MOBO.
> 
> Reason for RMA is that sometimes when people normally TDR, my screen goes black instead and I have to restart my whole computer UNLESS I set it to "Single-GPU" under Multi-GPU Rendering mode -- then it TDRs and crashes to desktop.
> 
> I also noticed when I had it at Force Alternate Frame Rendering 1, my Google Chrome flickered black. However, when I took out the GPU to get the S/N for the RMA and then put it back in (I blew on the slot and the GPU), it seems to be fixed. Haven't tested the black screen TDR yet.
> 
> *So what's your oponion? I think it's hard to break a PCI slot. I don't get Blue Screens or anything. OR perhaps that 1.1 is normal?*


I know what the problem is here, and there is actually nothing wrong with your 690. I'll take a guess and say that your machine was idling or not running any graphic-intensive applications when you probed your card with GPU-Z, right? The GTX 690 allows bandwidth throttling, so when you don't need maximum performance, it will switch down to PCIe 1.1 bandwidth.

Fire up a game on max settings, then Alt-Tab back to your desktop and open GPU-Z. It should register as 3.0 then.


----------



## max883

I get 11.540P witout Hypertreading With my GTX 690! With Hypertreading i get 12.360P

http://www.3dmark.com/3dm/614159?

Is that score ok ??


----------



## justanoldman

Quote:


> Originally Posted by *max883*
> 
> I get 11.540P witout Hypertreading With my GTX 690! With Hypertreading i get 12.360P
> 
> http://www.3dmark.com/3dm/614159?
> 
> Is that score ok ??


Run Valley 1.0 with both chip and 690 max overclocks and I can tell you better how your 690 is doing. 3dmark is about the 690, the chip, and the ram but based on your 11.4k firestrike I would say that is a pretty good score.


----------



## max883

Unigine valley v1.0

FPS. 94
Score: 4024
Min FPS: 40.6
Max FPS:179.4

http://www.3dmark.com/3dm/618647

im thinking of overclocking my GTX 690!


----------



## justanoldman

Quote:


> Originally Posted by *max883*
> 
> Unigine valley v1.0
> 
> FPS. 94
> Score: 4024
> Min FPS: 40.6
> Max FPS:179.4
> 
> http://www.3dmark.com/3dm/618647
> 
> im thinking of overclocking my GTX 690!


???
You already oced your 690 to get those scores.
Your Firestrike score is the best ever. What core and memory offset were you using to get that?

But your Valley score is too low compared to your Firestrike score. Did you oc the card for Firestrike more than Valley?


----------



## max883

Unigine valley v1.0

FPS. 94
Score: 4024
Min FPS: 40.6
Max FPS:179.4

i used Extreme HD.

i dont no wy it is so low Maybe if overclock my card i will get better score??


----------



## mtbiker033

Quote:


> Originally Posted by *justanoldman*
> 
> That looks like a good score to me. I think ht should help, but you will have to try it and see. Run Valley 1.0 with your max ocs and you will be able to tell how your 690 is doing.


yeah good score, very close to mine, actually only 1 point difference from mine:

http://www.3dmark.com/fs/386439
Quote:


> Originally Posted by *justanoldman*
> 
> ???
> You already oced your 690 to get those scores.
> Your Firestrike score is the best ever. What core and memory offset were you using to get that?
> 
> But your Valley score is too low compared to your Firestrike score. Did you oc the card for Firestrike more than Valley?


dude awesome score! what was your OC settings? watercooled?

P.S. I still love my 690!


----------



## justanoldman

Quote:


> Originally Posted by *max883*
> 
> Unigine valley v1.0
> FPS. 94
> Score: 4024
> Min FPS: 40.6
> Max FPS:179.4
> i used Extreme HD.
> i dont no wy it is so low Maybe if overclock my card i will get better score??


ExtremeHD is what you should be using, but hit F12 when the score screen comes up. Then go into C drive, Users, your user folder, and you will see a folder called Valley. In it is a screen shot of your score. Post that screen shot here so I can see the all numbers. I don't think it is possible to get 94 fps without overclocking your 690, but post the screen shot and I can tell better.


----------



## max883

http://www.3dmark.com/3dm/6186493 13.625P

Overclocked my GTX 690 to 1340.mhz gpu 7000.mhz mem

is that score ok??

Unigine valley v1.0

FPS. 105
Score: 4324
Min FPS: 43.1
Max FPS:191.7

Extreme HD.


----------



## justanoldman

Max883,
I think your 3mark link is not right, that is showing something else besides a 690.

Post your Valley screen shot. Because that is not just a good score, it is an unbelievably high score, so I would like to see the screen shot.


----------



## max883

ok i will put it upp. just give me 20.min


----------



## mtbiker033

Quote:


> Originally Posted by *max883*
> 
> http://www.3dmark.com/3dm/6186493 13.625P
> 
> Overclocked my GTX 690 to 1340.mhz gpu 7000.mhz mem
> 
> is that score ok??
> 
> Unigine valley v1.0
> 
> FPS. 105
> Score: 4324
> Min FPS: 43.1
> Max FPS:191.7
> 
> Extreme HD.


Quote:


> Originally Posted by *justanoldman*
> 
> Max883,
> I think your 3mark link is not right, that is showing something else besides a 690.
> 
> Post your Valley screen shot. Because that is not just a good score, it is an unbelievably high score, so I would like to see the screen shot.


amazing if real, 1340mhz??


----------



## max883

went Down to 1300.Mhz GPU and 7000.Mhz ram Stable. Moded Bios

http://www.3dmark.com/3dm/618647


----------



## justanoldman

Thanks, but we are looking for the F12 screen shot in your Valley folder. It looks like this:


See if you can do a run closer to that 105 fps, you would have the best 690 the world has ever seen.


----------



## mtbiker033

Quote:


> Originally Posted by *max883*
> 
> 
> 
> went Down to 1300.Mhz GPU and 7000.Mhz ram Stable
> 
> http://www.3dmark.com/3dm/618647


wow what were your clock offset settings for 1300mhz?


----------



## wholeeo

He beat my score ;(

Nice card


----------



## justanoldman

Quote:


> Originally Posted by *wholeeo*
> 
> He beat my score ;(
> 
> Nice card


Not without a valid screen shot. It will be interesting if he can come up with that 105 he said. Also he posted about a modded bios, I thought people here had found that no bios improves anything for a 690.


----------



## mtbiker033

Quote:


> Originally Posted by *justanoldman*
> 
> Not without a valid screen shot. It will be interesting if he can come up with that 105 he said. Also he posted about a modded bios, I thought people here had found that no bios improves anything for a 690.


That's what I understood from Mr. TooShort, I'm definitely interested if it's true. I would love to be able to up the volts on mine.


----------



## justanoldman

Max883,
If possible, we would like to know more about the bios you used and whether it increased performance. And an F12 screenshot of your best Valley 1.0 run would be great too. I would like to see that monster card of your listed in the top 30 thread.


----------



## wholeeo

Quote:


> Originally Posted by *justanoldman*
> 
> Not without a valid screen shot. It will be interesting if he can come up with that 105 he said. Also he posted about a modded bios, I thought people here had found that no bios improves anything for a 690.


I've tried several and tried modifying in various different ways and none do anything for voltage. I always went back to stock bios. My highest score was on the stock bios.


----------



## justanoldman

Quote:


> Originally Posted by *wholeeo*
> 
> I've tried several and tried modifying in various different ways and none do anything for voltage. I always went back to stock bios. My highest score was on the stock bios.


Thanks.
You sell your 690, or is it sitting on a shelf with your two Titan boxes?


----------



## wholeeo

Quote:


> Originally Posted by *justanoldman*
> 
> Thanks.
> You sell your 690, or is it sitting on a shelf with your two Titan boxes?


Returned it for a Titan.







I was planning on getting another 690 but didn't want to take my chances on Quad-SLI.


----------



## max883

Ok. wil post a screenshot used my own moded bios, its risky!!! stil testing! Power target +150 Core +200 mem +500

Volt. 1.212v

core 1300.mhz
Mem 7000.mhz

MY RIG IS WATER COOLD !

Testing some Battlefeald 3 on my 60.inc LG flat screen







Love it


----------



## justanoldman

Max883,
Have you tested stock bios vs. your modified one? How much better is your o.c. and Valley 1.0 score with one bios vs. the other?


----------



## max883

My min and max fps is lower, maybe cos there were a bit stuter.

Core clock: 1360.mhz Mem: 7000.mhz









Gona test 3dmark 13 now!


----------



## wholeeo

That's insane but how did you get Valley to show accurate clocks? It's usually wrong.


----------



## max883

Used nvidia kepler bios tweaker with my moded bios to hard lock my max clocks set it to 1360.0 to all 4 fealds!

Bisides it dosnt show my coreckt clocks! 1357.mhz shows in valey!!!


----------



## wholeeo

Quote:


> Originally Posted by *max883*
> 
> Used nvidia kepler bios tweaker with my moded bios to hard lock my max clocks set it to 1360.0 to all 4 fealds!


You need to submit that score here,

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form-single-and-multi-monitors/


----------



## max883

i could shere my bios, but you seriusly need wather cooling! il put it upp here tomorw after i have testet it 100%


----------



## wholeeo

Quote:


> Originally Posted by *max883*
> 
> i could shere my bios, but you seriusly need wather cooling! il put it upp here tomorw after i have testet it 100%


How do you confirm that the card is @ 1.21v other than the insane clocks and score? Does Precision actually show it at all?


----------



## justanoldman

Sorry max883, I posted your score in the Valley thread and asked them to review it. Your numbers are so good, no not good, they are great, that I know some will question it. Let me know and I will remove my post if you want. It is just that I am a little excited at the prospect that a bios mod could actually change the 690 performance.


----------



## mtbiker033

I'm very interested in what you got going on here max!







might be a good reason to go water cooling on this beast


----------



## justanoldman

Quote:


> Originally Posted by *mtbiker033*
> 
> I'm very interested in what you got going on here max!
> 
> 
> 
> 
> 
> 
> 
> might be a good reason to go water cooling on this beast


We are all hoping it is real, but so far the Valley screen shot is seen as being fake. I sent him a pm asking for another, and hopefully a screenshot of his desktop with the Precision monitor window open. Hopefully we will find out if he is pulling our leg or not.


----------



## wholeeo

I see he is using driver 313.96 back from January. Whole thing is just weird,


----------



## mtbiker033

Quote:


> Originally Posted by *justanoldman*
> 
> We are all hoping it is real, but so far the Valley screen shot is seen as being fake. I sent him a pm asking for another, and hopefully a screenshot of his desktop with the Precision monitor window open. Hopefully we will find out if he is pulling our leg or not.


Quote:


> Originally Posted by *wholeeo*
> 
> I see he is using driver 313.96 back from January. Whole thing is just weird,


agreed guys, need more proofs!


----------



## justanoldman

Quote:


> Originally Posted by *max883*
> 
> 
> went Down to 1300.Mhz GPU and 7000.Mhz ram Stable. Moded Bios
> http://www.3dmark.com/3dm/618647


Quote:


> Originally Posted by *max883*
> 
> 
> My min and max fps is lower, maybe cos there were a bit stuter.
> Core clock: 1360.mhz Mem: 7000.mhz
> 
> 
> 
> 
> 
> 
> 
> 
> Gona test 3dmark 13 now!


Both these screen shots have been altered.


----------



## Koniakki

A new owner of an ASUS GTX 690 here. I will provide proof when I finish my Rig.

Also *max883* since we both have an ASUS 690 I would be VERY interested to try your modded bios.

I would appreciate it if you could send them to me in PM or post them here please. Thank you.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Koniakki*
> 
> A new owner of an ASUS GTX 690 here. I will provide proof when I finish my Rig.
> 
> Also *max883* since we both have an ASUS 690 I would be VERY interested to try your modded bios.
> 
> I would appreciate it if you could send them to me in PM or post them here please. Thank you.


Hi there.

It's been proven earlier that the user you mentioned posted fake results. Therefore there is no bios that gets those outrageous results.

Myself and others were hoping it was true, but it's not. But it's nice to know that you still have a beast on your hands without a magic voltage bios. Congrats btw on your GTX690!


----------



## Koniakki

Dammit. I didn't believe the 1300+ clocks either since I already read the last 4-5 pages but I was really hoping for some good modded bios, thats why it was the only thing I asked him.

Well once again something that falls to the too good to be true category. Thanks Shorty for the reply and the peace of mind you gave me.


----------



## RagingCain

Well, better late than never I guess.

Just joined the team green again, and it feels oh so good not to have microstutter.


----------



## KeRo77

Hi Guys,

I'm looking at getting two 690's (probably won't OC too much) and want to make sure my power supply will handle it. I will also be getting the 4770k when it's released and will run the max OC I can on the CPU.

This is my currect PSU:

http://www.pccasegear.com/index.php?main_page=product_info&cPath=15_354&products_id=21232

Cheers


----------



## MrTOOSHORT

Quote:


> Originally Posted by *KeRo77*
> 
> Hi Guys,
> 
> I'm looking at getting two 690's (probably won't OC too much) and want to make sure my power supply will handle it. I will also be getting the 4770k when it's released and will run the max OC I can on the CPU.
> 
> This is my currect PSU:
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=15_354&products_id=21232
> 
> Cheers


Two GTX 690s with a 3770k will require an 850w psu. A 3930K with two GTX 690s will require 1000w unit.

So you're good to go.









Overclock all you want too.


----------



## KeRo77

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Two GTX 690s with a 3770k will require an 850w psu. A 3930K with two GTX 690s will require 1000w unit.
> 
> So you're good to go.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Overclock all you want too.


Thank you sir







I thought as much, but wanted to check it out first.

While I'm here, 2 x 690 will perform better then 2 titans?


----------



## MrTOOSHORT

I think two titans are better than two 690s. Two-way sli is better than four-way when it comes to scaling, and also consider the Vram of 6gb vs 2gb.

My self would only use one GTX 690. Two GTX 690s in sli is just too much gpu muscle for the vram amount.

Just my


----------



## KeRo77

Thanks mate, looks like I might now be leaning towards 2 titans


----------



## Koniakki

LOL today I tested my 690 with a friend's 3770k as my Rig is still on its way.
I learned first hand what overkill really means... lolol









But then again, there's no such thing as "overkill" in the tech world...


----------



## King4x4

Go surround and watch that tiger turn into a kitten!


----------



## Koniakki

Quote:


> Originally Posted by *King4x4*
> 
> Go surround and watch that tiger turn into a kitten!


lol! I wont even argue on that.. I don't wanna regret buying it.. I will leave it in "Tiger" mode..


----------



## PCModderMike

Just for nostalgic purposes, pulled out some old pics of my 690 to edit. Although the Titan has blasted onto the scene, I still love my 690. To add to the previous comments, I think the 690 is a fantastic card for single monitor solutions, especially at 2560x1440 or 2560x1600. But once you get into surround gaming, the 2GB of VRAM will hold the card back. Trying to run a second 690 for quad SLI would be plenty of power for a surround setup, but still there would only be 2GB of effective video memory and the cards would choke.


----------



## xToFxREAPER

Cant wait to join im (hopefully) getting my 690 tomorrow, if i do ill post pictures asap and start sharing in the awesomeness with you all in the meantime ill update my sig with my rig is its running right now


----------



## Koniakki

Quote:


> Originally Posted by *xToFxREAPER*
> 
> Cant wait to join im (hopefully) getting my 690 tomorrow, if i do ill post pictures asap and start sharing in the awesomeness with you all in the meantime ill update my sig with my rig is its running right now


We hope you get it too.. And if you(finally) do, congrats man...


----------



## justanoldman

Quote:


> Originally Posted by *xToFxREAPER*
> 
> Cant wait to join im (hopefully) getting my 690 tomorrow, if i do ill post pictures asap and start sharing in the awesomeness with you all in the meantime ill update my sig with my rig is its running right now


Welcome to ocn, don't know your level of expertise, but here is a guide to ocing your chip with an Asus mobo:
http://www.overclock.net/t/1291703/ivy-bridge-overclocking-guide-asus-motherboards

And here is a guide about ocing a 600 series card:
http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide

Get max ocs on both the take a run with Valley 1.0 and see how your 690 performs:
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form-single-and-multi-monitors


----------



## xToFxREAPER

Quote:


> Originally Posted by *justanoldman*
> 
> Welcome to ocn, don't know your level of expertise, but here is a guide to ocing your chip with an Asus mobo:
> http://www.overclock.net/t/1291703/ivy-bridge-overclocking-guide-asus-motherboards
> 
> And here is a guide about ocing a 600 series card:
> http://www.overclock.net/t/1265110/the-gtx-670-overclocking-master-guide
> 
> Get max ocs on both the take a run with Valley 1.0 and see how your 690 performs:
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form-single-and-multi-monitors


thank you for the warm welcome, i have good bit of expertise right now my 3770k ias at 4.7ghz rock stable







il be getting the 690 in a few hours and will be seeing what i can get out of it as soon as i can ill post everything as soon as i can


----------



## xToFxREAPER

Thank you much im coming from sli 680's (yes downgrade i know) i was having extreme heat issues so im thinking logically whats going to help my system the most, im getting the card in a few hours then all i gotta do is try and hunt down an evga backplate for it


----------



## justanoldman

Quote:


> Originally Posted by *xToFxREAPER*
> 
> Thank you much im coming from sli 680's (yes downgrade i know) i was having extreme heat issues so im thinking logically whats going to help my system the most, im getting the card in a few hours then all i gotta do is try and hunt down an evga backplate for it


No problem.
I understand the heat issue, but you might still face it a little with the 690 depending on your particular card and how you choose to oc it. During my benchmark Valley runs I could not keep it under the 70c throttle point without it being cold in my room, the case open, and a floor fan blowing on it. Now under water with the H220 and a Heatkiller block I can do the benchmark at a little over 40c. With normal use and a lower oc though, air can get the job done.


----------



## xToFxREAPER

Quote:


> Originally Posted by *justanoldman*
> 
> No problem.
> I understand the heat issue, but you might still face it a little with the 690 depending on your particular card and how you choose to oc it. During my benchmark Valley runs I could not keep it under the 70c throttle point without it being cold in my room, the case open, and a floor fan blowing on it. Now under water with the H220 and a Heatkiller block I can do the benchmark at a little over 40c. With normal use and a lower oc though, air can get the job done.


Yeah not as bad they were like 100 degrees plus so yeah... Ill be uploading a pic soon of it


----------



## xToFxREAPER

the 690 is now mine xD it's an evga ill get pics if it installed later when I'm home, since my rig is listed as a work in progress i may or may not post a build log later on to help explain


----------



## PCModderMike

Congrats


----------



## xToFxREAPER

Quote:


> Originally Posted by *PCModderMike*
> 
> Congrats


Thanks







I can't wait to start testing and overclocking


----------



## PCModderMike

If you're worried about temps...you could always venture into water cooling.


----------



## xToFxREAPER

Quote:


> Originally Posted by *PCModderMike*
> 
> If you're worried about temps...you could always venture into water cooling.


Can't afford it I'm saving to go back to school, I want to eventually do a watercooled system


----------



## justanoldman

Quote:


> Originally Posted by *xToFxREAPER*
> 
> Can't afford it I'm saving to go back to school, I want to eventually do a watercooled system


Not to push you in that direction, but imo water cooling the gpu is great, and there are people around here looking to unload their 690 water blocks for cheap. The H220 is $139 and can be expanded with another rad to cool the cpu and gpu. Just something to think about.


----------



## xToFxREAPER

Yeah
Quote:


> Originally Posted by *justanoldman*
> 
> Not to push you in that direction, but imo water cooling the gpu is great, and there are people around here looking to unload their 690 water blocks for cheap. The H220 is $139 and can be expanded with another rad to cool the cpu and gpu. Just something to think about.


I've been considering something like that actually, we'll see how funding goes cause it's something I really wanna do even when I had my 680's... Maybe I can squeeze some extra juice outta my cpu of I do


----------



## virgis21

Hi all,

I am looking to buy GTX 690 card and little confused about the Brand to chose from?
Sorry if I missed it in this looong tread








Is there any difference between those Zotac, Asus, Gigabyte or others?
Thanks for advice!

Virgis


----------



## xToFxREAPER

Quote:


> Originally Posted by *virgis21*
> 
> Hi all,
> 
> I am looking to buy GTX 690 card and little confused about the Brand to chose from?
> Sorry if I missed it in this looong tread
> 
> 
> 
> 
> 
> 
> 
> 
> Is there any difference between those Zotac, Asus, Gigabyte or others?
> Thanks for advice!
> 
> Virgis


They are all the same, nvidia hasnt allowed them to change them in any way, personally i prefer the evga model because its the only one you can use the led control app with


----------



## xToFxREAPER

if anyone in this forum has an evga backplate for the 690 for sale with all the required screws please contact me as i am having a great deal of trouble finding one and would very much like to have one as i change things in my system alot and would like the protection, i benchmark and test alot for friends and family so it would give me a bit of peace of mind


----------



## wholeeo

Quote:


> Originally Posted by *xToFxREAPER*
> 
> if anyone in this forum has an evga backplate for the 690 for sale with all the required screws please contact me as i am having a great deal of trouble finding one and would very much like to have one as i change things in my system alot and would like the protection, i benchmark and test alot for friends and family so it would give me a bit of peace of mind


Are you in the US? I'll check to see if my brother wants to sell a new one he has never bothered installing. Just an FYI, it doesn't come with screws. You reuse the ones that are on the card itself.


----------



## xToFxREAPER

Quote:


> Originally Posted by *wholeeo*
> 
> Are you in the US? I'll check to see if my brother wants to sell a new one he has never bothered installing. Just an FYI, it doesn't come with screws. You reuse the ones that are on the card itself.


im in canada but i can pay all shipping and whatnot, just let me know cause the only one ive been able to find is on amazon for over $50 and i know they only cost $25 from evga


----------



## FiShBuRn

I've one for sale too


----------



## xToFxREAPER

Quote:


> Originally Posted by *FiShBuRn*
> 
> I've one for sale too


ok well im in need of one so ill buy one from either of you, ill need a bit before i can though since the card on its own broke the bank lol


----------



## xToFxREAPER

i thought i would mention a little something for the people like me that wanna keep this card cool but cant afford watercooling, granted it stays quite cool as it is ive done a little tweak to the system, i found a little 80mm turbo fan laying around on its own is fairly loud but i used a noctua voltage mod cable so it runs near silent but moves a good bit of air, i then mounted it on the back of the video card right beside the dvi port so its not blocking anything but it suck a lot of heat out of the card, its a good cheap option that i havent seen anyone mention







its kind of ghetto but you dont notice it and it seems to be helping alot xD


----------



## Koniakki

I never thought that Blue and Green would look so sexy together...







And here's a pic:









http://postimage.org/

You can find proof here: http://www.overclock.net/t/392179/the-official-cooler-master-haf-x-932-922-912-club/21250#post_19977561


----------



## PCModderMike




----------



## mtbiker033

I will be getting a 120hz monitor next week and I wanted to check in with the club to see how many of you are using your 690 on a 120hz screen and how well it handles it. I really only play BF3 and Skyrim.


----------



## Arizonian

Quote:


> Originally Posted by *mtbiker033*
> 
> I will be getting a 120hz monitor next week and I wanted to check in with the club to see how many of you are using your 690 on a 120hz screen and how well it handles it. I really only play BF3 and Skyrim.


Don't play Skyrim will let others chime in there. Here is Caspian Borders multplayer on my i7 3770K with a GTX 690. I've video recorded it for OCN rather than use FRAPS to record. FRAPS is in the upper left corner for you to monitor FPS.

Basically depends on what your doing but it's a nice 120 FPS + anything beyond is superfluousness. Low dips of 78 FPS. I'd say average 98 FPS guesstimate. Ultra settings. See for yourself.






Quick note on 3D. If your getting 120 Hz monitor and it's Nvidia 3D Vision ready and have questions go to *[Official] Nvidia 3d Vision + 3d Surround Club* with any questions.

Running 3D will cut FPS about 45% - 50%. So if your playing a game normally @ 100 FPS, when your in 3D your going to get 50-55 FPS. When in 3D mode you need to maintain 30 FPS minimum to keep things running and you max @ 60 FPS for smoothest play again anything beyond is superfluousness.


----------



## Shogon

Quote:


> Originally Posted by *mtbiker033*
> 
> I will be getting a 120hz monitor next week and I wanted to check in with the club to see how many of you are using your 690 on a 120hz screen and how well it handles it. I really only play BF3 and Skyrim.


Which monitor did you decide on?

To answer your question though, a gtx 690 will handle it just fine! I really enjoyed playing Close Quarters a lot, the fluidity of the monitor made the close-up firefights an easy win for me. You'd be surprised, its gives quite the advantage.


----------



## mtbiker033

Quote:


> Don't play Skyrim will let others chime in there. Here is Caspian Borders multplayer on my i7 3770K with a GTX 690. I've video recorded it for OCN rather than use FRAPS to record. FRAPS is in the upper left corner for you to monitor FPS.
> 
> Basically depends on what your doing but it's a nice 120 FPS + anything beyond is superfluousness. Low dips of 78 FPS. I'd say average 98 FPS guesstimate. Ultra settings. See for yourself.
> 
> Quick note on 3D. If your getting 120 Hz monitor and it's Nvidia 3D Vision ready and have questions go to [Official] Nvidia 3d Vision + 3d Surround Club with any questions.
> 
> Running 3D will cut FPS about 45% - 50%. So if your playing a game normally @ 100 FPS, when your in 3D your going to get 50-55 FPS. When in 3D mode you need to maintain 30 FPS minimum to keep things running and you max @ 60 FPS for smoothest play again anything beyond is superfluousness.


Thanks for the tips and video! I get very similar fps in BF3 to what is in your video. I don't really care too much about 3d.
Quote:


> Which monitor did you decide on?
> 
> To answer your question though, a gtx 690 will handle it just fine! I really enjoyed playing Close Quarters a lot, the fluidity of the monitor made the close-up firefights an easy win for me. You'd be surprised, its gives quite the advantage.


Samsung S23A700D I bought used from a member here on the market place! Can't wait to try it out.

Thanks for the replies guys!


----------



## IBYCFOTA

Quote:


> Originally Posted by *mtbiker033*
> 
> I will be getting a 120hz monitor next week and I wanted to check in with the club to see how many of you are using your 690 on a 120hz screen and how well it handles it. I really only play BF3 and Skyrim.


I don't have my 690 yet but my 7970 can achieve pretty high frames on Ultra in Skyrim with my 120hz BenQ monitor and it's very smooth. Unfortunately Skyrim has some severe complications when running the framerate at over 100 FPS (flickering, audio, physics glitches) so you're going to need to cap the framerate at around 80-90. Doing that introduces some stuttering however, at least it did for me, but I haven't been particularly happy with AMD's performance in various games including Skyrim so I don't know if the 690 will work better with the framerate capped.


----------



## xToFxREAPER

Quote:


> Originally Posted by *IBYCFOTA*
> 
> I don't have my 690 yet but my 7970 can achieve pretty high frames on Ultra in Skyrim with my 120hz BenQ monitor and it's very smooth. Unfortunately Skyrim has some severe complications when running the framerate at over 100 FPS (flickering, audio, physics glitches) so you're going to need to cap the framerate at around 80-90. Doing that introduces some stuttering however, at least it did for me, but I haven't been particularly happy with AMD's performance in various games including Skyrim so I don't know if the 690 will work better with the framerate capped.


i found 80-90 on skyrim is fairly buggy as well i tested it on my 690, im gonna try capping it some different ways and see if it performs better and ill report back but for now it seems more that its laggy due to the engine more then anything.. also note i have a lot of mods installed that could be making it buggy as well so experience may very

________________________________________________________________________

on a secondary totally unrelated note... does the members list to this thread get updated often anymore? ill repost proof if need be


----------



## virgis21

Here is my sexy








Can confirm, BF3 runs over 100fps (i7 3770K overclocked to 4.2Ghz)


----------



## john99teg

I know someone that has a 690 that has no vendor sticker with it for 690. So technically there is no warranty. You guys think its worth the risk to get it for 600 if there no artifacts and have great temps. He said he got it from a friend that works at nvidia, so I am thinking these are one of like beta batches for the 690 which could be or good thing o0. I have a couple of days to decide. Usually I can tell if card is bad within 10 minutes of benchmarks and stuff. He said if not, he will just sell this his buddy for 400. He wont let me be his buddy ahaha.


----------



## xToFxREAPER

Quote:


> Originally Posted by *john99teg*
> 
> I know someone that has a 690 that has no vendor sticker with it for 690. So technically there is no warranty. You guys think its worth the risk to get it for 600 if there no artifacts and have great temps. He said he got it from a friend that works at nvidia, so I am thinking these are one of like beta batches for the 690 which could be or good thing o0. I have a couple of days to decide. Usually I can tell if card is bad within 10 minutes of benchmarks and stuff. He said if not, he will just sell this his buddy for 400. He wont let me be his buddy ahaha.


well i got my card second hand from someone for about the same price and have had no issues as yet, honestly i would say if you see no problems with the card go for it cause the 690 is a monster


----------



## mtbiker033

received the 120hz monitor today and wow it is amazing. I played a few minutes of BF3 and there definitely is a difference. Need to play/test some more but the 690 seems to handle it no problem.


----------



## uio77

Do any of you have problems with steam and the 690??

All the problems started after I upgrade from a 660ti. The games voice and images started to lose sync, after that, loading and saving times started to increase until the game finally crashed. These happens in Bioshock Infinite, Tomb Raider and Batman series. Also noted that characters just stopped moving, just froze, I can move around like everything is normal but the support characters just stand still doing nothing (Elizabeth in BS and your squad in BF3).
After the game crashed, Steam client crashed too in a very bad way. I have to delete all the steam files, except for the game and info folders to make it work again. This action makes Steam to download all the client files fresh, but the issues are back, eventually.
I cannot play more than 15 minutes without having to do this all over again,
I already sent the card for RAM to EVGA and they send me another one, but still the same problem. I uninstalled drivers, games and Steam plenty of times. Ran stress test to the card and RAM and no problems were found. Ran SMAT test to hard drive and it's in good shape. All the other components were running rock solid for over 6 months.
I am running out of choices and the dead line to return the card runs up the 23th. What to do??


----------



## xToFxREAPER

Quote:


> Originally Posted by *uio77*
> 
> Do any of you have problems with steam and the 690??
> 
> All the problems started after I upgrade from a 660ti. The games voice and images started to lose sync, after that, loading and saving times started to increase until the game finally crashed. These happens in Bioshock Infinite, Tomb Raider and Batman series. Also noted that characters just stopped moving, just froze, I can move around like everything is normal but the support characters just stand still doing nothing (Elizabeth in BS and your squad in BF3).
> After the game crashed, Steam client crashed too in a very bad way. I have to delete all the steam files, except for the game and info folders to make it work again. This action makes Steam to download all the client files fresh, but the issues are back, eventually.
> I cannot play more than 15 minutes without having to do this all over again,
> I already sent the card for RAM to EVGA and they send me another one, but still the same problem. I uninstalled drivers, games and Steam plenty of times. Ran stress test to the card and RAM and no problems were found. Ran SMAT test to hard drive and it's in good shape. All the other components were running rock solid for over 6 months.
> I am running out of choices and the dead line to return the card runs up the 23th. What to do??


i have had no issues since my upgrade from 680's.. uninstall all your drivers run driver sweeper then reinstall and report back, sometimes strange things happen from changing gpu's even if it is still nvidia


----------



## uio77

With all divers you mean all the components drivers or Nvidia only??


----------



## Masta Squidge

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> I think two titans are better than two 690s. Two-way sli is better than four-way when it comes to scaling, and also consider the Vram of 6gb vs 2gb.
> 
> My self would only use one GTX 690. Two GTX 690s in sli is just too much gpu muscle for the vram amount.
> 
> Just my


As a Titan owner, I would like to point out that all the benchmarks disagree with you on performance in general, but the vram does get to be an issue at higher resolutions.

The catch is you can't run three 690s xD


----------



## xToFxREAPER

Quote:


> Originally Posted by *uio77*
> 
> With all divers you mean all the components drivers or Nvidia only??


nvidia drivers, my bad for poor explanation, also check under device manager and make sure its not listing your 660 ti anymore, if it is disable/delete it


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Masta Squidge*
> 
> As a Titan owner, I would like to point out that all the benchmarks disagree with you on performance in general, but the vram does get to be an issue at higher resolutions.
> 
> The catch is you can't run three 690s xD


Yes, that's right, but I said better as in the scaling of two high powered gpus as opposed to four lower powered gpus. I mean a lot of games have no problem with two cards in sli or crossfire vs four and it should be smoother with two cards also, imo.

I would think with two GTX 690s in quad, the scaling after two gpus(first GTX 690) isn't so great. Maybe 50% for the third and 25% for the fourth(2nd GTX 690 added)? I could be exaggerating.

But with two Titans, you get 100% and then 90% for the second gpu.

And yeah, that vram of 6gb vs 2gb for the gpu muscle.

Running one GTX 690 with 2gb isn't an issue though, atleast for 1200p, when I had it. Faster than my GTX TITAN that's for sure. I just like trying new things, so I sold the 690 and bought the Titan.


----------



## xToFxREAPER

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Yes, that's right, but I said better as in the scaling of two high powered gpus as opposed to four lower powered gpus. I mean a lot of games have no problem with two cards in sli or crossfire vs four and it should be smoother with two cards also, imo.
> 
> I would think with two GTX 690s in quad, the scaling after two gpus(first GTX 690) isn't so great. Maybe 50% for the third and 25% for the fourth(2nd GTX 690 added)? I could be exaggerating.
> 
> But with two Titans, you get 100% and then 90% for the second gpu.
> 
> And yeah, that vram of 6gb vs 2gb for the gpu muscle.
> 
> Running one GTX 690 with 2gb isn't an issue though, atleast for 1200p, when I had it. Faster than my GTX TITAN that's for sure. I just like trying new things, so I sold the 690 and bought the Titan.


i must agree, i got to try the titan as well as 2 690's before i settled on what i have now and for a single monitor setup even at 1440p the 690 is a beast and ill take it over the titan any time, though as soon as you sli the 690's they go down the tubes, you actually lose fps in alot of games because it just cant scale with 4 gpus.. maybe in future games it could be useful and properly coded and optimized both on devs side and nvidias guess we just gotta wait to see right now though for those that have the money, 2 titans no questions asked is the best setup you can run imo


----------



## IBYCFOTA

So I finally have the 690 in my possession!















Only one problem, and it's fairly significant. The holes on the mounting bracket don't seem to align with my case.





This is pretty baffling. The card itself fits fine and is roughly the same size as my 7970. My case and mobo (NZXT Phantom 410 and Z77 Extreme 4, respectively) are probably a bit cheaper than some other GTX 690 builds, but I don't see any good reason why it shouldn't fit properly. Am I SOL here unless I want to do something tacky like duct tape it together?


----------



## xToFxREAPER

Quote:


> Originally Posted by *IBYCFOTA*
> 
> So I finally have the 690 in my possession!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only one problem, and it's fairly significant. The holes on the mounting bracket don't seem to align with my case.
> 
> 
> 
> 
> 
> This is pretty baffling. The card itself fits fine and is roughly the same size as my 7970. My case and mobo (NZXT Phantom 410 and Z77 Extreme 4, respectively) are probably a bit cheaper than some other GTX 690 builds, but I don't see any good reason why it shouldn't fit properly. Am I SOL here unless I want to do something tacky like duct tape it together?


That is actually really weird ive never seen that happen before in my life, i honestly have no idea you might be sol in all honesty but maybe anther forum member has an idea up their sleeve


----------



## justanoldman

I think something is blocking it from seating correctly in the mobo. Maybe the bottom flange is bent a little or some cables on the right side.


----------



## IBYCFOTA

Quote:


> Originally Posted by *justanoldman*
> 
> I think something is blocking it from seating correctly in the mobo. Maybe the bottom flange is bent a little or some cables on the right side.


I'll have to check again but games and benchmarks seem to be running mostly as expected (case is on it's side atm). Would that still happen if it wasn't seated perfectly?


----------



## Masta Squidge

Quote:


> Originally Posted by *IBYCFOTA*
> 
> So I finally have the 690 in my possession!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is pretty baffling. The card itself fits fine and is roughly the same size as my 7970. My case and mobo (NZXT Phantom 410 and Z77 Extreme 4, respectively) are probably a bit cheaper than some other GTX 690 builds, but I don't see any good reason why it shouldn't fit properly. Am I SOL here unless I want to do something tacky like duct tape it together?


So use one of your fingers, and push it up 1/4" to where it is supposed to be.

I have had this happen on tons of cases, I assure you the PCI slot will be just fine with a couple degrees of what appears to be misalignment. It should be fine.


----------



## justanoldman

Quote:


> Originally Posted by *IBYCFOTA*
> 
> I'll have to check again but games and benchmarks seem to be running mostly as expected (case is on it's side atm). Would that still happen if it wasn't seated perfectly?


Not sure what "mostly" means, but if everything is working perfectly that would tell you something. The spacing is fine from a vertical point of view, like Masta said you just have to push the card up a fraction. But it looks like it is not all the way to the left of the case. That is why I thought maybe the two metal flanges that slide down there were pushing it to the right or something like that.


----------



## uio77

Thanks for the tip.

I uninstalled and swept NVIDIA drivers and also reinstalled Steam from scratch. It worked fine for 30 minutes this time LOL. After that the same thing, Bioshock crashed and when I tried Tomb Rider, it just stopped loading the environments as previously behaved. I will return the 690 and wait for the release of the 780 tomorrow. I really wanted to love this card but for some reason it did not work in my rig.


----------



## xToFxREAPER

Quote:


> Originally Posted by *uio77*
> 
> Thanks for the tip.
> 
> I uninstalled and swept NVIDIA drivers and also reinstalled Steam from scratch. It worked fine for 30 minutes this time LOL. After that the same thing, Bioshock crashed and when I tried Tomb Rider, it just stopped loading the environments as previously behaved. I will return the 690 and wait for the release of the 780 tomorrow. I really wanted to love this card but for some reason it did not work in my rig.


yeah sometimes there are conflicts with certain hardware together so i guess im not overly surprised, i wish you luck with the 780 ive heard some good things about them though who knows how much of it is true


----------



## mtbiker033

dudes GTX690 + 120Hz monitor =









only played BF3 on it so far but DAYUM huge difference


----------



## Arizonian

Quote:


> Originally Posted by *mtbiker033*
> 
> dudes GTX690 + 120Hz monitor =
> 
> 
> 
> 
> 
> 
> 
> 
> 
> only played BF3 on it so far but DAYUM huge difference


Perfect card for that set up, I'll concur.









Edit - Congrats on the monitor btw.


----------



## mtbiker033

Quote:


> Originally Posted by *Arizonian*
> 
> Perfect card for that set up, I'll concur.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit - Congrats on the monitor btw.


why thank you sir! and thanks for your assistance prior to getting it!


----------



## Alex132

http://www.overclock.net/t/1393770/xbox-one-ps4-a-generation-ahead-of-high-end-pcs-says-ea-exec

What does it feel like to be 8-10 times behind a console guys









Oh poor EA.


----------



## PCModderMike

Quote:


> Originally Posted by *mtbiker033*
> 
> dudes GTX690 + 120Hz monitor =
> 
> 
> 
> 
> 
> 
> 
> 
> 
> only played BF3 on it so far but DAYUM huge difference


Hmm I'm a little jelly now I wanna try my 690 with 120Hz.








I used to have a 120Hz monitor a couple of years ago, like back when the 560Ti had just come out. I do remember it was butter smooth. But it was that 1st gen 3D monitor from Acer with the orange base....can't remember the model for the life of me now. Anyway, besides the smoothness, overall I didn't really like the monitor so I got rid of it.
What monitor did you get?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *PCModderMike*
> 
> Hmm I'm a little jelly now I wanna try my 690 with 120Hz.
> 
> 
> 
> 
> 
> 
> 
> 
> I used to have a 120Hz monitor a couple of years ago, like back when the 560Ti had just come out. I do remember it was butter smooth. But it was that 1st gen 3D monitor from Acer with the orange base....*can't remember the model for the life of me now*. Anyway, besides the smoothness, overall I didn't really like the monitor so I got rid of it.
> What monitor did you get?


This one?

http://www.newegg.ca/Product/Product.aspx?Item=N82E16824009222CVF










I agree, a GTX 690 would be perfect for a 1080p 120Hz screen.


----------



## PCModderMike

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> This one?
> 
> http://www.newegg.ca/Product/Product.aspx?Item=N82E16824009222CVF
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree, a GTX 690 would be perfect for a 1080p 120Hz screen.


That's it!









I currently run a 27 inch 1440p....debating on the switch to 1080p 120Hz.
In addition to the gaming I do, I really enjoy the extra space of working on a 1440p display.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *PCModderMike*
> 
> That's it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I currently run a 27 inch 1440p....debating on the switch to 1080p 120Hz.
> In addition to the gaming I do, I really enjoy the extra space of working on a 1440p display.


Yeah 1440p vs 120Hz 1080p, that is a tough decision to make!


----------



## PCModderMike

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Yeah 1440p vs 120Hz 1080p, that is a tough decision to make!


New excuse for two rigs









Work/general use hooked up to the 1440p monitor.

Gaming rig w/ 690 hooked up to the 1080p 120Hz monitor.

LOL


----------



## mtbiker033

Quote:


> Originally Posted by *PCModderMike*
> 
> Hmm I'm a little jelly now I wanna try my 690 with 120Hz.
> 
> 
> 
> 
> 
> 
> 
> 
> I used to have a 120Hz monitor a couple of years ago, like back when the 560Ti had just come out. I do remember it was butter smooth. But it was that 1st gen 3D monitor from Acer with the orange base....can't remember the model for the life of me now. Anyway, besides the smoothness, overall I didn't really like the monitor so I got rid of it.
> What monitor did you get?


I bought a samsung SA23700D from an ocn member (thanks surfbumb!) on the marketplace

it really is amazing

even playing crysis 3 where I'm closer to 80-90fps average it's great.


----------



## PCModderMike

Quote:


> Originally Posted by *mtbiker033*
> 
> I bought a samsung SA23700D from an ocn member (thanks surfbumb!) on the marketplace
> 
> it really is amazing
> 
> even playing crysis 3 where I'm closer to 80-90fps average it's great.


Checking it out right now.








Owning a 27 inch monitor currently though....so would want to transition into a 27 inch 120Hz.


----------



## pilla99

Quote:


> Originally Posted by *PCModderMike*
> 
> New excuse for two rigs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Work/general use hooked up to the 1440p monitor.
> 
> Gaming rig w/ 690 hooked up to the 1080p 120Hz monitor.
> 
> LOL


Or just get the right catleap bin and you can OC those to 128hz. Then you get 1440p + high refresh. Mine will only do about 90hz, but many go far beyond that.


----------



## IBYCFOTA

Quote:


> Originally Posted by *Masta Squidge*
> 
> So use one of your fingers, and push it up 1/4" to where it is supposed to be.
> 
> I have had this happen on tons of cases, I assure you the PCI slot will be just fine with a couple degrees of what appears to be misalignment. It should be fine.


I dunno. I've tried applying some force to moving the card up to where it would align correctly and it's not budging much. I'm afraid any more pressure will end up destroying the slot or damage the card itself. Ugh, this sucks. Might simply need to upgrade my mobo / case.
Quote:


> Originally Posted by *justanoldman*
> 
> Not sure what "mostly" means, but if everything is working perfectly that would tell you something. The spacing is fine from a vertical point of view, like Masta said you just have to push the card up a fraction. But it looks like it is not all the way to the left of the case. That is why I thought maybe the two metal flanges that slide down there were pushing it to the right or something like that.


Well I moved away from the 7970 because I was frustrated by how the card performed in a lot of games. As it turns out the 690 exhibits the same problems in those games, so it's not a card or driver issue. I've done my due diligence in terms of troubleshooting the problems, but nothing has really worked. Obviously you don't upgrade from the 7970 to the 690 because of driver concerns alone - I wanted a card that could run all of the new next gen games without having to sacrifice graphics for higher frames. I just figured it would also fix some of the problems I was having on account of moving to Nvidia drivers.

I've been busy this week so I haven't been able to extensively test the 690 in all of my games but I'm still getting horrible frame drops and slowdowns in Crysis 3 even on Medium settings. Metro 2033 skips and lags all over the place as it did before with my 7970. Bioshock Infinite performed a bit better I suppose in terms of less framedrops when the action got intense, but it was still there. GTA4 and Batman AC still run like hot garbage, as does New Vegas. Skyrim is pretty hit or miss. Far Cry 3, Max Payne 3, Tomb Raider, and Borderlands 2 run great though.

I guess I'm a bit disappointed in my PC gaming experience thus far. As a console player coming over to PC I was blown away by the graphics and smoothness of PC gaming, but the stuttering and stability issues I've experienced are a huge turn off, especially considering a lot of these games have been out for over a year and shouldn't have any problems running on a system as powerful as mine.


----------



## PCModderMike

Quote:


> Originally Posted by *pilla99*
> 
> Or just get the right catleap bin and you can OC those to 128hz. Then you get 1440p + high refresh. Mine will only do about 90hz, but many go far beyond that.


To buy and sell, and buy and sell, and maybe even buy and sell some more trying to hit the right one would just be a big pain. My Crossover won't take the slightest bit of overclock. But oh well.....I just like switching things up so if the 1080p 120Hz doesn't last with me very long it's no big deal.


----------



## Alex132

Would love a 120hz monitor, never really had a problem with 60hz until i turned v-sync off in D3.

MY GOD THE TEARING.


----------



## Lukas026

hey guys just a quick question here:

can anyone of you who owns 690 run the new evga oc scanner properly ?

all i can see when I start it is few buttons with no background and frame. realy strange becouse my system is up to date and there shouldn't be any problem.

thank you


----------



## xToFxREAPER

Quote:


> Originally Posted by *Lukas026*
> 
> hey guys just a quick question here:
> 
> can anyone of you who owns 690 run the new evga oc scanner properly ?
> 
> all i can see when I start it is few buttons with no background and frame. realy strange becouse my system is up to date and there shouldn't be any problem.
> 
> thank you


im having no issue with it


----------



## Lukas026

any ideas why is this happening ?



i dont have any ideas


----------



## xToFxREAPER

Quote:


> Originally Posted by *Lukas026*
> 
> any ideas why is this happening ?
> 
> 
> 
> i dont have any ideas


ive seen lots of posts about that all over the place for various versions of oc scanner, i have yet to see anyone mention a fix though







my only issue at the moment is i cant get it load both gpu's for some reason which i find a little odd cause it worked the other day


----------



## Lukas026

anyone else ?







I realy want to try it !


----------



## IBYCFOTA

Quick update: I did get the card to align correctly after all. Not sure what exactly I did but after reseating it a few times it finally lined up as it was supposed to. That's a pretty big relief as I was getting to the point where I was ready to return the card.









After testing it out more I'm really coming around to this card. There are so many different ways to customize the settings that I've been able to get good performance out of games I hadn't been able to in the past, most notably Metro 2033. The game was completely unplayable with my 7970 and my current 690 until I changed the vsync setting in the Nvidia control panel to On (smooth) and set a framerate limit of 60. Now it runs close to perfect.

I am slightly worried about the 2GB VRAM on the card. Playing Hitman Absolution maxed out it was around a steady 1.7GB, though I suppose I can always turn the AA down a bit if need be when games begin to approach the 2gb limit. I feel like this card will also hold it's value pretty decently for the next few years because of it's sheer power.

Also, add me to the club plz.


----------



## justanoldman

^Glad you got it to work with reseating. I have read a lot about the 2gb issue. Most of the people complaining about it don't understand that cards will use what they have, but that doesn't mean you will see any negative effects if it goes up to the limit. So a 4gb setup may use 2.3gb in a game and people will say that is proof that a 2gb card will fail, but that is simply not true. You are correct that taking down the graphics settings a bit will help also.

You really have to go out of your way to need more than 2gb on a single monitor setup, and be unwilling to reduce settings at all because you mistakenly think that life will come to an end if you can't max out every graphics setting.


----------



## Masta Squidge

Quote:


> Originally Posted by *justanoldman*
> 
> ^Glad you got it to work with reseating. I have read a lot about the 2gb issue. Most of the people complaining about it don't understand that cards will use what they have, but that doesn't mean you will see any negative effects if it goes up to the limit. So a 4gb setup may use 2.3gb in a game and people will say that is proof that a 2gb card will fail, but that is simply not true. You are correct that taking down the graphics settings a bit will help also.
> 
> You really have to go out of your way to need more than 2gb on a single monitor setup, and be unwilling to reduce settings at all because you mistakenly think that life will come to an end if you can't max out every graphics setting.


There is plenty of proof that hitting a vram cap slows performance.

And some of us prefer to be able to play at maximum settings, that's why we spend truckloads of money on our computers. Anyone buying a $1000 graphics card should be reasonably able to expect to set everything to max in most games.

Though you are more than correct about single monitor setups. You shouldn't be over the 2GB for almost any resolution, single screen. At least for now. Remember a few years back when 768MB was enough? I do.


----------



## Arizonian

Two things I'd like to add to the current discussion:

A single 120 Hz or 1440p - 2GB VRAM is enough for the current games up to date. EXCEPT heavily modded.

We now know new gen GTX 780 is still a 3GB card and if AMD comes out with another 3 GB VRAM card with Radeon 8970 then look for games to stay stagnant with GPU's and not exceed the 3GB VRAM limit for at least another year.

Developers keep this in mind, only upcoming games that begin to truly use over a good amount VRAM over will be the game the GTX 690 will have to turn down settings to stay fluent while gaming.

As a majority of end users are still be in 680's or 7970's or less and it keeps the ball from rolling too fast. Game Devs keep this in mind for largest target audience for best success/profits.

I'd say for single monitor set ups we're good for another year with 2GB VRAM on GPU's. Obviously not at maximum like I can now, except for Crysis 3, but nothing that turning down settings a tad without noticeable visual down grade at all can't fix.


----------



## Rei86

Quote:


> Originally Posted by *Arizonian*
> 
> Two things I'd like to add to the current discussion:
> 
> A single 120 Hz or 1440p - 2GB VRAM is enough for the current games up to date. EXCEPT heavily modded.
> 
> We now know new gen GTX 780 is still a 3GB card and if AMD comes out with another 3 GB VRAM card with Radeon 8970 then look for games to stay stagnant with GPU's and not exceed the 3GB VRAM limit for at least another year.
> 
> Developers keep this in mind, only upcoming games that begin to truly use over a good amount VRAM over will be the game the GTX 690 will have to turn down settings to stay fluent while gaming.
> 
> As a majority of end users are still be in 680's or 7970's or less and it keeps the ball from rolling too fast. Game Devs keep this in mind for largest target audience for best success/profits.
> 
> I'd say for single monitor set ups we're good for another year with 2GB VRAM on GPU's. Obviously not at maximum like I can now, except for Crysis 3, but nothing that turning down settings a tad without noticeable visual down grade at all can't fix.


1. At 1920x1080p and from benches a single 690 has gotten by
1a. Top shelf 2013 titles at 1920x1080 actually uses up more than 2GB of VRAM
1b. at 2560x1440/1600 for newer titles 2GB isn't cutting it
--- Most recent experince for me being BioShock infinite
2. AMD HD8970 roadmap already has it stated being 3GB and 6GB. It won't be till DDR4 (HD9000 series) that they'll revise the VRAM.


----------



## euphoria4949

Hey guys, Can I please join this elite club???









My 690 is an ASUS, currently running at: [email protected] 1095 - [email protected] 1597 - [email protected] 1200.
Excuse the wiring mess in the case, currently installing a new loop.

Cheers guys


----------



## IBYCFOTA

How in the hell do people get their cards to overclock so well on air? I swear there's either something with my setup or I'm just woefully unlucky when it comes to the sillicone lottery. I can't even go above 50 MHz on the core before I start to see artifacts, and I hadn't even touched the memory. I've had 3 different 7970's that I've tested in my system and the highest I ever got was 1175 / 1550, and that only after was applying 1.3 volts to keep it steady.

The card is powerful enough that OCing isn't a huge concern but I'm confused why others seem to get much higher overclocks on air than I do. It's not like my temps are very high - the 7970s I kept around 65c at full load and 690 hasn't gotten much higher than that. VRM temps are normal too.


----------



## Rei86

Quote:


> Originally Posted by *IBYCFOTA*
> 
> How in the hell do people get their cards to overclock so well on air? I swear there's either something with my setup or I'm just woefully unlucky when it comes to the sillicone lottery. I can't even go above 50 MHz on the core before I start to see artifacts, and I hadn't even touched the memory. I've had 3 different 7970's that I've tested in my system and the highest I ever got was 1175 / 1550, and that only after was applying 1.3 volts to keep it steady.
> 
> The card is powerful enough that OCing isn't a huge concern but I'm confused why others seem to get much higher overclocks on air than I do. It's not like my temps are very high - the 7970s I kept around 65c at full load and 690 hasn't gotten much higher than that. VRM temps are normal too.


Some people just has better luck









Tried OCing them with the sync off and doing them one at a time? Had to do that for one of my 690s as GPU1 would hit 125 on GPU but GPU2 coujld barely do 75.


----------



## PCModderMike

Quote:


> Originally Posted by *Rei86*
> 
> 1. At 1920x1080p and from benches a single 690 has gotten by
> 1a. Top shelf 2013 titles at 1920x1080 actually uses up more than 2GB of VRAM
> 1b. at 2560x1440/1600 for newer titles 2GB isn't cutting it
> --- Most recent experince for me being BioShock infinite
> 2. AMD HD8970 roadmap already has it stated being 3GB and 6GB. It won't be till DDR4 (HD9000 series) that they'll revise the VRAM.


I don't have BioShock Infinite....but I do have some newer stuff like Far Cry 3 and Metro Last Night...and both of those are completely fine with 2GB at 1440p. Actually Battlefield 3 will use up more memory than my newer games...but that's just because BF3 will eat up almost whatever is available.


----------



## Rei86

Quote:


> Originally Posted by *PCModderMike*
> 
> I don't have BioShock Infinite....but I do have some newer stuff like Far Cry 3 and Metro Last Night...and both of those are completely fine with 2GB at 1440p. Actually Battlefield 3 will use up more memory than my newer games...but that's just because BF3 will eat up almost whatever is available.


BF3 for me only uses up like 1.5GB /shrug.


----------



## PCModderMike




----------



## Alex132

My VRAM (I suspect) has met it's first issue.

Metro 2033 Last Light runs fine, like (Guessing) ~70-90fps, everything on maximum details.
Soon as I go to 4xMSAA, the FPS drops to about ~30-40. 2xSMAA yields the ~70-90FPS.

Thought VRAM issues would be a sudden, (like 4fps) frame-rate issue. Not losing about half


----------



## PCModderMike

Have you verified how much memory is being used through something such as EVGA Precision or MSI Afterburner? Also what resolution are you playing at?


----------



## euphoria4949

Quote:


> Originally Posted by *IBYCFOTA*
> 
> How in the hell do people get their cards to overclock so well on air? I swear there's either something with my setup or I'm just woefully unlucky when it comes to the sillicone lottery. I can't even go above 50 MHz on the core before I start to see artifacts, and I hadn't even touched the memory. I've had 3 different 7970's that I've tested in my system and the highest I ever got was 1175 / 1550, and that only after was applying 1.3 volts to keep it steady.
> 
> The card is powerful enough that OCing isn't a huge concern but I'm confused why others seem to get much higher overclocks on air than I do. It's not like my temps are very high - the 7970s I kept around 65c at full load and 690 hasn't gotten much higher than that. VRM temps are normal too.


I will add though, that OC I stated above is untested as yet. I had it OC'ed slightly lower, (1065 - 1575 - 1170) if I remember correctly, stable for a month or two. I upped it to what I said above but I ran into problems with my new CPU before I could test the stability. Once I've installed my new loop and sorted everything I'll will test that OC.


----------



## Alex132

Quote:


> Originally Posted by *PCModderMike*
> 
> Have you verified how much memory is being used through something such as EVGA Precision or MSI Afterburner? Also what resolution are you playing at?


1920x1080


----------



## PCModderMike

Quote:


> Originally Posted by *Alex132*
> 
> 1920x1080


Interesting...and what about answering the other question?


----------



## Alex132

Quote:


> Originally Posted by *PCModderMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> 1920x1080
> 
> 
> 
> Interesting...and what about answering the other question?
Click to expand...

Nope, just assumed it was that to be honest. I mean, what else would cause the rapid drop off of FPS from ONLY changing the MSAA from 2x to 4x?


----------



## PCModderMike

Turning up the AA certainly does cause a big drop in FPS...but even at 2560x1440 with 4x I'm not getting close to using 2GB of VRAM.
PhysX was on for all of these
Memory usage is shown in the on screen display in the upper left hand corner.


*No AA*





*2x AA*






*4x AA*






At 4x around 1600MB of memory is being used.


----------



## Alex132

Quote:


> Originally Posted by *PCModderMike*
> 
> Turning up the AA certainly does cause a big drop in FPS...but even at 2560x1440 with 4x I'm not getting close to using 2GB of VRAM.
> PhysX was on for all of these
> Memory usage is shown in the on screen display in the upper left hand corner.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *No AA*
> 
> 
> 
> 
> 
> *2x AA*
> 
> 
> 
> 
> 
> 
> *4x AA*
> 
> 
> 
> 
> 
> 
> 
> 
> At 4x around 1600MB of memory is being used.


Hmmmm, interesting.

Wonder why 4x is such a massive FPS hit then?


----------



## wRRM

Bought the EK-FC690 and a backplate to my baby 2 weeks ago.

Dropped from 80c+ in load to 45c max at 1188Mhz core and 6650Mhz mem :3


----------



## PCModderMike

Quote:


> Originally Posted by *Alex132*
> 
> Hmmmm, interesting.
> 
> Wonder why 4x is such a massive FPS hit then?


Yea the performance hit at 4x is just way too much for me. 2x seems to be the sweet spot for my setup.
Quote:


> Originally Posted by *wRRM*
> 
> Bought the EK-FC690 and a backplate to my baby 2 weeks ago.
> 
> Dropped from 80c+ in load to 45c max at 1188Mhz core and 6650Mhz mem :3


The wonders of watercooling.


----------



## justanoldman

Quote:


> Originally Posted by *wRRM*
> 
> Bought the EK-FC690 and a backplate to my baby 2 weeks ago.
> 
> Dropped from 80c+ in load to 45c max at 1188Mhz core and 6650Mhz mem :3


Nice.
Post a score over in this thread, we need more 690s with good scores.
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form


----------



## PCModderMike

Quote:


> Originally Posted by *justanoldman*
> 
> Nice.
> Post a score over in this thread, we need more 690s with good scores.
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form


Don't think I've ever been in that thread...think I should give it a go too?


----------



## justanoldman

Quote:


> Originally Posted by *PCModderMike*
> 
> Don't think I've ever been in that thread...think I should give it a go too?


Really? Lots of activity over there recently with the debate about using driver options to get better scores. I would like everyone with a 690 to post their best score over there. We are not all that well represented.

It is a pretty quick bench to run, and it focuses on the gpu so you can see how they do. As opposed to 3dmark which has a lot more to do with cpu and ram, and it takes longer to run.


----------



## mtbiker033

Quote:


> Originally Posted by *justanoldman*
> 
> Really? Lots of activity over there recently with the debate about using driver options to get better scores. I would like everyone with a 690 to post their best score over there. We are not all that well represented.
> 
> It is a pretty quick bench to run, and it focuses on the gpu so you can see how they do. As opposed to 3dmark which has a lot more to do with cpu and ram, and it takes longer to run.


it's quite impressive to watch too!


----------



## PCModderMike

Quote:


> Originally Posted by *justanoldman*
> 
> Really? Lots of activity over there recently with the debate about using driver options to get better scores. I would like everyone with a 690 to post their best score over there. We are not all that well represented.
> 
> It is a pretty quick bench to run, and it focuses on the gpu so you can see how they do. As opposed to 3dmark which has a lot more to do with cpu and ram, and it takes longer to run.


Yea, really. I just went in and checked out the last few pages, I see what you're talking about with all of the driver debates.
I'll give it a go and post and see how I can do.


----------



## Koniakki

Guys I'm in the hunt for an *EK Backplate* for obviously my lovely ASUS 690. I still can't believe sometimes I own such a card.









Yes, for some poor people that sell their kidneys(







) to buy stuff like that, its a pleasure to own. Same as other of course. Either rich or poor or middle--class.









So guys any *EK Backplate's* like the one below? I already searched the For Sale threads in case anyone says it. But this is a good place to ask too.


----------



## xToFxREAPER

well ill give the valley run a go, i got some good clocks with my card on air


----------



## justanoldman

To get the highest score with your card in Valley takes a few driver choice changes and a little optimizing. Do this:
-Switch to Classic windows theme
-Hit the enter key to scroll through the scenes before hitting F9
-Only have one monitor plugged into your card(s)
-Shut down any other programs, don't have any monitoring software on
-oc your chip and card as much as possible to just get one run done and a screen shot
-keep both gpus from hitting 70c+ and throttling

Change these in Nvidia control panel, Manage 3d settings:
-Multi-display/mixed-GPU acceleration changed to single display performance mode
-Power management mode changed to prefer maximum performance
-Vertical sync changed to off
-Texture filtering - Quality changed to High performance

Change the slider on the "Adjust image settings with preview" part
-Change "use my preference emphasizing:" to Performance in Image Settings


----------



## IBYCFOTA

Quote:


> Originally Posted by *PCModderMike*
> 
> Yea the performance hit at 4x is just way too much for me. 2x seems to be the sweet spot for my setup.


That's another thing. For me, I have a 120hz monitor and strive to get as close to 120 FPS as I can without making any major graphical sacrifices. Newer games don't come all that close to maxing out the 2gb vram until you start maxing out the AA, and at that point you're taking a pretty significant hit to your performance that probably isn't worth it when a lower AA setting or even FXAA can do a more than adequate job. Far Cry 3 is an extreme example of that. 8x MSAA is just a huge burden to the framerate even on the 690, and the difference between that and 2x MSAA is hardly significant. I honestly can't even tell the difference, and that goes for most games. Tomb Raider looks about as good with FXAA as 8x MSAA (or whatever that game uses) but running the former with everything else maxed out (except no Tress FX) gives me a solid, unwavering 120 FPS. Why would I opt for anything else?

It might be different on a 2560x1440 monitor, but even then I doubt it because higher AA settings likely aren't necessary at that resolution. For a 120 Hz monitor, there's no reason you should be approaching 2GBs on anything but modded Skyrim at this point.


----------



## drserk

i have a Zotac gtx 690 and i want to change its' voltage with making new bios. but i cant change voltage with new bios. can't the voltage been changed?


----------



## justanoldman

Quote:


> Originally Posted by *drserk*
> 
> i have a Zotac gtx 690 and i want to change its' voltage with making new bios. but i cant change voltage with new bios. can't the voltage been changed?


I am not aware anyone ever finding a bios that can improve the performance/voltage of the 690.


----------



## xToFxREAPER

Quote:


> Originally Posted by *drserk*
> 
> i have a Zotac gtx 690 and i want to change its' voltage with making new bios. but i cant change voltage with new bios. can't the voltage been changed?


no ones found a way to change it as yet


----------



## drserk

is it worth to make gtx690 watercooled? i wanna increase voltage and some oc with 690 but i am not happy because of this limit..(my parts are heatkiller gpuX3+backplate)


----------



## MrTOOSHORT

Quote:


> Originally Posted by *drserk*
> 
> is it worth to make gtx690 watercooled? i wanna increase voltage and some oc with 690 but i am not happy because of this limit..(my parts are heatkiller gpuX3+backplate)


Only worth it for noise and lower temps. Easy to keep the card under 70'c with a waterblock.

Pretty much you'll get the same overclock with the air cooler vs water block on a GTX 690.


----------



## Reinass

Asus GTX 690


----------



## drserk

thnx for replies


----------



## PinzaC55

Quote:


> Originally Posted by *Reinass*
> 
> Asus GTX 690


Nice shot! Reminds me of the final scene in Close Encounters where Roy enters the mothership


----------



## Alex132

Any tips on removing a stripped screw?


----------



## wholeeo

Quote:


> Originally Posted by *Alex132*
> 
> Any tips on removing a stripped screw?


How bad is it? Can you fit a small flat head in between any two notches?


----------



## Alex132

Quote:


> Originally Posted by *wholeeo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Any tips on removing a stripped screw?
> 
> 
> 
> How bad is it? Can you fit a small flat head in between any two notches?
Click to expand...

The entire T6 torx is inside is basically circular...


----------



## wholeeo

Quote:


> Originally Posted by *Alex132*
> 
> The entire T6 torx is inside is basically circular...


Fill the hole with a drop of crazy glue or something and make it stick to a torx


----------



## TheMadHerbalist

Could you take my name of the list, I'm going to be selling both my cards pretty soon.


----------



## Alex132

Quote:


> Originally Posted by *wholeeo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> The entire T6 torx is inside is basically circular...
> 
> 
> 
> Fill the hole with a drop of crazy glue or something and make it stick to a torx
Click to expand...

Pretty sure the glue would break, but i can try...


----------



## FiShBuRn

I had the same problem to remove the stock cooler, one screw got stripped, ive used a drill to remove it...i was a little scared....but in the end it worked out fine!


----------



## Alex132

Quote:


> Originally Posted by *FiShBuRn*
> 
> I had the same problem to remove the stock cooler, one screw got stripped, ive used a drill to remove it...i was a little scared....but in the end it worked out fine!


size of the drill bit used?


----------



## PCModderMike

Quote:


> Quote:
> 
> 
> 
> Originally Posted by *FiShBuRn*
> 
> I had the same problem to remove the stock cooler, one screw got stripped, ive used a drill to remove it...i was a little scared....but in the end it worked out fine!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> size of the drill bit used?
> 
> Click to expand...
Click to expand...

Hmmm I would be real iffy about trying that one...but maybe 1/16" bit would do it.


----------



## filty

image.jpeg 1912k .jpeg file


----------



## mtbiker033

Quote:


> Originally Posted by *filty*
> 
> image.jpeg 1912k .jpeg file


welcome to OCN!


----------



## filty

THX !!!


----------



## Arizonian

Quote:


> Originally Posted by *filty*
> 
> image.jpeg 1912k .jpeg file


Ditto - congrats on the GTX 690









Way to first post on OCN - welcome aboard.









Love to see your specs - around here we're all about nerding out like that.









*"How to put your Rig in your Sig"*


----------



## flv1333

Hiho!

I got myself a 690 a few months back; I'm currently running it with a 3570K @ 4.6ghz. Would there be a good increase in graphics performance if I were to switch to a 3770K? (ie... to feed the card more) I play alot of mainstream games, but currently stuck on DayZ. Would an upgrade be worth it? Seeing as the price has dropped a bit here.

Anyway Ill be posting a pic when i get home! I would like to join up as well


----------



## PCModderMike

For gaming, you're not really going to see much of a performance increase going from a 3570K to a 3770K....I would stick with with ya got. That money could be used for future upgrades.


----------



## zalbard

Quote:


> Originally Posted by *flv1333*
> 
> Hiho!
> 
> I got myself a 690 a few months back; I'm currently running it with a 3570K @ 4.6ghz. Would there be a good increase in graphics performance if I were to switch to a 3770K? (ie... to feed the card more) I play alot of mainstream games, but currently stuck on DayZ. Would an upgrade be worth it? Seeing as the price has dropped a bit here.


You would see little to no increase. Definitely not worth it.


----------



## virgis21

Quote:


> Originally Posted by *flv1333*
> 
> Hiho!
> 
> I got myself a 690 a few months back; I'm currently running it with a 3570K @ 4.6ghz. Would there be a good increase in graphics performance if I were to switch to a 3770K?...


I went from 3570k (4.8Ghz OC) to 3770K (4.4Ghz) and can't see any difference to compare.. If i had those 2 machines in close side by side - maybe. But doesn't worth to sell 3570k and buy 3770k, as those CPU are so similar, only HT added for those 4 cores.. Everything seems very the same


----------



## Lukas026

hey guys

i need your help once again

today one of my fan blades broke on my Arctic Accelero 690 so I am going to RMA it soon. Now I need to put on my stock cooler again so I can run the card at all. I know the process of putting TIM on cores but I am a little bit confused with the pads / things which was on memory / VRM / other stuff. I took them down when i was placing my Arctic cooler and throw them away. Now I am scared to run the card with stock cooler on but without them. Any ideas what I can do with it ? Or is there somewhere a shop where I can buy pads for my 690 ?

Thank you for your answers. OCN never let me down


----------



## PCModderMike

Quote:


> Originally Posted by *Lukas026*
> 
> hey guys
> 
> i need your help once again
> 
> today one of my fan blades broke on my Arctic Accelero 690 so I am going to RMA it soon. Now I need to put on my stock cooler again so I can run the card at all. I know the process of putting TIM on cores but I am a little bit confused with the pads / things which was on memory / VRM / other stuff. I took them down when i was placing my Arctic cooler and throw them away. Now I am scared to run the card with stock cooler on but without them. Any ideas what I can do with it ? Or is there somewhere a shop where I can buy pads for my 690 ?
> 
> Thank you for your answers. OCN never let me down


You do not want to run the stock cooler without any thermal pads. For the memory, replacement pads would be 0.5mm. For the VRMs you would want to use 1.0mm.
Some options for 1.0mm can be found here - LINK
Some options for 0.5mm can be found here - LINK


----------



## LeandroJVarini

I believe that this question has already been answered here but I could not even read all 459 pages!
There is a possibility the board run 3 screens in 120hz? I have 3 screens 120hz and would like to use just a GTX690 to run the 3 screens, I know that to run the 3 screens would be better to have two VGAs these but in the case of the new mini itx pc will then have to be just a VGA

thks!!!!


----------



## justanoldman

Quote:


> Originally Posted by *LeandroJVarini*
> 
> I believe that this question has already been answered here but I could not even read all 459 pages!
> There is a possibility the board run 3 screens in 120hz? I have 3 screens 120hz and would like to use just a GTX690 to run the 3 screens, I know that to run the 3 screens would be better to have two VGAs these but in the case of the new mini itx pc will then have to be just a VGA
> 
> thks!!!!


If you are talking about a single 690 running three screen at 120hz in general, that is obviously not an issue. However, if you want to do three monitor surround gaming and get 120 fps then that is not going to happen. In the Valley 1.0 thread no one has even gone above 97 fps with any gpus, including four Titans and four 7970 at 5760x1080.

I have run my one 690 and three monitor gaming and it does work, but you have to turn down settings and you won't get a real high fps.


----------



## Lukas026

ok guys i made pic how i would put these pads on my PCB. is it fine or is there some problem ?



I would rather put only LINES (shown in picture) and not cut sqares for each memory / VRM etc.

thanks OCN and also TPU for reference image of PCB


----------



## PCModderMike

nvm


----------



## LeandroJVarini

Quote:


> Originally Posted by *justanoldman*
> 
> If you are talking about a single 690 running three screen at 120hz in general, that is obviously not an issue. However, if you want to do three monitor surround gaming and get 120 fps then that is not going to happen. In the Valley 1.0 thread no one has even gone above 97 fps with any gpus, including four Titans and four 7970 at 5760x1080.
> 
> I have run my one 690 and three monitor gaming and it does work, but you have to turn down settings and you won't get a real high fps.


thks man!!
rep+!


----------



## noob.deagle

Quote:


> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCModderMike*
> 
> Turning up the AA certainly does cause a big drop in FPS...but even at 2560x1440 with 4x I'm not getting close to using 2GB of VRAM.
> PhysX was on for all of these
> Memory usage is shown in the on screen display in the upper left hand corner.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *No AA*
> [
> 
> 
> 
> 
> 
> 
> Hmmmm, interesting.
> 
> Wonder why 4x is such a massive FPS hit then?
Click to expand...

Quote:


> Originally Posted by *PCModderMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Hmmmm, interesting.
> 
> Wonder why 4x is such a massive FPS hit then?
> 
> 
> 
> Yea the performance hit at 4x is just way too much for me. 2x seems to be the sweet spot for my setup
Click to expand...

REMEMBER that isn't msaa its super sampling AA.

so 2x renders the game at 2x your resolution than downscales it.

So 4x at 1080p the game is being rendered at 7680x4320. That's why it takes a huge hit on performance.

Sent from my Nexus 4 using Tapatalk 4 Beta


----------



## Lukas026

guys please take a look at my picure above and tell if its safe to use thermal pads this way. don't want to damage such an expensive card









post #4586


----------



## justanoldman

Quote:


> Originally Posted by *Lukas026*
> 
> guys please take a look at my picure above and tell if its safe to use thermal pads this way. don't want to damage such an expensive card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> post #4586


What you have is slightly different than these:
EK:
http://www.ekwb.com/shop/EK-IM/EK-IM-3831109856529.pdf
Heatkiller:
http://watercool.de/sites/default/files/downloads/MA_GPU-X3_GTX_690_A5.pdf

But your pic is pretty close. Might want to look over those two pdf documents and compare.
You could also contact Asus and double check what they recommend for putting the stock cooler back.


----------



## Professional

Is it true that someone on another website said that 2 690s will outperform 2 Titan in every single way?


----------



## justanoldman

Quote:


> Originally Posted by *Professional*
> 
> Is it true that someone on another website said that 2 690s will outperform 2 Titan in every single way?


People can say whatever they want. In my opinion one 690 beats one Titan, but because of poor scaling for four gpus, I would say two Titans has the advantage over two 690s.


----------



## Professional

Quote:


> Originally Posted by *justanoldman*
> 
> People can say whatever they want. In my opinion one 690 beats one Titan, but because of poor scaling for four gpus, I would say two Titans has the advantage over two 690s.


Well, that what i replied to him, i also think even that 690 has something great over Titan, but still Titan will have its strength over 690, and 2 Titans will have winner points over 2 690s, just i don't believe that he said in every single way, maybe true, but not in every double or SLI way, hehehe


----------



## x2ezx

hi, I dont know if this is a great deal but someone on another forum sell this EVGA GTX 690 for 625$. I have a Asus GTX 580, should buy it or buy a GTX 780. The card was buy 6 mouth ago.

Thanks!


----------



## Professional

Quote:


> Originally Posted by *x2ezx*
> 
> hi, I dont know if this is a great deal but someone on another forum sell this EVGA GTX 690 for 625$. I have a Asus GTX 580, should buy it or buy a GTX 780. The card was buy 6 mouth ago.
> 
> Thanks!


GTX 780


----------



## justanoldman

Quote:


> Originally Posted by *x2ezx*
> 
> hi, I dont know if this is a great deal but someone on another forum sell this EVGA GTX 690 for 625$. I have a Asus GTX 580, should buy it or buy a GTX 780. The card was buy 6 mouth ago.
> 
> Thanks!


In the Valley thread the 780 is 28% below my 690, so for that price I would definitely look at the 690. However it would depend on if the warranty transferred and it was a good overclocker.

If you will SLI the 780, and need that much horsepower then that is a consideration also. But that sounds like a good price for the 690.


----------



## Professional

I feel that member will sell his EVGA 690 to get GTX 780


----------



## x2ezx

He just sell is 690 sli for a sli of titan.
Thanks for the anwser.


----------



## wholeeo

I'd jump all over a 690 for $625.


----------



## pwspong

Hi OCN,

anyways

GTX 690 EVGA

When I use two monitors so I have to use top right and the bottom one (looking at the back of your computer) for them to work. If I want to go back to a single monitor I still just leave in the top right because I get no signal if I put it on the top left. In order to get this to work (probably a driver error) I have to go back to 320.18 but I downgraded after I heard that it was frying peoples GPUs.

Is it weird that I can't use the top left (no signal) and does it matter which one of the top ones you use if you are using single monitor?


----------



## propeldragon

does overclocking the memory do anything for the 690? when i did a benchmark for tomb raider it gave me like 2 fps better when i raised it 500mhz but maybe for other games it helps more? i only use 1 1080p screen.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *propeldragon*
> 
> does overclocking the memory do anything for the 690? when i did a benchmark for tomb raider it gave me like 2 fps better when i raised it 500mhz but maybe for other games it helps more? i only use 1 1080p screen.


You answered your own question.


----------



## propeldragon

ok i guess


----------



## PinzaC55

Hi all just bought this EK FC690 GTX CSQ waterblock and backplate on ebay and I was wondering can anyone here point me to some good photos of GTX 690's cooled by one of these?


----------



## Alex132

Quote:


> Originally Posted by *PinzaC55*
> 
> Hi all just bought this EK FC690 GTX CSQ waterblock and backplate on ebay and I was wondering can anyone here point me to some good photos of GTX 690's cooled by one of these?


>>google


Spoiler: Warning: Spoiler!



In all seriousness though, I couldn't find any really nice ones. Problem with having a rare card, and then compiling that with watercooling!

So, give us some awesome pictures!


----------



## PinzaC55

Lol it only cost me £10.50 (new and unused) so I am in no hurry! I just want to get the aesthetics right. I will be using chrome fittings (either Koolance or Bitspower) and plumbing it into my existing 2 radiator loop.


----------



## PCModderMike

Quote:


> Originally Posted by *PinzaC55*
> 
> Hi all just bought this EK FC690 GTX CSQ waterblock and backplate on ebay and I was wondering can anyone here point me to some good photos of GTX 690's cooled by one of these?


Here - Click


----------



## Lukas026

ok guys I am still not sure about the pads. I found out than some sites even recommend using TIM for PLX chip.

Here is my last picture how I would do it right now ?



Please someone who has experience with this help









Thanks


----------



## Lukas026

please guys help









dont want to mess up card for so many bucks...


----------



## xToFxREAPER

Quote:


> Originally Posted by *Lukas026*
> 
> please guys help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> dont want to mess up card for so many bucks...


fallow the instructions based on the waterblock is my best advice to you.. usually they provide a specific set of instructions for a reason lol


----------



## Lukas026

I am not going to use waterblock lol . I need help reseating the stock cooler and pads under it. Thickness of pads on vram and vrms is my main question....


----------



## xToFxREAPER

Quote:


> Originally Posted by *Lukas026*
> 
> I am not going to use waterblock lol . I need help reseating the stock cooler and pads under it. Thickness of pads on vram and vrms is my main question....


ohh my bad lol havent had a chance to read much lately, only thing i used pads on was the vram... everything else i used paste as for pad thickness what that picture states should be correct to the best of my knowledge


----------



## Lukas026

np i will try it tommorow and see if it fits. I hope so









any other replies are still appreciated tho...


----------



## justanoldman

Quote:


> Originally Posted by *Lukas026*
> 
> please guys help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> dont want to mess up card for so many bucks...


Your last pic is how the Heatkiller block is setup, as shown in the link I provided previously, so I believe you are fine with that. But as I posted earlier, if for some reason you are not comfortable with that then you need to contact Asus directly and ask.


----------



## Lukas026

in fact they are opposite - 1mm for VRAM and 0.5 mm for VRM's and PLX

i read that info from some evga page...


----------



## yknot

Quote:


> Originally Posted by *justanoldman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lukas026*
> 
> please guys help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> dont want to mess up card for so many bucks...
> 
> 
> 
> Your last pic is how the Heatkiller block is setup, as shown in the link I provided previously, so I believe you are fine with that. But as I posted earlier, if for some reason you are not comfortable with that then you need to contact Asus directly and ask.
Click to expand...

Are not Asus cards just rebranded Nvidia reference cards? Maybe a request to Nvidia would be better?.........maybe









I have refitted the original cooler on a 690 (yes, 2 months still working, no probs







) but I'm thinking it's difficult to give advice in case "Lukas026" ends up with a very expensive fly swatter.

I'm just guessing, however. No offence meant.


----------



## Lukas026

hey guys

bad news everyone. after using thermal pads and re-seating cooler, guess what happens. I switched on the computer and black screen all the time. I tried to plug the card to another pc and same thing. The same problem as I had with MSI 690 gtx.

So now I have to again RMA it and see what happens. I realy love the 690 but cmon, RMAing 2 cards for 999 bucks in 6 months ?









And I am sure it's not my fault, becouse I tinkered with pads for quite some time and all fitted perfectly.

So now here is another question popping in ? If they accept RMA and return money to me, is 690 still the best option for gaming on 1200p ? Or should I switch to Titan ? Or maybe something else like 1x 780 or 2x 770 / 760 ?

Let me know your opinions and thanks all for help with my thermal pad problem...


----------



## xToFxREAPER

Quote:


> Originally Posted by *Lukas026*
> 
> hey guys
> 
> bad news everyone. after using thermal pads and re-seating cooler, guess what happens. I switched on the computer and black screen all the time. I tried to plug the card to another pc and same thing. The same problem as I had with MSI 690 gtx.
> 
> So now I have to again RMA it and see what happens. I realy love the 690 but cmon, RMAing 2 cards for 999 bucks in 6 months ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I am sure it's not my fault, becouse I tinkered with pads for quite some time and all fitted perfectly.
> 
> So now here is another question popping in ? If they accept RMA and return money to me, is 690 still the best option for gaming on 1200p ? Or should I switch to Titan ? Or maybe something else like 1x 780 or 2x 770 / 760 ?
> 
> Let me know your opinions and thanks all for help with my thermal pad problem...


personally im gonna be getting 2 4gb 770's when i sell my 690 then watercooling them, they have enough gpu muscle and a good chunk of vram


----------



## Imprezzion

Guys, i'm trying to flash my buddy's GTX690 (ASUS) so we can get some more volts and power limit raised.

Only, nvflash keeps kicking up a Hierarchy Mismatch when trying to flash it (GPU0) using nvflash -i0 0.rom.
nvflash --protectoff is done ofcourse and BIOS is saved per GPU with GPU-Z and editted with KeplerBiosTweaker.

Little help guys?


----------



## justanoldman

Quote:


> Originally Posted by *Imprezzion*
> 
> Guys, i'm trying to flash my buddy's GTX690 (ASUS) so we can get some more volts and power limit raised.
> 
> Only, nvflash keeps kicking up a Hierarchy Mismatch when trying to flash it (GPU0) using nvflash -i0 0.rom.
> nvflash --protectoff is done ofcourse and BIOS is saved per GPU with GPU-Z and editted with KeplerBiosTweaker.
> 
> Little help guys?


I can't help with flashing problems, but I can say that everyone who has ever tried a modded bios with a 690 never saw an increase in performance.


----------



## Imprezzion

Aight. Then we probably have to let it go.. It's already a great sample doing well in excess of 1200Mhz in 3dmark 11. (completed full suite at 1215Mhz)


----------



## justanoldman

Quote:


> Originally Posted by *Imprezzion*
> 
> Aight. Then we probably have to let it go.. It's already a great sample doing well in excess of 1200Mhz in 3dmark 11. (completed full suite at 1215Mhz)


Nice.








How far can you oc the memory? I have the highest score in the Valley 1.0 thread here so far with a 690, but that bench likes a high memory oc. Post a score over there with your max oc, and read the tweak guide in the op to get the best score. My 3dmark score was a little over 18k.


----------



## Imprezzion

We we're aiming for 18K but his CPU wasn't fully overclocked yet. 2600K @ 4.8. Did 172xx with 1202Mhz core and 6500Mhz VRAM. We didn't try VRAM pushing yet since it has the habit of bouncing off the power limit in GT1/GT3









I had/have the highest GTX670 score in the Valley thread for quite a while but I sold the card. Was a Gigabyte Windforce3x that did 1391Mhz core on 1.212v BIOS in Valley.

His CPU is fully watercooled and it can do 5400 4c/8t if it really has to with enough volts..


----------



## wack3d

am i just the unluckiest 690 owner on earth? i cant even overclock core 95mhz????









i started by trying:
evga
power target:135%
gpu clock offset +100mhz (since everbody dont seem to have a problem running 100+)
mem clock offset +0
fan set to 90

and valley would just go to black screen(music keep playing and i can alt-tab to close) pretty much straight away... temp were around 60+. voltage 1,75 load 90-100

then i tried updating my bios on my extreme4 mb since it was old and dropped my cpu overclock to stock (i5 3750)

ran valley with 75mhz it ran a benchmark with no crash upped to 85 ran benchmark again upped to 95 ran 10 sec then black screen.......

do i just have the worst 690 on earth or could something else be wrong with my comp?


----------



## justanoldman

^ That would be a little lower than average, but the boost can vary a bit. What is your actual gpu clock as shown in Precision? 1189 is very good, 1202 is extremely good in Valley, and the average card would be well below that. See how well you can oc the memory. In the cards I have seen, usually they can do in the +90 to +150 range on the core, but on memory they can go +600 or higher on some.

Valley is not an easy benchmark, I can run some games and 3dmark11 with a higher oc than Valley allows. The 690 is a powerful card at stock settings, there is no real need to oc it much other than trying for better benchmark scores. Also, water cooling helps the oc a little, and keeping an oced card under the 70c throttle point is not easy on air.


----------



## wack3d

stock when running valley seems to be around: gpu1 1058 gpu2 1071
oc with 85+ 1110-1124 - 1137 - 1150 it swings alot between the cloks with oc.....and gpu1 always seem to be around 13mhz lower then gpu2?


----------



## max883

Sorry my gtx 690 owners but i have to leave yu now! Got my self a pair of TITANS!!!


----------



## yknot

Quote:


> Originally Posted by *wack3d*
> 
> am i just the unluckiest 690 owner on earth? i cant even overclock core 95mhz????
> 
> 
> 
> 
> 
> 
> 
> .........................


My 690 would only go to 90-95 as well but water cooling got me to +155 on both cores with vram up to about +800, the other settings like yours..........I know that's not a record but it seems to make a difference in various scenarios.

Also, as "justanoldman" says, modded bioses on a 690 are not much good........................because Nvidia has ensured that 690s cannot be modded as other 6 series due to the inordinate number of previous gen 590s that were RMAed due to excessive voltage modding.

I'm sure somebody might have got sumfink out of the Kepler bios editor but my mod did not do anything much........just increased the power limit.

http://forums.guru3d.com/showthread.php?t=362667


----------



## xToFxREAPER

Quote:


> Originally Posted by *wack3d*
> 
> am i just the unluckiest 690 owner on earth? i cant even overclock core 95mhz????
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i started by trying:
> evga
> power target:135%
> gpu clock offset +100mhz (since everbody dont seem to have a problem running 100+)
> mem clock offset +0
> fan set to 90
> 
> and valley would just go to black screen(music keep playing and i can alt-tab to close) pretty much straight away... temp were around 60+. voltage 1,75 load 90-100
> 
> then i tried updating my bios on my extreme4 mb since it was old and dropped my cpu overclock to stock (i5 3750)
> 
> ran valley with 75mhz it ran a benchmark with no crash upped to 85 ran benchmark again upped to 95 ran 10 sec then black screen.......
> 
> do i just have the worst 690 on earth or could something else be wrong with my comp?


i must say that you are getting some pretty good boosts all things considered, though it does seem that your card is a poor overclocker its not as bad as you might think just because the boost is pretty decent







 on mine i can get +150 on the core but when it boosts its about 1202 which is currently on air so your not to far behind, and in terms of ingame performance that translates to MAYBE 5fps at the very most so i wouldnt worry to much about how well your card oc's it really doesnt matter much considering the 690 is already a beast, i actually run my card at stock unless im benching cause thats the only time it matters


----------



## justanoldman

Quote:


> Originally Posted by *yknot*
> 
> My 690 would only go to 90-95 as well but water cooling got me to +155 on both cores with vram up to about +800, the other settings like yours..........I know that's not a record but it seems to make a difference in various scenarios.
> 
> Also, as "justanoldman" says, modded bioses on a 690 are not much good........................because Nvidia has ensured that 690s cannot be modded as other 6 series due to the inordinate number of previous gen 590s that were RMAed due to excessive voltage modding.
> 
> I'm sure somebody might have got sumfink out of the Kepler bios editor but my mod did not do anything much........just increased the power limit.
> 
> http://forums.guru3d.com/showthread.php?t=362667


+155 and +800? That is very good. Can I ask why you have not put in a score for Valley 1.0 here? You would have one of it not the best score if your chip is oced high as well.
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form


----------



## tin0

Quote:


> Originally Posted by *justanoldman*
> 
> Nice.
> 
> 
> 
> 
> 
> 
> 
> 
> How far can you oc the memory? I have the highest score in the Valley 1.0 thread here so far with a 690, but that bench likes a high memory oc. Post a score over there with your max oc, and read the tweak guide in the op to get the best score. My 3dmark score was a little over 18k.


This is the card in question;

http://www.hwbot.org/submission/2396072_
Asus GTX690 on standard bios @ 1202mhz core 7200mhz mem (GPU-Z doesn't read clocks correctly). Now for that freakin' hierarchy mismatch thingy to dissapear...

Gonna be cooled by this next week;


Will post a heaven score as soon as the block is on


----------



## yknot

Quote:


> Originally Posted by *justanoldman*
> 
> [+155 and +800? That is very good. Can I ask why you have not put in a score for Valley 1.0 here? You would have one of it not the best score if your chip is oced high as well.
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0-fill-the-form


I've been concentrating on my mbd/cpu optimisations atm.............I noticed the low 90-95 scores and thought that the poster might want to try water cooling. I do not want to hog the thread, especially as quite a few ppl are not happy about ocking a card that is expensive and fast enough at stock settings, and doesn't yield enormous FPS gains in games when ocked.

The big deal for me is the substantial drop in temperature and the resultant lack of throttling of the cores.


----------



## justanoldman

Quote:


> Originally Posted by *yknot*
> 
> I've been concentrating on my mbd/cpu optimisations atm.............I noticed the low 90-95 scores and thought that the poster might want to try water cooling. I do not want to hog the thread, especially as quite a few ppl are not happy about ocking a card that is expensive and fast enough at stock settings, and doesn't yield enormous FPS gains in games when ocked.
> 
> The big deal for me is the substantial drop in temperature and the resultant lack of throttling of the cores.


Not sure I understand the issue of ocing and an expensive card, there is nothing you can do with a 690 oc that can hurt the card. Other cards that allow modded bios and voltage increases are vulnerable however.

I understand not ocing the card in normal use, I just think it is interesting to see how far you can take it for a benchmark run. And as for hogging this thread, it would be more interesting if someone did, there are not many of us 690 owner's left, that is one reason this thread is so quiet.


----------



## Alex132

So, does the R320.49 drivers make any improvements that you guys have noticed?


----------



## justanoldman

Quote:


> Originally Posted by *Alex132*
> 
> So, does the R320.49 drivers make any improvements that you guys have noticed?


The move from 314.22 to the 320.x gave me an improvement of a few fps, but then they had all the problems reported with 320.18. The 320.49 is supposed to address the problems with 320.18, but it doesn't give much performance increase over the other 320.x ones.


----------



## PinzaC55

Just curious, I have the EK waterblock for my 690 and the Bitspower fittings should arrive this week. When I swap the fan for the waterblock I will lose the fan's power consumption. How much power does it use?


----------



## yknot

Quote:


> Originally Posted by *justanoldman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *yknot*
> 
> I've been concentrating on my mbd/cpu optimisations atm.............I noticed the low 90-95 scores and thought that the poster might want to try water cooling. I do not want to hog the thread, especially as quite a few ppl are not happy about ocking a card that is expensive and fast enough at stock settings, and doesn't yield enormous FPS gains in games when ocked.
> 
> The big deal for me is the substantial drop in temperature and the resultant lack of throttling of the cores.
> 
> 
> 
> Not sure I understand the issue of ocing and an expensive card, there is nothing you can do with a 690 oc that can hurt the card. Other cards that allow modded bios and voltage increases are vulnerable however.
> 
> I understand not ocing the card in normal use, I just think it is interesting to see how far you can take it for a benchmark run. And as for hogging this thread, it would be more interesting if someone did, there are not many of us 690 owner's left, that is one reason this thread is so quiet.
Click to expand...

I'm very happy to oc my card and try for a benchmark. I was trying not to go overboard with my enthusiasm for water cooling a 690.


----------



## justanoldman

Quote:


> Originally Posted by *yknot*
> 
> I'm very happy to oc my card and try for a benchmark. I was trying not to go overboard with my enthusiasm for water cooling a 690.


+1, one of the best things I have done is put my video card under water. It was pretty comical when I had the case open, the 690 fan maxed, and a floor fan blowing on the card just to keep it under 70c for a benchmark run. With a chip at 5.0 and the 690 max oc, my rig now doesn't get hot or make noise. Can't ask for more than that.


----------



## Koniakki

Quote:


> Originally Posted by *PinzaC55*
> 
> Just curious, I have the EK waterblock for my 690 and the Bitspower fittings should arrive this week. When I swap the fan for the waterblock I will lose the fan's power consumption. How much power does it use?


Well the Accelero Twin Turbo 690 from AC consumes 5.04watts with its 2 fans. So I'm incline to say that the stock cooler of the GTX 690 will consume probably lower than that..


----------



## Koniakki

Quote:


> Originally Posted by *justanoldman*
> 
> +1, one of the best things I have done is put my video card under water. It was pretty comical when I had the case open, the 690 fan maxed, and a floor fan blowing on the card just to keep it under 70c for a benchmark run. With a chip at 5.0 and the 690 max oc, my rig now doesn't get hot or make noise. Can't ask for more than that.


Comical you say?? You should have seen me when I benching Valley/3DMark/Heaven etc at 3-4am (because it gets cold those times) with the below setup.
I can assure you comical doesn't cut it. It was actually a bit sad tbh! lol!









Freezer + 18" Floor Fan + HAF X open side= Pure Win! or is it Fail?


----------



## Seanimus

Running 690 Hydro + 120HZ. It is awesome.
Everything ULTRA in WoW. 25 Man dungeons in WOW with all the Mage and Blizzard AOE's is pretty fun to watch, without lagging.
Temperatures at around 30C with hours of game time- usually around 90-100 FPS. I think lowest I saw was 40 FPS and there was no activity of party/raid/blizzard/aoe in game --was just sitting solo on mount; don't know what happened there...lol


----------



## Lukas026

just a quick question guys









i am trying to get my hand on EVGA 690 Backplate but it seems they will not make it anymore. so I am thinking: can I use backplate from EK WB with the reference air cooler ? has anybody try it ?

here is a link to EK WB:

http://www.ekwb.com/shop/ek-fc690-gtx-backplate-black.html

thank you


----------



## PinzaC55

Quote:


> Originally Posted by *Lukas026*
> 
> just a quick question guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i am trying to get my hand on EVGA 690 Backplate but it seems they will not make it anymore. so I am thinking: can I use backplate from EK WB with the reference air cooler ? has anybody try it ?
> 
> here is a link to EK WB:
> 
> http://www.ekwb.com/shop/ek-fc690-gtx-backplate-black.html
> 
> thank you


Yes you can use it but you need a supply of six M2.5 x 7 or 8mm countersunk head screws. The screws as supplied by EK won't fit the holes on the reference fan cooler. Apart from that you install it as if it was fixing to the waterblock.


----------



## svpam92

Hello Ocn'ers. I just bought GTX 690 off a fellow guy here. It runs perfect at stock but when I tried to do some mild overclock (135% power target, +150 core clock, +50 memory clock) these are settings that perfectly stable for previous owner, but then I got crashes/system completely freeze when playing Tomb Raider 2013. I don't really know how to overclock this card, since I went from AMD to NVIDIA. I went through OP to see if there is any guides on overclocking, but unfortunately not. If probably someone could help me how to do some mild overclock, that would be very much appreciate it. By the way, my system specs are:

i7-2600k @4.8 stable
Kingston HyperX blu 8gb 1333
Asus Sabertooh p67
Corsair AX750
Asus GTX690
Cooler Master HAFX(modded)

Here is the picture, if OP can add me to the club, happy to be green!


----------



## justanoldman

Quote:


> Originally Posted by *svpam92*
> 
> Hello Ocn'ers. I just bought GTX 690 off a fellow guy here. It runs perfect at stock but when I tried to do some mild overclock (135% power target, +150 core clock, +50 memory clock) these are settings that perfectly stable for previous owner, but then I got crashes/system completely freeze when playing Tomb Raider 2013. I don't really know how to overclock this card, since I went from AMD to NVIDIA. I went through OP to see if there is any guides on overclocking, but unfortunately not. If probably someone could help me how to do some mild overclock, that would be very much appreciate it. By the way, my system specs are:
> 
> i7-2600k @4.8 stable
> Kingston HyperX blu 8gb 1333
> Asus Sabertooh p67
> Corsair AX750
> Asus GTX690
> Cooler Master HAFX(modded)
> 
> Here is the picture, if OP can add me to the club, happy to be green!


With a 690, +150 on the core is not mild, is it actually quite huge. It is pretty easy to oc the card.

Keep it 135%, put the mem oc to +0, and start with +90 on the core. Run Valley 1.0 at Extreme HD settings and write down your score after each run. If you make it through to the score after 18 scenes then up the core another 10 to +100. Keep doing that until you until anything crashes or your score goes down. Make sure to restart your computer after any driver or program crash.

When you have found your core oc that is stable, then start with +200 on the memory and do the same procedure as above, except keep the core the same and add another 50 to the mem oc. Keep doing that until you crash something, or your score goes down.

After that you can refine both your core and mem oc to fine tune it. Each program or game will have a max stable oc though. 3dmark11 allows a higher oc than Valley 1.0, and Heaven 4.0 needs an even lower oc usually. BF3, Crysis 3, FC3 are all known to be a little tougher than the average game.

The card with throttle at 70c and the more you oc it the hotter it will get, which why going under water helps. An average 690 should do about +90 to +120 on core, and maybe +300 to +500 on the memory.

Don't worry, you can hurt these cards by ocing this way. A driver or program crash from too high an oc on a 690 is no big deal.


----------



## svpam92

Quote:


> Originally Posted by *justanoldman*
> 
> With a 690, +150 on the core is not mild, is it actually quite huge. It is pretty easy to oc the card.
> 
> Keep it 135%, put the mem oc to +0, and start with +90 on the core. Run Valley 1.0 at Extreme HD settings and write down your score after each run. If you make it through to the score after 18 scenes then up the core another 10 to +100. Keep doing that until you until anything crashes or your score goes down. Make sure to restart your computer after any driver or program crash.
> 
> When you have found your core oc that is stable, then start with +200 on the memory and do the same procedure as above, except keep the core the same and add another 50 to the mem oc. Keep doing that until you crash something, or your score goes down.
> 
> After that you can refine both your core and mem oc to fine tune it. Each program or game will have a max stable oc though. 3dmark11 allows a higher oc than Valley 1.0, and Heaven 4.0 needs an even lower oc usually. BF3, Crysis 3, FC3 are all known to be a little tougher than the average game.
> 
> The card with throttle at 70c and the more you oc it the hotter it will get, which why going under water helps. An average 690 should do about +90 to +120 on core, and maybe +300 to +500 on the memory.
> 
> Don't worry, you can hurt these cards by ocing this way. A driver or program crash from too high an oc on a 690 is no big deal.


Damn! That is so detail! Thank you so much sir. Repped for ya!


----------



## Lukas026

Quote:


> Originally Posted by *PinzaC55*
> 
> Yes you can use it but you need a supply of six M2.5 x 7 or 8mm countersunk head screws. The screws as supplied by EK won't fit the holes on the reference fan cooler. Apart from that you install it as if it was fixing to the waterblock.


oh can you please link me some shop where I can find them ? I am not sure what countersunk exactly means









my english is not so great...

thanks

PS: great guide for OCing from justanoldman

/bow


----------



## Arizonian

Quote:


> Originally Posted by *svpam92*
> 
> Hello Ocn'ers. I just bought GTX 690 off a fellow guy here. It runs perfect at stock but when I tried to do some mild overclock (135% power target, +150 core clock, +50 memory clock) these are settings that perfectly stable for previous owner, but then I got crashes/system completely freeze when playing Tomb Raider 2013. I don't really know how to overclock this card, since I went from AMD to NVIDIA. I went through OP to see if there is any guides on overclocking, but unfortunately not. If probably someone could help me how to do some mild overclock, that would be very much appreciate it. By the way, my system specs are:
> 
> i7-2600k @4.8 stable
> Kingston HyperX blu 8gb 1333
> Asus Sabertooh p67
> Corsair AX750
> Asus GTX690
> Cooler Master HAFX(modded)
> 
> Here is the picture, if OP can add me to the club, happy to be green!


Very nice work inside that tower cooling and PSU cabling.









How to list your rig in my signature links for OCN.


----------



## PinzaC55

Quote:


> Originally Posted by *Lukas026*
> 
> oh can you please link me some shop where I can find them ? I am not sure what countersunk exactly means
> 
> 
> 
> 
> 
> 
> 
> 
> 
> my english is not so great...
> 
> thanks
> 
> PS: great guide for OCing from justanoldman
> 
> /bow


I've no idea which country you are in so no, but if you are in the USA it is here http://www.ebay.co.uk/itm/EK-GeForce-690-GTX-VGA-Liquid-Cooling-RAM-Backplate-Black-CSQ-3831109856567-/380649836946?pt=US_Water_Cooling&hash=item58a07f9192 If you are somewhere else Google "EK FC690" and you should find it.
This is what a countersunk{CSK} screw looks like https://www.google.co.uk/search?q=countersunk+head+screw&tbm=isch&tbo=u&source=univ&sa=X&ei=Q1XVUau6BoSBhQeD7IHwCg&sqi=2&ved=0CGUQsAQ&biw=1920&bih=955
The ones I suggested were M2.5 x 7mm or 8mm, 2.5 mm being the diameter of the screw and 7 or 8mm being the length of the thread. Obviously they need to be black.


----------



## Lukas026

okey thanks man


----------



## yknot

"justanoldman", I've finally got a benchmark for ya.

Got a 3D11 score with my 690. Both cores to +174 and vram up to +820.. Using vintage 3960X C1 with 16Gb

http://www.3dmark.com/3dm11/6532674

Not in the Titan/780 SLI class but not bad.


----------



## justanoldman

Quote:


> Originally Posted by *yknot*
> 
> "justanoldman", I've finally got a benchmark for ya.
> 
> Got a 3D11 score with my 690. Both cores to +174 and vram up to +820.. Using vintage 3960X C1 with 16Gb
> 
> http://www.3dmark.com/3dm11/6532674
> 
> Not in the Titan/780 SLI class but not bad.


That, my friend, is awesome.








Make sure to post a good Valley 1.0 score in that thread here. Follow the important parts of the tweak guide in the op there to get the best score.


----------



## svpam92

Quote:


> Originally Posted by *Arizonian*
> 
> Very nice work inside that tower cooling and PSU cabling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How to list your rig in my signature links for OCN.


Thank you so much!


----------



## yknot

Quote:


> Originally Posted by *justanoldman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *yknot*
> 
> "justanoldman", I've finally got a benchmark for ya.
> 
> Got a 3D11 score with my 690. Both cores to +174 and vram up to +820.. Using vintage 3960X C1 with 16Gb
> 
> http://www.3dmark.com/3dm11/6532674
> 
> Not in the Titan/780 SLI class but not bad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That, my friend, is awesome.
> 
> 
> 
> 
> 
> 
> 
> 
> Make sure to post a good Valley 1.0 score in that thread here. Follow the important parts of the tweak guide in the op there to get the best score.
Click to expand...

RIVE has died. I'm trying to get a "Formula" cranked up..........................Anon!


----------



## Koniakki

Quote:


> Originally Posted by *yknot*
> 
> "justanoldman", I've finally got a benchmark for ya.
> 
> Got a 3D11 score with my 690. Both cores to +174 and vram up to +820.. Using vintage 3960X C1 with 16Gb
> 
> http://www.3dmark.com/3dm11/6532674
> 
> Not in the Titan/780 SLI class but not bad.


Wow! Almost 2k above my best score(17818). Damn! Impressed!


----------



## Koniakki

Oops. DP


----------



## PinzaC55

Just been playing with my new EK FC690 trying to get an idea of how I will install it. I have to say that getting Bitspower fittings in the UK is a nightmare


----------



## PinzaC55

Well I finally took the plunge and watercooled my 690 yesterday. I have a couple more fittings on order so this may look untidy but its only temporary! I have to say that it isn't for the faint hearted; the guy who I bought the FC690 off on Ebay said he "chickened out" of doing it and it took me 3.5 hours just for the block, then out for a couple of pints and put the tubes in and fill her up, then take a deep breath . She worked with no leaks so literally the next thing I did was to run Valley benchmark - and - what can I say? Previously the 690 had ran at a max of 80 degrees C and the backplate was hot enough to fry an egg; this time it was normally 44 with a max of 47 and the backplate was just warm. So all in all it was 3.5 hours well spent


----------



## yknot

Quote:


> Originally Posted by *PinzaC55*
> 
> ............................. She worked with no leaks so literally the next thing I did was to run Valley benchmark - and - what can I say? Previously the 690 had ran at a max of 80 degrees C and the backplate was hot enough to fry an egg; this time it was normally 44 with a max of 47 and the backplate was just warm. So all in all it was 3.5 hours well spent


Yep.....







.......it's well worth the effort. My temps dropped from 70+, with associated early throttling, to a max about the same as yours.


----------



## PinzaC55

Quote:


> Originally Posted by *yknot*
> 
> Yep.....
> 
> 
> 
> 
> 
> 
> 
> .......it's well worth the effort. My temps dropped from 70+, with associated early throttling, to a max about the same as yours.


I actually posted here a long time ago that I was getting a poor score ("poor" for a 690 that is!) In Valley which I tbink was about 69. My score this time was a more credible 78.8 without OC.


----------



## fat_italian_stallion

I was going through some old 3dmark results from when I had 690s and this little gem showed up. They were truly great cards.


----------



## PinzaC55

Quote:


> Originally Posted by *fat_italian_stallion*
> 
> I was going through some old 3dmark results from when I had 690s and this little gem showed up. They were truly great cards.


Sounds like you have some regrets?


----------



## fat_italian_stallion

Quote:


> Originally Posted by *PinzaC55*
> 
> Sounds like you have some regrets?


Not at all. My current setup beats it by a few thousand points, and the margin opens up at higher resolutions. 3dmark's little goof is funny with the 100% since it wasn't #1 at the time, but instead like #5 on the hall of fame boards.


----------



## Marcsrx

Will I realize better performance moving from my 690 to 2 680 4gig cards? I run 2560x1440 @ 100hz. I like to play modded games as well which I know the higher VRAM will help. Any thoughts?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Marcsrx*
> 
> Will I realize better performance moving from my 690 to 2 680 4gig cards? I run 2560x1440 @ 100hz. I like to play modded games as well which I know the higher VRAM will help. Any thoughts?


Your next worth while jump is a couple of GTX 780s.


----------



## Marcsrx

Arent those only 3 gig cards though?


----------



## MrTOOSHORT

Yeah, it's all you need for 1440p.


----------



## noob.deagle

best result so far 3dmark http://www.3dmark.com/3dm/898192?

i updated my asus bios and now my card seems to overclock way better than before .....strange. before i could only get +75 core +50 mem and any higher it would artifact like mad

this latest run +100 core, +500 mem and no problems even played Crysis for a bit. will require more testing but so far very promising.


----------



## Lukas026

may I ask which bios exactly did you upgraded ? (GPU / MB / both)

you can share some links to


----------



## tin0

Also jumped in for a quick Fire Strike test and came up with the following result;

That's with the card @ 1215mhz for GPU1 and GPU2 with stock bios. Still trying to defeat the 'hierarchy mismatch' so I can up the power limit and max voltage.
Really anxious to find out how it performs with some more volts. Later on going under water.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *tin0*
> 
> Also jumped in for a quick Fire Strike test and came up with the following result;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> That's with the card @ 1215mhz for GPU1 and GPU2 with stock bios. Still trying to defeat the 'hierarchy mismatch' so I can up the power limit and max voltage.
> Really anxious to find out how it performs with some more volts. Later on going under water.


There isn't another level of max voltage for the GTX 690. And added power limit doesn't do anything for the card.

So don't worry about another bios, what you have now is what you get.


----------



## noob.deagle

Quote:


> Originally Posted by *Lukas026*
> 
> may I ask which bios exactly did you upgraded ? (GPU / MB / both)
> 
> you can share some links to


i just updated to the UEFI bios that asus supplies, its not suppose to change anything else excet make it UEFI compatible. i cant link the driver it downloads it through a program and flashes automatically.

its here if you want it http://www.asus.com/Graphics_Cards/GTX6904GD5/#support_Download_30

but once you flash to UEFI you cant use vbios anymore. least thats how i understand it.


----------



## Lukas026

i know I will be for a total idiot here but can you easily explain to me in few words whats UEFI ?

I tried to search for it and how i see it now its something like better (newer) type of BIOS ?

also is there some requirments for using this UEFI on GTX 690 except the card itself ?

thanks


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Lukas026*
> 
> i know I will be for a total idiot here but can you easily explain to me in few words whats UEFI ?
> 
> I tried to search for it and how i see it now its something like better (newer) type of BIOS ?
> 
> also is there some requirments for using this UEFI on GTX 690 except the card itself ?
> 
> thanks


Using the UEFI bios on your card along with Windows 8 will allow the fast boot feature when booting up your computer.


----------



## Lukas026

can I use uefi bios for gtx 690 even when i am using win 7 pro ?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Lukas026*
> 
> can I use uefi bios for gtx 690 even when i am using win 7 pro ?


I think so, but it won't do anything for you. Need Windows 8 to enjoy the fast boot up.


----------



## tin0

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> There isn't another level of max voltage for the GTX 690. And added power limit doesn't do anything for the card.
> 
> So don't worry about another bios, what you have now is what you get.


Argh, you are telling me that I can't even get the voltage up to 1.21v which is a mimum increase capable on all 6 series GPU's.
That would suck..Been aiming to watercool the card so I can OC further when upping the volts/limits..


----------



## xToFxREAPER

Quote:


> Originally Posted by *tin0*
> 
> Argh, you are telling me that I can't even get the voltage up to 1.21v which is a mimum increase capable on all 6 series GPU's.
> That would suck..Been aiming to watercool the card so I can OC further when upping the volts/limits..


They did a really good job of locking it down on the 690, no ones been able to pull it off yet which is also why i sold my 690 lol


----------



## PinzaC55

GTX 690 tidied up somewhat.


----------



## Valkayria

Quote:


> Originally Posted by *PinzaC55*
> 
> GTX 690 tidied up somewhat.










Respect! Nice looking set up.


----------



## svpam92

Guys, I need help. I recently bought a koolance gtx690 waterblock used from a fellow OCNer. However, it doesn't comes with thermal pads. I know that we need to apply certain thickness at certain place. I really dont know where or which thermal pads should I buy. Can someone help me out by showing diagram or something. Thank you. I would very much appreciate your help


----------



## anubis1127

Hi guys, I recently picked up an EVGA GTX 690 off a friend of mine, and thought I'd see about joining your club.

Here is a quick cell phone pic of my card right now:



Not the best pic, sorry, terrible lighting atm. I'm planning on switching cases this weekend, so I'll try to take a proper pic later.

Here is a quick 3DMark11 run I just did at +130 on the core clock, no memory OC yet: http://www.3dmark.com/3dm11/6856070



My CPU is also probably not 100% stable, and only running at 4.4ghz.

Any tips on airflow for just using the reference cooler? I'll be moving my hardware into an old Arc Midi I have lying around so I can watercool my CPU, but funds (I already have the stuff for the CPU loop just sitting here), and rad space won't really allow for watercooling the GPU any time soon. The arc midi has a side vent with a 140mm fan, and then two 140mm fans on the intake. I was thinking intake on the side vent to get the reference fan some cool air from outside the case.

Right now the card gets up to ~82C when I don't have a custom fan profile setup, so I'm hoping to get that down slightly.


----------



## PinzaC55

Quote:


> Originally Posted by *svpam92*
> 
> Guys, I need help. I recently bought a koolance gtx690 waterblock used from a fellow OCNer. However, it doesn't comes with thermal pads. I know that we need to apply certain thickness at certain place. I really dont know where or which thermal pads should I buy. Can someone help me out by showing diagram or something. Thank you. I would very much appreciate your help


According to the manual http://koolance.com/files/products/manuals/manual_vid-nx690_d100eng.pdf some are 0.5mm and some are 1mm, depending where they go. This is the same as the EK block. I guess you have emailed Koolance to see if they can send you some new ones?
When I removed my reference cooler the original pads were mostly left intact as they are kinda like plasticine. If you are in the UK you could have them as they are no use to me, but then again they are cut in squares rather than strips like the manual.


Quote:


> Any tips on airflow for just using the reference cooler? I'll be moving my hardware into an old Arc Midi I have lying around so I can watercool my CPU, but funds (I already have the stuff for the CPU loop just sitting here), and rad space won't really allow for watercooling the GPU any time soon. The arc midi has a side vent with a 140mm fan, and then two 140mm fans on the intake. I was thinking intake on the side vent to get the reference fan some cool air from outside the case.


One of THESE? http://www.frozencpu.com/products/3688/slf-08/AzenX_Blitztorm_System_Slot_Cooler_BT-SC70BBL.html?tl=g40c18s62&id=V3JMrFpb sucks external air in and directs it into the GPU fan.


----------



## anubis1127

Quote:


> Originally Posted by *PinzaC55*
> 
> One of THESE? http://www.frozencpu.com/products/3688/slf-08/AzenX_Blitztorm_System_Slot_Cooler_BT-SC70BBL.html?tl=g40c18s62&id=V3JMrFpb sucks external air in and directs it into the GPU fan.


Hmm, I was thinking more just along the lines of optimum air flow for the 690 cooler. I have seen similar products in the past, but I don't really want to go that route, I'll just try experimenting with fan direction to see if I can lower temps a bit. Thanks though!

I'll probably be forced to stick with Windows for the fan control / OC software anyway, so it's not a huge issue.


----------



## Inglewood78

Quote:


> Originally Posted by *anubis1127*
> 
> Hi guys, I recently picked up an EVGA GTX 690 off a friend of mine, and thought I'd see about joining your club.
> 
> Here is a quick cell phone pic of my card right now:
> 
> 
> 
> 
> Not the best pic, sorry, terrible lighting atm. I'm planning on switching cases this weekend, so I'll try to take a proper pic later.
> 
> Here is a quick 3DMark11 run I just did at +130 on the core clock, no memory OC yet: http://www.3dmark.com/3dm11/6856070
> 
> 
> 
> 
> My CPU is also probably not 100% stable, and only running at 4.4ghz.
> 
> Any tips on airflow for just using the reference cooler? I'll be moving my hardware into an old Arc Midi I have lying around so I can watercool my CPU, but funds (I already have the stuff for the CPU loop just sitting here), and rad space won't really allow for watercooling the GPU any time soon. The arc midi has a side vent with a 140mm fan, and then two 140mm fans on the intake. I was thinking intake on the side vent to get the reference fan some cool air from outside the case.
> 
> Right now the card gets up to ~82C when I don't have a custom fan profile setup, so I'm hoping to get that down slightly.


Replacing the stock TIM help my temps ~5-10c under load. I would suggest you do that first and see how it performs.


----------



## PinzaC55

Quote:


> Originally Posted by *anubis1127*
> 
> Hmm, I was thinking more just along the lines of optimum air flow for the 690 cooler. I have seen similar products in the past, but I don't really want to go that route, I'll just try experimenting with fan direction to see if I can lower temps a bit. Thanks though!
> 
> I'll probably be forced to stick with Windows for the fan control / OC software anyway, so it's not a huge issue.


It does optimise air flow - you can't get more optimal than directing external air into the fan instead of using already warm air from inside the case. I think one guy rigged up a vent (like a vacuum cleaner) to suck the hot air that comes out of the 690 vent inside the case but it looked very ugly.


----------



## anubis1127

Quote:


> Originally Posted by *Inglewood78*
> 
> Replacing the stock TIM help my temps ~5-10c under load. I would suggest you do that first and see how it performs.


Thanks, I'll try that out.

Quote:


> Originally Posted by *PinzaC55*
> 
> It does optimise air flow - you can't get more optimal than directing external air into the fan instead of using already warm air from inside the case. I think one guy rigged up a vent (like a vacuum cleaner) to suck the hot air that comes out of the 690 vent inside the case but it looked very ugly.


Yeah, I don't doubt it would help some, I'm just not looking to add anything new to it right now. Hopefully having that side vent intake with the 140mm fan will do about the same thing.


----------



## PinzaC55

Quote:


> Originally Posted by *anubis1127*
> 
> Thanks, I'll try that out.
> Yeah, I don't doubt it would help some, I'm just not looking to add anything new to it right now. Hopefully having that side vent intake with the 140mm fan will do about the same thing.


I'd be surprised. When I replaced the standard (200mm fan) side panel of my HAF-X case with a plain window panel it made no difference to the GPU temps. You might get lucky though


----------



## justanoldman

Quote:


> Originally Posted by *svpam92*
> 
> Guys, I need help. I recently bought a koolance gtx690 waterblock used from a fellow OCNer. However, it doesn't comes with thermal pads. I know that we need to apply certain thickness at certain place. I really dont know where or which thermal pads should I buy. Can someone help me out by showing diagram or something. Thank you. I would very much appreciate your help


PinzaC55 gave you the diagram via the pdf, but you can also call FrozenCPU and ask for Joe, he can tell you which ones you need. He set me up with all the thermal pads I needed when I had to redo it.


----------



## Fraizer

hello all

i need a UEFI bios for my GTX 690 from Nvidia Subvendor ""NVIDIA (10DE)""

this card its from a dell alienware aurora R4 (dell dont have this bios)

my acutal info card:

BIOS 80.04.1E.00.13
DEVICE ID 10DE - 1188
Subvendor NVIDIA (10DE)
2048MB

can you give me a solution please ?







maybe a solution to upload a bios of asus card to change the subvendor name... (maybe not asus because they have just the 4GB) and after i can update withe a uefi bios from asus ? (i try the uefi bios like this and its not working)

thank you


----------



## noob.deagle

Quote:


> Originally Posted by *Fraizer*
> 
> hello all
> 
> i need a UEFI bios for my GTX 690 from Nvidia Subvendor ""NVIDIA (10DE)""
> 
> this card its from a dell alienware aurora R4 (dell dont have this bios)
> 
> my acutal info card:
> 
> BIOS 80.04.1E.00.13
> DEVICE ID 10DE - 1188
> Subvendor NVIDIA (10DE)
> 2048MB
> 
> can you give me a solution please ?
> 
> 
> 
> 
> 
> 
> 
> maybe a solution to upload a bios of asus card to change the subvendor name... (maybe not asus because they have just the 4GB) and after i can update withe a uefi bios from asus ? (i try the uefi bios like this and its not working)
> 
> thank you


i dont know who that subvendor is i think its just an OEM nvidia one. but all gtx690s are the same so you should be able to force flash a bios for say evga or asus and then use their eufi updater.

asus evga and others often list the gtx690 as 4gb but its 2gb per gpu making a total of 4 but only 2gb is actually useable because the vram is mirrored between the gpus.


----------



## noob.deagle

Quote:


> Originally Posted by *Inglewood78*
> 
> Replacing the stock TIM help my temps ~5-10c under load. I would suggest you do that first and see how it performs.


how easy is this to do with the GTX690 ?, ive read that some people had issues with ripping the memory pads and not being able to find the right sized replacements.


----------



## Fraizer

thank you noob.deagle for your answear









in gpuz i see 2GB and not 4GB its normal too ?

do you knwo where i found the last asus bios and how i can force the update safely ? you know some one already did this ?

i think if its working many people do this when they want a uefi bios


----------



## noob.deagle

Quote:


> Originally Posted by *Fraizer*
> 
> thank you noob.deagle for your answear
> 
> 
> 
> 
> 
> 
> 
> 
> 
> in gpuz i see 2GB and not 4GB its normal too ?
> 
> do you knwo where i found the last asus bios and how i can force the update safely ? you know some one already did this ?
> 
> i think if its working many people do this when they want a uefi bios


GPUz should show 2 cards both with 2048mb each. that is normal

i don't know of anyone who has done it before but i would suggest you back up your bios before trying.

one thing of note is that the information for my asus GTX690 matches the evga GTX690 and i would assume yours also to me this would mean that the only thing each bios is changing is the vendor ids. once your card is seen as either asus or evga you can then flash their uefi on it.

i cant see why this would not work but again i cannot confirm

can you please post a screenshot of GPUZ.

here is a bios list for you
http://www.techpowerup.com/vgabios/index.php?architecture=NVIDIA&manufacturer=&model=GTX+690&interface=&memType=&memSize=

just download one that you want and then flash with nvflash.


----------



## Fraizer

yes of course for the screen









i think you are right.

what is the version of your Asus gtx 690 bios please ?

you have a guide to make safely the bios flash ?









Oh important question i understand i need to upload 2 bios for the 690 in the list of bios (your link) i need 2 bios or i can use the same bios ???? be cause i understand i need to flash 2 bios ine ths card.

ps: when i try to save the first bios of my list its not working (not read...) but the second of the list its ok


----------



## Fraizer

i wait answear before i update the bios


----------



## anubis1127

Has anybody played BL2 with one of these cards? I'm getting garbage performance with the latest WHQL drivers, dips into the 20s during battles with enemies at times.

Hmm, did a quick search through the thread and found a couple guys with issues, but most were last fall when the game was released.


----------



## Fraizer

please a nice person who have a Asus GTX 690 can extract withe NVFLASH (not working withe gpuz) the 2 bios and the PLX rom bios ?

its verry easy and quick just this command nvflash --save for each this 3 (its let you choose what bios you want

thank you a lot


----------



## trapjaw72

.........EVGA-GTX-690..............


----------



## anubis1127

Quote:


> Originally Posted by *trapjaw72*
> 
> .........










dat PPD


----------



## mtbiker033

how difficult is changing the TIM on the 690?

Any guides?


----------



## Alex132

Quote:


> Originally Posted by *mtbiker033*
> 
> how difficult is changing the TIM on the 690?
> 
> Any guides?


Easy..... if you can remove the T6 torx screws.

I stripped one or two of mine with little to no pressure.


----------



## trapjaw72

very easy just take your time an go slow, the t-6 torx screws are small an very easy to strip out, just like Alex.132 said..........goodluck........


----------



## mtbiker033

Quote:


> Originally Posted by *Alex132*
> 
> Easy..... if you can remove the T6 torx screws.
> 
> I stripped one or two of mine with little to no pressure.


Quote:


> Originally Posted by *trapjaw72*
> 
> very easy just take your time an go slow, the t-6 torx screws are small an very easy to strip out, just like Alex.132 said..........goodluck........


ok cool, thanks for the warning!


----------



## tin0

Currenty awaiting on the arrival of a Rampage IV Extreme;


----------



## PinzaC55

Quote:


> Originally Posted by *mtbiker033*
> 
> ok cool, thanks for the warning!


I am sure somebody here said that the Torx screws come out easier when the PCB is hot, like if you run it for a while then whip it out and dismantle it. Take care with the fan connectors, they were a very tight fit on mine.


----------



## Furlans

Hi all guys , i own a gtx 690 but i got a terrible issue.
I mounted two fans and a fan controller in my case , but when i booted on the pc, gtx 690 runned everything at low fps , and if i start a benchmark it runs at 10-20fps ( previous , un ungine heaven 4.0 i did 1800points ) and if i start a game too or it will crash. The gpus are running from 2% to 40/50% and everything is lagging. So i tried to reset the bios , format windows 8 , i changed my psu cables , i resetted the cmos , i reinstalled the drivers, i tried old drivers , i change powewr management options , and i actived k-boost with evga precision x but nothing changed. So , i'm so afraid for this issue because i order an arctic accelero twin turbo 690 and now my 690 ( with stock cooler ) is broken. Temps were ok. I'm requesting a RMA and i will ask if can change my broken gtx 690 ( payed 789euro ) with two gtx 770 4gb ( 394x2euro ). I never flashed the gpus bios , i just tried to overclocked it , but it's not lucky ( asic 60% ) and i couldn't get more than a gpu clock +129 and a memory clock+105







But in dailyuse i use it @default

Anyone got an idea ?
Here are my specs:
i7 [email protected],3ghz stable , now @default cooled by hyper 412s
gtx 690
2X8 1600mhz cortsair vengance hp
gigabyte x79 ud3
Case fractal design xl r2 with 5 fractal silent series r2 and a fan controller akasa fc.six
HDD WD Caviar Blue 7200rpm 1tb
SSD Crucial m4 64gb just for so
PSU Corsair tx750m

I don't have any idea and i EXCUSE ME for my english , i'm italian.


----------



## Lukas026

ok guys I just got word my RMA went through and ASUS is sending a replacment card to me. now my question is: i heard a rumor that replacement cards are somekind of used / repaired ones. Am I getting one of these or ASUS will send me whole new card (even with plastic foil on transparent part of the cooler) ?

thanks


----------



## noob.deagle

Quote:


> Originally Posted by *Lukas026*
> 
> ok guys I just got word my RMA went through and ASUS is sending a replacment card to me. now my question is: i heard a rumor that replacement cards are somekind of used / repaired ones. Am I getting one of these or ASUS will send me whole new card (even with plastic foil on transparent part of the cooler) ?
> 
> thanks


Generally it depends on the terms of the warranty I know some companies say first 6months you get brand new then year+ they reserve the right to use refurbished units as replacements. Read their terms.

Sent from my Nexus 4 using Tapatalk 4 Beta


----------



## Lukas026

thing is I cant find it









any links if you find one would be appreciated









my brand is ASUS and I bought it on 28th March 2013


----------



## Lukas026

so just a quick update. my friend in a company which sold me my gtx 690 told me, that ASUS is sending me USED card for the replacement. any suggestions ?

It could be just fine but you know the feeling when you are getting something from second hand...


----------



## trapjaw72

Quote:


> Originally Posted by *Lukas026*
> 
> so just a quick update. my friend in a company which sold me my gtx 690 told me, that ASUS is sending me USED card for the replacement. any suggestions ?
> 
> It could be just fine but you know the feeling when you are getting something from second hand...


Hello why a used card for.......?


----------



## Lukas026

well as my friend told me, ASUS is sending repaired / refurbished cards as replacements for new rma's. becouse of that I think that the card I will recieve will be some repaired one from some another previous RMA. and by that it was used by someone else before. I am not saying that card wont be fine, it is just about principal - I bought a new card, that card was faulty so next thing is that I should get again new card or refund.

thats what I think and if someone can show it to me that it is even in some warranty policy text by ASUS, I would request a new one. But I still cant find anything useful


----------



## MrTOOSHORT

Card that was sent in for RMA was used right?, then getting a working used card shouldn't be an issue. The problem I have with getting a working used product back is when there all kinds of scratches or cracked housing or any other anomalies that makes it look used.

Also that refurbished sticker that is placed on RMA replacement bothers me.

I do have a problem when I hear a person just bought something and it was DOA from the get go, then they RMA and get a used working/nonworking product back.


----------



## Lukas026

yy i hope card will be fine and that there will be no visual issues with it...

but still i would be happier when they would repair my card and tell me what the problem was rather than sending some other card









and I agree with you about the DOA thing


----------



## trapjaw72

Quote:


> Originally Posted by *Lukas026*
> 
> well as my friend told me, ASUS is sending repaired / refurbished cards as replacements for new rma's. becouse of that I think that the card I will recieve will be some repaired one from some another previous RMA. and by that it was used by someone else before. I am not saying that card wont be fine, it is just about principal - I bought a new card, that card was faulty so next thing is that I should get again new card or refund.
> 
> thats what I think and if someone can show it to me that it is even in some warranty policy text by ASUS, I would request a new one. But I still cant find anything useful


.........just me but I would not take a used gpu. for one you dont know 100 percent how card looks or works.an if you have warranty as a new card does say its ok, but a used gpu warranty might not be as good as a new one, me I say no...........an get my money back or a new gpu flat out....be just like you going buy a new iphone an they say if it breaks we will send you a used one...


----------



## Lukas026

@trapjaw72

well warranty will be still up like for a new card. maybe you dont get me right, but its ASUS service who is sending me this card so it seems I dont have much choice


----------



## trapjaw72

well if its Asus I would not worry then, They will take care of you.....


----------



## y2kcamaross

Quote:


> Originally Posted by *trapjaw72*
> 
> well if its Asus I would not worry then, They will take care of you.....


are you joking? Asus had arguably the worst rma service of any company out there


----------



## Lukas026

so what should I do guys ?


----------



## trapjaw72

it was a joke I told him to get his money back.......but he dont care an its his gpu. an you acting like its your's is it,


----------



## trapjaw72

Quote:


> Originally Posted by *y2kcamaross*
> 
> are you joking? Asus had arguably the worst rma service of any company out there


...............an if you looked at all the post you see where I told him get money back, but looks like you jumped in the Kool Aid did not know the favor...........


----------



## Furlans

No one saw my post in the previous Page?


----------



## PinzaC55

Anyone had their GPU crash after installing the 326.19 driver? Its happened twice now.


----------



## provost

Quote:


> Originally Posted by *PinzaC55*
> 
> Anyone had their GPU crash after installing the 326.19 driver? Its happened twice now.


What game, and how long did it take before crashing?


----------



## PinzaC55

Quote:


> Originally Posted by *provost*
> 
> What game, and how long did it take before crashing?


Far Cry 2 and it maybe went an hour or so. Screen got horizontal lines then back to Windows. There was a message something to do with Nvidia "but it had recovered". I reinstalled the driver and it seems OK now. I will need to check the Nvidia forums.


----------



## Lukas026

just a hypo question:

when I will recieve my replacement card from RMA and I found scratches etc. on it, is this a fair reason to want another RMA or even refund ?

even though the card itself could work normally...


----------



## MrTOOSHORT

I got another Samsung 24 inch after the replacement had a cracked bezel. I think scratches won't warrant another RMA. Unless they are gouged into the pcb and cut some traces.


----------



## Lukas026

well you what i mean. cards like gtx 690 are something like art to me and if they gave me another one but with like scratches on the pcb / alum. cooler / trasnparent window parts it is just bad


----------



## MrTOOSHORT

Art you to you, just another RMA part going out to them.

Probably be some scratches, sorry to say. I like my stuff mint myself too.


----------



## ocsi1970

I want to buy a gtx 690,I do not know if the my power suply if enought.My power suply is:
http://www.thermaltake.com/products-model.aspx?id=C_00000788
Rest of my System:
i7 3770k not ovrclocked
2x4 gb ram
1 ssd
2x1 tb wd
1 dvdrw
pci soun card
and 4 140cm coolers with fan controler

Thanks and sorry my english.


----------



## Lukas026

you will be fine unless you try pushing both cpu + gpu overclock which you wont as I get it


----------



## ocsi1970

Thanks very much!!!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ocsi1970*
> 
> I want to buy a gtx 690,I do not know if the my power suply if enought.My power suply is:
> http://www.thermaltake.com/products-model.aspx?id=C_00000788
> Rest of my System:
> i7 3770k not ovrclocked
> 2x4 gb ram
> 1 ssd
> 2x1 tb wd
> 1 dvdrw
> pci soun card
> and 4 140cm coolers with fan controler
> 
> Thanks and sorry my english.


I used a 3970x near 1.7v and a GTX 690 @~1200MHz on an AX650(54 amps 12v rail)and I had no shut downs or problems when I was benching.

Your psu has 60a total on the 12v rails. You'll be ok.

See pic, was on my AX650(650 watts):


----------



## ocsi1970

Thanks man!!!Are of great help all.


----------



## provost

Quote:


> Originally Posted by *PinzaC55*
> 
> Far Cry 2 and it maybe went an hour or so. Screen got horizontal lines then back to Windows. There was a message something to do with Nvidia "but it had recovered". I reinstalled the driver and it seems OK now. I will need to check the Nvidia forums.


i will give it a whirl..i may be able to locate a 690 from my hardware somewhere


----------



## PinzaC55

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Art you to you, just another RMA part going out to them.
> 
> Probably be some scratches, sorry to say. I like my stuff mint myself too.


That raises an interesting question. The fan assembly off my 690 is sitting in a box wrapped in cling film. Somebody on Ebay UK was recently selling them new at £20 each. What have other 690 owners done with them once they have watercooled - kept them or sold them? I can't see myself needing another card in the foreseeable future but even then it wouldn't be worth selling it for £20 in what is after all a niche market. It just occurred to me it might look nice mounted on a stand on my mantlepiece!


----------



## provost

Quote:


> Originally Posted by *PinzaC55*
> 
> That raises an interesting question. The fan assembly off my 690 is sitting in a box wrapped in cling film. Somebody on Ebay UK was recently selling them new at £20 each. What have other 690 owners done with them once they have watercooled - kept them or sold them? I can't see myself needing another card in the foreseeable future but even then it wouldn't be worth selling it for £20 in what is after all a niche market. It just occurred to me it might look nice mounted on a stand on my mantlepiece!


keep them...u will need it for rma and resale purposes.
who knows how the ebay sellers got a hold of theirs, but if yours ever gets scratched or due to torx screws breaking in them, i guess ebay may not be a bad option to replace the stock cooler


----------



## PinzaC55

Quote:


> Originally Posted by *provost*
> 
> keep them...u will need it for rma and resale purposes.
> who knows how the ebay sellers got a hold of theirs, but if yours ever gets scratched or due to torx screws breaking in them, i guess ebay may not be a bad option to replace the stock cooler


I would imagine the ones on Ebay were surplus stock since I saw an EVGA 690 which was in a box already watercooled. I couldn't get over how all that metal was that cheap lol. BTW somebody did an analysis of how much the parts in a Samsung Galaxy S4 actually cost and it was only about $130 dollars all told.


----------



## provost

Quote:


> Originally Posted by *PinzaC55*
> 
> I would imagine the ones on Ebay were surplus stock since I saw an EVGA 690 which was in a box already watercooled. I couldn't get over how all that metal was that cheap lol. BTW somebody did an analysis of how much the parts in a Samsung Galaxy S4 actually cost and it was only about $130 dollars all told.


They should have given the same quality build to Titan shroud. Lol
The actual cost of production (not mark up, marketing, sales, shipping, etc) of these gpus is no more than $100-$110
The rumor is that it costed Nvidia a grand total of $400 million to to develop the Geforce generation of cards. Each GPU chip itself cost $58 and this by far is the most expensive part of each graphic card. The PCB cost $13.40, but the most expensive stuff on the PCB is the memory. For example lets say Nvidia decides to make a 128 bit memory card, it uses 8Mx16 chips. Now with that said, the total number of chips is eight, and each chip costs $3.50 so that's $28 for the memory. So adding up $58+$13.40+28 will be a grand total of $99.40 to make a graphics card from materials.


----------



## PinzaC55

No more crashes since I reinstalled the driver


----------



## provost

Quote:


> Originally Posted by *PinzaC55*
> 
> No more crashes since I reinstalled the driver











may be I ought to do the same ting, I can't get past the first screen


----------



## ocsi1970

Hy everyone! Gtx 690 on game like crysis 3 or battlefield 3 is stutter or laggy or just run flawlessly?I tried several weeks 2xgtx560ti and less like jerky.I am using 1080p single monitor with 60hz refresh.Thanks.


----------



## Arizonian

Quote:


> Originally Posted by *ocsi1970*
> 
> Hy everyone! Gtx 690 on game like crysis 3 or battlefield 3 is stutter or laggy or just run flawlessly?I tried several weeks 2xgtx560ti and less like jerky.I am using 1080p single monitor with 60hz refresh.Thanks.


No stutter. Proof. GTX 690 playing 2560x1440 Resolution. It's only campaign though not multiplayer. I run multiplayer smooth as well though. Occasional server hiccup.

*BF3 Ultra Mode*
Antialiasing Deffered - OFF
Antialiasing Post - HIGH
Motion Blur - ON
Anisotropic Filter - 16X
Ambient Occlusion - HBAO






EDIT: Noticed it's only your fourth post in....if no one has told you yet....Welcome to OCN.


----------



## provost

yep, No stutter here either with the 690


----------



## PinzaC55

Just because most or all GTX 690 owners get no stutter doesn't mean it is impossible. ocsi1970 may have a faulty 690 or a fault somewhere in his system, or of course an OOD driver. When he posts his system specs there may be an answer.


----------



## ocsi1970

Thanks for all.My system is:
i73770k not ovrclocked
2x4gb 1600mhz ram
intel ssd
thermaltake 750w
gtx 690 coming to me this week and I hope I did not problems with it.
Thanks for movies Arizonian!!!
sory for my english


----------



## propeldragon

2013-04-02 01.33.19.jpg 1344k .jpg file


evga gtx 690
add me???


----------



## propeldragon

i have a question for you guys. can you get higher overclocks with the 320.49 driver? i recently tried something because i read it on a different forum for somewhere, that changing the power target is not a good idea at all (unless your watercooled? maybe?). btw im on air. its not my temps thats causing this either. my highest overclock with +135 power target was +135 and +120 stable. i change my power target to 100% and i can still hit my max boost clock with lower temps. so i decided maybe i can overclock more? so i overclocked my gpu 1 because it is stronger. went to 150 and no crash, so i was like im on to something, i hope. then i went to +160 (1215 in game). i havent crashed yet. i would like for someone else to try this out. idk if its the drivers or the stock power target. i hope with helps people out. but im off to bed and maybe i can try more tomorrow after class (booo).









thanks


----------



## Stablerage

May I become a member?


----------



## Badass1982

Overclocking my EVGA GTX 690 Hydro Copper

So here's my first attempt at overclocking this monster , I'd appreciate any comments/advice you people have on this topic.....

My settings were/are as follows:
Power Target: 135%
GPU Core Offset: +170
Mem clock Offset: +250

Voltage on both GPU's: 1025mv

also My temps never went above 50 degrees on either GPU and this was after well over an hour of fire strike maxed out on a continuous loop.

Pretty Impressed, I'm also sure I can push this card MUCH further but I have to say for my first attempt I am pretty impressed with this card.

Oh I have one more question. Do any of you Ninja's have any idea why EVGA Precision X (the newest version) resets my voltages to the default if I reboot my PC UNLESS I set it to start with windows (which is obviously NOT something that I want to do as it's just ANOTHER program to load on startup in that case...... Is there any way to keep the voltages without having to re do them EVERY time I boot my machine????)

Thanks

Martin


----------



## PCModderMike

Quote:


> Originally Posted by *Badass1982*
> 
> Overclocking my EVGA GTX 690 Hydro Copper
> 
> So here's my first attempt at overclocking this monster , I'd appreciate any comments/advice you people have on this topic.....
> 
> My settings were/are as follows:
> Power Target: 135%
> GPU Core Offset: +170
> Mem clock Offset: +250
> 
> Voltage on both GPU's: 1025mv
> 
> also My temps never went above 50 degrees on either GPU and this was after well over an hour of fire strike maxed out on a continuous loop.
> 
> Pretty Impressed, I'm also sure I can push this card MUCH further but I have to say for my first attempt I am pretty impressed with this card.
> 
> Oh I have one more question. Do any of you Ninja's have any idea why EVGA Precision X (the newest version) resets my voltages to the default if I reboot my PC UNLESS I set it to start with windows (which is obviously NOT something that I want to do as it's just ANOTHER program to load on startup in that case...... Is there any way to keep the voltages without having to re do them EVERY time I boot my machine????)
> 
> Thanks
> 
> Martin


The only way to have the OC start with Windows is to have the box that says "Windows Start Up" checked off.

Precision takes up barely any resources at all when running...I wouldn't worry so much about it being "another program to load on startup." Especially when considering the rig you're running, with a i7 970 and all. Not gonna break a sweat.


----------



## Lukas026

ok i got my card back from RMA and so far it seems my card is almost like a new one. I tried it and it seems like a decent overclocker:

tested with one hour valley loop at HD preset:

core: +115mhz
memory: +650mhz
pt: 135%

but it reached about 91C so I decided to remove cooler and apply Noctua NH 1 and now it drops to 87C. I know it is still high but my abient temp is like abou 35C. Hot summer in Czech Republic










Will keep you posted with some benchamrk scores later !


----------



## MrTOOSHORT

Lukas026

+115 core doesn't give true indication of the boost clocks. One person's GTX 690 at +115 could boost clock differently than another person's GTX 690.

Use gpuz to find boost clocks during a benchmark run:


----------



## Lukas026

both GPUs maxes at 1163

I guess it in normal range of other 690s

btw my ASICs are: 67% and 68% only


----------



## Stablerage

Quote:


> Originally Posted by *Stablerage*
> 
> 
> May I become a member?


Forgot to mention it is an Asus.


----------



## Whatupdoe1337

Hello all, I have heard some bad things about the newer nvidia drivers in regards to gaming performance. I was just wondering what drivers would be best for my 690 for the best gaming experience?


----------



## Stablerage

Quote:


> Originally Posted by *Whatupdoe1337*
> 
> Hello all, I have heard some bad things about the newer nvidia drivers in regards to gaming performance. I was just wondering what drivers would be best for my 690 for the best gaming experience?


The latest beta caused mine to crash so I went for 320.49 but again this has crashed a couple of times so maybe the 320.18 would be a good choice.


----------



## Lukas026

hey

so after 3 hours playing Crysis 3 my card died on me again. I realy dont know *** is going on becouse this is third gtx 690.

i was playing than, my screen goes black and than no more signal to display.

when i connect via integrated graphics I cant find my 690 in bios / nvflash so I guess its dead.

only reason i can think off some wrong batch of cards arrived to our country and I am just another and another cards from it









PS: my last card before 690s was ASUS ROG Matrix 580 Platinum and I was fine with it for more than a 3 years. I wish I had some luck this gen cards...


----------



## PinzaC55

Quote:


> Originally Posted by *Lukas026*
> 
> hey
> 
> so after 3 hours playing Crysis 3 my card died on me again. I realy dont know *** is going on becouse this is third gtx 690.
> 
> i was playing than, my screen goes black and than no more signal to display.
> 
> when i connect via integrated graphics I cant find my 690 in bios / nvflash so I guess its dead.
> 
> only reason i can think off some wrong batch of cards arrived to our country and I am just another and another cards from it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PS: my last card before 690s was ASUS ROG Matrix 580 Platinum and I was fine with it for more than a 3 years. I wish I had some luck this gen cards...


Earlier you said "tested with one hour valley loop at HD preset:"

The obvious conclusion is that you are putting them through excessive stress tests that they can't handle or that you are damaging them in some other way without realising it. I suggest that when you get the next one you simply lay off the benchmarks and use it for normal purposes and then tell us what happens.


----------



## Lukas026

oh dont take me wrong. i have some ways how to check if my cards are stable enough and I think its not extreme at all

I test my overclock first with one valley BENCHMARK run

if passed I let it run for 1 hour

and than I consider it stable

ofc I let the card rest esp. during night - I am completly switching off my PC during sleep

and this card crashed after 2 h 45 min of crysis 3 so again not extreme at all

temps never reached above 85 C so no thermal problem either (at least on cores)

and to be fair: even if I let it run all day long (for example Unigine Valley) and if temps are good - card should handle it

well should isnt the right word. it MUST handle it - its a card for 1000 bucks !

I am not bricking it with furmark - I am just playing games


----------



## virgis21

Are you sure have good PSU for this kind of GPU to power up? Normally, they don't burn like this..









Virgis


----------



## PinzaC55

Quote:


> Originally Posted by *Lukas026*
> 
> oh dont take me wrong. i have some ways how to check if my cards are stable enough and I think its not extreme at all
> 
> I test my overclock first with one valley BENCHMARK run
> 
> if passed I let it run for 1 hour
> 
> and than I consider it stable
> 
> ofc I let the card rest esp. during night - I am completly switching off my PC during sleep
> 
> and this card crashed after 2 h 45 min of crysis 3 so again not extreme at all
> 
> temps never reached above 85 C so no thermal problem either (at least on cores)
> 
> and to be fair: even if I let it run all day long (for example Unigine Valley) and if temps are good - card should handle it
> 
> well should isnt the right word. it MUST handle it - its a card for 1000 bucks !
> 
> I am not bricking it with furmark - I am just playing games


But you said " becouse this is third gtx 690.". I'm not clear whether they have repaired the same card or whether it is a new one each time but if it is a new one I would think the odds against getting a duff card three times in a row must be huge?


----------



## Atheos

Just picked up two GTX 690, gonna run quad sli in a new Z87 board. I plan on running both cards plus a dedicated physx card can't wait!


----------



## propeldragon

overkill


----------



## Lukas026

@atheos: if I can give u advice and u still can return your 690s - do it. in these days I think its way better to go with 2x or 3x 780s or maybe 2x Titans. Quad SLI is not scalling so well and if you will play on something more than 1080p you will be bottlenecked in some games with 2GB VRAM. (yes it says on the box gtx 690 has 4 GB of VRAM but it is 2GB for each core - just dont want you to be mislead)

@PinzaC55: one card was MSI and other one is ASUS which was sent to RMA and even though ASUS said they replaced mine card for refurbished one, I still get the feeling its the same card but repaired









@Virgis21: I have Corsair AX750 about 1 year old and I think its great and solid PSU...

but to be honest I am too starting to get a feeling that there is something wrong with my PC but I cant get my finger on it. Now I am on Intel HD graphics and all is smooth. I checked my PCI-E cables too and all is fine. Is there any software that can do some readings with VGA plugged in ? I mean like power / current / PCI-E slot draw etc.


----------



## superx51

Sounds like you have a bad power supply? Buy a new one from best buy and if that isn't the problem return it. Not having enough power can kill the card.


----------



## Atheos

Quote:


> @atheos: if I can give u advice and u still can return your 690s - do it. in these days I think its way better to go with 2x or 3x 780s or maybe 2x Titans. Quad SLI is not scalling so well and if you will play on something more than 1080p you will be bottlenecked in some games with 2GB VRAM. (yes it says on the box gtx 690 has 4 GB of VRAM but it is 2GB for each core - just dont want you to be mislead)


I'd have to agree with you on this. Here's my thing. firstly, I picked up both of these cards new for $1,300. so roughly about the same price as 2x 780's or 1 1/2 Titans, the Vram is a HUGE issue for me as I'm running on 5760 x 1080. Initially i weighed all my options, 2x 780's 3x 770's 1x Titan, but I ended up with the 2x 690's for two reasons,

The first, because for MY price they out performed 2x 780's 3x 770's and 2x Titan
and according to most benchmarks I've come across they perform better than 2 titans, and 2/3 780's at 5760 x 1080

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/16.html
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/15.html
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/6.html

They only are outperformed by 3x titans in almost every benchmark (for some strange reason) so that is the logic I went with.

Will this be very future proof? No probably not, but then again I plan on upgrading when the 800 or 900 series come out what ever they shall be named


----------



## Atheos

Quote:


> Originally Posted by *propeldragon*
> 
> overkill


Not at 5760 x 1080 resolution.


----------



## Arizonian

If your a bencher the [OFFICIAL]--- Top 30 --- Unigine 'Valley' Benchmark 1.0 is reopened.


----------



## PinzaC55

Quote:


> Originally Posted by *Atheos*
> 
> Just picked up two GTX 690, gonna run quad sli in a new Z87 board. I plan on running both cards plus a dedicated physx card can't wait!


I have often wondered what a GTX 690 + Physx card would be like. Will you be testing your system with and without the Physx card?


----------



## Atheos

Quote:


> Originally Posted by *PinzaC55*
> 
> I have often wondered what a GTX 690 + Physx card would be like. Will you be testing your system with and without the Physx card?


I will. I am extremely curious if there is any difference which in my experience with physx, there will be. I think there will be a major difference in games like Metro Last Light with and without physx, I'll post results as soon as I get em benched. Anything anyone else wants to see benched I'll be happy to.


----------



## Atheos

Quote:


> Originally Posted by *Arizonian*
> 
> If your a bencher the [OFFICIAL]--- Top 30 --- Unigine 'Valley' Benchmark 1.0 is reopened.


I'm not a bencher, but I suppose I'll give it a shot


----------



## svpam92

Hello, I'm having problem with watercooled GTX690 for being uneven temperature. Here is my thread:
http://www.overclock.net/t/1414243/weird-temperature-on-my-watercooled-gtx690
I decided to reseat the waterblock but am not sure how to do it properly. Some says there is a sequence to the order we tighten them . I just wanna make sure I do it right this time. I just can't afford the hassle for 3rd time


----------



## PinzaC55

Quote:


> Originally Posted by *Atheos*
> 
> I will. I am extremely curious if there is any difference which in my experience with physx, there will be. I think there will be a major difference in games like Metro Last Light with and without physx, I'll post results as soon as I get em benched. Anything anyone else wants to see benched I'll be happy to.


Thanks! I have space for another GPU


----------



## virgis21

Quote:


> Originally Posted by *Atheos*
> 
> I will. I am extremely curious if there is any difference which in my experience with physx, there will be. I think there will be a major difference in games like Metro Last Light with and without physx, I'll post results as soon as I get em benched. Anything anyone else wants to see benched I'll be happy to.


What benchmark to use? I could check my setup too and compare here. Gigabyte GTX 690 (default clocks) and i7 3770K @ 4.8ghz.

Virgis


----------



## ocsi1970

Hello.I have a problem with gtx 690.When i play crysis 3 reaches the temperature at 85 degrees.It is normal?If i play 2-3 hours would reach 90 degrees?


----------



## PinzaC55

With air cooling, running Unigine Valley benchmark my GTX 690 hit 80 degrees so yours doesn't sound abnormal. What kind of case have you got and are you sure it is getting the maximum amount of external air possible? BTW my 80 degrees would be with the room at about 20 degrees so what is the temperature of your living room?


----------



## ocsi1970

I have a Corsair 500R case with 3 fans who put and 3 out air.And livingroom temperature is 30 C.


----------



## PinzaC55

Quote:


> Originally Posted by *ocsi1970*
> 
> I have a Corsair 500R case with 3 fans who put and 3 out air.*And livingroom temperature is 30 C*.


There you go then. The air going into your case is 30 degrees so 85 is about normal. It would be interesting to see what temps you got running Unigine Valley.


----------



## Arizonian

Quote:


> Originally Posted by *ocsi1970*
> 
> Hello.I have a problem with gtx 690.When i play crysis 3 reaches the temperature at 85 degrees.It is normal?If i play 2-3 hours would reach 90 degrees?


Quote:


> Originally Posted by *ocsi1970*
> 
> I have a Corsair 500R case with 3 fans who put and 3 out air.And livingroom temperature is 30 C.


My house is 77F currently and GPU #1 gets to 87F and GPU #2 82F highest temps. It's normal. Got an OC right now with boost reaching 1144 Mhz Core. No memory OC.

So your house is 9F higher than mine and that seems relevant to the temps your seeing as normal on air.


----------



## ocsi1970

Thanks all!!!Unigine valley run with max temp 80 c,but in crysis 3 up to 85-87 c.I wanted to be sure to not have problems.Another question,which is the maximum temperature for gtx 690 to not broken?Thanks!!!


----------



## MrTOOSHORT

Quote:


> Originally Posted by *ocsi1970*
> 
> Thanks all!!!Unigine valley run with max temp 80 c,but in crysis 3 up to 85-87 c.I wanted to be sure to not have problems.Another question,which is the maximum temperature for gtx 690 to not broken?Thanks!!!


Quote:


> *Thermal and Power Specs:
> 
> 98 CMaximum GPU Tempurature (in C)
> 
> *


*
http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-690/specifications*


----------



## ocsi1970

Thanks man,and thank you all!!!


----------



## PinzaC55

Quote:


> Originally Posted by *ocsi1970*
> 
> Thanks all!!!Unigine valley run with max temp 80 c,but in crysis 3 up to 85-87 c.I wanted to be sure to not have problems.Another question,which is the maximum temperature for gtx 690 to not broken?Thanks!!!


Possibly not surprising. When my 690 was still on air and I played Far Cry 3 I literally couldn't touch the EK backplate - it was hot enough to fry an egg. Now with H20 max temp is 45 degrees.


----------



## ocsi1970

But H20 is for cpu no?How this help video card?


----------



## PinzaC55

Quote:


> Originally Posted by *ocsi1970*
> 
> But H20 is for cpu no?How this help video card?


Heres my rig with water cooled GTX 690





Its a lot of work and you have to lose the "Geforce GTX" logo but its worth it.


----------



## ocsi1970

Looks fantastic.I thought of water cooling CPU only,but is also on the video card is loks amaizing and keep for you greet temperature.Thank for picture.


----------



## PinzaC55

Quote:


> Originally Posted by *ocsi1970*
> 
> Looks fantastic.I thought of water cooling CPU only,but is also on the video card is loks amaizing and keep for you greet temperature.Thank for picture.


I had a look at reviews of your case and you'd be good to go with a 240 rad on the top and a 240 or 200 (like mine) rad in the front and you would see a big benefit.

BTW anyone who hasn't yet downloaded Nvidia 326.41 driver should think long and hard whether you want it. I am on my second crash tonight, 3rd in two days and that is AFTER reinstalling. After I finish this post it gets uninstalled.


----------



## PCModderMike

Quote:


> Originally Posted by *ocsi1970*
> 
> Looks fantastic.I thought of water cooling CPU only,but is also on the video card is loks amaizing and keep for you greet temperature.Thank for picture.


Video cards respond very well to water cooling. Great temps and great looks...would definitely recommend it.
Here's my current setup.


----------



## Seanimus

Sharing some work with 690 Hydro.

1) Overclocking the GPU remotely using MSI Afterburner - EVGA Precision did not work for me.
EVGA Precision uses Bluetooth. Could not get it to work or Samsung Galaxy tab. They don't have a iPad, iPhone version.
MSI Afterburner polls your network to look for broadcast. Could not get it work on iPhone or iPad, would crash. Worked on Samsung Galaxy tab.

MSI Afterburner settings while gaming in WoW and adjusting till stable:
http://www.flickr.com/photos/[email protected]/9424294160/

http://www.flickr.com/photos/[email protected]/9424295282/

2) EVGA LED Controller for the 690
the led on the evga 690 will glow based on setting . so if I set it to LED Breathing...it will count up and down.
If temperature; then the LED on the 690 will be kind of dim for me at 29C or at ~30%
http://www.flickr.com/photos/[email protected]/9226255009/


----------



## hammertime850

Hey, I'm a proud owner of a gtx 690, man it's awesome.


----------



## PCModderMike

Wow tight fit. Nice though!


----------



## Arizonian

Quote:


> Originally Posted by *hammertime850*
> 
> 
> 
> Hey, I'm a proud owner of a gtx 690, man it's awesome.


Congrats on the 690!









Been loving mine since release, enjoying the performance killing FPS in games still.


----------



## FiShBuRn

Quote:


> Originally Posted by *PCModderMike*
> 
> Video cards respond very well to water cooling. Great temps and great looks...would definitely recommend it.
> Here's my current setup.


What is the name of that bridge in the block of 690?


----------



## virgis21

I just wonder, What is the heaviest game for today? Just want to try how my system performs with it..

Virgis


----------



## PCModderMike

Quote:


> Originally Posted by *FiShBuRn*
> 
> What is the name of that bridge in the block of 690?


Are you talking about this?


It's a EK FC Bridge Single CSQ - Plexi version


----------



## hammertime850

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats on the 690!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been loving mine since release, enjoying the performance killing FPS in games still.


Its been great so far, the 2gb ram is plenty even for 1600p. I just hop bf4 runs fine on it.


----------



## PCModderMike

Quote:


> Originally Posted by *hammertime850*
> 
> Its been great so far, the 2gb ram is plenty even for 1600p. I just hop bf4 runs fine on it.


Same. I've also found the 2GB is plenty for all my games at 2560x1400...but yea I hope that remains the same when BF4 hits.


----------



## FiShBuRn

Quote:


> Originally Posted by *PCModderMike*
> 
> Are you talking about this?
> 
> 
> It's a EK FC Bridge Single CSQ - Plexi version


Nop, the other one, that one is only for one side, right? Came with the block...


----------



## PCModderMike

Quote:


> Originally Posted by *FiShBuRn*
> 
> Nop, the other one, that one is only for one side, right? Came with the block...


Yes, the other one comes with the block.

But just so we're on the same page...the piece with the red arrow pointing to it was what I linked you the first time and that did not come with the block...the piece with the yellow arrow pointing to it *does* come with the block.


----------



## FiShBuRn

Thats what i need then







thanks for the help!


----------



## PCModderMike

No problem.


----------



## lascar

hi guys,

I'm a new one, greetings from France !

i've bought my babe on august 2012, and i'm very proud of it : Crappy pics from my cellphone i apologize

http://www.hostingpics.net/viewer.php?id=314401IMG0437.jpg

http://img15.hostingpics.net/pics/825803IMG0438.jpg

http://www.hostingpics.net/viewer.php?id=823460IMG0417.jpg

So i have a question to ask ....

i've bought recently an aftermarket aircooler aka Arctic Accelero Twin Turbo 690 (DCACO-V780001-BL):

i've unscrewed all the T6 TorX screws and phillips ones on the pcb.

but i can't remove the screws of the (back) slot bracket (i really apologize for my poor english ... anyway i'm just a frog )

Could Anyone ever tell me wich screwdriver (Exact Specs) i should use for these screws :

picture in here : on screw over DP (Display Port) the other one over the third DVI

http://www.tweakpc.de/hardware/tests/grafikkarten/nvidia_geforce_gtx_690/i/geforce_gtx_690_16.jpg

Thx a lot people, for your futher advices, tips and help !

Cya soon

PS: new beta driver's out 326.58 with OpenGL 4.4 & GLSL 4.40 full support

X64
http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/opengl/4.4/326.58_desktop_win8_winvista_win7_64bit_international.exe

x86
http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/opengl/4.4/326.58_desktop_win8_winvista_win7_international.exe


----------



## provost

Quote:


> Originally Posted by *lascar*
> 
> hi guys,
> 
> I'm a new one, greetings from France !
> 
> i've bought my babe on august 2012, and i'm very proud of it : Crappy pics from my cellphone i apologize
> 
> http://www.hostingpics.net/viewer.php?id=314401IMG0437.jpg
> 
> http://img15.hostingpics.net/pics/825803IMG0438.jpg
> 
> http://www.hostingpics.net/viewer.php?id=823460IMG0417.jpg
> 
> So i have a question to ask ....
> 
> i've bought recently an aftermarket aircooler aka Arctic Accelero Twin Turbo 690 (DCACO-V780001-BL):
> 
> i've unscrewed all the T6 TorX screws and phillips ones on the pcb.
> 
> but i can't remove the screws of the (back) slot bracket (i really apologize for my poor english ... anyway i'm just a frog )
> 
> Could Anyone ever tell me wich screwdriver (Exact Specs) i should use for these screws :
> 
> picture in here : on screw over DP (Display Port) the other one over the third DVI
> 
> http://www.tweakpc.de/hardware/tests/grafikkarten/nvidia_geforce_gtx_690/i/geforce_gtx_690_16.jpg
> 
> Thx a lot people, for your futher advices, tips and help !
> 
> Cya soon
> 
> PS: new beta driver's out 326.58 with OpenGL 4.4 & GLSL 4.40 full support
> 
> X64
> http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/opengl/4.4/326.58_desktop_win8_winvista_win7_64bit_international.exe
> 
> x86
> http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/opengl/4.4/326.58_desktop_win8_winvista_win7_international.exe


Any precision Philips would do for these. It works for mine anyway. Also be careful when taking the stock cooler off, when unhooking the led and fan connections (white and black). Then take a pic of the pcb with the thermal pads and thermal paste still on, before changing anything. This way you will never forget what goes where. Save all the stock thermal pads by putting them on the stock cooler and wrap it up and put it away in the same anti static bag it came in.
Cheers!


----------



## Lukas026

hey

can anyone send me BIOS files for GPU 1, GPU 2 and also PLX chip please ? I am having some kind of troubles and want to flash it with the latest BIOSes available.

I need NON uefi ones. EVGA prefferebly

My mail is: [email protected]

Thanks


----------



## Koniakki

Guys I have been away from the GTX 690 modded bios news for a couple months now so please bare with me on this question.

So anything new regarding the modded bios/OC tools for our beloved 690's to increase performance/OC headroom besides going under water?

I'm on modded bios btw with 1097Mhz/6.8Ghz.

Thanks.


----------



## Buzzkill

Quote:


> Originally Posted by *Koniakki*
> 
> Guys I have been away from the GTX 690 modded bios news for a couple months now so please bare with me on this question.
> 
> So anything new regarding the modded bios/OC tools for our beloved 690's to increase performance/OC headroom besides going under water?
> 
> I'm on modded bios btw with 1097Mhz/6.8Ghz.
> 
> Thanks.


The Hydro Copper 690 BIOS has Higher Core (+9%) and Boost clock (+3%). If you could get this bios from someone then flash a regular 690 with water block. You can also just use percision x or afterburrner to set clocks but it's voltage locked. . Techpowerup has regular 690 Bios only no one has uploaded the Hydro Copper Bios.

GeForce GTX 690 Hydro Copper Signature EVGA 04G-P4-2699-KR Core Clock 993MHz Boost clock 1045MHz

GeForce GTX 690 EVGA 04G-P4-2690-KR Core Clock 915MHz Bost clock 1018MHz

Name GPU Clock Memory Clock Other Changes
ASUS GTX 690 915 MHz 1502 MHz
ASUS ROG MARS III 915 MHz 1502 MHz 4096 MB, 12.4 inches
EVGA GTX 680 2Win Gemini 915 MHz 1502 MHz Quad-slot
EVGA GTX 690 915 MHz 1502 MHz
EVGA GTX 690 Hydro Copper 993 MHz 1502 MHz Boost Clock: 1046 MHz
EVGA GTX 690 Hydro Copper Signature 993 MHz 1502 MHz Boost Clock: 1046 MHz
EVGA GTX 690 Signature 915 MHz 1502 MHz
Gainward GTX 690 915 MHz 1502 MHz
Galaxy GTX 690 915 MHz 1502 MHz
GIGABYTE GTX 690 915 MHz 1502 MHz
MSI GTX 690 915 MHz 1502 MHz
Palit GTX 690 915 MHz 1502 MHz
ZOTAC GTX 690 915 MHz 1502MHz


----------



## lascar

i had the gainward stock bios for my 690 : PT= 135 % (360/375W) - 1058/6000 8-9 bins (on average load).

Now i use a Asus moded bios Aka (450W TDP) : PT up to 150 % (>450W) - and on default boost setting up to 1071 mhz/6000 gddr5 ---) between 10-11 bins (max 12).

with stock cooler with an Antec 1200 tower and modded bios :

+ 120 mhz offset on gpu (1189- 1202) and + 700 mhz offset (1.4 ghz -- 7.4 ghz GDDR5)

Cheers,
lascar.

PS : thx for your advices Provost, but in my case, to put in a nutshell : no way to unscrew it, nor with strength


----------



## Lukas026

would you be kind enough and upload / send me BOTH your modded ASUS bios files and also PLX bios as well ?

I would be realy thankful


----------



## Lukas026

btw I got back my ASUS gtx 690 from RMA: this one has low ASICs - 64 and 63% only. I was able to clock it like this:

+80 core
+500 mem
+135% PT

Not great overclocker but I think its decent. And with my Accelero Twin Turbo card is deadly silent


----------



## provost

Quote:


> Originally Posted by *lascar*
> 
> i had the gainward stock bios for my 690 : PT= 135 % (360/375W) - 1058/6000 8-9 bins (on average load).
> 
> Now i use a Asus moded bios Aka (450W TDP) : PT up to 150 % (>450W) - and on default boost setting up to 1071 mhz/6000 gddr5 ---) between 10-11 bins (max 12).
> 
> with stock cooler with an Antec 1200 tower and modded bios :
> 
> + 120 mhz offset on gpu (1189- 1202) and + 700 mhz offset (1.4 ghz -- 7.4 ghz GDDR5)
> 
> Cheers,
> lascar.
> 
> PS : thx for your advices Provost, but in my case, to put in a nutshell : no way to unscrew it, nor with strength


Since the screws are not on pcb, you can try to use a dremel to cut across the screw and use a flathead if the screws are stripped. You would have to be very careful not to damage other parts of the cooler. Also, try warming up the screws with a hair dryer, u never know.


----------



## provost

Quote:


> Originally Posted by *Lukas026*
> 
> btw I got back my ASUS gtx 690 from RMA: this one has low ASICs - 64 and 63% only. I was able to clock it like this:
> 
> +80 core
> +500 mem
> +135% PT
> 
> Not great overclocker but I think its decent. And with my Accelero Twin Turbo card is deadly silent


This seems to be a low ASIC for 690. The average is north of 74%. Not that it mattes.
You can over clock your core more at 135% but less on memory for better bench scores, and higher stable over clock for 24/7. By the way, 690s do like cooler temps, and ware block helps.


----------



## Cylas

Found this on Guru3d Link
Quote:


> Originally Posted by *Warning: Spoiler!*
> *NCP4206* is also used on many modern NVIDIA reference design cards (Titan, 7x0 and some 6x0). MSI don't provide official voltage control via NCP4206 for such cards, however you may try to unlock it by adding the following strings to hardware profile (.\Profiles\VEN_10DE......cfg):


[Settings]
VDDC_Generic_Detection = 0
VDDC_NCP4206_Detection = 3:20h


----------



## lascar

My Asic quality is also low (65 and 67 %) but can i run the gfx without trouble h24/7 with + 120 mhz / +700 mhz

Last version of gpu-z said " bios reading not supported on this device" so i don't know , first time i saw this !

http://www.hostingpics.net/viewer.php?id=469897gtx690.jpg

thanks a lot provost for your help i really want to thank you, will try your tricks with the hairdryer ASAP







!

Cheers !


----------



## lascar

Lukas try this i follow this thread :

http://forums.guru3d.com/showthread.php?p=4339918

Try this guide that i written for modding yout bios if you want to, Its easy to follow as long as you follow everything correctly.

My GPU's both go over 1200mhz when overclocked now which is great.

(Ive also updated post no1 with this guide)
Flashing bios instructions for any Nvidia Keplar GPU.
YOU DO THIS AT YOUR OWN RISK

I wrote this guide as i couldnt find one anywhere.

First of all, You then need to download these files.

GPU-Z (This is used to grab the original Bios from the Gpu wether multi or single)
http://www.techpowerup.com/downloads/2181/mirrors.php

KGB - Kepler BIOS Editor/Unlocker. (This is used to modify the Bios that you just grabbed from your Gpu using Gpu-z)

KGB supports: GTX690, GTX680, GTX670, GTX660Ti, GTX660OEM and GTX660
https://www.dropbox.com/s/vrunxuq03vj0m5y/kgb_0.5.zip

NVFLASH (This is used to reflash or reinstall the new modified bios)
http://www.softpedia.com/progDownloa...oad-16133.html

Steps 1-10.

1.Ok what you need is firstly to make a USB dongle bootable into DOS, To do this read this very easy to follow guide.
http://www.bay-wolf.com/usbmemstick.htm

2.Run GPU-Z and and click on the icon that allows you to grab your bios from the Gpu and save it to your PC.

The pictures below show how.....

Also make sure you have a copy of your bios!!

3.Once you have grabbed your bios put it into C:\Users\My and rename it 1.rom

For example the bios from the GTX 690 Gpu 1 is called GK104.rom, I renamed it to 1.rom and the bios from Gpu 2 I renamed 2.rom.

The reason why you should put the bios in C:\Users\My is because the default path will be set here from KGB.exe so if you are not familiar with the CMD (Commandline) then this is the easiest way of doing it.

5.Take the KGB.exe as well as the KGB.cfg file and also place both of these in C:\Users\My.
You should now have KGB.exe, KGB.cfg and 1.rom files in C:\Users\My.

If you have more than 1xGpu then you will have 1.rom and 2.rom respectively in C:\Users\My.

6.The next thing to do is open up the CMD (Commandline) which im sure you all know how to do.
Once you have the CMD open, Copy and paste in this command.

kgb 1.rom unlock (press enter).

It will now save the new values (i.e. from kgb.cfg) in your bios then print out the new values in the bios.

7.Now all we need to do is put the new modified 1.rom back onto the gpu, To do this we need to use the USB DOS bootable Dongle that we created in step 1.

Take the nvflash files once you have extracted them and place them onto the USB DOS bootable Dongle, There should be 2 files.

Then place the 1.rom bios also on the USB DOS bootable Dongle.

8.Now the Dongle is ready to be booted from so reboot the PC and go into the bios.
Once you are in the bios make it so the PC boots from the dongle, Once this happens you will then have a flashing cursor telling you that you are in DOS mode.

9.Type in nvflash --list (press enter)
This will then tell you what gpu(s) that you have installed on your motherboard.

10.Type in nvflash -i1 1.rom (press enter)

It will then say flashing Gpu and press Y to confirm, so press Y and the flashing will commence which takes a few seconds.

If you have a second gpu with 2.rom.

Type in nvflash -i2 2.rom (press enter)

It will then say flashing Gpu and press Y to confirm, so press Y and the flashing will commence which takes a few seconds.

Once this is done reboot the PC go into the bios and reset your boot so it boots from your hardrive with the Windows OS on.

Once you get into windows you can see from MSI afterburner or EVGA precision that the power target can be raised from 130 to 150.

When gaming the boost clock should be higher depending on your GPU.

I achieved around an increase of 30 on the core clock on boost alone on both GPU's which is very good.

It will differ for each person and each GPU(s), You can also try other peoples bios as long as iyts the same GPU as your own.

You can also modify the KGB.exe and tweak power settings for even more results but I wouldnt advise this unless you know what you are doing, I would just leave it at stock and use that for the time being.

Everything you do is at your own risk ! ! !


----------



## FiShBuRn

Quote:


> Originally Posted by *Micko*
> 
> Good news guys!
> 
> New beta of MSI Afterburner just came out and it brings improved support (voltage unlocking) for cards which have NCP4206 voltage regulator. According to Uniwinder, that chip is used on many reference GTX 6xx/7xx/Titan cards.
> 
> Link to official thread at Guru3D
> 
> Uniwinder's post where he explains how to unlock the voltage
> 
> I tried the tweak and it really works. Out of the box, 1.212mv was the limit for my card and after the tweak, upper limit is 1.300mv.
> At 1.212mv, i could finish the Valley with 1267Mhz boost clock and at 1.3v card passed at 1333Mhz. Not bad. Temps were about 7-8 C higher.
> 
> However, voltage does not drop when card is idling, so I won't be using this tweak for everyday gaming. Benching is another story though..


Anyone tried this? Im for away from mine...


----------



## Koniakki

Quote:


> Originally Posted by *FiShBuRn*
> 
> Anyone tried this? Im for away from mine...


IT WORKS!! IT FREAKING WORKED!

I don't know if the Voltage shown is indeed the one the GPU truly uses but without proper testing equipment I cannot verify it. But Afterburner shows it does so I will play along for now.

I have to run some benchmarks to verify it.


----------



## Buzzkill

Quote:


> Originally Posted by *Koniakki*
> 
> IT WORKS!! IT FREAKING WORKED!
> 
> I don't know if the Voltage shown is indeed the one the GPU truly uses but without proper testing equipment I cannot verify it. But Afterburner shows it does so I will play along for now.
> 
> I have to run some benchmarks to verify it.


You can use EVGA OC Scanner X to read the card Like GPU-z. Here what EVGA NV-Z 0.5.2 GPU Monitoring


----------



## lascar

works flawlessly OMG : Nvapi.dll lock Blown out !

Just an other day in paradise : MSI AFTERBURNER beta 14:

try this in settings tab ONLY FOR 690 GTX :

"Huge thanks and props for pointing this to us!
Btw I used the below as the other value resulted in locked voltage option:

[Settings]
VDDC_Generic_Detection = 0
VDDC_NCP4206_Detection = 3:20h "

proof of the gtx 690 unlock edition !

Love russia love unwinder member from Guru3d !

http://www.hostingpics.net/viewer.php?id=666764690.jpg


----------



## provost

Quote:


> Originally Posted by *lascar*
> 
> works flawlessly OMG : Nvapi.dll lock Blown out !
> 
> Just an other day in paradise : MSI AFTERBURNER beta 14:
> 
> try this in settings tab ONLY FOR 690 GTX :
> 
> "Huge thanks and props for pointing this to us!
> Btw I used the below as the other value resulted in locked voltage option:
> 
> [Settings]
> VDDC_Generic_Detection = 0
> VDDC_NCP4206_Detection = 3:20h "
> 
> proof of the gtx 690 unlock edition !
> 
> Love russia love unwinder member from Guru3d !
> 
> http://www.hostingpics.net/viewer.php?id=666764690.jpg


So does anyone here have any benches using this?


----------



## Lukas026

i tried voltage hack and it works, although I am on air so i will remain on stock voltage atm. BTW thanks for the guide, I unlocked my Pt to 150 % but I am still only able to work with +70 core and +500 memory stable.

I mean this is game stable like FC 3 / Crysis 3 or Tomb Raider. For benchmarking, I could go probably a little higher...

Anyway, I wanted to ask you guys. When I got my ASUS gtx 690 back from RMA, there was bad bios on GPU2 and I had to reflash. Can this be used as an issue for another RMA ? I mean I got some card which was certainly used before by someone else, even though ASUS claimed they sent me whole NEW card. Also as I said earlier, my ASICs are realy low and I dont know if this card wasnt used for some kind of benching before. There are also scratches on the back side of PCB.

This happened to me for second time. Can I demand some kind of refund or at least REALY new card ?


----------



## Koniakki

Quote:


> Originally Posted by *provost*
> 
> So does anyone here have any benches using this?


Did a Valley run @1215-1228Mhz! My card never went above 1175 and be stable in any situation. Either benchmark or game or in-game benchmark. Never!

And that's really fricking amazing!!

Also played Far Cry 3 and also Crysis [email protected] with 2x SMAA and also tried TXAA-High! No probs. But they were quick tests, like 5-6min each.

*P.S:* A quick little fun fact. I was doing my routine check(once a month usually) for Bios updates(MB, GPU etc) and when I didn't find anything new
I posted my previous post #4808 to see if I missed any news about our overclocking our beloved 690's OC since I hadn't followed the thread for a while
and in the same day we get *Voltage control*!!

That's a real coincidence!


----------



## FiShBuRn

I've tested and it worked!

Now is it safe to use this voltage? 1.3v? Even if underwater?


----------



## provost

Quote:


> Originally Posted by *Koniakki*
> 
> Did a Valley run @1215-1228Mhz! My card never went above 1175 and be stable in any situation. Either benchmark or game or in-game benchmark. Never!
> 
> And that's really fricking amazing!!
> 
> Also played Far Cry 3 and also Crysis [email protected] with 2x SMAA and also tried TXAA-High! No probs. But they were quick tests, like 5-6min each.
> 
> *P.S:* A quick little fun fact. I was doing my routine check(once a month usually) for Bios updates(MB, GPU etc) and when I didn't find anything new
> I posted my previous post #4808 to see if I missed any news about our overclocking our beloved 690's OC since I hadn't followed the thread for a while
> and in the same day we get *Voltage control*!!
> 
> That's a real coincidence!


What score did you get for valley? Can you post it in the valley thread and here? I have a pretty good idea of how 690s do on water in valley as I have pushed mine quite a bit. But I am benching another card right now, and will plug in 690 when I get a moment.


----------



## Koniakki

Quote:


> Originally Posted by *provost*
> 
> What score did you get for valley? Can you post it in the valley thread and here? I have a pretty good idea of how 690s do on water in valley as I have pushed mine quite a bit. But I am benching another card right now, and will plug in 690 when I get a moment.


I have scores in most benchmark threads here, so I will definitely update them.

But for now I got some fluctuations on GPU1 when upping the Voltage. It must be my bios probably. I will try and flash back the stock ones or maybe even those Hydro ones.

Btw someone be kind enough and share here or PM me *The Hydro Copper 690 BIOS* please?

Thanks.


----------



## MrTOOSHORT

eVGA Hydro GTX690.zip 111k .zip file


The voltage unlocks make me wish I never sold my GTX 690 and Titan cards.


----------



## Lukas026

any advantage to flash HC BIOS on reference GTX 690 or it is just clocked a little higher ?


----------



## FiShBuRn

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> eVGA Hydro GTX690.zip 111k .zip file
> 
> 
> The voltage unlocks make me wish I never sold my GTX 690 and Titan cards.


Do you think that 690 VRM can handle that voltage?


----------



## lascar

my last gfx radeon 5970 i've bumped the vcore up to 1.45V (40 nm process) for benches and 1.35V for H24/7/7 for gaming sessions without any troubles with aircooling.


----------



## Proxish

Ugh, who do I have to kill for a 690 :/


----------



## lascar

Uselless, in my case flashing with a new bios just allows the gfx on stock setting to boost 1 or 2 bins (13 mhz step) higher than the previous bios.

1.3V seems a bit high for 28 nm process with aircooling for long periods of time, under WC should be ok.

quick bench ....

Grid 2 downscalled 3200x1800 aax8 maxed out

i'm hitting OCP over current Protection at 1280 mhz / PT 150 % / 1.295V (+ 125 mV)

http://imagesup.org

will try other games and lowering the vcore on both gpu and upping memory offset to find out the best ratio clock vs memory ...

and the best for the end : the only thing it will do .... RMA your card









lets see

http://imagesup.org
Stay tuned !


----------



## Koniakki

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> eVGA Hydro GTX690.zip 111k .zip file
> 
> 
> The voltage unlocks make me wish I never sold my GTX 690 and Titan cards.


Thanked and Repped...


----------



## PinzaC55

Quote:


> Originally Posted by *Proxish*
> 
> Ugh, who do I have to kill for a 690 :/


On ebay recently (not mine nor did I bid)  There was another one a week later "Buy It Now" at £450.


----------



## Furlans

Have anyone tried the voltage control with the lastest MSI ab? I saw that you can give 1.3 of voltage at the gpus


----------



## FiShBuRn

yes, it works!

Ive tested on mine and is working fine...altought i have a strange situation, on benchmarks like Valley GPU 1 doesnt reach the max boost speed... GPU 2 is allways at maximum speed. Temps are OK, below 55C.

But if i play games like BF3, Crysis 3 or Metro LL both GPUs run at max speed boost!

Any ideas?


----------



## Proxish

Quote:


> Originally Posted by *PinzaC55*
> 
> On ebay recently (not mine nor did I bid)  There was another one a week later "Buy It Now" at £450.


That is amazing...
I'll definitely have to have a look out for one for around that price.


----------



## Proxish

Thought people might be interested in these over on eBay.
I don't have the money for one, but someone here might be looking for an SLi upgrade.
http://www.ebay.co.uk/itm/EVGA-NVIDIA-GTX-690-4-GB-2-Year-EVGA-Warranty-/321189290067?pt=PCC_Video_TV_Cards&hash=item4ac85fa053


----------



## provost

Quote:


> Originally Posted by *FiShBuRn*
> 
> yes, it works!
> 
> Ive tested on mine and is working fine...altought i have a strange situation, on benchmarks like Valley GPU 1 doesnt reach the max boost speed... GPU 2 is allways at maximum speed. Temps are OK, below 55C.
> 
> But if i play games like BF3, Crysis 3 or Metro LL both GPUs run at max speed boost!
> 
> Any ideas?


yeah, there is probably a power throttle somewhere, and no one has been able to put custom bios on a 690 successfully, except for kingpin and a guy name grubio or something.
take off precision and kboost,and don't use it with ab. water helps with cooling.


----------



## FiShBuRn

Im not using custom bios, only softmod via MSI AB Beta14 and my card is under WC.


----------



## lascar

each Wafer Yield is specific and also each game / software.

My gtx 690 will not peform as yours, maybe mine will boost maybe higher, maybe it's yours because mine needs 1.175V for running @ 1200 mhz, and your better "yielded" card only need 1.125 to maintain stability at 1200 mhz ....

Got the same issue on the strongest workload with the gtx 690 when overvolting it :

you have to try different settings, to found out the best ratio between gpu clock vs memory clock vs voltage vs TDP threshold

the only way to prevent gpu0 (or 1) throlling back is to lower your voltage .... your gpu takes more "juice" assuming that it will "blow" easily your TDP limit and power target limit will start to kick in...

So obviously, doing this, power target limit will be not surpassed
: gpu will not start throlling.

you have to run the application like HEAVEN 4.0 or CRYSIS 3 and found the lowest vcore needed to maintain the same exact core clock on both gpu's, meanwhile maintaining a good temperature on both cores and not overpassing your power target limit you already set.

i think you should enjoy better performance with the two gpu's running a 1215 mhz without throlling, rather than the 1st gpu running still at 1247mhz and the second one fluctuating betwen 1084 mhz ( 70% of the time ) and 1247mhz (3 % of the time) if you know what i mean...

gtx 690 is using a specific over current protection (OCP), the gpu boost feature controls each gpu, calculating not the total consumption of the gtx 690, but for each gpu independently.

Power states are calculated directly on the moffsets and VRM circuitry ( GPU0/GPU1/MEM1/MEM2/PLX switch) and the GFX

The next "extreme" action you can do, it's to totally disable the power target limit, this "virtual threshold" done by Nvida (adding 300% PT threshold in bios ), but don't worry gtx 690 has differents safe mechanisms to prevent frying it









give a try !

Greetings from Paris, Sun is hot like ladies : D

i apologize for my poor english, it was really hard to translate geek and technical stuff in french into english one .


----------



## FiShBuRn

yeah, power limit is the problem...too bad then, this softmod of AB doesnt bring me more performance...


----------



## lascar

so try to disable the PT target , set it to +300 % in KGb software, softmod your bios one more time and retry .

Your hardware is under WC so i think it's worth it .

My point of view.

bon courage l'ami.

Cheers.


----------



## Koniakki

Quote:


> Originally Posted by *lascar*
> 
> so try to disable the PT target , set it to +300 % in KGb software, softmod your bios one more time and retry .
> 
> Your hardware is under WC so i think it's worth it .
> 
> My point of view.
> 
> bon courage l'ami.
> 
> Cheers.


I got serious throttling issues in Valley too from GPU1. GPU2 seems perfectly fine. Hmm.


----------



## lascar

MAX ROCK STABLE mine 690 gtx - stock cooler VID 2.225V and 2.275V

HEAVEN + 160 mhz : 1233 mhz on both core without throll

VALLEY : +175 mhz : 1247/1274 on both core without throll

Crysis 3 + 170 mhz : 1233mhz/1247 mhz on both core without throll

Metro 2033/ LL : +195 mhz : 1253/1280 mhz on both core without throll

Unreal Engine Games : + 200 mhz 1267/1280 mhz on both core without throll

3d MARK 11/13 : +175 mhz : 1247/1274 on both core without throll

Memmory offset betwen + 600 et + 675 mhz ( 7.3 / 7.4 ghz QDR)

this week end i'll set up the "accelero extreme" for my lady and i'll tweak the bios to put 300% theshold for the PT:

once this done : if gpu is throlling back :

- u already set the max settings for Your Card can support u can't go futher that's a fact ... wait for 18 nm process0

the gtx 690 is a 8 + 2 phases vrm circuitry it can"t supply enough juice ....

- your PSU is too weak with "Amp's or your motherboard is to weak to supply sufficient current ---) Protection against OVC


----------



## lascar

look at GPU - Z : at the end of the list look at OCP / OVP protection if you are experiencing some issues with throlling, u should definately watch over the infos forwarded by gpu z like OV real, OCP, etc ....


----------



## Lukas026

man i hope u meant 1.225v







(if 2.225V than how did u do it ?)

anyways I am getting throttling on GPU1 also with all above 1.23v. I am using modded bios with PT 150%. is there any guide how to make bios with 300% ?

also is there any danger to use bios with 300 % if temps are cool ? I mean for example if I would play games with settings like this:

300% PT
+ 150 mhz core @ 1.25V
+ 500 mhz memory

and temps never go above 80C

can I somehow kill the card ?

thanks


----------



## cpachris

I'd like to have some custom 690 backplates made. Can anybody point me toward someone doing them? I've heard dwood is not doing any right now.


----------



## lascar

it's always risky .... indeed it 's your choice : would you jeopardize the health of your GFX...

and if risk is minimal, the fact that it will shorten the lifetime your Videocard...

So it's worth it ? i can't answer that question but you definitely should ...

In my case ... i'm not concerned by the fact that my gtx 690 will run for 'only 4 or 5 years" before the first failure... instead of 8 or ten years.


----------



## Sypherian

Hello guys I'm having a bit of a problem with my GTX 690 which I'm hoping you can help me out with.

Right about 4 months ago i started having really high temps on my card when playing any game really, even in DOTA 2 it sometimes reaches 80.

I ofc immediately got kinda concerned and started to find the problem but without any luck iv'e tried almost anything i can think of, reinstalling drivers clean install ofc, reinstalling MSI afterburner. I even dismounted all case fans including the two on my H80 on a push pull config and cleaned all of them it did give me a couple of degrees lower but not by much.

Then today while playing a game i noticed my GPU temps we're on the 89-91 (fan set to max in MSI afterburner) so started trying to locate some form of inconsistency.

Even though Multi-GPU config is set to maximize 3D performance it only used one core so i shut down the game and set it to disable multi GPU mode same performance and temps in game.

After no solution and a lot of googling i noticed that in MSI afterburner my GPU2 memory usage is always at max even though i'm idle in windows and GPU 1 is really low. So i once again tried disabling multi GPU mode and GPU2 memory usage fell to zero and GPU1 still stayed the same as before, I reapplied Multi GPU and GPU2 memory usage skyrocks to max while GPU1 stays the same .

Is my card faulty in some way? my idle temps in windows is 40 which is mind boggling considering that my fan is set to max in MSI afterburner.

Hope you guys can help me i'm at my wits end here.

Sypherian


----------



## lascar

driver's issues ?

try to run driver sweaper in troubleshooting mode under windows
make a clean fresh install of the WHQL drivers, 320.49 or 314.18 ...

maybe it will fix your issue.

the only thing i can tell you is i was also experiencing high temperatures> 95C for months (since feb 2013) under gaming, and drivers were faulty ( any driver between the 310 and the 320.49 version raise the temperature very high and i dont know why ... )

I"ve solved my problem installing the last beta 320.80 .

Wish it can help you.


----------



## FiShBuRn

Quote:


> Originally Posted by *Sypherian*
> 
> Hello guys I'm having a bit of a problem with my GTX 690 which I'm hoping you can help me out with.
> 
> Right about 4 months ago i started having really high temps on my card when playing any game really, even in DOTA 2 it sometimes reaches 80.
> 
> I ofc immediately got kinda concerned and started to find the problem but without any luck iv'e tried almost anything i can think of, reinstalling drivers clean install ofc, reinstalling MSI afterburner. I even dismounted all case fans including the two on my H80 on a push pull config and cleaned all of them it did give me a couple of degrees lower but not by much.
> 
> Then today while playing a game i noticed my GPU temps we're on the 89-91 (fan set to max in MSI afterburner) so started trying to locate some form of inconsistency.
> 
> Even though Multi-GPU config is set to maximize 3D performance it only used one core so i shut down the game and set it to disable multi GPU mode same performance and temps in game.
> 
> After no solution and a lot of googling i noticed that in MSI afterburner my GPU2 memory usage is always at max even though i'm idle in windows and GPU 1 is really low. So i once again tried disabling multi GPU mode and GPU2 memory usage fell to zero and GPU1 still stayed the same as before, I reapplied Multi GPU and GPU2 memory usage skyrocks to max while GPU1 stays the same .
> 
> Is my card faulty in some way? my idle temps in windows is 40 which is mind boggling considering that my fan is set to max in MSI afterburner.
> 
> Hope you guys can help me i'm at my wits end here.
> 
> Sypherian


That temps are really high, not normal i think, you have a lot of throttling issues for sure.... do you OC it?

I would change the TIM for sure...


----------



## tin0

Guys, I'm trying to flash my gtx690 with a modded bios just for higher power target so I can add more voltage via AB314.
However, I keep getting a hierarchy mismatch error when using nvflash. Even tried flashing the stock bios, still comes up with this error.
Also the PLX chip does not list when.using the command for nvflash.
What to do? Card is fully watercooled and currently doing 1202mhz max 48 degrees celsius, it can do so much more with some more juice.


----------



## grunion

Quote:


> Originally Posted by *tin0*
> 
> Guys, I'm trying to flash my gtx690 with a modded bios just for higher power target so I can add more voltage via AB314.
> However, I keep getting a hierarchy mismatch error when using nvflash. Even tried flashing the stock bios, still comes up with this error.
> Also the PLX chip does not list when.using the command for nvflash.
> What to do? Card is fully watercooled and currently doing 1202mhz max 48 degrees celsius, it can do so much more with some more juice.


Nvflash in windows?


----------



## tin0

Yes

Edit: bricked the card this morning, but have fixed it with modded hydro copper bios via DOS. Now got voltage control to 1.3v.
Have been able to run 1240mhz on both cores, but power limit keeps crapping out if I go higher. Already created a custom bios with PT 300% but that doesn't seem to do anything (slider in AB goes up to 200).

So for anyone getting the hierarchy mismatch error trying to flash their GTX690 under windows, solution is to do it in DOS.

Thanks grunion for putting me on the right track.


----------



## FiShBuRn

If you use the latest version of Nvflash for Windows you will not see that hierarchy error anymore...


----------



## tin0

Tried that, didn't do anything for me.


----------



## FiShBuRn

This one: NVFlash 5.142 for Windows ?


----------



## provost

I guess not a lot people asking for modded 690 bios to remove power throttle. May have to ask some experts to look into it. 690 is tricky with dual GPUs and this is why we have not seen a successful modded bios that remove all other kind of pre-programmed throttling. But, I bet someone knows how to do it....make a post in various forums including this one and you may be able to get some help. Just an idea


----------



## killbom

Add me!

Got my barely used 690. AWESOME CARD!


----------



## tin0

Quote:


> Originally Posted by *provost*
> 
> I guess not a lot people asking for modded 690 bios to remove power throttle. May have to ask some experts to look into it. 690 is tricky with dual GPUs and this is why we have not seen a successful modded bios that remove all other kind of pre-programmed throttling. But, I bet someone knows how to do it....make a post in various forums including this one and you may be able to get some help. Just an idea


Yea I guess it all boils down to that. I can set 1.3v on my GTX690 but the power limit is holding it back (max 150%). I've already made a custom bios with 500% power limit, but in reality it just doens't go higher than 150%. Been trying all day with different custom biosses (vendor and different settings for max. voltage and powerlimit)
I was able to complete GT2, GT3, and GT4 in 3dmark11 @ 1241mhz (where I could do 1215mhz before), but GT1 throttles like crazy and makes the card downclock all the way to 1050mhz.
So for us GTX690 owners it might be nice to finally have voltage control, the power limit is still holding us back.

Maybe Skyn3t or someone else can look into this?


----------



## FiShBuRn

If you set Power Limit to 200 it will work, ive mine on 200%!


----------



## tin0

I've actually set it to 500 on my current bios, but it still hits it, that's impossible


----------



## PinzaC55

Quote:


> Originally Posted by *killbom*
> 
> Add me!
> 
> Got my barely used 690. AWESOME CARD!


Welcome to the Elite


----------



## Koniakki

PT(Power Target) should stand as PT=Power Throttle... Dammit!


----------



## egotrippin

I haven't commented in here in a while. Does anybody know of a good way to add a support for the weight of my GTX 690 + Koolance waterblock? I bought an EVGA back plate originally before realizing how flimsy it was and resold it again (I could have literally folded it in half with just a few fingers). The weight of my card causes it to sag and warp over time. I'm using a thick piece of stiff tube squeezed under my card to push upwards. The temps between the two GPUs are normally within a couple degrees of one another but without support, that variance increases by 10-15c which means that the warping of the PCB is affecting the GPU contact to the cooling block. Does anybody know of any elegant solutions?


----------



## tin0

Lol can't finish a session of 3dmark on 1215mhz without throttling like crazy but I can play Metro LL @ 1228mhz fluid boost clock...strange stuff. Really hope there is a way to lift the Power Limit on this thing.


----------



## Proxish

To everyone looking to buy a GTX 690 over in the states, there are a ton of 690's going for $650-$700 on amazon's sale section under the warehouse sales.
http://www.amazon.com/gp/offer-listing/B007ZRO3U4/ref=sr_1_1_olp?ie=UTF8&qid=1378078449&sr=8-1&keywords=690&condition=used

I was going to order one, unfortunately they won't post to the UK.


----------



## Renairy

Quote:


> Originally Posted by *Proxish*
> 
> To everyone looking to buy a GTX 690 over in the states, there are a ton of 690's going for $650-$700 on amazon's sale section under the warehouse sales.
> http://www.amazon.com/gp/offer-listing/B007ZRO3U4/ref=sr_1_1_olp?ie=UTF8&qid=1378078449&sr=8-1&keywords=690&condition=used
> 
> I was going to order one, unfortunately they won't post to the UK.


You don't want one.... 2GB is now not enough, especially for the new games coming. 3GB is now the bare minimum standard imo.


----------



## egotrippin

Quote:


> Originally Posted by *Renairy*
> 
> You don't want one.... 2GB is now not enough, especially for the new games coming. 3GB is now the bare minimum standard imo.


It appears that anti aliasing is what eats up my VRAM and if you're gaming at 2560x1440 and turn anti-aliasing off it's perfect. As far as I can tell, anti aliasing doesn't help edges at that resolution and but what it does do is make surfaces appear a bit more shiny and artificial which I don't like. For a price in the area of $700 it blows all the other cards out of the water.

If anybody wants my EVGA GTX 690 Signature Edition + Koolance VID-NX690 waterblock already attached using a heavy amount of Shin-Etsu MicroSi G751 TIM let me know. I haven't decided to sell it but if somebody is willing to buy it I might entertain offers. It overclocks at over 130+ MHz GPU and Mem gets 300-500 MHz overclock, maybe more (I noticed I get a higher overclock since switching to an X79 platform but I haven't pushed it to find out how high). Temps have always been kept at or bellow 40c.


----------



## Proxish

Quote:


> Originally Posted by *Renairy*
> 
> You don't want one.... 2GB is now not enough, especially for the new games coming. 3GB is now the bare minimum standard imo.


Those are 4gb, 690's don't come smaller than that.


----------



## tin0

And the memory bandwith thing is getting old. Having owned a GTX780 I can say that my FPS is far better with my GTX690, the raw GPU power more than makes up for the 1GB difference.
I haven't had a single problem concerning memory bandwith @ 1440p, even with 4xMSAA on different games. It probably helps that we have ~21k graphics score power on our hands


----------



## PinzaC55

Quote:


> Originally Posted by *Proxish*
> 
> To everyone looking to buy a GTX 690 over in the states, there are a ton of 690's going for $650-$700 on amazon's sale section under the warehouse sales.
> http://www.amazon.com/gp/offer-listing/B007ZRO3U4/ref=sr_1_1_olp?ie=UTF8&qid=1378078449&sr=8-1&keywords=690&condition=used
> 
> I was going to order one, unfortunately they won't post to the UK.


Assuming you have an Android phone Save a Search on ebay for GTX 690 so it alerts you to new listings. I have seen them go for £410 and £450 (used) recently.


----------



## PCModderMike

Quote:


> Originally Posted by *Renairy*
> 
> You don't want one.... 2GB is now not enough, especially for the new games coming. 3GB is now the bare minimum standard imo.


2GB is still plenty for single monitor gaming. Even at 2560x1440 which is what I use, I don't come close to using 2GB worth of memory and that's playing newer titles as well.
Quote:


> Originally Posted by *Proxish*
> 
> Those are 4gb, 690's don't come smaller than that.


The fact that they advertise having 4GB worth of memory is misleading. It's 2GB per GPU, so even though that's 4GB total, in SLI memory does not add together so you still only have an effective 2GB worth of memory.


----------



## w8tdstrgecube

Proof attached.


----------



## flv1333

Quote:


> Originally Posted by *lascar*
> 
> it's always risky .... indeed it 's your choice : would you jeopardize the health of your GFX...
> 
> and if risk is minimal, the fact that it will shorten the lifetime your Videocard...
> 
> So it's worth it ? i can't answer that question but you definitely should ...
> 
> In my case ... i'm not concerned by the fact that my gtx 690 will run for 'only 4 or 5 years" before the first failure... instead of 8 or ten years.


How would I go about getting changing and flashing my 690s BIOS's? Is there any guides available?


----------



## Koniakki

Quote:


> Originally Posted by *flv1333*
> 
> How would I go about getting changing and flashing my 690s BIOS's? Is there any guides available?


I can provide you the automated foolproof commands for the flashing. Its really easy and needs no user interaction besides pressing Enter at the end to close the CMD window.

Let me know if interested.


----------



## ocsi1970

Hy guys!GTX 690 cooler change lead to loss of warranty?


----------



## Lukas026

depends on manufacturer but i think not...


----------



## ocsi1970

Is GIGABYTE.


----------



## Djnardu

I just saw the BF4 system requirements today and it kinda bummed me out about the VRAM req

Now I'm contemplating selling my 690.


----------



## iARDAs

Quote:


> Originally Posted by *Djnardu*
> 
> I just saw the BF4 system requirements today and it kinda bummed me out about the VRAM req
> 
> Now I'm contemplating selling my 690.


You are gaming at 1440p right? I am sure you will be fine but there might be rare occasions where you hit the vram wall. But again, I am sure 690 will be just fine.

That being said. At ultra levels with my previous 590 I used to hit vram wall rarely on a 120hz 1080p tn panel.

All of a sudden the FPS would lock itself to 40, not a tad more.


----------



## Djnardu

Quote:


> Originally Posted by *iARDAs*
> 
> You are gaming at 1440p right? I am sure you will be fine but there might be rare occasions where you hit the vram wall. But again, I am sure 690 will be just fine.
> 
> That being said. At ultra levels with my previous 590 I used to hit vram wall rarely on a 120hz 1080p tn panel.
> 
> All of a sudden the FPS would lock itself to 40, not a tad more.


Yea I mean I know it helps I can turn off AA and such but Im wondering If a could get around 600 or more for the 690 and buy a a single 780 until I need two.

With all the power of the card if it hits the VRAM wall it pretty much negates the HP. Its like a Sports car in the snow...


----------



## iARDAs

Quote:


> Originally Posted by *Djnardu*
> 
> Yea I mean I know it helps I can turn off AA and such but Im wondering If a could get around 600 or more for the 690 and buy a a single 780 until I need two.
> 
> With all the power of the card if it hits the VRAM wall it pretty much negates the HP. Its like a Sports car in the snow...


Yeah I know how you feel. It was the only reason I sold my 590. I used to love that card but not the vram to be honest.

780 will give you less performance than the 690 though. You might not be happy unless you get 2 of them.

My OCed Titan at Ultra settings with 4xAA can handle BF3 just fine for example, but I am sure BF4 will benefit from a 2nd Titan to fully run the game at 1440p.


----------



## Djnardu

Maybe ill hold off until the 880..


----------



## cpachris

I'm hoping for a 790 announcement.


----------



## iARDAs

Quote:


> Originally Posted by *cpachris*
> 
> I'm hoping for a 790 announcement.


I read somewhere that 790 might not be happening and it makes sense

Nvidia lately only did 1 dualchip GPU per generation

295 was from its own generation (forgot the name)

590 was Fermi

690 is Kepler

So perhaps 1 dual GPU per each generation. And since Maxwell is incoming we can only hope for 890. I don't think 790 will be happening but noone can know for sure but Nvidia.


----------



## virgis21

Yes, VRAM size (1600Mb used) is getting issue with 690s







Here is my screenshot of second monitor, while I am playing Battlefield3 at Ultra.
3770K is OC.
Do you think 2 GTX 780 will be better than one 690? Seems the same setup but with 3Gb of VRAM?!
As 690 declares 4Gb. it is just 2x2 and those fills same 2x way


----------



## iARDAs

Quote:


> Originally Posted by *virgis21*
> 
> Yes, VRAM size (1600Mb used) is getting issue with 690s
> 
> 
> 
> 
> 
> 
> 
> Here is my screenshot of second monitor, while I am playing Battlefield3 at Ultra.
> 3770K is OC.
> Do you think 2 GTX 780 will be better than one 690? Seems the same setup but with 3Gb of VRAM?!
> As 690 declares 4Gb. it is just 2x2 and those fills same 2x way


Oh 2 780s will definitaly be better than a 690.

When I oc my Titan I can get a stock 690 performance ( of course 690 can still make a difference when OCed as well)

Since an OCed 780 is also similar to a stock Titan, it is possible to say that 780 SLI will be faster than 690.

http://www.guru3d.com/articles_pages/geforce_gtx_780_sli_review,1.html

In worse case scenario the difference is minimal but still favoring 780 SLI.


----------



## virgis21

Hmm.







Don't want to sell my 690, will be waste of money..








Ok, lets think another scenario, lets say 690 isn't powerful enough for new game, then I add another one (in SLI) and what about VRAM? Future games will need more and more VRAM, so 690 will be pointless to upgrade into SLI?
Or I am thinking wrong way? Thanks!

virgis

P.S. found this:
http://www.game-debate.com/gpu/index.php?gid=878&gid2=593&compare=geforce-gtx-790-vs-geforce-gtx-690


----------



## iARDAs

Quote:


> Originally Posted by *virgis21*
> 
> Hmm.
> 
> 
> 
> 
> 
> 
> 
> Don't want to sell my 690, will be waste of money..
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, lets think another scenario, lets say 690 isn't powerful enough for new game, then I add another one (in SLI) and what about VRAM? Future games will need more and more VRAM, so 690 will be pointless to upgrade into SLI?
> Or I am thinking wrong way? Thanks!
> 
> virgis
> 
> P.S. found this:
> http://www.game-debate.com/gpu/index.php?gid=878&gid2=593&compare=geforce-gtx-790-vs-geforce-gtx-690


Honestly for me, Quad SLI 690 is pointless. Toooo much power but toooo few vram. You will still have 2GB of vram at the end. 2GB of Vram is still ok on pretty much all games but new generation could be different.

Also that link seems a hoax as there is no info on 790 as of now. There are rumors that it will have 2 GK110 chips though.


----------



## anubis1127

Quad SLI is kinda crappy in general, it can be a pita, and the scaling isn't that good. I had a buddy with 2 690s, and whenever we would try to game it would take him multiple reboots to get his SLI enabled and functioning.


----------



## lascar

Virgis your result seems a little bit odd ...

I can play it maxed out with downscalling 1800p (3200x1800p + msaa x4 + ultra) with ease ...

Vram average utilisation is 2018 Mb on my gtx 690 ... on stock both cores running at 1071 mhz ... 1243 mhz when overclocked without issues or throlling.









average 60 fps @ 3200x1800p with a poor Peryn Q9950 @ 4.1ghz and o/ced gtx 690 on battlefield 3... u have a much powerfull cpu !









Both gpu's are still @ 900 mhz .... unless your playing with a 15 inch flat screen @ 720P







... so my question ? wich screen are you using at you home : 1080p/1440p / surround etc....

.... 690 is the most powerfull card ever made, Vram issue is very rare : Max payne 3 @ 2880x1620p + msaa x8 will blow for sure the 2x2 gb of gddr5 ram,.... or skyrim with thousand of mods and 8k textures.

I've plenty of games installed on my rig, vram was never the issue .... but cpu bottlenecking was







( in my case ... not yours for sure)


----------



## PCModderMike

2GB has been plenty for me on all my games at 2560x1440.....but shew, these new recommendations DICE has released does have me a little worried.
http://www.pcgamer.com/2013/09/10/battlefield-4-system-requirements-released/?utm_source=fb&utm_medium=emp

I really don't feel like trying to sell my 690, especially with block and all....was hoping to get a couple of years out of it at least. Still, should I sell now and try to get as much as possible and go for something like a 780? Stick with it and see how it does?


----------



## iARDAs

Quote:


> Originally Posted by *PCModderMike*
> 
> 2GB has been plenty for me on all my games at 2560x1440.....but shew, these new recommendations DICE has released does have me a little worried.
> http://www.pcgamer.com/2013/09/10/battlefield-4-system-requirements-released/?utm_source=fb&utm_medium=emp
> 
> I really don't feel like trying to sell my 690, especially with block and all....was hoping to get a couple of years out of it at least. Still, should I sell now and try to get as much as possible and go for something like a 780? Stick with it and see how it does?


That 780 will be a downgrade from a 690. Grab 2. OR if you like Grab 2 770s for the time being with 4G and enjoy it untill Maxwell?

If you have the cash though a 780 SLI system or atleast a single Titan for the time being would be awesome.


----------



## anubis1127

Quote:


> Originally Posted by *PCModderMike*
> 
> 2GB has been plenty for me on all my games at 2560x1440.....but shew, these new recommendations DICE has released does have me a little worried.
> http://www.pcgamer.com/2013/09/10/battlefield-4-system-requirements-released/?utm_source=fb&utm_medium=emp
> 
> I really don't feel like trying to sell my 690, especially with block and all....was hoping to get a couple of years out of it at least. Still, should I sell now and try to get as much as possible and go for something like a 780? Stick with it and see how it does?


Remember its an "optimized for AMD game", that 3gb recommendation is probably just marketing. Most GPUs with 3gb vram on the market aren't even fast enough to use 3gb vram until you get into CF, or SLI.

I would definitely wait until you buy the game, play it, then if you feel you are out of vram make a decision. I wouldn't be surprised if 2gb is still just fine for bf4, and several upcoming games.


----------



## PCModderMike

Quote:


> Originally Posted by *iARDAs*
> 
> That 780 will be a downgrade from a 690. Grab 2. OR if you like Grab 2 770s for the time being with 4G and enjoy it untill Maxwell?
> 
> If you have the cash though a 780 SLI system or atleast a single Titan for the time being would be awesome.


Yea I know power wise, the 780 would be a downgrade from my 690...just thinking about that memory buffer is all. Wouldn't have the cash for 780 SLI, plus I'm mATX and I use a sound card, so I'm a single card kind of guy. After reading anubis1127's comment below, might be best to just see how it does.
Quote:


> Originally Posted by *anubis1127*
> 
> Remember its an "optimized for AMD game", that 3gb recommendation is probably just marketing. Most GPUs with 3gb vram on the market aren't even fast enough to use 3gb vram until you get into CF, or SLI.
> 
> I would definitely wait until you buy the game, play it, then if you feel you are out of vram make a decision. I wouldn't be surprised if 2gb is still just fine for bf4, and several upcoming games.


Good point about the marketing hype. I don't even come close to 2GB currently in BF3, I can't imagine BF4 will be a huge difference...at least I hope. Guess it will just be best to wait and see how it does.


----------



## iARDAs

Quote:


> Originally Posted by *PCModderMike*
> 
> Yea I know power wise, the 780 would be a downgrade from my 690...just thinking about that memory buffer is all. Wouldn't have the cash for 780 SLI, plus I'm mATX and I use a sound card, so I'm a single card kind of guy. After reading anubis1127's comment below, might be best to just see how it does.
> Good point about the marketing hype. I don't even come close to 2GB currently in BF3, I can't imagine BF4 will be a huge difference...at least I hope. Guess it will just be best to wait and see how it does.


Yeah definitaly. The best way to be sure is testing it yourself.

Even if you hit the vram wall (which I Doubt) you can probably lower the Anti Allising a tad and be fine.

690 is a great GPU.


----------



## provost

Quote:


> Originally Posted by *PCModderMike*
> 
> 2GB has been plenty for me on all my games at 2560x1440.....but shew, these new recommendations DICE has released does have me a little worried.
> http://www.pcgamer.com/2013/09/10/battlefield-4-system-requirements-released/?utm_source=fb&utm_medium=emp
> 
> I really don't feel like trying to sell my 690, especially with block and all....was hoping to get a couple of years out of it at least. Still, should I sell now and try to get as much as possible and go for something like a 780? Stick with it and see how it does?


You will get a lot of mileage of the 690, unless you want to go multi monitor set up and crank up anti aliasing. And, if you are planning to go multi monitor, don't be surprised if you end up with a Tri set up of some kind which means more $$$$ compared to what you would get out of selling the 690.
690 is one of the best p/p play out there, IMHO, at the moment with the used 690 prices being low due to all the unwarranted negative publicity.


----------



## virgis21

Quote:


> Originally Posted by *lascar*
> 
> Virgis your result seems a little bit odd ...
> 
> I can play it maxed out with downscalling 1800p (3200x1800p + msaa x4 + ultra) with ease ...
> 
> Vram average utilisation is 2018 Mb on my gtx 690 ... on stock both cores running at 1071 mhz ... 1243 mhz when overclocked without issues or throlling.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> average 60 fps @ 3200x1800p with a poor Peryn Q9950 @ 4.1ghz and o/ced gtx 690 on battlefield 3... u have a much powerfull cpu !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both gpu's are still @ 900 mhz .... unless your playing with a 15 inch flat screen @ 720P
> 
> 
> 
> 
> 
> 
> 
> ... so my question ? wich screen are you using at you home : 1080p/1440p / surround etc....
> 
> .... 690 is the most powerfull card ever made, Vram issue is very rare : Max payne 3 @ 2880x1620p + msaa x8 will blow for sure the 2x2 gb of gddr5 ram,.... or skyrim with thousand of mods and 8k textures.
> 
> I've plenty of games installed on my rig, vram was never the issue .... but cpu bottlenecking was
> 
> 
> 
> 
> 
> 
> 
> ( in my case ... not yours for sure)


Actually, i am running 60hz monitor and cheched that vertical sync or what name it is to limit FPS to 60hz. That is why gpu isn"t overloaded and frequency so low. Will give a try on full power. At the moment I have ~80FPS in BF3.

BF4 beta is less than 4 weeks to wait, so lets wait









Will confirm resolutions tonight


----------



## iARDAs

Quote:


> Originally Posted by *virgis21*
> 
> Actually, i am running 60hz monitor and cheched that vertical sync or what name it is to limit FPS to 60hz. That is why gpu isn"t overloaded and frequency so low. Will give a try on full power. At the moment I have ~80FPS in BF3.
> 
> BF4 beta is less than 4 weeks to wait, so lets wait


Yeah disable the vsync. Try to lock the frames per second via MSI afterburner or Evga Precision. You might enjoy it a lot. This is how I play my games.

And yeah since 690 is an incredible strong GPU, your cores will not utilize all the power at 60hz in 1080p


----------



## virgis21

Quote:


> Originally Posted by *iARDAs*
> 
> Yeah disable the vsync. Try to lock the frames per second via MSI afterburner or Evga Precision. You might enjoy it a lot. This is how I play my games.
> 
> And yeah since 690 is an incredible strong GPU, your cores will not utilize all the power at 60hz in 1080p


Does fps make sense then monitor is 60hz?


----------



## iARDAs

Quote:


> Originally Posted by *virgis21*
> 
> Does fps make sense then monitor is 60hz?


I don't get the question 

But having owned 2-3 60hz monitors I always hated vsync. I don't like enabling vsync on any game as it introduces lag. Even on crappy games. By locking my frame per second ot 62 or 59, I get a fluid ride where there aren't any disadvantages of vsync, the GPU does not tire itself a lot, and I get less screan tearing than no vsync.


----------



## tin0

Still theres a thing called vram utilization. Meaning the same game can use as much vram as it has to work with. Like, if your gfx card has 3gb vram, some games might use that while it's not really neccesary but it's allocated never the less.
GTX690 (especially with 1 screen) will be fine for a few years


----------



## virgis21

Quote:


> Originally Posted by *tin0*
> 
> GTX690 (especially with 1 screen) will be fine for a few years


I like this statement


----------



## PCModderMike

Quote:


> Originally Posted by *tin0*
> 
> Still theres a thing called vram utilization. Meaning the same game can use as much vram as it has to work with. Like, if your gfx card has 3gb vram, some games might use that while it's not really neccesary but it's allocated never the less.
> GTX690 (especially with 1 screen) will be fine for a few years


I like the whole statement. Makes sense, doesn't BF3 already do that sort of? It will just eat up whatever amount of available VRAM there is, even if it doesn't really need it.


----------



## virgis21

Here is my screenshot. This PC is not made for games only.. because of 2 monitors and one running via HDMI (LG 3D TV 27", I do a lot of 3D video edit) and resolution is only 1080p (DVI->HDMI)







So by this I got this








http://imageshack.us/a/img713/8507/1xn8.jpg (click this link for high rez picture)

Virgis


----------



## virgis21

I see everyone of us is playing Battlefield 3?







Should we make a GTX 690 club Battlefield 3 Tournament?







So only 690 users play?









Virgis

P.S. What temps are max for 690s? As far I see 80C is only 30% fan speed..


----------



## flv1333

Hell yeah man I'm up for that


----------



## virgis21

Guys, can you check my Valley benchmark results and compare with yours? I see not much better than GTX780 (other tread):


----------



## Atzenkeeper500

Is your CPU overclocked? If its not, maybe its stops the GPUs to performe to the Maximum


----------



## virgis21

Quote:


> Originally Posted by *Atzenkeeper500*
> 
> Is your CPU overclocked? If its not, maybe its stops the GPUs to performe to the Maximum


Yes, to 4.8Ghz. And it is not overloaded, around 60%. I would like to compare with others, 690 owners here.

Virgis


----------



## mfranco702

Quote:


> Originally Posted by *Atzenkeeper500*
> 
> Is your CPU overclocked? If its not, maybe its stops the GPUs to performe to the Maximum


I dont think valley registers the real clock speed. happened to me before while running a SLI of 680s Lightning, btw kinda low for a double GPU I scored 104.1 FPS


----------



## PCModderMike

My monitor is 2560x1440....but I can give my system a Valley run later once back at home, and I'll just run at 1920x1080. You just selected the extreme HD preset and didn't change anything else?


----------



## lascar

as u can see Virgis you have NO issue with VRAM look at gpu-z "memory used': 1680 Mb Used on 2048 Mb

Also note that frequencies of your gpu under valley demo were wrong, there is a bug ...









PS:

and gpu throlling reasons :

- 60 fps capped game (U3 engines , /-)

- 3D Workload is too low or gpu scalling is limited ( bad gpu Sli scalling or cpu limited ) .... try to max out your graphics settings, use (OGSSA / SGSSAA / SSAA) other supersampling or downscalling techniques

- V-sync enabled / double/tripple buffering

- adaptive power management under nvidia CPL enabled

- Adaptive VSync is also enabled under nvidia CPL.

- Cpu bottlenecking ?

- Vram Bottlenecking ( gdd5 memory / ring bus fully used)?

- bad PSU : unsufficient or faulty voltage rate

- TDP threshold bypassed


----------



## virgis21

Ye
Quote:


> Originally Posted by *PCModderMike*
> 
> My monitor is 2560x1440....but I can give my system a Valley run later once back at home, and I'll just run at 1920x1080. You just selected the extreme HD preset and didn't change anything else?


Yes, only extreme HD


----------



## ocsi1970

Hy guys!Does anyone Arctic Accelero Twin Turbo 690 and tell me what temperature is compared to the stock?Thanks!


----------



## tin0

Any news on how to get by the 150% power limit on our cards?


----------



## fast_fate

I got a few of these....

Driver issues cause only 5 GPU's to show sometimes - I think it might be a GPUGrid hiccup.
So now and then I have to do clean driver installs to get the extra chip back online, like atm the moment 3 cards only showing 5 GPU's.

On day on posting this was number 1 machine worldwide on GPUGRID.


Here is the machine stats.

In total I have 2 x Asus and 2 x EVGA GTX 690 cards.
Have blocks for them all, need to install them (blocks) now before Summer starts to kick in in hot Western Australia.



3DMark Firestrike run done a while back


----------



## mtbiker033

anyone else have trouble with gpu's not downclocking?

I'm using afterburner beta 15 (have also tried precision 4.2.1) and clocks stay at 915mhz at all times.

tried changing prefer maximum performance to adaptive, same issue

chrome browser hardware acceleration off

??


----------



## AllGamer

Hey guys, have any of you updated to the latest *327.23* drivers yet?

I after I installed the new drivers, things now open super fast, but....



I get these weird extra windows when *Surround View is Enabled* if I don't enable surround view then it's normal.

also with the new drivers now it won't let me drag Start Menu bar to the 3+1 monitor, it can only be placed within the 3 monitors of the Surround View.









seems like nVidia pulled an ATI in here, the new Surround View seems like an emulated screen, unlike before.

At least the good thing is they finally made On/Off switches to quickly swap between Normal screen vs. Surround View, this feature was waaaaay long overdue


----------



## tin0

I use a single U2711 @ 2560x1440 so I can't comment on your surround problem. However, this driver seems stable and gives excellent performance for the games I play (BF3, Metro LL, C&C alpha) even @ constant OC to 1202mhz.


----------



## juanP

so i start my pc today morning and my screen looks like this , vertical lines all across three monitors. was working fine yesterday night.

anybody knows what the issue might be...hopefully not the 690


----------



## Arizonian

Are you connected with D-DVI cable? Have you tried a different D-DVI cable? Could be monitor related. Start with simple things.


----------



## superx51

I had a long talk with my friend last night who is a engineer at Nvidia and they will be releasing a gtx 790 that uses 2 full 780s on one pcb like the 690. It will have 3 gigs of ram x2 and a lower tdp than 780 sli like the 690. They tried to develop a 790 using titans but couldn't fit 6 gb x2 on the board, plus I guess the dual titans created to much heat and required to much power . I believe nvidia will be announcing it by the end of the month, but I'm not exactly sure when they plan to unveil it. Sounds pretty cool to me!


----------



## superx51

It will be unveiled @ Motreal Gaming Event on the16th of October. 9 days to sell my 690. Lol


----------



## iARDAs

Quote:


> Originally Posted by *superx51*
> 
> I had a long talk with my friend last night who is a engineer at Nvidia and they will be releasing a gtx 790 that uses 2 full 780s on one pcb like the 690. It will have 3 gigs of ram x2 and a lower tdp than 780 sli like the 690. They tried to develop a 790 using titans but couldn't fit 6 gb x2 on the board, plus I guess the dual titans created to much heat and required to much power . I believe nvidia will be announcing it by the end of the month, but I'm not exactly sure when they plan to unveil it. Sounds pretty cool to me!


Wow really?

If true that is amazing man. Great info.


----------



## superx51

I don't know how much of a improvement it would be considering my. 690 hits 20500 3d mark 11 overclocked and sli 780s hit about the same. I use a 2560x1600 hp monitor ?


----------



## iARDAs

Quote:


> Originally Posted by *superx51*
> 
> I don't know how much of a improvement it would be considering my. 690 hits 20500 3d mark 11 overclocked and sli 780s hit about the same. I use a 2560x1600 hp monitor ?


I would definitaly think the improvement would be nice over 690. Plus the extra 1GB Vram could become handy.


----------



## superx51

If I get 650 for it and buy the new one for 1000$ that's 350$ for a 10% increase?


----------



## iARDAs

Quote:


> Originally Posted by *superx51*
> 
> If I get 650 for it and buy the new one for 1000$ that's 350$ for a 10% increase?


I think it will be more than a 10% increase

Also SLI 780 should score more than your 690 OCed. Or are those 780s stock?


----------



## Arizonian

Price tag will be crazy unlike the decently priced the 690.

Two 780's could be another 590 fiasco unlike the 690 made from two 256 bit GPU's. Hope they don't have a repeat.

Will be a great card if it works out. A niche of consumers it could fill gap for. Makes you wonder if they are going to drop the 780 prices and if the 790 be priced x2 accordingly rather than cost more.

EDIT TO ADD: i'm going to be hanging on to my 690 at least another two years as I am handing it down to my second rig this month. Glad I purchased the extra EVGA insurance which will have me protected under warranty for total of five years. It's been his best GPU purchase I've ever made when all said and done. Had it since day one launch.


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> Price tag will be crazy unlike the decently priced the 690.
> 
> Two 780's could be another 590 fiasco unlike the 690 made from two 256 bit GPU's. Hope they don't have a repeat.
> 
> Will be a great card if it works out. A niche of consumers it could fill gap for. Makes you wonder if they are going to drop the 780 prices and if the 790 be priced x2 accordingly rather than cost more.


I hope they lower the prices of the 780 so I can go SLI sooner.

What will you do in the future? wait for 8xx series or perhaps 9xx series?

That 690 is still amazing.


----------



## superx51

According to my friend it will cost no more than the 690. But the tdp cap will make it slightly less powerful than 780 sli just like the 690 is. So I don't know if 350 $ is worth 1 gig of ram and a little increase in performance. I love my 690 still thought I got it the day it came out and still it's the most powerful gpu ever.


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> I hope they lower the prices of the 780 so I can go SLI sooner.
> 
> What will you do in the future? wait for 8xx series or perhaps 9xx series?
> 
> That 690 is still amazing.


My 120 Hz 3D vision gaming rig is getting the 690 now. Selling its 680 and I'm jumping ship to play with an R9 290X and when prices / if prices go down looking to crossfire it is my next two year plan.

New computer build when Intel release next gen CPU after Haswell is my road map.


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> My 120 Hz 3D vision gaming rig is getting the 690 now. Selling its 680 and I'm jumping ship to play with an R9 290X and when prices / if prices go down looking to crossfire it is my next two year plan.
> 
> New computer build when Intel release next gen CPU after Haswell is my road map.


Solid plan. 290x for now and a 2nd one later will really make you happy I am sure of it.

Will you go to 290x for mantle?


----------



## Arizonian

Quote:


> Originally Posted by *iARDAs*
> 
> Solid plan. 290x for now and a 2nd one later will really make you happy I am sure of it.
> 
> Will you go to 290x for mantle?


In all honesty I don't even know if mantle is going to be a success or not, so I'm not even putting my eggs in that basket. I'm just buying it for a new toy to play with.









If it gives me issues. Who knows might be back to an Nvidia card in July when they announce 20 nm fabrication reportedly. If I'm happy still and AMD announces price cuts to compete with the new Nvidia cards I'll be Crossfiring a second at that time.


----------



## superx51

AMD I'm not a fan. 790 will be the best again for at least a year and a half! But since I'm sooooo close to that I will keep the 690.


----------



## cpachris

I hope Aquacomputer does a block quickly for the GTX790. I think their blocks are sharp.


----------



## PCModderMike

Broke 16K

http://www.3dmark.com/3dm11/7279145


----------



## superx51

I believe my 690 will beat or be damn close to the 290x in sli and since amd drivers suck and frame rates are all over the place I'll b sticking with the 690 until it can't run any game on ultra in 2560x1600 at 60+ frames. lol really it does with no problem. Anyone ever seen this? It fell for it for 2 weeks and almost bought a second 690 lol. http://www.techngaming.com/home/news/pr/updated-nvidia-announces-4-card-geforce-gtx-690-sli-bigger-video-memory-r686


----------



## superx51

4 cards 8 gpus and 16 gigs of video memory would be crazy ! The dual 680 that uses 4 gigs is crazy enough!


----------



## superx51

The 690 has dual 680s with 2 gigs of video memory each, so it's a four gig card. Each card uses the memory separately to do half the resolution. Each card fully utilizes it's memory so yea it is a 4 gig card. If it was a. 2 gig card it would only have 1 gig for each gpu and it would be slow as balls


----------



## Arizonian

Quote:


> Originally Posted by *superx51*
> 
> The 690 has dual 680s with 2 gigs of video memory each, so it's a four gig card. Each card uses the memory separately to do half the resolution. Each card fully utilizes it's memory so yea it is a 4 gig card. If it was a. 2 gig card it would only have 1 gig for each gpu and it would be slow as balls


Yes....Effectively it's only a 2GB GPU. Each frame is handled one at a time back and forth between the two GPU's on single PCB just like if it was in SLI.

One huge kudos to the 690 is the 'frame metering' that was implemented on a hardware level. IMO it handles like a single GPU because I don't see the micro-stutter while gaming that I was expecting from a dual GPU.


----------



## iARDAs

Quote:


> Originally Posted by *Arizonian*
> 
> Yes....Effectively it's only a 2GB GPU. Each frame is handled one at a time back and forth between the two GPU's on single PCB just like if it was in SLI.
> 
> One huge kudos to the 690 is the 'frame metering' that was implemented on a hardware level. IMO it handles like a single GPU because I don't see the micro-stutter while gaming that I was expecting from a dual GPU.


Yeah when I had a 590, I never experienced microstuttering as well. I love dual core GPUs


----------



## PCModderMike

Quote:


> Originally Posted by *Arizonian*
> 
> Yes....Effectively it's only a 2GB GPU. Each frame is handled one at a time back and forth between the two GPU's on single PCB just like if it was in SLI.
> 
> One huge kudos to the 690 is the 'frame metering' that was implemented on a hardware level. IMO it handles like a single GPU because I don't see the micro-stutter while gaming that I was expecting from a dual GPU.


The last dual GPU card I owned was a 295....world of difference with the 690. As you said, kudos to the way the frame metering works, really does feel like I'm gaming on a single super powerful GPU.


----------



## mtbiker033

Quote:


> Originally Posted by *Arizonian*
> 
> Price tag will be crazy unlike the decently priced the 690.
> 
> Two 780's could be another 590 fiasco unlike the 690 made from two 256 bit GPU's. Hope they don't have a repeat.
> 
> Will be a great card if it works out. A niche of consumers it could fill gap for. Makes you wonder if they are going to drop the 780 prices and if the 790 be priced x2 accordingly rather than cost more.
> 
> EDIT TO ADD: i'm going to be hanging on to my 690 at least another two years as I am handing it down to my second rig this month. Glad I purchased the extra EVGA insurance which will have me protected under warranty for total of five years. It's been his best GPU purchase I've ever made when all said and done. Had it since day one launch.


I'm with you on this, I plan on keeping my 690 for awhile and also purchased the extended warranty from evga!


----------



## virgis21

Guys, how is going with Battlefield 4 Beta?
updated drivers to beta (special made for BF4 upcoming too) and got 70+ FPS! Love Ultra









Virgis


----------



## flv1333

Quote:


> Originally Posted by *virgis21*
> 
> Guys, how is going with Battlefield 4 Beta?
> updated drivers to beta (special made for BF4 upcoming too) and got 70+ FPS! Love Ultra
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Virgis


Yep, I get about the same, really depends if there is a lot going on though. On conquest after the building falls, at the "C" flag it seem to be the worst, dipping to low 60s, but still nice and smooth







I love my 690


----------



## laz0rbrain

Hi guys

Can anyone tell me if a new cpu for my gtx 690 will be much help in BF4 etc? Right now I have an I7-960 not overclocked and I don't plan to do it again because I had some problems. I play in 1440p and I think my cpu is bottlenecking my card. What do you think? And do you have any recommendation as to what cpu I should get?

Thanks


----------



## iARDAs

Well what is your budget?

I own a 3570k and it works great. Easy to OC to 4.2 and no bottlenecking whatsoever.

3770k

4770k

3930k

4930k are all perfect CPUs for you.

If you are willing to spend a bit, grab a 3930k or a 4930k. Better for future games perhaps.


----------



## laz0rbrain

I was also thinking about the 3770k or maybe an i5. But will I feel the difference? Or is it still the card that slows the game down?

Thanks


----------



## virgis21

Quote:


> Originally Posted by *laz0rbrain*
> 
> I was also thinking about the 3770k or maybe an i5. But will I feel the difference? Or is it still the card that slows the game down?
> 
> Thanks


You need to check what is CPU "doing" while gaming?! As I remember correctly, my 3770K @4.8Ghz is loaded around 50-70% on Ultra (1080).
And sure, take a look at FPS you got?
Oh and update Nvidia drivers to the latest beta (improving it for BF4) that will come out with BF4 release.

Virgis


----------



## DreadyDK

Okay ppl this is the deal, im planing a new mAtx build. I have a Asus GTX 690 that im pretty happy with, runs pretty much everything with sensible frams and settings. i know it sounds pretty crazy but i want 2 card in this build, both for prefomence but mostly for looks







Should i sell my GTX 690 and buy 2 other cards or add one more Q-SLi ?

Pros/Cons ?

Withs other cards ?


----------



## Furlans

Quote:


> Originally Posted by *DreadyDK*
> 
> Okay ppl this is the deal, im planing a new mAtx build. I have a Asus GTX 690 that im pretty happy with, runs pretty much everything with sensible frams and settings. i know it sounds pretty crazy but i want 2 card in this build, both for prefomence but mostly for looks
> 
> 
> 
> 
> 
> 
> 
> Should i sell my GTX 690 and buy 2 other cards or add one more Q-SLi ?
> 
> Pros/Cons ?
> 
> Withs other cards ?


I think that GTX 690's time is ended...
I sold my GTX 690 and i am going to buy a GTX 780/780Ti and thinking about a future SLI...


----------



## DreadyDK

Quote:


> Originally Posted by *Furlans*
> 
> I think that GTX 690's time is ended...
> I sold my GTX 690 and i am going to buy a GTX 780/780Ti and thinking about a future SLI...


Yeah guess u are right, best card i ever hade tho. But i got 2 used EVGA GTX 780 Hydro Copper on my hands, ill get about the same for my 690 as one of the 780







maybe ill do that









Thanks for the feedback mr


----------



## Arizonian

Honestly on single monitor as long as you push more FPS than your refresh rate it's not over yet.

Guess it depends on the games you play. I kill FPS in BF3 even with dips. Crysis 3 maxed settings no AA is the only game I see dips a little below to about 51 FPS on my 1440p.

I think about another year 690 aka 680 SLI is going to be fine for those not ready to upgrade. Can always turn down settings and still stay relevant.

Only negative I see is pricing hasn't come down yet. It will mostly go out of production and fade away rather than price reduction is what Nvidia tends to do.

Mines is going into a second rig as I've said soon to power my 3D gaming on my 120 Hz system the kids use. I still enjoy a one time run through 3D vision on a new title release.


----------



## grassy

I have 2 690's and am thinking i should not have bought the second one as the quad sli with the 690 is not all that great.I am thinking of taking one out and using the other in another build. What do you guys think and whats your opinion on the 2 /690's vs the 1/690


----------



## Lagpirate

Quote:


> Originally Posted by *grassy*
> 
> I have 2 690's and am thinking i should not have bought the second one as the quad sli with the 690 is not all that great.I am thinking of taking one out and using the other in another build. What do you guys think and whats your opinion on the 2 /690's vs the 1/690


Yea, quad sli is really only good for synthetic benches. In a real world application, quad sli does terribly. The scaling is pathetic, and to boot: most games don't play nice with quad sli. The most I would ever go would be tri-sli.


----------



## anubis1127

Quote:


> Originally Posted by *Lagpirate*
> 
> Yea, quad sli is really only good for synthetic benches. In a real world application, quad sli does terribly. The scaling is pathetic, and to boot: most games don't play nice with quad sli. The most I would ever go would be tri-sli.


I agree 100% with this statement. @grassy I would take it out and use it in another build.


----------



## flv1333

Hey guys, I'm wanting to unlock my bios (for 150% Power) but when I try to pull my BIOS with GPU-z (0.7.3) it will only do so for the 2nd gpu. for the first I get an error: BIOS reading not supported for this device. I'm kinda suck :-(


----------



## Alex132

Quote:


> Originally Posted by *flv1333*
> 
> Hey guys, I'm wanting to unlock my bios (for 150% Power) but when I try to pull my BIOS with GPU-z (0.7.3) it will only do so for the 2nd gpu. for the first I get an error: BIOS reading not supported for this device. I'm kinda suck :-(


How do you extract/pull/etc the vBIOS with GPU-Z?

I can do my first GPU for you, (if i can)









EDIT- Found it and can only do GPU2 as well, I guess that's how it's supposed to be?


----------



## flv1333

Yeah, im not quite sure, as I can see that the EVGA hydrocopper has 2 roms (User uploaded it a few pages back. I'm just scared to use someone else's BIOS , don't want to brick my card.


----------



## grassy

Thanks for the replies guys. I will take the other one out and use the 1. I will start another build for my wife and she can have the other card. I really was expecting a little more from these cards considering the price i paid for them.


----------



## DreadyDK

Quote:


> Originally Posted by *Arizonian*
> 
> Honestly on single monitor as long as you push more FPS than your refresh rate it's not over yet.
> 
> Guess it depends on the games you play. I kill FPS in BF3 even with dips. Crysis 3 maxed settings no AA is the only game I see dips a little below to about 51 FPS on my 1440p.
> 
> I think about another year 690 aka 680 SLI is going to be fine for those not ready to upgrade. Can always turn down settings and still stay relevant.
> 
> Only negative I see is pricing hasn't come down yet. It will mostly go out of production and fade away rather than price reduction is what Nvidia tends to do.
> 
> Mines is going into a second rig as I've said soon to power my 3D gaming on my 120 Hz system the kids use. I still enjoy a one time run through 3D vision on a new title release.


Yeah good point mate, im playing on a Dell Ultra-Sharp u2713H @ 1440p. And i was beginning to feel the card was not good enough anymore, so i just traded my 1 year old ASUS GTX 690 with a 2 month old eVga GTX 780 Hydro Copper. I have a caselabs s5 comming my way, and planning to add a 780 hydro copper more for sli. Then i should be able to play any game with decent settings again for some time (i hope)


----------



## Alex132

Really wish I could undo the screw on my 690.

I really wanna watercool it.

Eugh, tried everything so far... well everything suggested to me.

Rubberband, superglue... trying extra hard.


----------



## propeldragon

is the memory bandwidth a bottleneck or one of them for the 690? what makes you use more memory bandwidth?


----------



## ocsi1970

hy guys!i have gigabyte gtx 690,and i want to put my windows UEFI.Is safe to flash my gtx 690 bios?can someone help me step by step how to do?Thanks very much.
sory my english.


----------



## FiShBuRn

Quote:


> Originally Posted by *Alex132*
> 
> Really wish I could undo the screw on my 690.
> 
> I really wanna watercool it.
> 
> Eugh, tried everything so far... well everything suggested to me.
> 
> Rubberband, superglue... trying extra hard.


You have to drill the screw...


----------



## Tinman12

Quote:


> Originally Posted by *Arizonian*
> 
> Honestly on single monitor as long as you push more FPS than your refresh rate it's not over yet.
> 
> I think about another year 690 aka 680 SLI is going to be fine for those not ready to upgrade. Can always turn down settings and still stay relevant.
> 
> Only negative I see is pricing hasn't come down yet. It will mostly go out of production and fade away rather than price reduction is what Nvidia tends to do.
> 
> Mines is going into a second rig as I've said soon to power my 3D gaming on my 120 Hz system the kids use. I still enjoy a one time run through 3D vision on a new title release.


Well I said eff-it last night and flashed my 690 for 150% power boost. All these 290x, 780ti threads are making my wallet itch.







Only noticed 2 things after doing it: was able to get another 5mhz stability out of the GPU, and I noticed during 1440p Heaven benchmark there is visibly (perceptively?) less ripple/stutter right where I would always notice it when panning along the ship, and the spin around the dragon. only went up bout .5-1 degree on the water cooling during benching. Something is dicked on the score though, it's showing me about half the score I should be getting even though I'm seeing 44-82fps for the run. I dunno bout that.

Just downloaded BF4, and after going mad for an hour over the flashing rainbow sherbet texture issue I'll try it out tonight to see what I get for FPS when in multi player maps with everthing pegged.

I would _hope_ I could get at least another year out of the 690, but when I built the system my baseline was for a build that would not dip below 65fps at 1440p with max gfrx settings in games, while not needing 1kw power or enormous water cooling loop setups. I'm still having doubts that I will get 3-4months out of the card before I want to upgrade. Looking at the reviews though, I think I'm stuck till Maxwell to really get good gains for money over the 690 investment. Unless that 780ti really turns out to be something magical.


----------



## Arizonian

IMO I think you're going to be fine until Maxwell comes out. Battlefield 4 is only taking 2.3 GB of VRAM in order to play and I don't foresee any other games that's going to require more demand out of your GPU anyway.

If you purchased your 690 on release, this has been the best bang for your buck considering that it is still a strong graphics card.

From speculation it seems that Maxwell will be coming out around July August 2014 timeframe.


----------



## virgis21

I think we are running out ot VRAM







Here is my Battlefield 4 test. I have 100 +- 20 fps all the time in large and action map!! Kind of the same/better as it was in Battlefield 3 (interesting).. But BF4 "eats" more VRAM as BF3 did. What do you guys think about this?


----------



## Arizonian

Quote:


> Originally Posted by *virgis21*
> 
> I think we are running out ot VRAM
> 
> 
> 
> 
> 
> 
> 
> Here is my Battlefield 4 test. I have 100 +- 20 fps all the time in large and action map!! Kind of the same/better as it was in Battlefield 3 (interesting).. But BF4 "eats" more VRAM as BF3 did. What do you guys think about this?


What do I think? I think you might be right because I can't confirm my numbers since afterburner doesn't officially support the 290X. My readings In AB could be askew.


----------



## Tinman12

I'm getting game lock ups that I have to crash out of on the large 64 person big conquest maps. Audio starts shuddering and the screen freezes until I crush it with task manager. Think it might be my overlcock causing it now, but I was wondering about being over the limit on VRAM for those large maps.


----------



## superx51

Well I can play battlefield 4 in 2560x1600 everything maxed and my depth at 100% and I sit around 80fps all the time with my 690 at 1250mhz. It looks way better than my friends 7950s with 3gb. 2mb doesn't seen to affect it at all. Even in. 60 person battles


----------



## Tinman12

Quote:


> Originally Posted by *superx51*
> 
> 2mb doesn't seen to affect it at all. Even in. 60 person battles


Ya I was trying to delude myself it wasn't my overclock that was causing the lockups.


----------



## virgis21

Drivers updated?


----------



## Tinman12

Yea, the latest WHQL driver seems more unstable then the 320.18 driver.


----------



## Proxish

Hey,

I'd like to join the club.
The brand is Asus.


----------



## Proxish

Can someone help me out with my overclock?
I've read people have been getting +800 on their mem, and +150 on their GPU.

Right now I've got %135 Power, +48 GPU, +180 Mem.
Can some explain this to me? Is my card faulty or something?

Thanks.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Proxish*
> 
> Can someone help me out with my overclock?
> I've read people have been getting +800 on their mem, and +150 on their GPU.
> 
> Right now I've got %135 Power, +48 GPU, +180 Mem.
> Can some explain this to me? Is my card faulty or something?
> 
> Thanks.


Card just doesn't overclock as well as other 690s. You were promised +0 +0 from factory, and you have +48 and +180. See it as a bonus. The 690 is still a monster card at stock clocks.


----------



## fast_fate

Thought I had posted here previously for membership BUT obviously I just subscribed to the thread








Anyways... 2 x Asus and 1 x EVGA in this monster cruncher.
Testing complete on air - top 5 in world every day for the last 5 days on GPU Grid.
I've got Heatkiller blocks to whack on them - I'm getting throttling at 70 degrees (I think) so need the water cooling.
Cards run between 60 and 75 degrees with 80 - 95 % load - 24/7 during testing.

Firestrike score - single GTX 690 card @ 1084 / 6858 (effective)

I read somewhere that Win 7 can handle a max of 7 GPU's - any truth to this ??
I have access to another 690 and might consider it if I knew I could get all 8 GPU's working on the SR-2 (CPU's are Xeon 5650's)


----------



## Proxish

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Card just doesn't overclock as well as other 690s. You were promised +0 +0 from factory, and you have +48 and +180. See it as a bonus. The 690 is still a monster card at stock clocks.


Ah, ok... That's a shame... I hoped it could do more and I was just doing something wrong.
I thought maybe my CPU, Memory or PSU might be getting in the way.

What's the deal on modifying the bios to accept higher power targets etc?
I only got my 690 recently and don't know what it can really do.


----------



## fast_fate

Quote:


> Originally Posted by *Proxish*
> 
> Can someone help me out with my overclock?
> I've read people have been getting +800 on their mem, and +150 on their GPU.
> 
> Right now I've got %135 Power, +48 GPU, +180 Mem.
> Can some explain this to me? Is my card faulty or something?
> 
> Thanks.


Are you on air OR underwater ?
Water cooling will most likely be your key to unlocking higher clocks.
My clocks in last post for the Firestrike run are on air.

Try lowering the power down from +135 to 115 or 120 and see if it helps.
Extra power could be generating more heat and throttling ??
Adding to that re: your modded Bios question - more volts = more heat again.


----------



## Proxish

Quote:


> Originally Posted by *fast_fate*
> 
> Are you on air OR underwater ?
> Water cooling will most likely be your key to unlocking higher clocks.
> My clocks in last post for the Firestrike run are on air.
> 
> Try lowering the power down from +135 to 115 or 120 and see if it helps.
> Extra power could be generating more heat and throttling ??
> Adding to that re: your modded Bios question - more volts = more heat again.


Sorry for taking so long to get back to you, I wasn't notified of your reply and opened up to see if it had failed to notify me.

I'm running on air. I never thought I might be throttling it. I'll try lowering the power and see what happens. I'm not a great overclocker, I'm relatively new to the field.

I'm already hitting 82c, so I guess modifying the voltage is out of the window.


----------



## fast_fate

Quote:


> Originally Posted by *Proxish*
> 
> Sorry for taking so long to get back to you, I wasn't notified of your reply and opened up to see if it had failed to notify me.
> 
> I'm running on air. I never thought I might be throttling it. I'll try lowering the power and see what happens. I'm not a great overclocker, I'm relatively new to the field.
> 
> I'm already hitting 82c, so I guess modifying the voltage is out of the window.


I got this from EVGA site about the 690's throttling with temps....
Everyone likes to keep the card below 70C by running a custom fan profile or a fixed fan speed via precision while gaming to keep the card from throttling
13MHz every 10C over 70C.
The card reduces it's clock speed by 13MHz @ 70C, 80C & 90C

On air I run with a mild overclock of +29 GPU, +86 Mem and +112 power.
My theory being that even if it tags 80 degrees it was still (just) faster that the stock boost clock.
On air I could *not* keep mine below 70 degrees while under 85% - 95% load - even with fans flat out.
Need water cooling !!!


----------



## Proxish

Quote:


> Originally Posted by *fast_fate*
> 
> I got this from EVGA site about the 690's throttling with temps....
> Everyone likes to keep the card below 70C by running a custom fan profile or a fixed fan speed via precision while gaming to keep the card from throttling
> 13MHz every 10C over 70C.
> The card reduces it's clock speed by 13MHz @ 70C, 80C & 90C
> 
> On air I run with a mild overclock of +29 GPU, +86 Mem and +112 power.
> My theory being that even if it tags 80 degrees it was still (just) faster that the stock boost clock.
> On air I could *not* keep mine below 70 degrees while under 85% - 95% load - even with fans flat out.
> Need water cooling !!!


The lowest mine has run, is 75c at stock.
But I got a 10fps boost with my overclock in Valley at 1080p, so I don't want to remove mine.

the GTX 690's time has passed already, I'll be selling up and buying something else in the next 6 months anyway. No point going custom cooling on it.

Thank you for all your help though, at least I know my card isn't faulty now, even though it runs so hot.
I may contact nVidia/Asus and see if I can get a replacement and claim it runs too hot at stock.


----------



## killbom

Have anyone tested the Twin Turbo 690? I'm thinking about installing it in my TJ08E but I'm afraid about the heat not getting vented out of the case? Thoughts?


----------



## Tinman12

Quote:


> Originally Posted by *Proxish*
> 
> The lowest mine has run, is 75c at stock.
> But I got a 10fps boost with my overclock in Valley at 1080p, so I don't want to remove mine.
> 
> the GTX 690's time has passed already, I'll be selling up and buying something else in the next 6 months anyway. No point going custom cooling on it.
> 
> Thank you for all your help though, at least I know my card isn't faulty now, even though it runs so hot.
> I may contact nVidia/Asus and see if I can get a replacement and claim it runs too hot at stock.


At this point no, probably not worth investing further. The card is still a beast running stock, so long as your not throttling from heat.

I started with getting the best performing TIM goop I could get, and that helped with about 1-1.5 degrees and a little more stability. From there I went water loop. Solid 30 degree heat under load, that's where you will get the big memory boosts and more GPU boost, but because the card is 2 GPU dies married up on a hardware bridge it quickly suffers from leakage and you start losing performance the further you push it. Finally I flashed the BIOS to 150% power, perceptually all I can see that did was give me more stability under load (eg. less drastic frame dipping) and a couple more Mhz on the GPU for benching, but not stable in gaming.

If you want the least investment that might actually give you a good return, buy some good TIM and clean the dies off and re-goop em. It's like 5-25$ investment.

I went water for the silence not the power, I was already aware ahead of time water cooling doesn't give a good performance return for the investment on the 690s. At this point I may go with 780ti-SLI and cannibalize my loop to support it. Or wait for next-gen in Q3-4 of 2014.

BF4 runs great (minus the games own instability issues) at 1440p ultra though so I can't really complain.


----------



## Proxish

Quote:


> Originally Posted by *Tinman12*
> 
> At this point no, probably not worth investing further. The card is still a beast running stock, so long as your not throttling from heat.
> 
> I started with getting the best performing TIM goop I could get, and that helped with about 1-1.5 degrees and a little more stability. From there I went water loop. Solid 30 degree heat under load, that's where you will get the big memory boosts and more GPU boost, but because the card is 2 GPU dies married up on a hardware bridge it quickly suffers from leakage and you start losing performance the further you push it. Finally I flashed the BIOS to 150% power, perceptually all I can see that did was give me more stability under load (eg. less drastic frame dipping) and a couple more Mhz on the GPU for benching, but not stable in gaming.
> 
> If you want the least investment that might actually give you a good return, buy some good TIM and clean the dies off and re-goop em. It's like 5-25$ investment.
> 
> I went water for the silence not the power, I was already aware ahead of time water cooling doesn't give a good performance return for the investment on the 690s. At this point I may go with 780ti-SLI and cannibalize my loop to support it. Or wait for next-gen in Q3-4 of 2014.
> 
> BF4 runs great (minus the games own instability issues) at 1440p ultra though so I can't really complain.


Thank you for all the information.
One question I have is how do I know if the GPU is throttling? No ones ever really explained that to me.

I didn't realise that the next gen was coming out in only a year.
I was thinking along the same lines as yourself, 780ti SLi, but I'd have to wait a while before I could afford the second one.

EDIT: I already have good TIM kicking about here somewhere, but I'd be nervous replacing it and does that not void warranty?


----------



## Tinman12

Quote:


> Originally Posted by *Proxish*
> 
> Thank you for all the information.
> One question I have is how do I know if the GPU is throttling? No ones ever really explained that to me.
> 
> I didn't realise that the next gen was coming out in only a year.
> I was thinking along the same lines as yourself, 780ti SLi, but I'd have to wait a while before I could afford the second one.
> 
> EDIT: I already have good TIM kicking about here somewhere, but I'd be nervous replacing it and does that not void warranty?


Mine is EVGA and they cover warranty so long as you don't doing anything that cannot be "un"-done to the card, re-TIM should be fine.

On throttling; it's just easier for to point you at this thread: GTX 670 Overclock Master Guide The guy who put that post together put an enormous amount of work into it. He goes into kepler boosting and throttling. It's a great read.

If the leaked, questionably legit stuff that's been coming out is close to right, a single 780ti should OC on air and get about 98% of a 690's performance. If it ponies up to really be that, you could get 1 and be fine with it long enough to wait for a 2nd 780ti to come down in price. If I go the 780ti route I won't bother waiting since I will put them under water, so I'm in for a higher $ commit at that point.

The next gen stuff is already taped-out, it's the K40 "atlas" cards. They will start out as Tesla to get Nvidia the most R&D return by selling to subsidized markets (oil, gas, pharma, nuclear, etc) for top dollar, then into the Quadro line, and eventually will come out in the GTX 800 series. Markets are pretty fluid though, Q3 2014 could become 2015, depends on a lot of moving pieces out there. That's my speculation.

If I can play a game at 1440p maxed staying in the 60FPS wheelhouse I'll stay on my 690, if I'm dipping the mid-40's, I'll upgrade; that's pretty much my rule of thumb.

EDIT: re-TIMing your card can be a litle daunting, there's some good directions out on the web. Just have take your time on all steps, dis-assembly, cleaning all the contact points (I used nail acetone eg. nail polish remover and q-tips) very slowly and precisely, don't tear any of the heat pads, then re-apply TIM. less it more and I did the finger-in-saranwrap method to paint it on the GPUs and bridge chip.


----------



## Proxish

Quote:


> Originally Posted by *Tinman12*
> 
> Mine is EVGA and they cover warranty so long as you don't doing anything that cannot be "un"-done to the card, re-TIM should be fine.
> 
> On throttling; it's just easier for to point you at this thread: GTX 670 Overclock Master Guide The guy who put that post together put an enormous amount of work into it. He goes into kepler boosting and throttling. It's a great read.
> 
> If the leaked, questionably legit stuff that's been coming out is close to right, a single 780ti should OC on air and get about 98% of a 690's performance. If it ponies up to really be that, you could get 1 and be fine with it long enough to wait for a 2nd 780ti to come down in price. If I go the 780ti route I won't bother waiting since I will put them under water, so I'm in for a higher $ commit at that point.
> 
> The next gen stuff is already taped-out, it's the K40 "atlas" cards. They will start out as Tesla to get Nvidia the most R&D return by selling to subsidized markets (oil, gas, pharma, nuclear, etc) for top dollar, then into the Quadro line, and eventually will come out in the GTX 800 series. Markets are pretty fluid though, Q3 2014 could become 2015, depends on a lot of moving pieces out there. That's my speculation.
> 
> If I can play a game at 1440p maxed staying in the 60FPS wheelhouse I'll stay on my 690, if I'm dipping the mid-40's, I'll upgrade; that's pretty much my rule of thumb.
> 
> EDIT: re-TIMing your card can be a litle daunting, there's some good directions out on the web. Just have take your time on all steps, dis-assembly, cleaning all the contact points (I used nail acetone eg. nail polish remover and q-tips) very slowly and precisely, don't tear any of the heat pads, then re-apply TIM. less it more and I did the finger-in-saranwrap method to paint it on the GPUs and bridge chip.


Again, thanks for all the info.

I didn't realise the 780ti was coming so close in performance to the GTX 690, if that's the case, I'll definitely sell the one I have an upgrade to a GTX 780ti as soon as they get better coolers installed on them. I learnt my lesson from the cooler that came with the GT 690.

Let just hope the GTX 780ti comes close in price to the 290X, though I doubt it will.


----------



## superx51

I must say I haven't loved my 690 2560x1600 this much yet! Found its overclocking limit on battlefield 4 ultra everything! 1270mhz clock and 3500mhz memory runs everything it at 70+ frames. Adaptive vsync works so well it's so smooth


----------



## ktbryant

Hey guys,

So what settings are most of you running on Battlefield 4 at 1440p? I'm trying to find the sweet spot without hitting the VRAM limit. Is there anyway someone could post a screen shot of their video settings that they find to be the most fluid? Thanks


----------



## superx51

In 1440 you can run ultra and get more frames than me.


----------



## ktbryant

At 1440p as soon as I put everything to Ultra it goes over the 2gb VRAM and I get stuttering. I was just wondering whats the best setting for visual quality without going over 2gb. I believe ultra is around 2.5gb.


----------



## superx51

Is it overclocked? Mine runs at 70+ fps in 2560x1600 and that is higher res than yours. I don't look at ram usage. My friend has 2. 7970s 3gig and my 690 runs way smoother than his.


----------



## Proxish

I need a little help here.

I was running my 690 at 135% +45 +180 for about three days fine.

Overclocked my CPU to 4.5GHz from 4.4GHz and increased the Turbo from 0.004 to 0.008, seems fine and stable.

Two days later, playing Batman Arkham City, white squares all over the screen, crash. This happened another 3 times.
Tried Valley, same thing. reduced the overclock to 130% +30 +150. Same thing.

Went default setting and ran Valley for 6 hours, came back and no crash reported. Played Arkham City for 8 hours, no crash.

Is my PSU limiting my overclock?
I can't think of any other reason the GTX 690 would of worked fine for days, and then suddenly start bugging out...
I initially thought it was getting faulty and was going to contact Asus for an RMA (I know they are terrible) but after testing it for so many hours and it coming back fine, I don't know what to do.

I had an increase of 10fps in Valley with that overclock, and it's really annoying losing it.
Any advice would be appreciated.


----------



## superx51

I run mine +20v +130core +400 mem with no problems at all. I hit. 20145 gpu score in 3dmark 11 . It could be a bad card or psu.


----------



## Proxish

There's my old Firemark score, that was when I was overclocked.
Would a bad card or a bad psu be eligible for a warranty replacement, or is it simply what I'm stuck with.

I would think it might be my PSU, my old 660sli setup didn't overclock very well either. Had somewhere around +30 +200 on it and any more than that and the thing would crash.


----------



## superx51

What brand is the 690


----------



## Proxish

It's an Asus.

EDIT: My power supply is 700w and my CPU is overclocked to 4.5GHz.
Got 7GB of ram and a 1GB virtual drive which I created the other day.


----------



## superx51

Asus has a 3 year warranty so u can send it back


----------



## Proxish

With what excuse?
It works fine at stock settings, it's just stoped working with any overclock.


----------



## superx51

Tell them your having the problem all the time on stock clocks


----------



## Proxish

And what happens when they test it and see it works fine at stock?


----------



## lascar

Accelero Twin Turbo 690:

review the only one the whole web ! it will help you for sure

http://www.kitguru.net/components/graphic-cards/zardon/arctic-accelero-twin-turbo-690-cooler-review-w-asus-gtx690/

My PoV:

Pros :

- in good case u should reach really good temp : mine 50C on both core on HEAVEN 4.0 tess maxed out 400% - for me the strongest Workload after Furmark

- if you re looking for silent mode, your are able to throttle down the fan @ 750/900 Rpm ( 3000 RPM fan earlier....) with stock voltage without loss on performance at stock voltage and stock speed

- In my case Fan is capped at 1500 RPM with overvolting Gpu's never exceed 65C in my antec case 1200 / 35-45C @ stock speed !

Cons:

- Setting up the babe, on a 1K bucks GFX it's like on the moon.... you have to be focused and take your time , remember time is money 

- You should read the review, be sure you want to do it because it's a very long process if you want to do the things right without issues:

- You need TORX T6 mini screwdriver and phillips ones but to dismount very tiny screw like on some smartphones : look carefully the PCB and and the screws !

- The mod will make the card heavier , And DONT PUT HDD JUST BEHIND THE GFX : bad idéa really ssd would suffer, and hdd's will burn in hell...


----------



## propeldragon

Do you think I should switch my 690 for a 780 ti?


----------



## superx51

It's a down grade. My 690 gets 3dmark 11 score of 20270 no single gpu gets close.


----------



## propeldragon

I know it's a little weaker but it can overclock like no other and has 1gb vram more


----------



## superx51

Even with an overclock and 3 gigs it doesn't beat a 690! Not even close


----------



## Proxish

Quote:


> Originally Posted by *propeldragon*
> 
> Do you think I should switch my 690 for a 780 ti?


You are best off with getting 2 R9 290's, they come in not that much more expensive, at £600~ for 2, whereas the 780ti comes in at £550 by itself.
Look at the reviews and it's clear that, that is the better option. I'd wait for new coolers for them though, and then get two of them.

That or buy a single R9 290X and watercool it, an extra 1GB VRam over the 780ti and underperforms by about 5fps on most benchmarks.


----------



## propeldragon

Quote:


> Originally Posted by *superx51*
> 
> Even with an overclock and 3 gigs it doesn't beat a 690! Not even close


in bf4 it's like 64 vs 74.


----------



## propeldragon

Quote:


> Originally Posted by *Proxish*
> 
> You are best off with getting 2 R9 290's, they come in not that much more expensive, at £600~ for 2, whereas the 780ti comes in at £550 by itself.
> Look at the reviews and it's clear that, that is the better option. I'd wait for new coolers for them though, and then get two of them.
> 
> That or buy a single R9 290X and watercool it, an extra 1GB VRam over the 780ti and underperforms by about 5fps on most benchmarks.


I wouldbt need 4gb of vram. I wouldn't water cool my gpu either. 290 and 290x run really hot. Plus I'm not a fan of amd. Such hard choices!!


----------



## Proxish

Quote:


> Originally Posted by *propeldragon*
> 
> I wouldbt need 4gb of vram. I wouldn't water cool my gpu either. 290 and 290x run really hot. Plus I'm not a fan of amd. Such hard choices!!


You didn't need 3GB of VRam two years ago, and you didn't need 2GB of VRam four years ago.
It's future and resolution proofing.

You are better off waiting for the 290 to come out with non reference coolers and get two of those.
Both 780ti 290X and 290 can't beat the 690, but two 290's come in at little more than £50 more than a 780ti, and outperform the 780ti and 690 by about 30fps-40fps.

Fan of AMD or not, it's the smartest choice. I'm not a huge AMD fan, but that's what I'll be doing just for the sheer amount of power.

Second best choice is two 780's over a 780ti, you are looking at £150-£200 more for two 780's than for one 780Ti.


----------



## grassy

Quote:


> Originally Posted by *Proxish*
> 
> You didn't need 3GB of VRam two years ago, and you didn't need 2GB of VRam four years ago.
> It's future and resolution proofing.
> 
> You are better off waiting for the 290 to come out with non reference coolers and get two of those.
> Both 780ti 290X and 290 can't beat the 690, but two 290's come in at little more than £50 more than a 780ti, and outperform the 780ti and 690 by about 30fps-40fps.
> 
> Fan of AMD or not, it's the smartest choice. I'm not a huge AMD fan, but that's what I'll be doing just for the sheer amount of power.
> 
> Second best choice is two 780's over a 780ti, you are looking at £150-£200 more for two 780's than for one 780Ti.


You are probably right in saying that but i have heard that the 290x gets pretty hot and this is just my opinion but i would stay away from the 290x because of that reason for longer term benefits just in case. Any video card that gets up to 90 degrees is a friggin dissaster in my book. It probably performs real well but i wouldnt take the risk.I have a system touching the $9000.00 mark and one of those 290x definately wont see my case.


----------



## Proxish

Quote:


> Originally Posted by *grassy*
> 
> You are probably right in saying that but i have heard that the 290x gets pretty hot and this is just my opinion but i would stay away from the 290x because of that reason for longer term benefits just in case. Any video card that gets up to 90 degrees is a friggin dissaster in my book. It probably performs real well but i wouldnt take the risk.I have a system touching the $9000.00 mark and one of those 290x definately wont see my case.


Yea, AMD are aware and claim they were designed to run at 95c. I don't know why they thought this was ok, I reckon it was more to get back into the ball game and start competing with nVidia (Which they have definitely succeeded in)
I assume when non reference coolers come out, they will be running around 80c-85c, which isn't as bad, since my 690 runs at 82c under load.
If the 290 can do 85c Crossfired, then I'll be picking up two without a doubt.


----------



## grassy

Quote:


> Originally Posted by *Proxish*
> 
> Yea, AMD are aware and claim they were designed to run at 95c. I don't know why they thought this was ok, I reckon it was more to get back into the ball game and start competing with nVidia (Which they have definitely succeeded in)
> I assume when non reference coolers come out, they will be running around 80c-85c, which isn't as bad, since my 690 runs at 82c under load.
> If the 290 can do 85c Crossfired, then I'll be picking up two without a doubt.


Yea AMD are a great company and it would be great to see the temps come down on that card in particular. Who knows, but it just leaves a tad of a bitter taste in my mouth that we may have to go to an extreme to keep it cool.Its probably just me being a little scared i suppose. I have made plenty of mistakes in this hobby and it has cost me quite a bit of money wasted on making the wrong choices.


----------



## Proxish

Quote:


> Originally Posted by *grassy*
> 
> Yea AMD are a great company and it would be great to see the temps come down on that card in particular. Who knows, but it just leaves a tad of a bitter taste in my mouth that we may have to go to an extreme to keep it cool.Its probably just me being a little scared i suppose. I have made plenty of mistakes in this hobby and it has cost me quite a bit of money wasted on making the wrong choices.


I see where you are coming from.
As consumers of a product that costs £300 per GPU, we really shouldn't need to worry about them cooking our whole house.
But those are the decisions we have to make at the moment, cheaper power at high temps. Or low temps and a bank statement that makes us cry.


----------



## grassy

Quote:


> Originally Posted by *Proxish*
> 
> I see where you are coming from.
> As consumers of a product that costs £300 per GPU, we really shouldn't need to worry about them cooking our whole house.
> But those are the decisions we have to make at the moment, cheaper power at high temps. Or low temps and a bank statement that makes us cry.


(HEHE) Yea your too right my friend these companies have us worked out alright.Good luck with it and i hope it works out for you and you get the temps your after.


----------



## Proxish

Quote:


> Originally Posted by *grassy*
> 
> (HEHE) Yea your too right my friend these companies have us worked out alright.Good luck with it and i hope it works out for you and you get the temps your after.


Unfortunately they do... And what can we gamers with OCD graphic settings do, but buy into their bull peddling lol.
Thanks, just another month and the non reference come out, let's hope that my 690 still fetches a decent price.


----------



## delacruzpaolo19

Hi Everyone I'm new here
just want to ask is everyone having issue
in battlefield 4 single player campaign
I'm getting flickering textures in some areas
of my game play.

I'm using the lates WHQL driver 331.65
OS is windows 7 sp1 64bit

any help would be appreciated.

my specs:
intel core i7 2600k (stock now)
msi z77 mpower motherboard
crucial ballistix elite 1866mhz 2x8gb dual chanel
corsair ax860i power supply
asus gtx 690 gpu
seagate 2tb hard drive


----------



## superx51

Quote:


> Originally Posted by *propeldragon*
> 
> in bf4 it's like 64 vs 74.


I play in 2560x1600 (2x1080 that's a lot of pixels)so I need all the horsepower I can get. It's takes twice the power to run it with the 690 I get. 75+ frames with not one thing turned down. A 780 can't do that. If ur going 2 cards do two 780s best bang for the buck(no need to buy an extra cooler like the 290x) plus u can overclock the heck out of them and not have them any where near the amount of noise and heat two 290xs produce! Plus for me I have everything cramed into a tiny x51 mitx case so my orb card has to be quiet and cool and I can use only one card. I bought a Titan the day it came out and returned it one week later cause it was a total downgrade for me. Then I tried my friends 780 classi still not near the power I need. I will only upgrade if it's a performance upgrade, like a 790 (790ti would be awesome.


----------



## ocsi1970

delacruzpaolo19:

Hi Everyone I'm new here
just want to ask is everyone having issue
in battlefield 4 single player campaign
I'm getting flickering textures in some areas
of my game play.

I'm using the lates WHQL driver 331.65
OS is windows 7 sp1 64bit

any help would be appreciated.

my specs:
intel core i7 2600k (stock now)
msi z77 mpower motherboard
crucial ballistix elite 1866mhz 2x8gb dual chanel
corsair ax860i power supply
asus gtx 690 gpu
seagate 2tb hard drive

And I do exactly.can anyone help?


----------



## skyn3t

is any of you are running any special vBios?


----------



## superx51

I don't think so the general concencious is that it won't help t cause the voltage is locked hardware wise .


----------



## skyn3t

Quote:


> Originally Posted by *superx51*
> 
> I don't think so the general concencious is that it won't help t cause the voltage is locked hardware wise .


I may not be successful on this but this is what I got so far, anyone to help test it?


----------



## delacruzpaolo19

Quote:


> Originally Posted by *ocsi1970*
> 
> delacruzpaolo19:
> 
> Hi Everyone I'm new here
> just want to ask is everyone having issue
> in battlefield 4 single player campaign
> I'm getting flickering textures in some areas
> of my game play.
> 
> I'm using the lates WHQL driver 331.65
> OS is windows 7 sp1 64bit
> 
> any help would be appreciated.
> 
> my specs:
> intel core i7 2600k (stock now)
> msi z77 mpower motherboard
> crucial ballistix elite 1866mhz 2x8gb dual chanel
> corsair ax860i power supply
> asus gtx 690 gpu
> seagate 2tb hard drive
> 
> And I do exactly.can anyone help?


*we need help...*


----------



## killbom

Quote:


> Originally Posted by *lascar*
> 
> Accelero Twin Turbo 690:
> 
> review the only one the whole web ! it will help you for sure
> 
> http://www.kitguru.net/components/graphic-cards/zardon/arctic-accelero-twin-turbo-690-cooler-review-w-asus-gtx690/
> 
> My PoV:
> 
> Pros :
> 
> - in good case u should reach really good temp : mine 50C on both core on HEAVEN 4.0 tess maxed out 400% - for me the strongest Workload after Furmark
> 
> - if you re looking for silent mode, your are able to throttle down the fan @ 750/900 Rpm ( 3000 RPM fan earlier....) with stock voltage without loss on performance at stock voltage and stock speed
> 
> - In my case Fan is capped at 1500 RPM with overvolting Gpu's never exceed 65C in my antec case 1200 / 35-45C @ stock speed !
> 
> Cons:
> 
> - Setting up the babe, on a 1K bucks GFX it's like on the moon.... you have to be focused and take your time , remember time is money
> 
> - You should read the review, be sure you want to do it because it's a very long process if you want to do the things right without issues:
> 
> - You need TORX T6 mini screwdriver and phillips ones but to dismount very tiny screw like on some smartphones : look carefully the PCB and and the screws !
> 
> - The mod will make the card heavier , And DONT PUT HDD JUST BEHIND THE GFX : bad idéa really ssd would suffer, and hdd's will burn in hell...


Okay, I think I might go for it. The only problem is my case is moderately ventilated. 180 mm fan as intake. PSU (140 mm fan) and 120 mm as exhaust. Currently it feels like my GTX 690 is doing some of the exhaust work. Gaming makes the temperature go to around 80 - 85 degrees, with fans at 65%, I feel that's my noise limit.

Do you think the Twin Turbo would help? I think it would, the hot air should be vented fast enough







?

Lets give you a picture also (the fan should fit with the soundcard, right?):


----------



## ocsi1970

Quote:


> Originally Posted by *killbom*
> 
> Okay, I think I might go for it. The only problem is my case is moderately ventilated. 180 mm fan as intake. PSU (140 mm fan) and 120 mm as exhaust. Currently it feels like my GTX 690 is doing some of the exhaust work. Gaming makes the temperature go to around 80 - 85 degrees, with fans at 65%, I feel that's my noise limit.
> 
> Do you think the Twin Turbo would help? I think it would, the hot air should be vented fast enough
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Lets give you a picture also (the fan should fit with the soundcard, right?):


I just change my Twin Turbo with water cooling,but without discussion Twin Turbo is better that the stock. In crysis 3 had 60-70 degrees after 2 hours of play.


----------



## fifagi

i've a Gtx 690...and i want install a MOD bios...

Where can i find it? and...how i can istall it?

THANKS


----------



## superx51

Ask skyn3t for the bios


----------



## killbom

Quote:


> Originally Posted by *ocsi1970*
> 
> I just change my Twin Turbo with water cooling,but without discussion Twin Turbo is better that the stock. In crysis 3 had 60-70 degrees after 2 hours of play.


Does the Twin Turbo fit in three slots (see the picture you replied on) or is it larger than that? It's kinda hard to see in the picture. How far does it extend from the side of the card?


----------



## ocsi1970

Quote:


> Originally Posted by *killbom*
> 
> Does the Twin Turbo fit in three slots (see the picture you replied on) or is it larger than that? It's kinda hard to see in the picture. How far does it extend from the side of the card?


Fit in three slots.Size Twin Turbo:288 x 138 x 50 mm.http://www.pc-max.de/artikel/kuehlung/feature-arctic-accelero-twin-turbo-690-vga-kuehler And pictures of how to assemble.

Good luck!!!


----------



## lascar

Quote:


> Originally Posted by *killbom*
> 
> Okay, I think I might go for it. The only problem is my case is moderately ventilated. 180 mm fan as intake. PSU (140 mm fan) and 120 mm as exhaust. Currently it feels like my GTX 690 is doing some of the exhaust work. Gaming makes the temperature go to around 80 - 85 degrees, with fans at 65%, I feel that's my noise limit.
> 
> Do you think the Twin Turbo would help? I think it would, the hot air should be vented fast enough
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Lets give you a picture also (the fan should fit with the soundcard, right?):


Should be (Ok) , close to the sound card but it's ok, got the same setup with my old fashioned audigy 2ZS soundblaster card which also is very close from the 690gtx ... no problem at all.

Your temperature should be better than with the stock coolling , so i think it's worth it ... if temps are good, you can even try to lower a bit the fan ( 1200/1100 rpm --> 1000/900 rpm) to minimise the "acoustic threshold".

that way the boost clock will be higher in most cases, your performance should be better without overclocking...

But remember with the stock design most of the hot exhaust air is away from the tower, outside of it.

With the accelero, all the hot air (VRM and RAM chips) will stay inner PC case ...

So the norhbridge (NB) / southbridge (SB) & HDD's will run hotter that's a fact on non well ventilated case.

Your intake fan (180 mm) shoud obvisouly help

I run the fan at max speed 1500 rpm when overvolting, on AUTO @ Stock speed (custom made AB fan profile).

the two PWN fans of the accelero (120 mm fans) are very silent compared to the stock one:

GTX 690 stock design :
5.3 Sone for 1 x 92mm fan @ 3000rpm max- temp 84°C @ stock speed

Accelero:
0.4 Sone for 2 x 120 mm fans @ 1500rpm max - temp 72°C @ stock speed

In my case, noise isn't an issue, playing on a 65' Plasma TV i could barely hear the fan sound of the accelero at 1500 Rpm (5.1 SURROUND muted) ...

anyway my ps3 runs louder


----------



## lascar

Quote:


> Originally Posted by *ocsi1970*
> 
> Fit in three slots.Size Twin Turbo:288 x 138 x 50 mm.http://www.pc-max.de/artikel/kuehlung/feature-arctic-accelero-twin-turbo-690-vga-kuehler And pictures of how to assemble.
> 
> Good luck!!!


nein es tut mir leid aber ....

"Der Accelerator Twin Turbo 690 misst 288 x 138 x 50 Millimeter und entspricht daher der Dual-Slot-Bauweise. Das Gewicht wird von ARCTIC mit 805 Gramm angegeben..."

" entspricht daher der Dual-Slot-Bauweise" ..... stands for dual slot design ... not triple slot !


----------



## Arizonian

Hi everyone,

Since I'm the second post on the club I thought I'd add sky3nt GTX 690 BIOS to the front page of 690 Owners Club to make it easier for members to see and point others who might be interested in trying it.

Remember try this at your own risk. If you have questions you can post them here and sky3nt should be subbed and will respond. No need to PM him. He's curious to see how it works for members wanting to try it out.









Personally I've got my 690 in a second rig which I do not access and for what it's being used for it's more GPU than needed. Since I don't get much time if any at all on it I won't be testing it myself.

If only had access to this just a couple months sooner.


----------



## virgis21

Hi All,

I've been away for a week. Then powered my rig, found few updates from nvidia and for Battlefield 4 and now it runs much better. Game seems more optimized now. Vram doesn't get over 1200mb on both Vrams, very similar. Runs at about 80-100fps (1080).
I am very happy 690 user so far! And I am sure will be fine until 890?!









Virgis


----------



## delacruzpaolo19

Guys anyone having this kind of issue?




*ANY FIX?*


----------



## Proxish

Quote:


> Originally Posted by *delacruzpaolo19*
> 
> Guys anyone having this kind of issue?
> 
> 
> 
> 
> *ANY FIX?*


Yea, I installed the first update for Battlefield 4 last night, and had the same problem.
I have a minimal overclock on my card, not tried it at stock, but I think it was the update that messed it.


----------



## virgis21

Updated too, but no issue! Runs better than before!

Virgis


----------



## skyn3t

Quote:


> Originally Posted by *Arizonian*
> 
> Hi everyone,
> 
> Since I'm the second post on the club I thought I'd add sky3nt GTX 690 BIOS to the front page of 690 Owners Club to make it easier for members to see and point others who might be interested in trying it.
> 
> Remember try this at your own risk. If you have questions you can post them here and skyn3t should be subbed and will respond. No need to PM him. He's curious to see how it works for members wanting to try it out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally I've got my 690 in a second rig which I do not access and for what it's being used for it's more GPU than needed. Since I don't get much time if any at all on it I won't be testing it myself.
> 
> If only had access to this just a couple months sooner.


First of thank you Arizonian for the push







.

If you guys have any question how to flash or any concern post here or pm. I'm subbed here so easy to track.


----------



## virgis21

Guys,

I've found that can add another card for PhysX to get better performance? Any ideas if I add extra GTX 560Ti to my existing GTX 690 setup? GTX 560Ti will do PhysX calculations, while GTX 690 will do Dual GPU work without messing up with PhysX?
https://forums.geforce.com/default/topic/627776/gtx-690-and-phys-x-/?offset=1

Virgis


----------



## CommanderJ

I tried sky3nt's BIOS (rev A2) but EZflash refuses to flash the second card. Something about PCI ID mismatch and hierarchy mismatch. I noticed that according to GPU-Z the two GPU cores actually had different stock BIOS. Am I missing something? Surely both cores need the new BIOS for new voltage and power targets?

Edit: Essentially, the REV A rom updates this card: 80.04.1E.00.14 (P2000-00g2)
but it will not update this card: 80.04.1E.00.13 (P2000-00g1)


----------



## skyn3t

Quote:


> Originally Posted by *CommanderJ*
> 
> I tried sky3nt's BIOS (rev A2) but EZflash refuses to flash the second card. Something about PCI ID mismatch and hierarchy mismatch. I noticed that according to GPU-Z the two GPU cores actually had different stock BIOS. Am I missing something? Surely both cores need the new BIOS for new voltage and power targets?
> 
> Edit: Essentially, the REV A rom updates this card: 80.04.1E.00.14 (P2000-00g2)
> but it will not update this card: 80.04.1E.00.13 (P2000-00g1)


you must use the Ez3flash option #6 to flash the second GPU. just follow the screen, if you have done the first GPU like you said you have nothing to worry about for the second GPU


----------



## CommanderJ

Hi, thanks for replying.

Yes I did use option 6 (because there is an intel adapter on the mobo but it is the first 690 core listed by NVflash that does not wish to flash), bypassed all the warnings (typed YES all caps etc) and it still failed to flash the BIOS. The window closes so fast that it's hard to see the full error, but it does talk about hierarchy error and PCI ID mismatch, even though it is supposed to bypass the ID mismatch. PCI ID for the card that does flash is 16, and the other one 8 I think.

Do you have any earthly idea why the two cores would have different BIOS in the first place?

Edit:
The first error window that does come up is this:
WARNING Firmware Image Board ID (E102) does not match adapter Board ID (E103)
WARNING Firmware Image Hierarchy ID (switch port 16) either does not match the Hierarchy ID of the adapter (switch port 8) or does not match the Hierarchy Role of the adapter (switch port 8).

And even when I select all the right things to bypass the errors it still won't go, it says: Error Hierarchy mismatch and then closes.

2nd edit: I get the same errors if I try flashing the .14 bios onto this card with the .13, so I don't think it's really your BIOS thats the issue.


----------



## skyn3t

Quote:


> Originally Posted by *CommanderJ*
> 
> Hi, thanks for replying.
> 
> Yes I did use option 6, bypassed all the warnings (typed YES all caps etc) and it still failed to flash the BIOS. The window closes so fast that it's hard to see the full error, but it does talk about hierarchy error and PCI ID mismatch, even though it is supposed to bypass the ID mismatch. PCI ID for the card that does flash is 16, and the other one 8 I think.
> 
> Do you have any earthly idea why the two cores would have different BIOS in the first place?


both 690 is the same brand? can you tell me which brand is yours and bios version. so i can mod it.


----------



## CommanderJ

Ah there is only one 690 - I am talking about the two GPUs inside it, not two different cards, sorry I see how my first post was worded confusingly. So the two GPUs on the -same card- had different BIOS, which strikes me as odd. (unless one is specifically slave or something). The brand is Gainward. It lists a different BIOS for each integrated GPU, 80.04.1E.00.14 (P2000-00g2) and 80.04.1E.00.13 (P2000-00g1)


----------



## skyn3t

Quote:


> Originally Posted by *CommanderJ*
> 
> Ah there is only one 690 - I am talking about the two GPUs inside it, not two different cards, sorry I see how my first post was worded confusingly. So the two GPUs on the -same card- had different BIOS, which strikes me as odd. (unless one is specifically slave or something). The brand is Gainward. It lists a different BIOS for each integrated GPU, 80.04.1E.00.14 (P2000-00g2) and 80.04.1E.00.13 (P2000-00g1)


you got PM


----------



## skyn3t

any 690 owner around?


----------



## superx51

Yes?


----------



## skyn3t

Quote:


> Originally Posted by *superx51*
> 
> Yes?


sorry, I missed.


----------



## virgis21

Another here too!


----------



## V3teran

Hi all.
Long time!
Im thinking of buying a waterblock for my 690, what i want to know is what is that custom bios on the front page of the thread and what is the max voltage?
Thanks for any advice!

What is the highest you have seen on water and what can i expect if i watercool the 690?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *V3teran*
> 
> Hi all.
> Long time!
> Im thinking of buying a waterblock for my 690, what i want to know is what is that custom bios on the front page of the thread and what is the max voltage?
> Thanks for any advice!
> 
> What is the highest you have seen on water and what can i expect if i watercool the 690?


Download the file and the read me says:
Quote:


> skyn3t vBios GTX690
> Nvidia GTX 690
> Version 80.04.1E.00.91
> Disabled boost
> 3d voltage adjustable
> 1.212v unlocked
> Fan Idle 20%
> Fan bumped to 100%
> Default power target 100% 200W by 130% slide 350W


----------



## CommanderJ

That vBios doesn't currently work, skyn3t is working on something that will work.


----------



## V3teran

Quote:


> Originally Posted by *CommanderJ*
> 
> That vBios doesn't currently work, skyn3t is working on something that will work.


Has it been confirmed that it doesnt work mate?
Ive pmed skyn3t btw


----------



## CommanderJ

Quote:


> Originally Posted by *V3teran*
> 
> Has it been confirmed that it doesnt work mate?
> Ive pmed skyn3t btw


It didn't work for me at least, you need 2 vBios as each core needs a different vBios. I've tried 3-4 different variations of bios that skyn3t has PM'ed me, but so far none of them allowed voltage above normal. I have a Gainward 690, but I'd assume all 690s need two different vBios to work.


----------



## V3teran

Hmm if thats the case i wont bother watercooling it then. Ill just wait for maxwell and wc that.


----------



## propeldragon

How do you only use 1200mb of vram? I use 1800-1900mb on ultra 2x msaa (multiplayer).


----------



## Proxish

Can you add me to the member list please.
Asus GTX 690.


----------



## Arizonian

I'm pretty sure jcde7ago has abandoned the club along time ago. Send him a PM.

I think the club is still relevant since we have a lot of members with cards out there. Also we have a new vbios that we are experimenting with that might revitalize this card.


----------



## LeandroJVarini

Hello guys! I bought a GTX 690'm loving this very strong and silent VGA! I wonder if anyone has seen or has already done so, use two universal GPU blocks in the VGA, and the VRM and memories passive heatsinks? I have a WC but I'm not willing to spend $ 130~150 block, and already have two universal blocks standing here in my old 580 SLI.

Does it work without problems?


----------



## Proxish

Quote:


> Originally Posted by *Arizonian*
> 
> I'm pretty sure jcde7ago has abandoned the club along time ago. Send him a PM.
> 
> I think the club is still relevant since we have a lot of members with cards out there. Also we have a new vbios that we are experimenting with that might revitalize this card.


Thanks for getting back to me.
Ok, thanks, I'll do that.

What is it exactly this vbios mod will do?
As far as I understand, you use a custom bios to unlock the voltage, higher voltage equals higher overclocked core and memory?
If that's the case, that'd be great, because right now, I have Core +50 & +175 Memory, anything past that and my card completely crashes.


----------



## Proxish

Anyone?


----------



## GoneTomorrow

Quote:


> Originally Posted by *Arizonian*
> 
> I'm pretty sure jcde7ago has abandoned the club along time ago. Send him a PM.
> 
> I think the club is still relevant since we have a lot of members with cards out there. Also we have a new vbios that we are experimenting with that might revitalize this card.


Ooh. Links or info about this vbios?

Quote:


> Originally Posted by *Proxish*
> 
> Thanks for getting back to me.
> Ok, thanks, I'll do that.
> 
> What is it exactly this vbios mod will do?
> As far as I understand, you use a custom bios to unlock the voltage, higher voltage equals higher overclocked core and memory?
> If that's the case, that'd be great, because right now, I have Core +50 & +175 Memory, anything past that and my card completely crashes.


You might be clocking your memory too high. With my memory set to +100 or so, I can get the core to +100. Stable, runs at full load all day.


----------



## GoneTomorrow

dupe


----------



## Cylas

Here is my GOP vBios and an updated version from the PLX Firmware, for the GTX 690.



GK104a.zip 121k .zip file


GK104b.zip 121k .zip file


690_plx.zip 1k .zip file


----------



## GoneTomorrow

Quote:


> Originally Posted by *Cylas*
> 
> Here is my GOP vBios and an updated version from the PLX Firmware, for the GTX 690.
> 
> GK104a.zip 121k .zip file
> 
> 
> GK104b.zip 121k .zip file
> 
> 
> 690_plx.zip 1k .zip file


Thanks. What benefits does it give exactly? I know about the UEFI BIOS and the fast-boot for Windows 8.


----------



## skyn3t

Quote:


> Originally Posted by *Proxish*
> 
> Thanks for getting back to me.
> Ok, thanks, I'll do that.
> 
> What is it exactly this vbios mod will do?
> As far as I understand, you use a custom bios to unlock the voltage, higher voltage equals higher overclocked core and memory?
> If that's the case, that'd be great, because right now, I have Core +50 & +175 Memory, anything past that and my card completely crashes.


Quote:


> Originally Posted by *Proxish*
> 
> Anyone?


Hi Proxish,

You are right about the vbios info you mentioned in your post above. the only thing is holding me back on the 690 vbios is the voltage control. even when I unlocked the voltage to pass the 1.75v on the bios. the Precision x Or MSI Afterburner won't let it to be raise some how.
I know a couple of GPU brand has this issue. like Gigabyte and Asus. on the 760 family 770 and some 600's family GPU. I have been tested the vBios on a Gigabyte brand. I have not tested the vBios in any other brand.

one a side note, the 690 GPU has the bios tied to the first and second GPU in the pcb, so in other the bios to be flash it does need to be extract from each GPU, cuz it does have a bios 1 and 2.

If you are willin to help to develop the vBios please save your both bios and pm it to me along with GPU brand, I hope I could work with EVGA brand seems like more friendly to debug everything.
Quote:


> Originally Posted by *GoneTomorrow*
> 
> Ooh. Links or info about this vbios?
> You might be clocking your memory too high. With my memory set to +100 or so, I can get the core to +100. Stable, runs at full load all day.


can you save you both stock bios zip it and send to me please? add the GPU brand too. because not all the company has the CERT bios in it. most uses Nvidia stock bios shipped to the company, they don't even touch the bios only makes the GPU,flash the Nvidia bios and send to the store to be sold.


----------



## GoneTomorrow

Quote:


> Originally Posted by *skyn3t*
> 
> can you save you both stock bios zip it and send to me please? add the GPU brand too. because not all the company has the CERT bios in it. most uses Nvidia stock bios shipped to the company, they don't even touch the bios only makes the GPU,flash the Nvidia bios and send to the store to be sold.


Can't get my BIOS to save with GPU-Z or nvflash, presumably because it's not unlocked. So unless you know of another way (without unlocking), I'll have to disappoint you.









EDIT: Weird, I can get the BIOS from GPU 2 to save, but not GPU 1 (says BIOS reading not supported). So here's GPU 2's BIOS if it helps.









GK104.zip 56k .zip file


----------



## Proxish

Quote:


> Originally Posted by *skyn3t*
> 
> Hi Proxish,
> 
> You are right about the vbios info you mentioned in your post above. the only thing is holding me back on the 690 vbios is the voltage control. even when I unlocked the voltage to pass the 1.75v on the bios. the Precision x Or MSI Afterburner won't let it to be raise some how.
> I know a couple of GPU brand has this issue. like Gigabyte and Asus. on the 760 family 770 and some 600's family GPU. I have been tested the vBios on a Gigabyte brand. I have not tested the vBios in any other brand.
> 
> one a side note, the 690 GPU has the bios tied to the first and second GPU in the pcb, so in other the bios to be flash it does need to be extract from each GPU, cuz it does have a bios 1 and 2.
> 
> If you are willin to help to develop the vBios please save your both bios and pm it to me along with GPU brand, I hope I could work with EVGA brand seems like more friendly to debug everything.
> can you save you both stock bios zip it and send to me please? add the GPU brand too. because not all the company has the CERT bios in it. most uses Nvidia stock bios shipped to the company, they don't even touch the bios only makes the GPU,flash the Nvidia bios and send to the store to be sold.


I'd be interested to help you develop the Bios mod.
But a couple of problems.
I'm on a stock cooler, hitting 82c-83c when gaming and I have an Asus card, which still has two years left on the warranty.
If I bios mod it, does that not invalidate the warranty?

Oh, and thanks GoneTomorrow, I'll try overclocking the card again, GPU Boost Clock first with a Power Target of 135, and see how high I can go before the whole thing just dies on me.


----------



## skyn3t

Quote:


> Originally Posted by *GoneTomorrow*
> 
> Can't get my BIOS to save with GPU-Z or nvflash, presumably because it's not unlocked. So unless you know of another way (without unlocking), I'll have to disappoint you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Weird, I can get the BIOS from GPU 2 to save, but not GPU 1 (says BIOS reading not supported). So here's GPU 2's BIOS if it helps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GK104.zip 56k .zip file


you must disable the EEprom security in order to save the bios. if you have a PLX chip it may interfere the process. see the example below

open nvflash and type

nvflash --display
i you have PLX chip it will display like this for example, this is mobo with 3 titan and two PLX chip reported.









the GPU count start from 0 to 8 so

PLX chip always show's first "0" so
nvflash --display
0 - GPU 1
1 - GPU 2
2 - GPU 3
3 - GPU 4
by your sig you have one 690 if you have a PLX chip you must skip the GPU order and find the right # to disable eeprom
by type

nvflash --protectoff
it will ask what chip you want to disable you type any number from the menu that are not the PLX chip. repeat the process for the second GPU into your 690.
Quote:


> Originally Posted by *Proxish*
> 
> I'd be interested to help you develop the Bios mod.
> But a couple of problems.
> I'm on a stock cooler, hitting 82c-83c when gaming and I have an Asus card, which still has two years left on the warranty.
> If I bios mod it, does that not invalidate the warranty?
> 
> Oh, and thanks GoneTomorrow, I'll try overclocking the card again, GPU Boost Clock first with a Power Target of 135, and see how high I can go before the whole thing just dies on me.


it will void the warranty when you flash the gpu with any modded vBios, but after you flash back to stock bios your warranty come back and if you need to RMA your GPU one day you fine. but if you are not familiar with flashing bios i recommend you to not do anything. read a bit how it how before you try to flash , in my sig are link for the GTX 780 thread in OP are some guide how to flash.


----------



## Proxish

Quote:


> Originally Posted by *skyn3t*
> 
> you must disable the EEprom security in order to save the bios. if you have a PLX chip it may interfere the process. see the example below
> 
> open nvflash and type
> 
> nvflash --display
> i you have PLX chip it will display like this for example, this is mobo with 3 titan and two PLX chip reported.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the GPU count start from 0 to 8 so
> 
> PLX chip always show's first "0" so
> nvflash --display
> 0 - GPU 1
> 1 - GPU 2
> 2 - GPU 3
> 3 - GPU 4
> by your sig you have one 690 if you have a PLX chip you must skip the GPU order and find the right # to disable eeprom
> by type
> 
> nvflash --protectoff
> it will ask what chip you want to disable you type any number from the menu that are not the PLX chip. repeat the process for the second GPU into your 690.
> it will void the warranty when you flash the gpu with any modded vBios, but after you flash back to stock bios your warranty come back and if you need to RMA your GPU one day you fine. but if you are not familiar with flashing bios i recommend you to not do anything. read a bit how it how before you try to flash , in my sig are link for the GTX 780 thread in OP are some guide how to flash.


I've flashed a 660 Sli setup before, so I would know roughly what I was doing.
What kind of performance boost, temperature boost would we be looking at with a new bios?

EDIT: Just redid my overclock and got a max of 1257 / 3154 at 83c.


----------



## skyn3t

Quote:


> Originally Posted by *Proxish*
> 
> I've flashed a 660 Sli setup before, so I would know roughly what I was doing.
> What kind of performance boost, temperature boost would we be looking at with a new bios?
> 
> EDIT: Just redid my overclock and got a max of 1257 / 3154 at 83c.


only your GPU will tell you, each chip does performance different. but you should able to clock higher and scores will increase too if i can get the work around the voltage to 1.212v.


----------



## Proxish

Quote:


> Originally Posted by *skyn3t*
> 
> only your GPU will tell you, each chip does performance different. but you should able to clock higher and scores will increase too if i can get the work around the voltage to 1.212v.


Ok, I'm interested in doing it and willing to try.
If you can provide a decent guide, I can try it tonight or as soon as you get back to me.

Right now, I'm having an issue overclocking and I don't know why.
I'm fine at stock speeds, when I overclock with EVGA Precision however, my card seems to crash at anything past +20 Core +50 Mem, so whatever I can try would be great.
I had a good overclock with Asus GPU Tweak of +55 +150, but I noticed that it was only overclocking one of the chips, and the other was left at default, and I don't know why.

So yea, I'll try this custom bios if it might get a decent overclock.
What's the chance of killing my card though?

Oh and thanks for all your help dude and providing the custom bios, I appreciate it.


----------



## s74r1

Hmm, my +55 core OC is no longer stable after 17 months. Not sure if new drivers or degradation or what. Anyone else experience something similar?


----------



## franky123

hello,

I will purchase 2 gpu cards for gtx titan and gtx 690.
My question is " do these gpu can be configured sli to improve the graphic performance for both gaming, computing and videos encoding " ?

thank you


----------



## Alex132

Quote:


> Originally Posted by *franky123*
> 
> hello,
> 
> I will purchase 2 gpu cards for gtx titan and gtx 690.
> My question is " do these gpu can be configured sli to improve the graphic performance for both gaming, computing and videos encoding " ?
> 
> thank you


www.geforce.com/whats-new/guides/introduction-to-sli-technology-guide#1

Obviously not, it doesn't take more than 5 seconds to google that.


----------



## endergx

wondering if that new nzxt kraken g10 will fit on my 690... doesn't show up on the compatible cards list, but it could probably be modified.


----------



## Proxish

Quote:


> Originally Posted by *s74r1*
> 
> Hmm, my +55 core OC is no longer stable after 17 months. Not sure if new drivers or degradation or what. Anyone else experience something similar?


I'm having the same problem. I had an overclock of 135% 50+ 180+, then the newer drivers came out and now with EVGA Precision I can't get more than a 135% +20 +50 without all my games crashing.
I suspect it's the drivers, though I don't know why they would do that all of a sudden.

Quote:


> Originally Posted by *endergx*
> 
> wondering if that new nzxt kraken g10 will fit on my 690... doesn't show up on the compatible cards list, but it could probably be modified.


As far as I know, the NZXT Kraken G10 is for single GPU cards only, and looking at the build on it, looks like there is no work around for it.
If you want to improve the cooling on your GTX 690, I suggest going with the suggested cooler on the first page of this thread.


----------



## V3teran

Skyn3t, I have 2 stock bios from my Gigabyte GTX 690.
I also have 2 modded bios that i made using KGB Unlocker.

I hjave sent you in PM my stock bios.


----------



## lascar

Quote:


> Originally Posted by *s74r1*
> 
> Hmm, my +55 core OC is no longer stable after 17 months. Not sure if new drivers or degradation or what. Anyone else experience something similar?


driver related issue since 314.22 ( in my case gtx 690 : worst Oc scenarios since launched ....)


----------



## provost

Yep, can't oc 690s with the new driver for gaming at all. Since I don't bench 690 anymore, I have not bothered with putting the waterblock back on to see if it makes a difference....
I suspect its boost 2. 0 and new drivers....I have only played very few games,but ac4 seems to fly on latest beta drivers on 1440p with the 690 sli profile.


----------



## skyn3t

Quote:


> Originally Posted by *V3teran*
> 
> Skyn3t, I have 2 stock bios from my Gigabyte GTX 690.
> I also have 2 modded bios that i made using KGB Unlocker.
> 
> I hjave sent you in PM my stock bios.


Hey I got your PM, actually I was waiting on Buzzkill but he never got back to me. it does makes things go slow.


----------



## V3teran

Quote:


> Originally Posted by *skyn3t*
> 
> Hey I got your PM, actually I was waiting on Buzzkill but he never got back to me. it does makes things go slow.


Well hopefully you can do something positve with them, i will keep my eyes on this thread.


----------



## skyn3t

Quote:


> Originally Posted by *V3teran*
> 
> Well hopefully you can do something positve with them, i will keep my eyes on this thread.


I'm going to work on the Giga bios to see how things going, but I would like to know about the feedback EVGA bios as well. I will pm the bios when I get it done.


----------



## V3teran

Ok nice one.


----------



## superx51

Sky any luck with a bios mod for the 690? Is it even possible?


----------



## skyn3t

Quote:


> Originally Posted by *superx51*
> 
> Sky any luck with a bios mod for the 690? Is it even possible?


I'm still work on it, I have some EVGA 690 done but I still have no feedback yet, I still try to get the voltage unlocked to 1.212v, PT is increased and boost can be disable but voltage still the issue.


----------



## superx51

Quote:


> Originally Posted by *skyn3t*
> 
> I'm still work on it, I have some EVGA 690 done but I still have no feedback yet, I still try to get the voltage unlocked to 1.212v, PT is increased and boost can be disable but voltage still the issue.


was the reason the Titan owners got voltage control be cause of the NCP4206 chip?


----------



## superx51

Using nvida inspector my bio is 80.04.1e.00.13 sub vendor nvidia. Can I use any of your bios?


----------



## skitz9417

hi guys im just wondering could a gtx 690 handle 5670x1080 gaming with no aa or fxaa


----------



## ceteris

Still got my 690 under water and is going strong. However, I had revert the settings back to stock some several months back due to instability in some games. It was probably one of the recent drivers, but I don't bother benching anymore since the 690 is about to hit 2 years now.

Probably won't plan on replacing her until 4k monitors become mainstream and/or affordable.


----------



## Cylas

Quote:


> Originally Posted by *skyn3t*
> 
> I'm still work on it, I have some EVGA 690 done but I still have no feedback yet, I still try to get the voltage unlocked to 1.212v, PT is increased and boost can be disable but voltage still the issue.


I think the Bios is not the only problem. After the Geforce 590 GTX disaster, NVidia has integrated a voltage protection mechanism into the driver and locked the Voltage to 1,175V.
The MSI Afterburner overclocked directly the NCP 4206 controller and don´t use the Profiles over the driver, therefore the voltage is not read out correctly with tools like GPU-Z.
So I think there are two possibilities. You modifies the driver too, or you get the ASUS MARS 760 bios and modified it..


----------



## skitz9417

hi guys im just wondering could a gtx 690 handle 5670x1080 gaming with no aa or fxaa anyone ?


----------



## Proxish

Quote:


> Originally Posted by *skitz9417*
> 
> hi guys im just wondering could a gtx 690 handle 5670x1080 gaming with no aa or fxaa anyone ?


Probably could, but I doubt you'd be looking at any more than 30fps in most games.


----------



## skitz9417

Quote:


> Originally Posted by *Proxish*
> 
> Probably could, but I doubt you'd be looking at any more than 30fps in most games.


well i was looking at this benchmark what do u think ?

http://www.guru3d.com/articles_pages/geforce_gtx_780_sli_review,22.html


----------



## Proxish

Quote:


> Originally Posted by *skitz9417*
> 
> well i was looking at this benchmark what do u think ?
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_780_sli_review,22.html


Fair enough, I was way off.
If you had that though, why were you asking here?


----------



## skitz9417

becasue people telll to get a titan or gtx 780 and when i look at the graph the gtx 780 gets beat by the gtx 690


----------



## virgis21

People keep talking bunch of nonsenses







Some of them pick VGA just by price (higher is better).
Actually I was looking around for my 690 replacement and went with 690 until 8xx is out. Because even GTX 780Ti is little under 690. And to buy 2 of them (780Ti) will be overpriced solution







So my choice is the same - 690 so far is the best for me.

Virgis


----------



## Proxish

Quote:


> Originally Posted by *skitz9417*
> 
> becasue people telll to get a titan or gtx 780 and when i look at the graph the gtx 780 gets beat by the gtx 690


I would say go with an R9 280X Crossfire setup, or go with a single R9 290.
A 280X Crossfire setup would beat out a GTX 690 by around 20fps-30fps in most games.

Or get an R9 290 with a non reference cooler when they come out and Crossfire somewhere down the line.


----------



## RnRollie

Quote:


> Originally Posted by *skitz9417*
> 
> hi guys im just wondering could a gtx 690 handle 5670x1080 gaming with no aa or fxaa anyone ?


depends... at that resolution the VID_MEM comes into play
There is "only" 2x2GB available on the 690 - NOTE: 2x2GB is NOT the same a 4GB
Compare that to 760/770/780/Titan which have 3 / 4 / 6 GB available

So, at some point if the card needs to load/refresh a series of images each the size of 3GB, the 690 *might* struggle, while the 7x0 / Titan does not have that problem as there is enough RAM available to load the whole image "in one go"

However the speed comes into play also....2GB refreshed twice at 20ns can seem smoother compared to 4GB which refreshes at 40ns once.

The same rules apply to GDDR5 refresh rates as to "regular" DDR RAM... you get what you pay for.... There IS a reason why 690 & Titan retail around 1000, while a 770/780 is a lot less







Although the differences in GPU are a lot less as the differences that exists between RAM brands.... see also "why a CORSAIR 4GB stick costs more as a noname 16GB stick"


----------



## V3teran

I was going to sell my 690 but tbh im going to keep it, its still the fastest single slot gpu that you can buy and its 20 months old!
Legendary King is still at the top, only thing it will be worth selling for is the 790. Ill wait until then.


----------



## Proxish

Quote:


> Originally Posted by *V3teran*
> 
> I was going to sell my 690 but tbh im going to keep it, its still the fastest single slot gpu that you can buy and its 20 months old!
> Legendary King is still at the top, only thing it will be worth selling for is the 790. Ill wait until then.


I'm doing the same and waiting it out until the 790 comes out. I'm not going for anything less than an 8Gb version though.
If nVidia can do a 790 with two 780ti's inside and an 8Gb version for around £700, then I'm going for that.

I'm not I'm switching over to nVidia when they bring out their dual card 290x at 8Gb. Hoping they go nuts and put in 10GB or 12GB though.

Failing all that, I'll be going for dual R9 290's with non reference coolers, possibly bios modding them to 290x's and overclocking them. Wouldn't rock more than £600 by February.


----------



## Trissaayne

unless a gtx 790 is 2 titans then u wont see a 790 over 3gig max, atm gtx 780'sand ti versions arent allowed more than 3gig of ram


----------



## Proxish

Quote:


> Originally Posted by *Trissaayne*
> 
> unless a gtx 790 is 2 titans then u wont see a 790 over 3gig max, atm gtx 780'sand ti versions arent allowed more than 3gig of ram


Why aren't they allowed more than 3GB of ram while the 290 / 290x both have 4GB?


----------



## Trissaayne

Quote:


> Originally Posted by *Proxish*
> 
> Wh aren't they allowed more than 3GB of ram while the 290 / 290x both have 4GB?


Nvidia's rules i'm afraid


----------



## Piciato

hi there guys,

ive been debating this issue for a couple of days now, and i would like ur feedback.

Currently, im using a ref 780, unlocked bios, to run a 1440p screen, i7 3820, 16gb ram. Recently, ive got a deal from a friend that he is willing to trade my 780 to his 690.

Is it a worthy trade to u guys? Just curious.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Piciato*
> 
> hi there guys,
> 
> ive been debating this issue for a couple of days now, and i would like ur feedback.
> 
> Currently, im using a ref 780, unlocked bios, to run a 1440p screen, i7 3820, 16gb ram. Recently, ive got a deal from a friend that he is willing to trade my 780 to his 690.
> 
> Is it a worthy trade to u guys? Just curious.


It's a worthy trade, but I'd keep the 780. Even though it's a little bit slower, it has 3gb of vram and you don't have to deal with sli issues. Games run smoother with a single card.

I had a 690, then switched to a Titan and like the Titan better.


----------



## Piciato

thank u mrtooshort for the fast response! i can put my mind at ease now, even though the gtx 690 is king of performance, after hearing what u said, guess i will be sticking to my gtx 780 for now!


----------



## provost

I have both the 690 and Titan. For 1080p - 1440p, it's hard to beat 690's game play performance with any other single card from Nvidia. You should keep your 780 , as long as you plan to go sli in the future. The extra vram of 780 will only shine when you have extra horsepower of sli. I can live with less aa on 1440p which eats up VRAM, but not less fps.








You should also keep your 780, if you are a bencher, as 780 can be unlocked with modded bios and volt unlock.
690 can be volt unlocked, but its of no use , as there isn't a hack yet to mod the bios.
I have never used a single card for gaming, except for the one in my sig rig, but that serves a different purpose.
So, I can't comment on sli vs single card gaming. I have always enjoyed the raw grunt of multiple cards ; perhaps I am less sensitive to sli issues.


----------



## V3teran

What is the performance of the 690 under water. what kind of oc can i expect?


----------



## V3teran

Quote:


> Originally Posted by *Proxish*
> 
> Why aren't they allowed more than 3GB of ram while the 290 / 290x both have 4GB?


Because it will eat in Titan sales thats why. They could easily put 6gb on it.


----------



## provost

Quote:


> Originally Posted by *V3teran*
> 
> What is the performance of the 690 under water. what kind of oc can i expect?


Although I have not put the block on the 690 in a while, 690 does not throttle as much on water with overclocks and you can keep kboost enabled while gaming for constant clocks.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *V3teran*
> 
> What is the performance of the 690 under water. what kind of oc can i expect?


Expect 1150Mhz 24/7 stable and 1200Mhz if you have a lucky gpu.

I had two 690s that both did around ~1200MHz stable under an EK block.


----------



## provost

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Expect 1150Mhz 24/7 stable and 1200Mhz if you have a lucky gpu.
> 
> I had two 690s that both did around ~1200MHz stable under an EK block.


Right on! That's also my recollection from my 690 benching days. Perhaps even a bit more than 1200 on benchmark runs


----------



## V3teran

Somebody i know is selling a koolance waterblock for the 690 but it dont come with any thermal pads? Can i buy these seperately?


----------



## lascar

Quote:


> Originally Posted by *V3teran*
> 
> What is the performance of the 690 under water. what kind of oc can i expect?


My gfx ASIC quality 68% on both gpu's
Aftermarket air cooled solution - artic cooling brand- for the gtx 690

PT 150%
+120 core
+700 men
@ 1.175v

With overvolting
PT 150%
+200
+650
@ 1.275v

On crysis and metro LL & fc3 I got throlling because of the tdp : gpu's are flapping between 1150 MHz and 1274mhz

Wish it could help tuning your beast up 

Gtx 690 Oc roundup:

http://www.gpureview.com/gpureviews-gtx-690-overclocking-roundup-article-945.html


----------



## provost

Quote:


> Originally Posted by *V3teran*
> 
> Somebody i know is selling a koolance waterblock for the 690 but it dont come with any thermal pads? Can i buy these seperately?


Actually, the Koolance block is really good, and it's the same that I have. Thermal pads are generic and swappable.
http://www.frozencpu.com/cat/l2/g8/c487/list/p1/Thermal_Interface-Thermal_Pads_Tape.html. I would recommend Fujipoly.
Just grab some .5mm thickness from FCPU, and look up the manual on line for the koolance block to see where to place these on the pcb. You will have to cut the thermal pad to size, which is no biggie. I did the same for all my Titan blocks.
You will like your water cooled 690 much better than the stock cooler.

Edit: make sure that you save the thermal pads from the stock cooler. I always put the stock thermal pads on the stock cooler, after taking it off, and then wrap it in a plastic wrap and put it away. I also take a picture of the pcb with the stock cooler off, before putting the new pads and water block on. This provides a good reference point of how to put back the stock thermal pads, in case you have to do an rma.


----------



## Proxish

Quote:


> Originally Posted by *lascar*
> 
> My gfx ASIC quality 68% on both gpu's
> Aftermarket air cooled solution - artic cooling brand- for the gtx 690
> 
> PT 150%
> +120 core
> +700 men
> @ 1.175v
> 
> With overvolting
> PT 150%
> +200
> +650
> @ 1.275v
> 
> On crysis and metro LL & fc3 I got throlling because of the tdp : gpu's are flapping between 1150 MHz and 1274mhz
> 
> Wish it could help tuning your beast up
> 
> Gtx 690 Oc roundup:
> 
> http://www.gpureview.com/gpureviews-gtx-690-overclocking-roundup-article-945.html


Quote:


> Originally Posted by *Piciato*
> 
> thank u mrtooshort for the fast response! i can put my mind at ease now, even though the gtx 690 is king of performance, after hearing what u said, guess i will be sticking to my gtx 780 for now!


Jesus, I'm stuck at +20 +50, while you have +200 +650... Just not fair...
Even before the latest terrible update, I couldn't get more than +40 +180.


----------



## lascar

I know my friend, it's so called 'silicon lottery"

Maybe my chip was taken in a good silicium wafer, or more "cherrypicked" (gainward brand ....I seriously doubt it)

few tips....

With overvolting
PT 150%
+200 ----) this statement is only true with the 314.22 can't go further : new up to date drivers cause instability
+650
@ 1.275v

Did you try some overvolting using the UNWINDER technique in MSI AB ?

Maybe it could help stabilize the gpu's .

Good luck mate


----------



## tin0

I currently play like this (the bouncing of core clock is from minimizing) 150% PT 1254mhz core @ 1.275v with drops to 1.244v on both GPU's / 6832mhz memory, no throttling though











Card reaches 48°C max. on these volts


----------



## Cylas

*@tin0*, Create a shortcut with this line .. and add it to your autostart folder

Code:



Code:


"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /sg0 /wi3,20,de,00 /sg1 /wi3,20,de,00"

This line disables the VDrop, but it adds 0,025V to your Settings.

Settings with +200 Core and +300 Mem.


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> My gfx ASIC quality 68% on both gpu's
> Aftermarket air cooled solution - artic cooling brand- for the gtx 690
> 
> PT 150%
> +120 core
> +700 men
> @ 1.175v
> 
> With overvolting
> PT 150%
> +200
> +650
> @ 1.275v
> 
> On crysis and metro LL & fc3 I got throlling because of the tdp : gpu's are flapping between 1150 MHz and 1274mhz
> 
> Wish it could help tuning your beast up
> 
> Gtx 690 Oc roundup:
> 
> http://www.gpureview.com/gpureviews-gtx-690-overclocking-roundup-article-945.html


How did you get the voltage that high on the 690?
Ive used the KGB unlocker buts thats as far as ive got...30mhz increase on boost clocks.
Whats the best driver for a stable overclock?

Quote:


> Originally Posted by *tin0*
> 
> I currently play like this (the bouncing of core clock is from minimizing) 150% PT 1254mhz core @ 1.275v with drops to 1.244v on both GPU's / 6832mhz memory, no throttling though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card reaches 48°C max. on these volts


How did you manage to get your voltage that high on the 690?

Quote:


> Originally Posted by *provost*
> 
> Actually, the Koolance block is really good, and it's the same that I have. Thermal pads are generic and swappable.
> http://www.frozencpu.com/cat/l2/g8/c487/list/p1/Thermal_Interface-Thermal_Pads_Tape.html. I would recommend Fujipoly.
> Just grab some .5mm thickness from FCPU, and look up the manual on line for the koolance block to see where to place these on the pcb. You will have to cut the thermal pad to size, which is no biggie. I did the same for all my Titan blocks.
> You will like your water cooled 690 much better than the stock cooler.
> 
> Edit: make sure that you save the thermal pads from the stock cooler. I always put the stock thermal pads on the stock cooler, after taking it off, and then wrap it in a plastic wrap and put it away. I also take a picture of the pcb with the stock cooler off, before putting the new pads and water block on. This provides a good reference point of how to put back the stock thermal pads, in case you have to do an rma.


Thanks thats good advice.


----------



## lascar

Answer with ease : "rewind" the thread









your lucky m8 im a good man : SEMPER FI

to sum up Power Target maxed out :

can raise it higher like 500% value in a bios editting software like KGB (but value is capped to 169% in my case when monitoring the right value) / and of course it will break the PCI EXPRESS specs for sure ( max 8+8 pins= 375w= 150+150+75)

new TDP between 380 and 475w when overvolted : need STRONG PSU

*Try this guide , Its easy to follow as long as you follow everything correctly.
*
http://forums.guru3d.com/showthread.php?t=362667

*My GPU's both go over 1200mhz when overclocked now which is great. --) TDP tricks !

(Ive also updated post no1 with this guide)
Flashing bios instructions for any Nvidia Keplar GPU.
YOU DO THIS AT YOUR OWN RISK

I wrote this guide as i couldnt find one anywhere.

First of all, You then need to download these files.

GPU-Z (This is used to grab the original Bios from the Gpu wether multi or single)
http://www.techpowerup.com/downloads/2181/mirrors.php

KGB - Kepler BIOS Editor/Unlocker. (This is used to modify the Bios that you just grabbed from your Gpu using Gpu-z)

KGB supports: GTX690, GTX680, GTX670, GTX660Ti, GTX660OEM and GTX660
https://www.dropbox.com/s/vrunxuq03vj0m5y/kgb_0.5.zip

NVFLASH (This is used to reflash or reinstall the new modified bios)
http://www.softpedia.com/progDownloa...oad-16133.html

Steps 1-10.

1.Ok what you need is firstly to make a USB dongle bootable into DOS, To do this read this very easy to follow guide.
http://www.bay-wolf.com/usbmemstick.htm

2.Run GPU-Z and and click on the icon that allows you to grab your bios from the Gpu and save it to your PC.

Also make sure you have a copy of your bios!!

3.Once you have grabbed your bios put it into C:\Users\My and rename it 1.rom

For example the bios from the GTX 690 Gpu 1 is called GK104.rom, I renamed it to 1.rom and the bios from Gpu 2 I renamed 2.rom.

The reason why you should put the bios in C:\Users\My is because the default path will be set here from KGB.exe so if you are not familiar with the CMD (Commandline) then this is the easiest way of doing it.

5.Take the KGB.exe as well as the KGB.cfg file and also place both of these in C:\Users\My.
You should now have KGB.exe, KGB.cfg and 1.rom files in C:\Users\My.

If you have more than 1xGpu then you will have 1.rom and 2.rom respectively in C:\Users\My.

6.The next thing to do is open up the CMD (Commandline) which im sure you all know how to do.
Once you have the CMD open, Copy and paste in this command.

kgb 1.rom unlock (press enter).

It will now save the new values (i.e. from kgb.cfg) in your bios then print out the new values in the bios.

7.Now all we need to do is put the new modified 1.rom back onto the gpu, To do this we need to use the USB DOS bootable Dongle that we created in step 1.

Take the nvflash files once you have extracted them and place them onto the USB DOS bootable Dongle, There should be 2 files.

Then place the 1.rom bios also on the USB DOS bootable Dongle.

8.Now the Dongle is ready to be booted from so reboot the PC and go into the bios.
Once you are in the bios make it so the PC boots from the dongle, Once this happens you will then have a flashing cursor telling you that you are in DOS mode.

9.Type in nvflash --list (press enter)
This will then tell you what gpu(s) that you have installed on your motherboard.

10.Type in nvflash -i1 1.rom (press enter)

It will then say flashing Gpu and press Y to confirm, so press Y and the flashing will commence which takes a few seconds.

If you have a second gpu with 2.rom.

Type in nvflash -i2 2.rom (press enter)

It will then say flashing Gpu and press Y to confirm, so press Y and the flashing will commence which takes a few seconds.

Once this is done reboot the PC go into the bios and reset your boot so it boots from your hardrive with the Windows OS on.

Once you get into windows you can see from MSI afterburner or EVGA precision that the power target can be raised from 130 to 150.

When gaming the boost clock should be higher depending on your GPU.

I achieved around an increase of 30 on the core clock on boost alone on both GPU's which is very good.

It will differ for each person and each GPU(s), You can also try other peoples bios as long as iyts the same GPU as your own.

You can also modify the KGB.exe and tweak power settings for even more results but I wouldnt advise this unless you know what you are doing, I would just leave it at stock and use that for the time being.

Everything you do is at your own risk ! ! !*

1.3V UNLOCK VOLTAGE : http://forums.guru3d.com/showpost.php?p=4638716&postcount=22

http://forums.guru3d.com/showthread.php?t=380513

*
Quote:
Originally Posted by msi-afterburner View Post
- Added NCP4206 voltage regulators support to provide compatibility with future custom design MSI graphics cards
NCP4206 is also used on many modern NVIDIA reference design cards (Titan, 7x0 and some 6x0). MSI don't provide official voltage control via NCP4206 for such cards, however you may try to unlock it by adding the following strings to hardware profile (.\Profiles\VEN_10DE......cfg):

[Settings]
VDDC_Generic_Detection = 0
VDDC_NCP4206_Detection = 3:20h

You can easily verify if your GPU is powered by NCP4206 even without the profile modification using the following command line switch:

MSIAfterburner.exe /ri3,20,99

This command reads NCP4206 ID register, if it will display "41" value - you can unlock direct voltage control via NCP using the profile modification trick mentioned above.
*
proof :

*
1.3V seems a bit high for 28 nm process with air cooling for long periods of time, under WC should be ok.

quick bench ....

Grid 2 downscalled 3200x1800 aax8 maxed out

i'm hitting OCP over current Protection at 1280 mhz / PT 150 % / 1.295V (+ 125 mV)*

http://imagesup.org

*

/ / / PS : To put in a nutshell ...

I want to apologize 4 my poor English : keep in mind i'm just a poor .... Frog !









all the work was done by "UNWINDER" AND "V3TERAN" members from Guru3D : thx to them ! ! !

and thank you people; for sharing useful information on this thread (or others....) since the beginning : don't give it up !* / / /

Greetings from Paris


----------



## fast_fate

lascar - appreciate the effort mate








+1 from me.
I'll have a crack at my 690 over the holiday.
Got a water block to install on it yet.


----------



## V3teran

I made that guide, it was me, the KGB one that is.

So if i follow the other bit at the bottom and put those values in, i will be able to raise voltage power on the 690?
Is it possible to do the same with EVGA precision?
Cheers.


----------



## lascar

Quote:


> Originally Posted by *V3teran*
> 
> So if i follow the other bit at the bottom and put those values in, i will be able to raise voltage power on the 690?
> Cheers.


*
yeah try this :

open x:\programfiles\MSIafterburner\Profiles\*

*if u set a profile and save it : you should normally had the following strings in here : "VENXXX.cfg' and other VENXXX*.cfg
for example :

VEN_10DE&DEV_1004&SUBSYS_104B10DE&REV_A1&BUS_1&DEV _0&FN_0.cfg
VEN_10DE&DEV_1004&SUBSYS_104B10DE&REV_A1&BUS_2&DEV _0&FN_0.cfg

if Not please set STOCK SETTING in MSI AB : STOCK SPEED , POWER TARGET TO DEFAULT AND FAN SPEED TO DEFAULT ALSO

save the settings in profile 1 for instance. !

Close ALL MSI AB TABS , and the software itself
Add in the bottom of any configuration file the following part:

[Settings]
VDDC_Generic_Detection = 0
VDDC_NCP4206_Detection = 3:20h

So you have to do it on each *.CFG you will have in the Profiles directory.

Now if everything is cool, next time you'll restart MSI AB, a popup window will display with the sentence " You should restart the computer otherwise MSI AB will not detect properly the speed and the fan configuration ...... "

click on "NO" ( U don't want to restart the pc, indeed you prefer forcing the detection right now !)

MSI AB wil pop up briefly disapears and pops up again... (look at the voltage tab) : AUTO should be written, moving the slider will display REAL VALUE in starting from 0.80V to 1.30V maxed out

And sadly noway it works under Precision, because MSI AB was made from the original scratch of RIVA tuner software (made by UNWINDER) , PRECISION is more like a bootleg version  ... not the same DEV.

Unwinder ---> NOTES / HELP / TIPS / TRICKS : will help you just in case of you get issues setting the things up .
*
Quote:


> Guys, I've analyzed NCP4206 voltage control feedback in different forums and noticed that many users treat current implementation as beta and expect "improved" NCP4206 voltage control in future versions of Afterburner and expect to see voltage drop in idle. It won't happen, current NCP4206 voltage control approach is the maximum that you can expect from this chip programmability. NC4206 supports either external voltage control via VID pins, when driver sets desired 2D/3D voltages via GPU Boost or voltage override mode, when fixed voltage is set 24/7. So there won't be any improved version with power management enabled, sorry. "
> Quote:
> 
> 
> 
> Originally Posted by DStealth
> Quote:
> 
> 
> 
> "Thanks Alex,
> Working perfectly here, hope to implement it in future AB without mods...and i'll try the same mod with PresitionX, hope it works ? "
> 
> 
> 
> It won't work in Precision. Furthermore,
> Quote:
> 
> 
> 
> Precision won't even display altered voltage on the graphs because it is coded to read voltages via NVIDIA driver only, it doesn't support direct access to VRM like Afterburner does.
> 
> Comments given there are plain wrong. AUTO is not AB mode, AB doesn't take any control on voltage regulator when AUTO is selected, voltage is driven by VID pins of NCP4206 in this mode. And nothing is broken there. The only change with default voltage control (which is taking place when no CFG edit is applied) is that you see VRM output rather than target voltage on the graph.
> BTW due to such comments we locked direct VRM voltage monitoring via CHL8228 during RADEON 7970 launch. Seeing LLC on output voltage graphs is too confusing for many users and results in stupid claims like "AB drops voltage".
> 
> Alexey Nicolaychuk aka Unwinder, RivaTuner creator
> 
> Click to expand...
> 
> 
> 
> Click to expand...
Click to expand...


----------



## V3teran

Thankyou my friend for that useful post!
I will try that when my waterblock comes sometime this week.
Repped!


----------



## lascar

Quote:


> Originally Posted by *V3teran*
> 
> Thankyou my friend for that useful post!
> I will try that when my waterblock comes sometime this week.
> Repped!


You 're welcome Glad it will help you out !

God helps those who help themselves


----------



## ZephyrBit

Cool club!


----------



## V3teran

Having some problems, i installed the koolance block which is ok my temps are both pcb's at around 35 degrees. As soon as i run uniengine my temps go over 100 degrees on 1 of the pcbs. My radiator is as an mcr 320 triple rad. I had better performance with the air cooler if im honest, the bottom of my radiator gets hot but the rest of it is cold, i wonder if i have an air bubble trapped in there, i have eliminated all the bubbles tbh. I may just put the air cooler back on and **** the block back to where it came from. Totally pissed off if im honest.


----------



## RnRollie

One core spikes +100 ? Probably not good contact between chip & block

also, if your rad only gets luckwarm or "hot" only at the bottom.... you might not have circulation.... check the pump , check flow, and yes.. check if you rad is not half filled with air.

Anyways, a 360 rad is barely enough for what is basically TWO GPUs... following the rule of 120.2 per device

Ps: since i'm not seeing it in your sig... you DO have a pump i presume ?


----------



## V3teran

Well i went back to basics and stripped all the loop down. I cleaned the cpu block which had some green gunk in it, i also cleaned all the tubing letting it soak in vinegar overnight and i did the same with the radiator. I reinstalled everything again today and temps are better. My gpu's on the 690 idle at around 27 and 25 now instead of in the 30s, my cpu temps are down by about 5 degrees as well.

When i run heaven benchmark im still getting one gpu hitting 65 degrees which is better than 100 degrees but the other gpu is at 35 degrees. So one is at 35 and one at 65, this still dont seem right. By the time i get to the end of heaven bench pcb 1 is at 40 degrees and pcb2 is 70 degrees. Why isnt pcb 1 higher than 40?

Ive used some thermal pads rated at 1mm in size and although when i used these the mount of the block regarding the 1 pcb on the 690 was not touching yet the 1 pcb was touching. I could tell this by taking the waterblock back off and looking at the tim on the pcb.

So i used some other thermal pad which i found in the box, seems the same size as the other 1mm thermal pads that i have but when i use it it seems to flatten out more. So im using a hybrid of these 2 across the block. Ive now ordered some 0.5 mm thermal pads so when i get these ill apply these to the block and see what happens.

Im thinking either the block is warped or something is wrong with the pads, im not exactly sure.

Also using the 331.93 beta driver made me crash heaven bench at stock settings!! I get better performance with the air cooler unless its the driver...?


----------



## V3teran

Ok i got some new thermal pads and used those, now the 690 idles at 23 on both gpus and now full loads around 45 on gpu 1 and 37 on gpu 2.. This is much better than before, is this similar to your pcb temps? Is it possible to get both gpus the same temp?

When using the msi afterburner hack for increased voltage is it worth doing this after the KGB mod?
Cheers


----------



## MrTOOSHORT

Depending on the seating of of the block, you could have even temps on both gpuz. Practically not though as water passes through the one gpu and warms up a tink before reaching the other gpu.

So being off by a couple of degrees is fine.


----------



## RnRollie

yeah upto 10 °C difference between cores is "normal" because of the Wblock design

Its testament to how well these blocks transfer the heat from the chisp/core/vrms to the water - you are looking at a rise of at least 5°C ; which is very very good; normally a regular (CPU) WB only shaves off 2 or 3 degrees with each "pass"

This is why a "big" rad and a pump doing +/- 1 GPM is important.


----------



## Proxish

Can someone help me out with my overclock.
I just had another reading and overclocking session, trying to figure out what I'm doing wrong and why I'm not achieving the overclock other people are.

With a PT of more than 100%, everything crashes at an overclock above +50 +50.
I just finished an hour overclock session and ended with PT 100% CC+120 MC+600.

Is my PSU killing my overclock here?
While using Valley at 1080p, I got 83fps with an overclock of 135% +50 +0. And got 85.8fps with 100% +120 +600.
How does this make any sense?

I'm using Driver 331.82

Here are my overclocking session results in order of
Test number / FPS / Power Target / Core Clock / Mem Clock
1/78.4 - Stock
2/81.3 - 135%
3/83 - 135% +50
4/Fail - 135 +100
5/81.1 - 100% +100
6/Fail - 100% +150
7/81.7 - 100% +120
8/Fail - 110% +120
9/Fail - 110% +120
10/81.4 - 100% +120
11/Fail - 105% +120
12/79.6 - 100% +0 +200
13/80.2 - 100% +0 +300
14/81 - 100% +0 +500
15/80.9 - 100% +0 +600
16/85.8 - 100% +120 +600


----------



## V3teran

Before doing the MSI volt mod, have people used KGB Or Keplar Tweaker before using the MSI mod?


----------



## lascar

Quote:


> Originally Posted by *V3teran*
> 
> Before doing the MSI volt mod, have people used KGB Or Keplar Tweaker before using the MSI mod?


yeah mate, KGB + MSI UWINDER trick : everything is fine for me

Cheers


----------



## lascar

Quote:


> Originally Posted by *Proxish*
> 
> Can someone help me out with my overclock.
> I just had another reading and overclocking session, trying to figure out what I'm doing wrong and why I'm not achieving the overclock other people are.
> 
> With a PT of more than 100%, everything crashes at an overclock above +50 +50.
> I just finished an hour overclock session and ended with PT 100% CC+120 MC+600.
> 
> Is my PSU killing my overclock here?
> While using Valley at 1080p, I got 83fps with an overclock of 135% +50 +0. And got 85.8fps with 100% +120 +600.
> How does this make any sense?
> 
> I'm using Driver 331.82
> 
> Here are my overclocking session results in order of
> Test number / FPS / Power Target / Core Clock / Mem Clock
> 
> 1/78.4 - Stock
> 2/81.3 - 135%
> 3/83 - 135% +50
> 4/Fail - 135 +100
> 5/81.1 - 100% +100
> 6/Fail - 100% +150
> 7/81.7 - 100% +120
> 8/Fail - 110% +120
> 9/Fail - 110% +120
> 10/81.4 - 100% +120
> 11/Fail - 105% +120
> 12/79.6 - 100% +0 +200
> 13/80.2 - 100% +0 +300
> 14/81 - 100% +0 +500
> 15/80.9 - 100% +0 +600
> 16/85.8 - 100% +120 +600


Power Target PT : always maxed out when overclocking, it will help normalize the TDP and the speed of your gpu's.

the fact is PT should NOT make your GFX crash

start overclocking your GPU cores speed and then the memory GDDR5 for best results.

--> GS 700 corsair should be ok ( at stock speed)

Bad GFX ? Bad PSU , to weak on +V12 ? not Enough AMPs ? could be ...

For a system using a single GeForce GTX 690 graphics card NVIDIA specifies a minimum of a 650 Watt or greater power supply that has a maximum combined +12 Volt continuous current rating of 46 Amps or greater and that has at least two 8-pin PCI Express supplementary power connectors.

For a system using two reference clocked GeForce GTX 670 graphics cards in 2-way SLI mode NVIDIA specifies a minimum of a 700 Watt or greater power supply that has a maximum combined +12 Volt continuous current rating of 45 Amps or greater and that has at least four 6-pin PCI Express supplementary power connectors.

Total Power Supply Wattage is NOT the crucial factor in power supply selection!!! Total Combined Continuous Power/Current Available on the +12V Rail(s) rated at 50°C ambient temperature, is the most important factor.

The Corsair GS700 , with its maximum combined +12 Volt continuous current rating of 56 Amps and with four (6+2)-pin PCI Express supplementary power connectors, is way more than sufficient to power your system configuration with a single GeForce GTX 690 graphics card even with a little O/C (???)

it will also depend on the Vcore of the CPU , and other factors...


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> yeah mate, KGB + MSI UWINDER trick : everything is fine for me
> 
> Cheers


What driver do you use and what do you find are the best drivers for overclocking and gaming?


----------



## Proxish

Quote:


> Originally Posted by *lascar*
> 
> Power Target PT : always maxed out when overclocking, it will help normalize the TDP and the speed of your gpu's.
> 
> the fact is PT should NOT make your GFX crash
> 
> start overclocking your GPU cores speed and then the memory GDDR5 for best results.
> 
> --> GS 700 corsair should be ok ( at stock speed)
> 
> Bad GFX ? Bad PSU , to weak on +V12 ? not Enough AMPs ? could be ...
> 
> For a system using a single GeForce GTX 690 graphics card NVIDIA specifies a minimum of a 650 Watt or greater power supply that has a maximum combined +12 Volt continuous current rating of 46 Amps or greater and that has at least two 8-pin PCI Express supplementary power connectors.
> 
> For a system using two reference clocked GeForce GTX 670 graphics cards in 2-way SLI mode NVIDIA specifies a minimum of a 700 Watt or greater power supply that has a maximum combined +12 Volt continuous current rating of 45 Amps or greater and that has at least four 6-pin PCI Express supplementary power connectors.
> 
> Total Power Supply Wattage is NOT the crucial factor in power supply selection!!! Total Combined Continuous Power/Current Available on the +12V Rail(s) rated at 50°C ambient temperature, is the most important factor.
> 
> The Corsair GS700 , with its maximum combined +12 Volt continuous current rating of 56 Amps and with four (6+2)-pin PCI Express supplementary power connectors, is way more than sufficient to power your system configuration with a single GeForce GTX 690 graphics card even with a little O/C (???)
> 
> it will also depend on the Vcore of the CPU , and other factors...


Thanks for all the advice.
I've ran a few more tests:

16/85.8 - 100% +120 +600
17/82.6 - 135% +50 +0
18/82.7 - 135% +60 +0
19/83.2 - 135% +70 +0
20/83.5 - 135% +80 +0
21/Fail - 135% +90 +0
22/Fail - 135% +0 +200
23/81.6 - 135% +0 +150
24/84.3 - 135% +80 +150

Even though I had a higher fps at 100% +120 +600, I've went with 135% +80 +150.


----------



## V3teran

Ok good news!
I managed to get the core clock upto 1241mhz fully stable in Heaven and 3D Mark 11.

I tried for 1256 but was unstable in Heaven, i did not try 3D mark 11.
I used the MSI Voltmod which enabled to go to 1.3v. I cannot go any higher on the 690, would have been nice to try 1.31 as the 690 only maxxes out at 50 degree temps with this waterblock.

When my ThermalPoly pads come ill try for higher on the core.
Overall im pleased with the result.
PT..150.
Core..190
Mem..0

https://imageshack.com/i/eufd7ap


----------



## lascar

good news


----------



## iandroo888

hi everyone. so i just recently borrowed a GTX690 from a friend. but when i put it into my system, nothing shows up. computer still boots into windows (i can hear windows start sound) and i can press windows key, right arrow, and press enter to shut down...

the video card's fan spins up on high and the GEFORCE GTX is lit. comp its being used in specs are in sig.

ive tried updating mobo bios to latest... ive tried disabling high precision timing and changing the link to GEN2... still no luck... any ideas?!? D: i really wanna get this working ! havent played any of the new games on higher quality >


----------



## V3teran

Is your dvi cable properly secured to both card and monitor?
Try another dvi port on the card to rule out faulty one.
Try another gpu in your pc if you can and see if that works


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> good news


By leaving on voltage as auto setting now will MSI still use 1.3v while on Auto voltage control?


----------



## iandroo888

Quote:


> Originally Posted by *V3teran*
> 
> Is your dvi cable properly secured to both card and monitor?
> Try another dvi port on the card to rule out faulty one.
> Try another gpu in your pc if you can and see if that works


yes. the computer runs perfectly fine with the gtx260

tried all ports.


----------



## V3teran

I presume you have a powerful enough psu?
Try card in another pci-e slot.


----------



## iandroo888

Quote:


> Originally Posted by *V3teran*
> 
> I presume you have a powerful enough psu?
> Try card in another pci-e slot.


TX850W corsair. should be enough? tried different slot. doesnt work either.


----------



## Alex132

Quote:


> Originally Posted by *iandroo888*
> 
> Quote:
> 
> 
> 
> Originally Posted by *V3teran*
> 
> I presume you have a powerful enough psu?
> Try card in another pci-e slot.
> 
> 
> 
> TX850W corsair. should be enough? tried different slot. doesnt work either.
Click to expand...

more than enough power


----------



## iandroo888

Quote:


> Originally Posted by *Alex132*
> 
> more than enough power


yeah thats what i thought. even tried different pci-e cables (thinking ones being used may have been a little abused?) ...

still to no avail *sigh*


----------



## Buzzkill

Quote:
Originally Posted by *iandroo888* 

hi everyone. so i just recently borrowed a GTX690 from a friend. but when i put it into my system, nothing shows up. computer still boots into windows (i can hear windows start sound) and i can press windows key, right arrow, and press enter to shut down...

the video card's fan spins up on high and the GEFORCE GTX is lit. comp its being used in specs are in sig.

ive tried updating mobo bios to latest... ive tried disabling high precision timing and changing the link to GEN2... still no luck... any ideas?!? D: i really wanna get this working ! havent played any of the new games on higher quality >

Did you uninstall driver with the 260 then reinstall the driver with the 690? That could be your problem.


----------



## iandroo888

Quote:


> Originally Posted by *Buzzkill*
> 
> Did you uninstall driver with the 260 then reinstall the driver with the 690? That could be your problem.


but wouldnt POST at least show? if not windows correctly? problem im having at this moment is NO DISPLAY is being shown. no post. no windows. nothing. my monitor wont even get out of standby D:


----------



## Buzzkill

Quote:


> Originally Posted by *iandroo888*
> 
> but wouldnt POST at least show? if not windows correctly? problem im having at this moment is NO DISPLAY is being shown. no post. no windows. nothing. my monitor wont even get out of standby D:


You have your monitor plugged into the right DVI or DVI-D


http://www.geforce.com/hardware/technology/3dvision-surround/system-requirements


----------



## iandroo888

Quote:


> Originally Posted by *Buzzkill*
> 
> You have your monitor plugged into the right DVI or DVI-D
> 
> 
> http://www.geforce.com/hardware/technology/3dvision-surround/system-requirements


using only 1 monitor and ive tried the other ports


----------



## MrTOOSHORT

pci-E switch in the on position, just had to ask, never know!


----------



## Buzzkill

Quote:


> Originally Posted by *iandroo888*
> 
> using only 1 monitor and ive tried the other ports


I would uninstall GeForce Driver with 260 then Install 690 let windows install generic driver then Install the GeForce WHQL or Beta driver you want to use.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Buzzkill*
> 
> I would uninstall GeForce Driver with 260 then Install 690 let windows install generic driver then Install the GeForce WHQL or Beta driver you want to use.


But there is no post with the screen, that's before the driver would load.


----------



## Buzzkill

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> But there is no post with the screen, that's before the driver would load.


That's why you have to uninstall the driver with the 260


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Buzzkill*
> 
> That's why you have to uninstall the driver with the 260


What I'm saying is that there is no screen visual right after the press of the power button, no bios screen. Right before windows loads, the driver loads. So it's not the driver that's the issue.


----------



## Buzzkill

When I setup Quad sli I would have a black screen if you had the monitor plugged into the wrong card. So I had to install one 690 then install driver then install second 690 and sometime's the driver would need to be installed again. I also had the computer boot with out the Post Screen, Raid or Bios but when windows started the monitor would work. I could not stand not being able to go into the Bios so I made sure that the Post and Bios screen show every time my systems starts.

If you change/upgrade cards without uninstalling and reinstalling the driver system might not post.

I know he is trying to just install one 690 but I think the system thinks the 260 is installed when he pulled the 260(2008) out and stuck the 690(2012) in causing the driver to malfunction causing no picture when system boots. The installation instructions from Nvidia/Evga/Asus/Gigabyte/Pny/Plait/Msi/Zotac all says to remove driver before upgrade.

http://www.nvidia.com/object/IO_13955.html


*Driver files should always be uninstalled before updating to newer drivers or when removing an older NVIDIA card and replacing it with a newer card. To uninstall your current NVIDIA Display Drivers from your system, follow these steps:*


Problem: The computer will only display a black screen after it boots into Windows.

Cause: The integrated video adapter has not been disabled prior to installing the new graphics card and/or not
having uninstalled the display drivers from your previous display adapter.

Solution: Ensure the integrated video adapter has been disabled through your motherboard's BIOS or the
Windows Device Manager prior to installing the new graphics card, and that you have uninstalled the drivers
through the Windows Control Panel. After the new card has been installed, connect the monitor cable and if
required, any supplemental power adapters to the new card before booting up the system.


----------



## iandroo888

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> What I'm saying is that there is no screen visual right after the press of the power button, no bios screen. Right before windows loads, the driver loads. So it's not the driver that's the issue.


correct !


----------



## Cylas

This could be a compatibility issue, ASUS and EVGA release a PLX and Bios update that resolves many compatibility issues as well as some performance enhancements.
Go to this post and compare your PLX Version with the version on the picture (use another pc for this).


----------



## Marcsrx

Hi folks, I've got a question. Trying to determine if the FPS gain from putting the 690 under-water is worth the cost. In reading through this thread I have been trying to see what "gains" putting the card under-water has provided in an OC'd setup. Can anyone chime in and provide a little cost/benefit analysis?


----------



## V3teran

Quote:


> Originally Posted by *Cylas*
> 
> *@tin0*, Create a shortcut with this line .. and add it to your autostart folder
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /sg0 /wi3,20,de,00 /sg1 /wi3,20,de,00"
> 
> This line disables the VDrop, but it adds 0,025V to your Settings.
> 
> Settings with +200 Core and +300 Mem.


Hey Cylas i made a shortcut and placed it in my start folder with the settings that you have put here.
I only get 1.3v max but shouldnt i be getting 1.3.25 according to this?
I wont go past 1.3v or is this normal?


----------



## lascar

You won't in any case

He was talking about Vdropping...


----------



## iandroo888

Quote:


> Originally Posted by *Cylas*
> 
> This could be a compatibility issue, ASUS and EVGA release a PLX and Bios update that resolves many compatibility issues as well as some performance enhancements.
> Go to this post and compare your PLX Version with the version on the picture (use another pc for this).


was that directed to me? cuz both mobo and card is asus o.o


----------



## V3teran

Quote:


> Originally Posted by *Marcsrx*
> 
> Hi folks, I've got a question. Trying to determine if the FPS gain from putting the 690 under-water is worth the cost. In reading through this thread I have been trying to see what "gains" putting the card under-water has provided in an OC'd setup. Can anyone chime in and provide a little cost/benefit analysis?


Just put mine underwater, bought a custom waterblock second hand from ebay and it works great. I know play all my games at 1241mhz on the core, only game i get instability with this oc is BF4 so i turn it down for that to 1202mhz which is 150+ on the core. To me its well worth it but only if you can get the block cheap and also if you adding to an existing waterloop like i was. If you going to do from scratch i wouldnt bother until new maxwell or even Haswell e. Good thing about watercooling is that the radiators, pump and reservoir go with you from build to build so its kinda future proof. You just need to purchase a block as i have.


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> You won't in any case
> 
> He was talking about Vdropping...


So its like LLC where it spikes the voltage to keep the oc stable?
Do you have this setup on yours?


----------



## Marcsrx

Quote:


> Originally Posted by *V3teran*
> 
> Just put mine underwater, bought a custom waterblock second hand from ebay and it works great. I know play all my games at 1241mhz on the core, only game i get instability with this oc is BF4 so i turn it down for that to 1202mhz which is 150+ on the core. To me its well worth it but only if you can get the block cheap and also if you adding to an existing waterloop like i was. If you going to do from scratch i wouldnt bother until new maxwell or even Haswell e. Good thing about watercooling is that the radiators, pump and reservoir go with you from build to build so its kinda future proof. You just need to purchase a block as i have.


So you are talking about a ~20% increase in mhz when underwater. What does that translate into in terms of FPS? Also I have an H100 on the CPU which is more than enough considering I haven't even OC'd it yet. I'd do a custom loop just for the GPU and then change blocks as I move to the next card a few years down the road.

Again trying to figure out if its worth it. If its 20% increase in FPS that will bring a lot more useful life out of the card.









Anyone else have any thoughts?


----------



## V3teran

Quote:


> Originally Posted by *Marcsrx*
> 
> So you are talking about a ~20% increase in mhz when underwater. What does that translate into in terms of FPS? Also I have an H100 on the CPU which is more than enough considering I haven't even OC'd it yet. I'd do a custom loop just for the GPU and then change blocks as I move to the next card a few years down the road.
> 
> Again trying to figure out if its worth it. If its 20% increase in FPS that will bring a lot more useful life out of the card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else have any thoughts?


Well i couldnt run on air at 1150mhz in all games as they got hot and unstable even when they didnt get hot.
I dont know what the fps increase is but its worth it just for the quietness.


----------



## lascar

back to home sweet home ... 2 O clock in the morning !

Wish u all a merry Christmas to begin with !

You ask me about the feature, voltage tweaking for the 690 :

just an example :

200/650 : core / GDDR5
I set my voltage to 1.275V : it will Vdrop to 1.260/1.250 ingame so far i can't prevent this, either Unwinder do.

If you set to AUTO : the last voltage setting you set earlier is kept ... vcore will flap between 1.1xxV and 1.275V (with Vdrop)

one Gpu can easily handle 1300 mhz and the other one between 1274 mhz and going down to 1110 (+50 setting for me) under very strong workload

so i set a lower resolution ( 3200x1800p > 2880x1620p > 2560x1440p) and the second gpu will stay at 1274 still if not lower the voltage and the speed of the gpu's.

so The power target and Nvidia OCP is pure ****







...

your last statement :
1202 / 1241 ... + 0.38 FPS gained ? keep your money and start playing !









to put in a nutshell v dropping force you to add more voltage , vcore basically to sustain high workload ...
obviously it will warm more the vrm circuitry and your gfx will start sooner to throttle : PT will kick in... and **** comes in!

the volterra chip is vdropping that's a fact

maybe if we can managed to force constant voltage without adding vdropping it will help for sure all of us ... so Back to the future ?!!

the unwinder trick via the voltera VRM chip is possible on the gtx 690 but guess what nvdia has putted a cheap vrm circuitry on the gfx : 8+2 phases

4 + 4 for the to gpu and 1+1 for the ram

so basically when overvolting there is a HUGE workload on the VRM and they were
not designed to support such workload for a long period of time..

"good OC bad oc ..."
.... silicon lottery as i said earlier ... lifespan is the same deal ...

all your game must run at the exact speed on the average to have the best performance : if not lower your graphic settings or your overclocking

it's worth it ? or not , the costs and if you do it wrong with not sufficient cooling it will shorten the lifespan of your gfx for sure ... for benching it's ok ... for gaming stay at 150+ ...

the fact is nvidia as put some locks in the driver, unwinder via AB has bypassed nvida OCP

Nvidia claimed it was to prevent to much RMA, ... My POV is that they already know that the GK104 was the best game gpu the firm have done so far : best core ratio x3 compared to previous 580 gtx , and not "greedy" gpu and it can overclock
well so they decide to prevent this... Nvidia banned Voltage modification.

or it was on other purposes ...

and that a fact the 690 gtx is still the fastest card "occed" or not , maybe the 780 ti kigping editon will be the next king of the Hill !

Je vous souhaite bonne nuit cher Monsieur ...


----------



## Code187

Never mind, jcde7ago is apparently not monitoring this thread.


----------



## skyn3t

Happy new year for everyone









skyn3t wishes the best for you and family.


----------



## tin0

Best wishes right back atcha!









To confirm Lascars post. Running overvolted anywhere from 1.21v to 1.3v with or without LLC mod will result in a lower 3dmark score possibly due to vrm throttling.
Example: running stock volts with both cores @ 1215Mhz gives me a graphics score in 3dmark11 of ~21.900 which is nice. Touching voltages and running 1228Mhz to 1300Mhz results in a graphics score of ~16.500 due to heavy throttling (doesn't matter of voltage is set to 1.21v, 1.275v or even 1.3v) while the card never reaches temps above 48 degrees under water.
Both running with stock bios but with 150% Power Limit.
So I''ve gone back to just upping the core clock without changing voltages to 1202Mhz for 24/7 use, under water. Ergo; days of fiddling with voltage and LLC didn't get me anywhere (and I know what I'm doing)

Biggest problem is the crippled Power Limit in my opinion. We need a higher Power Limit above 150% for sure.


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> back to home sweet home ... 2 O clock in the morning !
> 
> Wish u all a merry Christmas to begin with !
> 
> You ask me about the feature, voltage tweaking for the 690 :
> 
> just an example :
> 
> 200/650 : core / GDDR5
> I set my voltage to 1.275V : it will Vdrop to 1.260/1.250 ingame so far i can't prevent this, either Unwinder do.
> 
> If you set to AUTO : the last voltage setting you set earlier is kept ... vcore will flap between 1.1xxV and 1.275V (with Vdrop)
> 
> one Gpu can easily handle 1300 mhz and the other one between 1274 mhz and going down to 1110 (+50 setting for me) under very strong workload
> 
> so i set a lower resolution ( 3200x1800p > 2880x1620p > 2560x1440p) and the second gpu will stay at 1274 still if not lower the voltage and the speed of the gpu's.
> 
> so The power target and Nvidia OCP is pure ****
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> your last statement :
> 1202 / 1241 ... + 0.38 FPS gained ? keep your money and start playing !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> to put in a nutshell v dropping force you to add more voltage , vcore basically to sustain high workload ...
> obviously it will warm more the vrm circuitry and your gfx will start sooner to throttle : PT will kick in... and **** comes in!
> 
> the volterra chip is vdropping that's a fact
> 
> maybe if we can managed to force constant voltage without adding vdropping it will help for sur all of us ... so Back to the future ?!!
> 
> the unwinder trick via the voltera VRM chip is possible on the gtx 690 but guess what nvdia has putted a cheap vrm circuitry on the gfx : 8+2 phases
> 
> 4 + 4 for the to gpu and 1+1 for the ram
> 
> so basically when overvolting there is a HUGE workload on the VRM and they were
> not designed to support such workload for a long period of time..
> 
> "good OC bad oc ..."
> .... silicon lottery as i said earlier ... lifespan is the same deal ...
> 
> all your game must run at the exact speed on the average to have the best performance : if not lower your graphic settings or your overclocking
> 
> it's worth it ? or not , the costs and if you do it wrong with not sufficient cooling it will shorten the lifespan of your gfx for sure ... for benching it's ok ... for gaming stay at 150+ ...
> 
> the fact is nvidia as put some locks in the driver, unwinder via AB has bypassed nvida OCP
> 
> Nvidia claimed it was to prevent to much RMA, ... My POV is that they already know that the GK104 was the best game gpu the firm have done so far : best core ratio x3 compared to previous 580 gtx , and not "greedy" gpu and it can overclock
> well so they decide to prevent this... Nvidia banned Voltage modification.
> 
> or it was on other purposes ...
> 
> and that a fact the 690 gtx is still the fastest card "occed" or not , maybe the 780 ti kigping editon will be the next king of the Hill !
> 
> Je vous souhaite bonne nuit cher Monsieur ...


Quote:


> Originally Posted by *tin0*
> 
> Best wishes right back atcha!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To confirm Lascars post. Running overvolted anywhere from 1.21v to 1.3v with or without LLC mod will result in a lower 3dmark score possibly due to vrm throttling.
> Example: running stock volts with both cores @ 1215Mhz gives me a graphics score in 3dmark11 of ~21.900 which is nice. Touching voltages and running 1228Mhz to 1300Mhz results in a graphics score of ~16.500 due to heavy throttling (doesn't matter of voltage is set to 1.21v, 1.275v or even 1.3v) while the card never reaches temps above 48 degrees under water.
> Both running with stock bios but with 150% Power Limit.
> So I''ve gone back to just upping the core clock without changing voltages to 1202Mhz for 24/7 use, under water. Ergo; days of fiddling with voltage and LLC didn't get me anywhere (and I know what I'm doing)
> 
> Biggest problem is the crippled Power Limit in my opinion. We need a higher Power Limit above 150% for sure.


Thankyou for the explanation guys and a very happy new year too you!


----------



## Infinite Jest

Do you guys think this card is still worth buying if a good deal can be had for it? Has anyone had issues with micro-stuttering or hitting the VRAM ceiling often?
'


----------



## V3teran

Quote:


> Originally Posted by *Infinite Jest*
> 
> Do you guys think this card is still worth buying if a good deal can be had for it? Has anyone had issues with micro-stuttering or hitting the VRAM ceiling often?
> '


No issues with Microstuttering as the 690 has hardware based frame metering technology so its actually smoother than 2x 680s in sli.
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-690/8/

I do not hit the vram limit when using anti aliasing from most games but when i use Nvidia inspector i can hit this limit easily.
Also depends on what resolution your playing at, i play at 2560*1440 so i could do with atleast 3gb of vram but tbh i max out on most games at aroun 1.5gb so the card is still ok for now.
Lower resolution than this should see you good or just get a 780TI with 3gb ram.


----------



## lascar

RBBY volt mod give it a try !









http://www.overclock.net/t/1398725/unlock-afterburner-limits-on-lots-of-cards-some-with-llc-ab-b-18

don't want to loose the battle againt nvida : keep looking for best volt mod without VDrop









Stay tuned M8


----------



## noob.deagle

Quote:


> Originally Posted by *Infinite Jest*
> 
> Do you guys think this card is still worth buying if a good deal can be had for it? Has anyone had issues with micro-stuttering or hitting the VRAM ceiling often?
> '


i have never experienced micro stutters however i have hit the VRAM ceiling pretty often. Crysis 3 will hit it vram limit, battlefield 4 also hits this limit.

having said that dropping some AA settings resolves this completely and since ive moved to windows 8.1 most of the stuttering in at least bf4 has completely disappeared despite it being locked at 2gb usage. (i assume this has something to do with the DX upgrade in 8.1).

having said that if i could sell my 690 i would and now i believe you would be better off looking at the custom cooled AMD r9 290 range. performance is close to the 690 vram limit is double and with mantle being used for lots of upcoming high profile games amd looks like a winner atm. if your thinking of buying something wait until some mantle benchmarks then make your call.

BUT even at 1080p the 2gb vram limit is an issue especially on a card this powerful as your limited by the vram not the cards raw power which is sad.


----------



## lascar

if the card was shipped with 4Gb of GDDR5, no one would buy the titan , the 780gtx ,the "GHZ" and "ti"edition .... that's a fact and nvidia already knew that









Vram limitation and voltage tunning banned are the key of success ....









sad world .... consumerist society ...


----------



## virgis21

Yes, sad part of VRAM.
Actually, then I bought GTX690, somehow I just thought it will have 4Gb of VRAM and was thinking of enough for long time. Since I got it, realised that is not 4, it is 2x2








So how are you guys satisfied with Battlefield 4 on all Ultra?







and GPU-z shows about 1900Mb utilization of VRAM?
Where can I sell mine?









Virgis


----------



## delacruzpaolo19

Has anyone tried the GeForce 332.21 driver here?
what's your in game performance?
is there anyone having issues with their GTX 690 using this driver?

any reply would be much appreciated
-thanks


----------



## Cylas

The 332.23 has also received a few revisions of the boost behavior. This is not mentioned in the changelog. My 690GTX GPU1 is now throttling also in Games like BF4 or AC4! otherwise only in 3DMark! (Monitor is on GPU2)


----------



## delacruzpaolo19

Quote:


> Originally Posted by *Cylas*
> 
> The 332.23 has also received a few revisions of the boost behavior. This is not mentioned in the changelog. My 690GTX GPU1 is now throttling also in Games like BF4 or AC4! otherwise only in 3DMark! (Monitor is on GPU2)


where did you get the 332.23?


----------



## Marcsrx

Bought the ARCTIC Accelero TT 690 VGA Cooler (Only $49.99 on Newegg) and Windows 8.1

We shall see how DX 11.2 + better cooling affect BF4 FPS. I also bought 3 1 TB drives for a RAID 5 array. Should be fun to button it all up next week!


----------



## virgis21

Do I have to migrate from win 7 to win 8.1 to get more performance from my GTX 690? or that difference doesn't worth whole reinstall/redownload tons of GB ?

Virgis


----------



## mtbiker033

Quote:


> Originally Posted by *delacruzpaolo19*
> 
> Has anyone tried the GeForce 332.21 driver here?
> what's your in game performance?
> is there anyone having issues with their GTX 690 using this driver?
> 
> any reply would be much appreciated
> -thanks


I installed it the day it was released and have no problems with any games I play (RO2, CS GO, Metro LL). 3dmark score was very similar to my previous with 331.82


----------



## Marcsrx

Quote:


> Originally Posted by *virgis21*
> 
> Do I have to migrate from win 7 to win 8.1 to get more performance from my GTX 690? or that difference doesn't worth whole reinstall/redownload tons of GB ?
> 
> Virgis


The biggest help would be DX 11.2

It has DirectX tiled resources but from what I gather it will require GFX drivers to be written to support it. Should help w/the VRAM issue as it uses system memory in addition to VRAM. Should be something of a game changer as far as video memory goes.

Here's an interesting video describing it:

http://channel9.msdn.com/Blogs/Windows-Blog/Tiled-Resources-for-DirectX-in-Windows-81

Even better video:


----------



## delacruzpaolo19

Quote:


> Originally Posted by *mtbiker033*
> 
> I installed it the day it was released and have no problems with any games I play (RO2, CS GO, Metro LL). 3dmark score was very similar to my previous with 331.82


thanks for reps mtbiker033
have you tried BF3 and BF4?
is there no issues with our overclocked 690's?


----------



## killbom

How is that cooler working out for you







?


----------



## lascar

http://www.overclock.net/t/1425102/updated-ab-b18-team-skyn3ts-unlocked-ncp4206-voltage-llc-mod-tool

Team Skyn3t presents: Unlocked NCP4206 Voltage / LLC mod tool


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> http://www.overclock.net/t/1425102/updated-ab-b18-team-skyn3ts-unlocked-ncp4206-voltage-llc-mod-tool
> 
> Team Skyn3t presents: Unlocked NCP4206 Voltage / LLC mod tool


So what is this tool used for, is it worth using?

Also this no longer works with afterburner 2.3.1
[Settings]
VDDC_Generic_Detection = 0
VDDC_NCP4206_Detection = 3:20h
I cannot get it too work so i rolled back to a beta.


----------



## lascar

Disabling LLC will prevent Vdropping !









bypassing the 1.175V limit using unwinder trick you'll have to put more juice to sustain a specific speed

Now u can lower the vcore without vdropping : the card will throll later, and maybe not at all.

i have won + 50 mhz and - 0.0250V ( - 250mV)


----------



## propeldragon

i did this with my 690 but only gpu 1 is changing voltage. gpu 2 stays at 1.175. help?


----------



## FiShBuRn

Well with that MOD (Unlocked NCP4206 Voltage / LLC mod tool) i got the same problem as with modded bios, PT limit! The only thing i do is disable LLC.
My stable clocks are 1215 (sometimes 1228 on gpu2) and 7200 on memory.


----------



## ricardovix

People, eVGA GTX 690 supports 1440p resolution through mini displayport connector or only through DVI Dual Link???

Thanks!


----------



## nickcnse

Hey guys, just purchased a gtx 690 from craigslist with an xspc waterblock. It was sold as broken and I'm hoping it ends up working but if not I need to attempt to RMA it. My question comes in two parts: where can I get a stock cooler for the gtx 690? And is there a replacement screw package sold somewhere so that I can remount the stock cooler? Thanks in advance.

If anyone has the screws and stock cooler that would be willing to sell let me know!


----------



## killbom

Quote:


> Originally Posted by *ricardovix*
> 
> People, eVGA GTX 690 supports 1440p resolution through mini displayport connector or only through DVI Dual Link???
> 
> Thanks!


Works perfectly over mini displayport for me


----------



## ericf1z

Hey guys,

I could really use your help. I got a "broken" GTX 690 for free and it seems so close to working, but it always crashes. This is an NVIDIA refrence board, so no chance of RMA'ing it. While playing any sort of graphics intensive game, the card will randomly stop outputting video. You can hear the game still running in the background, but just the card stopped displaying. When this happens, you can also hear the card fan spin way down. I have never seen it run higher than 84c, which I believe is in spec. I have tried the card on multiple machines and power supplies ranging from 650w (the min for this card) up to a 1000 watt PSU. The card will successfully complete a pass of 3dmark tests, but then have the problem the next time you try and run it. I have attempted to flash many different BIOS' from ASUS, EVGA and stock nvidia, along with the PLX and still see the same issues.

My question to you guys, is this a known issue with this card which could be fixed with some BIOS tweaks? Would down-clocking the memory or lowering voltage or improving cooling resolve this problem? I have tried this type of stuff with MSI after burner without luck. I also tried throwing it in the oven as a last resort. The card still works as well as it did before the bake, still displaying the exact same issues.

Any help is appreciated guys.


----------



## NotReadyYet

I can get a 690 now for $800...debating if I should just sell my 7970 and pull the trigger on it? I'd be shelling out over $400 once I sell my 7970.

I'd get another 7970 but I don't think my 750X Seasonic will take it, besides I havnt heard good thing about x-fire...thoughts?>


----------



## ericf1z

Quote:


> Originally Posted by *NotReadyYet*
> 
> I can get a 690 now for $800...debating if I should just sell my 7970 and pull the trigger on it? I'd be shelling out over $400 once I sell my 7970.
> 
> I'd get another 7970 but I don't think my 750X Seasonic will take it, besides I havnt heard good thing about x-fire...thoughts?>


not worth it. gtx 790 is right around the corner along with new titan black. Prices will drop own big time....


----------



## ericf1z

Quote:


> Originally Posted by *ericf1z*
> 
> Hey guys,
> 
> I could really use your help. I got a "broken" GTX 690 for free and it seems so close to working, but it always crashes. This is an NVIDIA refrence board, so no chance of RMA'ing it. While playing any sort of graphics intensive game, the card will randomly stop outputting video. You can hear the game still running in the background, but just the card stopped displaying. When this happens, you can also hear the card fan spin way down. I have never seen it run higher than 84c, which I believe is in spec. I have tried the card on multiple machines and power supplies ranging from 650w (the min for this card) up to a 1000 watt PSU. The card will successfully complete a pass of 3dmark tests, but then have the problem the next time you try and run it. I have attempted to flash many different BIOS' from ASUS, EVGA and stock nvidia, along with the PLX and still see the same issues.
> 
> My question to you guys, is this a known issue with this card which could be fixed with some BIOS tweaks? Would down-clocking the memory or lowering voltage or improving cooling resolve this problem? I have tried this type of stuff with MSI after burner without luck. I also tried throwing it in the oven as a last resort. The card still works as well as it did before the bake, still displaying the exact same issues.
> 
> Any help is appreciated guys.


bottom of the last page.


----------



## lascar

try to run these :

FURMARK for testing stability purposes and VRM issues








http://www.ozone3d.net/benchmarks/fur/index.php?lang=1

STANFORD CUDA memtest ----> test faulty GDDR5








http://www.techsupportforum.com/forums/f256/gpu-memory-test-438710.html

DL : https://simtk.org/home/memtest/

" MemtestG80 and MemtestCL are a software-based testers to test for "soft errors" in GPU memory or logic for NVIDIA CUDA-enabled or OpenCL-enabled (of any manufacturer) GPUs. They use a variety of proven test patterns (some custom and some based on Memtest86) to verify the correct operation of GPU memory and logic. They are useful tools to ensure that given GPUs do not produce "silent errors" which may corrupt the results of a computation without triggering an overt error.


----------



## Marcsrx

So this Arctic Accelero rules! Hardly hit 70 C on the hottest core during BF4! I was previously hitting as high as 78...


----------



## lascar

1268 mhz @ 88C PT150% : 1.24V









Artic rulez indeed


----------



## philsrb

Hello everybody,

I am now officially an owner of a working 690 gtx. Purchased one couple of months ago for acustom rig, and just finished the build very late. Then i realized card was not working properly. It is a asus 690 gtx with a xspc watercooler and a custom loop.
Thanks to a very nice user from hardwareluxx he got it working. Seems the problem was that it had the uefi bios with 2 different bioses on it. Now he got it working with standard bios from evga....
As of now, i would like to get out everything i can from this card by overclocking it. With afterburner, i could get 1228 stable with no voltage mods whatsoever.
Now since i havent been into gaming for many years, i really am a little bit out of ideas on where to start... Have played a bit with keplertweaker and flashing but for now, i am stuck with max voltage of 1.175
Can anyone give me advice or maybe make a custom bios for me? This friend of mine who fixed it told me to ask if anyone could check the bios since he didnt exactly figure out what was the problem. I COULD NOT Flash skyn3ts custom bios files on it because of an invalid board id...even with override. One of my cards has sp8, the other sp16(same board,not 2 690gtx).


Update:

I have modded my bios to 1.21V 175 power limit and getting stable results at 1202 mhz and 3499mhz for memory...
If i go beyond 1202 mhz it will not actually raise my core clock (at least thats wat afterburner osd shows).
Why is that?


----------



## Mms60r

I have been looking at these 690 cards on ebay for a new build I'm starting. I can probably get one for under $700. Right now I have 7970's crossfire and a single 2560x1440 display. My goal is to get as close as possible to the performance of the crossfire setup using a single card in the itx case (yes it fits). I guess my question is do you think I'd be wiser to go with a 780ti ($700) or R9 290x ($550) The card will be fully water cooled so OC is a definite option. Truthfully I am really leaning toward the GTX. Thanks for any thoughts you can share.


----------



## provost

Here are mine.







I have had these for a while. Love these cards, but I think I may have too many gpus at this point....lol

http://s1364.photobucket.com/user/provostelite/media/image_zpsca7d5128.jpg.html


----------



## FiShBuRn

WOW thats a lot of power over there







very nice, they should have done this gfx with 3GB VRAM...


----------



## provost

I know people keep harping about VRAM, but I have been using these since I am doing a rebuild with my Titans for surround 120 hz/3d. 690s have been great for all games on my 1440p, and asus 120/140 hz. I have multiple Titans for surround, and I can tell you whatever magic Nvidia did with 690s as far as dual gpu sli goes, I have been very pleased with it. I have never had an issue with quads either. May be this one of the reasons I have not been motivated to finish my muti Titan build, other than being busy at work (or may be it is my excuse for being lazy.







).


----------



## Lagpirate

Hi guys, I figured this was the best place to ask..
im currently debating on selling my 690 for some extra fundage for the new 790, or keeping my 690 and watercooling it. I only play at 1080P.

What do you guys think?


----------



## nickcnse

Hey guys, was hoping to join the gtx 690 club but my purchase (used) card isn't displaying video. I'm hoping to RMA the card once I reinstall the stock cooler but unfortunately the original screws for the cooler were misplaced by the original owner. Does anyone have an idea of what size and type of screws they are? I heard somewhere that they are a torxs t5 screw but how long are they? For your reference it is an EVGA card. Thanks for the help!


----------



## superx51

http://www.xbitlabs.com/news/graphics/display/20140122231145_Nvidia_Readies_Dual_Chip_Flagship_Graphics_Card_with_Two_GK110_GPUs.html


----------



## PinzaC55

Just joined the Quad SLI club having obtained a second 690 already watercooled on ebay! I am surprised at the high prices being quoted stateside for 690's - one went a couple of weeks ago on Ebay for the opening bid of £399.95. (used of course, and stock).


----------



## Lynkdev

Is the modded bios on the first page still the best?

Also using precision, i have power target set to 135, gpu 145, and mem at 200. Do i need to adjust the voltage setting higher to accommodate this or is it done auto? I set it to the max at 1.150.


----------



## Mms60r

Well I Just bought a brand new Asus GTX 690 for $620 shipped. Looking forward to joining the club. I read a lot of posts scaring me about the 2Gb of vram for my 2560x1440 Display. Also how well do these cards OC? I got a full EK water block to with it.


----------



## provost

Here is my take, as far as Nvidia cards go, I have multiple Titans, 780 KPE and these cards. I would be lying if I said that I have not really enjoyed the 690s, and these are some of the best cards I have owned. I have these in my back up rig now, and I have not had any issues with games on 1440p.
Watercooled 690 will be a decent clocker, but these clock just well on air cooling too. 1200 is probably average. Again, not sure if Nvidia showed this card a lot of love when developing it.







, but they are still very sweet, and probably the best of the best demonstration of the Kepler architecture, developed from the ground up.


----------



## xXx1990

Ok so before I get yelled at I know that my case is a mess... Im planning on getting a new case soon so everything will eventually be tucked in and neat lol.

So anyway I've had the card for about a year now and I haven't even logged into this site in a long time lol so never really got a chance to post my card. Anyway here it is and it is an EVGA model


----------



## avatardiablo

a question for gtx 690 sli mode
1.225v with llc for daily use 24/7 air is ok ?


----------



## provost

Quote:


> Originally Posted by *avatardiablo*
> 
> a question for gtx 690 sli mode
> 1.225v with llc for daily use 24/7 air is ok ?


No. It won't do you any good, as you would temp throttle down. Besides, I don't think that we have working custom bios to truly take advantage of the volt unlock on the 690s. I have not tested custom bios much, but some others may have, so I could be wrong?


----------



## avatardiablo

Quote:


> Originally Posted by *provost*
> 
> No. It won't do you any good, as you would temp throttle down. Besides, I don't think that we have working custom bios to truly take advantage of the volt unlock on the 690s. I have not tested custom bios much, but some others may have, so I could be wrong?


I'm trying the volt mod with version beta18 msiafterburner.
with voltmod you can climb up to 1.3v. (but not I go up to them)

In the bios I just changed the target power up to 150 max.
you can not change the voltage.

I use air cooling 2 artic accelero 690

maximum temperature under stress 1.225v +150 +600 cores and mem (heaven 4.0 bench)

arrival at 82 ° c max


----------



## provost

Quote:


> Originally Posted by *avatardiablo*
> 
> I'm trying the volt mod with version beta18 msiafterburner.
> with voltmod you can climb up to 1.3v. (but not I go up to them)
> 
> In the bios I just changed the target power up to 150 max.
> you can not change the voltage.
> 
> I use air cooling 2 artic accelero 690
> 
> maximum temperature under stress 1.225v +150 +600 cores and mem (heaven 4.0 bench)
> 
> arrival at 82 ° c max


I know that the volt mod works, but I am not sure if the power target actually works, even when changed in bios, possibly due to the complexity of 690 bios having a dual gpu, and a plx chip. Have you tried the over volt on benches with and without the modded bios? If so, can you share some results? Thanks


----------



## avatardiablo

Quote:


> Originally Posted by *provost*
> 
> I know that the volt mod works, but I am not sure if the power target actually works, even when changed in bios, possibly due to the complexity of 690 bios having a dual gpu, and a plx chip. Have you tried the over volt on benches with and without the modded bios? If so, can you share some results? Thanks


I have not tried it with the original and modified bios ... I edited it before anyway.
So I do not know if it serves the 150 max target power ... I'll see GPUZ under stress.

But +150 mhz on the core with default vcore was not before the voltmod,
and with the addition llc drop mod.

I will do further testing.


----------



## Alex132

Anyone else having issues with the new driver? My GPU would only go up to 36% power, and get horrible FPS.

And if I change from SLI to non-SLI back to SLI my driver dies, and I have to re-install it.


----------



## killbom

Just a heads up. Installing the Twin Turbo 690 gilled my GTX 690. Everything was correclty mounted but after a couple of minutes of stress testing the GPU died. I guess some kind of VRM or similar didn't get enough cooling.

On the plus side i have an awesome bookstopper for my books in the living room


----------



## PinzaC55

Quote:


> Originally Posted by *killbom*
> 
> Just a heads up. Installing the Twin Turbo 690 gilled my GTX 690. Everything was correclty mounted but after a couple of minutes of stress testing the GPU died. I guess some kind of VRM or similar didn't get enough cooling.
> 
> On the plus side i have an awesome bookstopper for my books in the living room


Oh no







commiserations

On the positive side I put my second GTX 690 in today without a hitch and here are the twins (or is it quads?) in action.


----------



## killbom

Quote:


> Originally Posted by *PinzaC55*
> 
> Oh no
> 
> 
> 
> 
> 
> 
> 
> commiserations
> 
> On the positive side I put my second GTX 690 in today without a hitch and here are the twins (or is it quads?) in action.


Nice rig. I guess the Twin Turbo i got was faulty in some way. After a closer inspection the insulating plastic seems to be off on some places. On the plus side the cooler was very silent while it worked


----------



## lascar

Sad for you dude ! bad news got mine working for months .... You 're unlucky









the famous insulating strip .... over the VRM To prevent... "KaBOom"

Guess what ? i've thrown this strip directly to the trash bin ....

No issues at all !

what kitguru said during the review of this custom cooler :

"ARCTIC recommend you use the little insulation strips to cover circuit components. There is really no need to use these as the card spacing wouldn't touch any of the sensitive components. If you want to be safe, then run the insulation strips across the circuitry close to the VRMS on the board."

Cheers


----------



## PinzaC55

Better than 97% of results.


----------



## killbom

Quote:


> Originally Posted by *lascar*
> 
> Sad for you dude ! bad news got mine working for months .... You 're unlucky
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the famous insulating strip .... over the VRM To prevent... "KaBOom"
> 
> Guess what ? i've thrown this strip directly to the trash bin ....
> 
> No issues at all !
> 
> what kitguru said during the review of this custom cooler :
> 
> "ARCTIC recommend you use the little insulation strips to cover circuit components. There is really no need to use these as the card spacing wouldn't touch any of the sensitive components. If you want to be safe, then run the insulation strips across the circuitry close to the VRMS on the board."
> 
> Cheers


I ran the insulation strips like a boss







Maybe i should have skipped them!

Anyhow i have got an R9 290 now and it seems to work out ok


----------



## Buzzkill

Quote:


> Originally Posted by *PinzaC55*
> 
> Better than 97% of results.


Better than 99% of results


----------



## PinzaC55

Quote:


> Originally Posted by *Buzzkill*
> 
> Better than 99% of results


Nice. What settings/overclock did you use?


----------



## Alex132

I thought the max +tdp for the GTX 690 was 135%.... but I have gotten to +155% before











Spoiler: Maybe it's because of this?







Sorry for the shoddy picture.

I just thought seeing as how I had a bunch of spare VRAM heatsinks around, might as well


----------



## mac13

I currently have an EVGA GTX 770 Classified and was looking at upgrading to a 690. I have the option to stepup my 770 to a 780 through EVGA or I am able to snag a 690 for only slightly more after I sell my 770. I am just wondering if anyone has issues with the 690 since it is a dual GPU card and I always hear people saying to avoid them because there are driver issues and they aren't as well supported for all games. Would I be better off getting a 780 or maybe even pay a couple hundred more and getting a 780 TI? Thanks!

PS - The 690 I have a chance at getting a good deal on is a new reference card someone received from Nvidia. I am slightly concerned about the noise from the reference cooler and am wondering how a warranty would work since it is a reference card directly from Nvidia (no retail box, documentation, etc). I will be placing this in a small Fractal Node case I use as a combo HTPC and gaming rig. With this card having the reference cooling I suppose that may be good since it won't blow all of the hot air back in the case but since it is a HTPC I am concerned about the card being noisy and want it to be as quiet as possible for movies. Any thoughts?


----------



## euphoria4949

Hi guys, I wanted to ask a couple of quick questions if I may.
I did request to join the 690 Club a while back by posting proof of ownership but I never got added =( Never mind, that's not what I wanted to ask.

Ok so my first question is, can anyone tell me please what metal the stock cooler is made from, the contact patch that sits on the GPU's? I read that it's aluminium fins with a nickel plated copper base, is that correct?

The second is, with the standard cooler installed, what sort of overclocks have you managed to achieve without any BIOS mods or anything??? As I'm a bit unhappy with mine, well for overclocking that is. I thought I had a decent "Stable" OC, it ran benchmark apps without a hitch but games kept crashing. So I started dropping my OC to see if that would stop the crashing, and it did. But only when I dropped the OC to +90 on the cores and +110 on the VRAM with the TDP at 135. Which from what I've read seems really low.

Anyway, I would appreciate your input guys.
Cheers


----------



## V3teran

I have my 690 overclocked to 1241mhz with a custom waterblock. I also game at 2560*1440 120hz using and Overlord Tempest.
I have my monitor set at 100hz and in 99% of games i can hit 100fps which goes up and down when playing. The 2gb of Vram will only become a problem when maxxing out some games using the ingame AA. Hitman absolution and bf4 when using 4x or higher AA will take the gpu memory above 2GB. If using custom AA though Nvidia Inspector like 4xmsaa+sgssaa or higher than you can max out pretty much anygame. Even the original Max Payne with custom AA flag can easily surpass 2GB of ram but that would be using extreme amounts of custom AA.

Overall at 2560*1440 the 690 is a great card using ingame AA fully maxxed out. There are some games were you can drop the AA slightly like Hitman Absolution and bf4 to stop going over the 2gb vram limit. Also when playing at 1440p you dont need as much AA as you do at 1080 or 1200p. I wouldnt change my 690 for anything except maybe a 790, i would not even change it for a 780ti as they only have 3gb of vram. May as well hang on to the 690 and wait for something more worthwhile because performance wise it is still top of the performance charts almost 2 years down the line, quite impressive.


----------



## mac13

Quote:


> Originally Posted by *V3teran*
> 
> I have my 690 overclocked to 1241mhz with a custom waterblock. I also game at 2560*1440 120hz using and Overlord Tempest.
> I have my monitor set at 100hz and in 99% of games i can hit 100fps which goes up and down when playing. The 2gb of Vram will only become a problem when maxxing out some games using the ingame AA. Hitman absolution and bf4 when using 4x or higher AA will take the gpu memory above 2GB. If using custom AA though Nvidia Inspector like 4xmsaa+sgssaa or higher than you can max out pretty much anygame. Even the original Max Payne with custom AA flag can easily surpass 2GB of ram but that would be using extreme amounts of custom AA.
> 
> Overall at 2560*1440 the 690 is a great card using ingame AA fully maxxed out. There are some games were you can drop the AA slightly like Hitman Absolution and bf4 to stop going over the 2gb vram limit. Also when playing at 1440p you dont need as much AA as you do at 1080 or 1200p. I wouldnt change my 690 for anything except maybe a 790, i would not even change it for a 780ti as they only have 3gb of vram. May as well hang on to the 690 and wait for something more worthwhile because performance wise it is still top of the performance charts almost 2 years down the line, quite impressive.


Have you had any driver issues or lack of support for SLI for any games? I also wonder if you have noticed any microstutter. I got a new GTX 690 at really good price from a friend but am thinking about selling it and getting an overclocked 780 TI since you get about the same performance as a 690 and don't have to worry about SLI issues, microstutter, or lower VRAM.


----------



## V3teran

No problems whatsoever with stuttering or otherwise. The 690 has hardware based frame metering technology which actually has less chance of any stutter compared to 2x680s in SLI! The 690 is not like cards in the past like the 590 and 295, its new tech and nvidia really pulled out all the stops when designing it.

Here is proof that the 690 is actually better than 2x680s in SLI and 680s dont get any stutter so im told.
http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-690/8/


----------



## euphoria4949

Quote:


> Originally Posted by *mac13*
> 
> Have you had any driver issues or lack of support for SLI for any games? I also wonder if you have noticed any microstutter. I got a new GTX 690 at really good price from a friend but am thinking about selling it and getting an overclocked 780 TI since you get about the same performance as a 690 and don't have to worry about SLI issues, microstutter, or lower VRAM.


I have to say the same, I've never noticed any stuttering with the 690 installed. And as for SLI support, well Nvidia is pretty good at releasing driver updates for new games and SLI is a very common thing now so it's pretty much a given that SLI support will be included in their drivers. In fact the only game I have found in the last year that didn't have SLI support straight out the box, was DayZ Standalone. But that's an Early Access Alpha so no one expected driver updates for it. But with a couple of clicks you can simply enable SLI to run on that game just by changing 1 settings in Nvidia Inspector. Now DayZ SA utilises all GPU's =)

The 780Ti will give you 50% extra VRAM which certainly may help in a few games that can use it. I've only noticed 2, maybe 3 games in the last year that have used all the VRAM on my 690, well it was "Allocating" all VRAM to the game but whether it was actually using it is a different matter. I never got any huge frame rate drops as a result of not having enough VRAM.
Plus the 780Ti is a single card that gets within 10% of the performance of the 690, so you have more headroom for adding a second at a later date.
But it depends a lot on what resolution and refresh rate you game at. If you game on a 1080p 60HZ monitor then I would say stick with the 690. If you game on a higher res, 1440p/1600p or you have a 120/144HZ monitor and really want to play all games at 100+fps then go for the 780Ti and add a second after Christmas.

Hope this helps


----------



## V3teran

I would say if you already have a 690 there is no point in upgrading at the moment. If your thinking of buying a 780 or 780ti, i would only buy if i was going to go sli. Only reason i do not go sli with 690 is because the 2gb of vram is slowly being maxxed out. At the moment at 2560 1440p its fine but maybe another 12 months it wont. It also depends on amount of in game aa that you use and more importantly custom aa which really kills the vram. Remember though at 1440p not as much aa is needed. Some people do not use aa at this resolution although i feel a small amount is needed personally.


----------



## Fighting Games

Looking for a smooth driver for my Asus gtx 690 that doesn't include micro stuttering. I've heard Nvidia driver 331.40 is still the best. Downloading now


----------



## RnRollie

Quote:


> Originally Posted by *V3teran*
> 
> I would say if you already have a 690 there is no point in upgrading at the moment. If your thinking of buying a 780 or 780ti, i would only buy if i was going to go sli. Only reason i do not go sli with 690 is because the 2gb of vram is slowly being maxxed out. At the moment at 2560 1440p its fine but maybe another 12 months it wont. It also depends on amount of in game aa that you use and more importantly custom aa which really kills the vram. Remember though at 1440p not as much aa is needed. Some people do not use aa at this resolution although i feel a small amount is needed personally.


Unless one is made out of money... then one can upgrade (pre-order) the Titan Z ....









although.. 3k for a gpu is not exactly bargain-bin money


----------



## nyk20z3

I've been offered a str8 trade of a 690 for my 780 Lighting.

I game at 2560X1440 but I don't foresee any VRAM issues going from 3GB to 2GB.

Do you guys think this trade makes sense or is it absolutely nuts considering the stout engendering behind the lighting ?

Currently running a Silverstone FT03 case so within my given space constraints a dual GPU card does make sense.


----------



## V3teran

MSI Afterburner will expire
Quote:


> Originally Posted by *nyk20z3*
> 
> I've been offered a str8 trade of a 690 for my 780 Lighting.
> 
> I game at 2560X1440 but I don't foresee any VRAM issues going from 3GB to 2GB.
> 
> Do you guys think this trade makes sense or is it absolutely nuts considering the stout engendering behind the lighting ?
> 
> Currently running a Silverstone FT03 case so within my given space constraints a dual GPU card does make sense.


It all depends if your going to add another 780 in sli later on. A 780 with custom water setup should easily outdo a 690. Im sure the 780 can be volt modded like 690 can through MSI Afterburner.

So only if your gonna push it past the 690 and intend on putting another in sli otherwise your undercutting yourself.

Stock 690 beats 780.
Watercooled and volt modded 780 beats a watercooled and volt modded 690 ( I think).My 690 is at 1267mhz on the core, i think the 780 could beat this. No point going sli with 690 as 2gb vram will be hit sooner in more games within next 12 months. I could be wrong on that however.

Quote:


> Originally Posted by *RnRollie*
> 
> Quote:
> 
> 
> 
> Originally Posted by *V3teran*
> 
> I would say if you already have a 690 there is no point in upgrading at the moment. If your thinking of buying a 780 or 780ti, i would only buy if i was going to go sli. Only reason i do not go sli with 690 is because the 2gb of vram is slowly being maxxed out. At the moment at 2560 1440p its fine but maybe another 12 months it wont. It also depends on amount of in game aa that you use and more importantly custom aa which really kills the vram. Remember though at 1440p not as much aa is needed. Some people do not use aa at this resolution although i feel a small amount is needed personally.
> 
> 
> 
> Unless one is made out of money... then one can upgrade (pre-order) the Titan Z ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> although.. 3k for a gpu is not exactly bargain-bin money
Click to expand...

For 3k they can keep there Titan Z.
For 3k id want something even more powerful than what Titan Z is for that kinda money.

Quote:


> Originally Posted by *Fighting Games*
> 
> Looking for a smooth driver for my Asus gtx 690 that doesn't include micro stuttering. I've heard Nvidia driver 331.40 is still the best. Downloading now


Yes i use the 331.40, smoothest driver ive used at 120hz.


----------



## superx51

The titan z will fit perfectly in my x51 case. Finally a worthy upgrade to my 690


----------



## PinzaC55

Quote:


> Originally Posted by *superx51*
> 
> The titan z will fit perfectly in my x51 case. Finally a worthy upgrade to my 690


Anybody spending 3K on a GPU would have a case built to fit it.


----------



## PCModderMike

Well, the 690 is starting to fade away into history like everything else does...but I'm still rocking mine for one more rebuild. Still got plenty of power for what I do.









__
https://flic.kr/p/13367886453


__
https://flic.kr/p/13367905423


----------



## Alex132

Quote:


> Originally Posted by *Fighting Games*
> 
> Looking for a smooth driver for my Asus gtx 690 that doesn't include micro stuttering. I've heard Nvidia driver 331.40 is still the best. Downloading now


Been running really well on 332.21, when I upgraded to the latest drivers switching between SLI on and SLI off had HUGE issues for me (often needing a remove/reinstall of the driver with driver sweeper etc).


----------



## PinzaC55

Just curious, I have been following this thread http://www.overclock.net/t/1472671/samsung-4k-1ms-monitor-754-it-has-begun/470 and I have 2 x GTX 690s. Does anyone have a similar configuration hooked up to a 4K monitor and is it good or bad?


----------



## Cylas

Nvidia GeForce *337.50* with *690GTX* Benchmarks

3DMark Benchmark with an Intel 8 Core Xeon Processor runs @ 3,6Ghz vs an older Nvidia Driver 331.65. IceStorm was buggy, the 690 GTX running at idle clock speeds.

*Firestrike Result Comparison:*

R331.65 = 12116P
R337.50 = 12306P +1.5%

*CloudGate Result Comparison:*

R331.65 = 37667P
R337.50 = 37742P +0.19%

*IceStorm Result Comparison:*

R331.65 = 145075P
R337.50 = 139482P -3.8%


----------



## Alex132

Quote:


> Originally Posted by *Cylas*
> 
> Nvidia GeForce *337.50* with *690GTX* Benchmarks
> 
> 3DMark Benchmark with an Intel 8 Core Xeon Processor runs @ 3,6Ghz vs an older Nvidia Driver 331.65. IceStorm was buggy, the 690 GTX running at idle clock speeds.
> 
> *Firestrike Result Comparison:*
> 
> R331.65 = 12116P
> R337.50 = 12306P +1.5%
> 
> *CloudGate Result Comparison:*
> 
> R331.65 = 37667P
> R337.50 = 37742P +0.19%
> 
> *IceStorm Result Comparison:*
> 
> R331.65 = 145075P
> R337.50 = 139482P -3.8%


It's a nice little bump in performance


----------



## nickcnse

Hello everyone,

I'm looking for a little advice in regards to the 690. I picked up one on craigslist not too long ago and it's an amazing card! Even came with a water block so now I'm putting it under water. I have the option to trade one of my spare r9 290's + $100 for a second gtx 690 (a total investment of $485) and I'm wondering if it would be worthwhile. Any advice would be appreciated.

Thanks everyone.


----------



## euphoria4949

Quote:


> Originally Posted by *nickcnse*
> 
> Hello everyone,
> 
> I'm looking for a little advice in regards to the 690. I picked up one on craigslist not too long ago and it's an amazing card! Even came with a water block so now I'm putting it under water. I have the option to trade one of my spare r9 290's + $100 for a second gtx 690 (a total investment of $485) and I'm wondering if it would be worthwhile. Any advice would be appreciated.
> 
> Thanks everyone.


It would really depend on what resolution you game at. What is the rest of your setup???

If you game at 1080p it really wouldn't be worth it. 1440p maybe but the 690's small 2GB of VRAM will hold things backs, and above 1440p...... Hate to say it, but get a new card entirely.

Here is a link, a guy from Toms Hardware did a "Real World" not just benchmarks test of 2x 690's in Quad SLI. The scaling is awful, maybe an extra 10% increase from the 3rd GPU and between "minus" -5% and 2% gain or drop in performance from the 4th GPU.


----------



## nickcnse

Quote:


> Originally Posted by *euphoria4949*
> 
> It would really depend on what resolution you game at. What is the rest of your setup???
> 
> If you game at 1080p it really wouldn't be worth it. 1440p maybe but the 690's small 2GB of VRAM will hold things backs, and above 1440p...... Hate to say it, but get a new card entirely.
> 
> Here is a link, a guy from Toms Hardware did a "Real World" not just benchmarks test of 2x 690's in Quad SLI. The scaling is awful, maybe an extra 10% increase from the 3rd GPU and between "minus" -5% and 2% gain or drop in performance from the 4th GPU.


I'm currently gaming on a single 1080p monitor @ 95 hertz but I want to upgrade to a 4k monitor within the next six months. The gtx 690 is currently in my new system with an i7 4770k and MSI z87 Xpower with 16gb of ram. From what you're saying, the second gtx 690 would just be adding heat to my system and not improve performance. Maybe I'll just consider crossfiring two r9 290's then. Or if I can find someone to trade, maybe acquire an r9 290x lightning.

Thank you very much euphoria!


----------



## euphoria4949

Quote:


> Originally Posted by *nickcnse*
> 
> I'm currently gaming on a single 1080p monitor @ 95 hertz but I want to upgrade to a 4k monitor within the next six months. The gtx 690 is currently in my new system with an i7 4770k and MSI z87 Xpower with 16gb of ram. From what you're saying, the second gtx 690 would just be adding heat to my system and not improve performance. Maybe I'll just consider crossfiring two r9 290's then. Or if I can find someone to trade, maybe acquire an r9 290x lightning.
> 
> Thank you very much euphoria!


At 1080 the 690 can achieve a solid 60fps on ultra settings in 99.99% of games. I personally run mine with a 1080p 120HZ monitor and as I really feel a benefit from the extra frames in shooters. I debated whether to add a second 690 back when I bought mine, but after see reviews and seeing first hand on my nephews rig how little improvement it added, I very quickly dropped that idea.

Regarding 4k, well even at 1080 I've found a few games that max out the 2GB of VRAM the 690 has. So at 4k...... well lets just say the VRAM will be a pretty substantial bottleneck.

And no probs, happy to help if I can.


----------



## grassy

What drivers are you using as i am gaming on a dell 3011 with 2560x1600 res and i am using drivers 335.21 and all i am getting is stutter. Especially playing dead space 3 its almost impossible to keep up with the stutter. I am using a single gtx690 and have the other in the wifes computer. What drivers are you using for stutter free gaming.


----------



## RnRollie

Have your tried the 337.50 (beta) driver?

Also, isn't DS3 included in Nvidia's optimal settings experience thingie?


----------



## euphoria4949

Quote:


> Originally Posted by *grassy*
> 
> What drivers are you using as i am gaming on a dell 3011 with 2560x1600 res and i am using drivers 335.21 and all i am getting is stutter. Especially playing dead space 3 its almost impossible to keep up with the stutter. I am using a single gtx690 and have the other in the wifes computer. What drivers are you using for stutter free gaming.


I haven't personally noticed any stuttering with mine, but I remember a few members on here posted that they had problems with it and they managed to resolve the issue by rolling back to old drivers, I think it may have been 332.21. Also, download Nvidia Inspector, customising the DS3 profile may help.


----------



## provost

I have tried some games with two 690s and with the new driver I do get better performance. How much better? I could not tell you, as I don't game much. I did notice a degradation of IQ on 1080p with new drivers. On 1440p, it's not noticeable to me. I added a second 690 simply because I had it, and it does add a lot to performance in terms of additional fps. No more stuttering issues than my multi Titan build which is unfortunately still undergoing a never ending rebuild. I haven't even cracked open my kpe yet,....may be I will just keep it in the box and make it a relic....lol


----------



## Wiz766

I know this isnt the norm way but no camera at the moment.
EVGA


----------



## Wiz766

What is the GTX 690 sig code?


----------



## grassy

Quote:


> Originally Posted by *euphoria4949*
> 
> I haven't personally noticed any stuttering with mine, but I remember a few members on here posted that they had problems with it and they managed to resolve the issue by rolling back to old drivers, I think it may have been 332.21. Also, download Nvidia Inspector, customising the DS3 profile may help.


Yea thanks for that. Nividia inspector didnt do much for me. After 2 days of problem solving i have it sorted. I got rid of my drivers completely with and without driver sweeper and made sure i had no trace of drivers whatsoever, And got rid of a folder called "explorer.exe" which was located in my origin folder where my dead space game is and reinstalled the latest drivers and all is good.No game stutter at all now.


----------



## sausage boy

Count me in new build new card...


----------



## askala2

how change 690 bios?
nvflash -i1 -4 -5 -6 3.rom?ok?
but i cannot change bios.....error


----------



## krug

Quote:


> Originally Posted by *askala2*
> 
> how change 690 bios?
> nvflash -i1 -4 -5 -6 3.rom?ok?
> but i cannot change bios.....error


Do Nvflash --protectoff first.


----------



## lascar

bios modified via KGB ( 300W power limit x2 - 175W by default)

gtx 690 running stock 1071 mhz - 1280 mhz boost max / 1267 mhz avg

Maximum with 1.31V mod :










toasted vrm's & Heaven 4.0 maxed out rock stable


----------



## workthis

add me please


----------



## Arizonian

Quote:


> Originally Posted by *workthis*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> add me please


Nice - congrats -









jcde7ago stopped updating members to the actual list since Maximilium on 3/14/13. However you will find a good group of 690 owners who's GPU's are still running strong here so welcome.









Mine is in my second rig and runs games like a champ still.


----------



## Stablerage

Sadly my 690 had to be RMA'd. I loved the card. I am just waiting for the refund so I can look at a near free upgrade. I will probably go for two 780 classifieds. the 690 was my first high-end card but not the last. Hope the rest of you guys with 690's have them for a long time still.


----------



## s74r1

Quote:


> Originally Posted by *Stablerage*
> 
> Sadly my 690 had to be RMA'd. I loved the card. I am just waiting for the refund so I can look at a near free upgrade. I will probably go for two 780 classifieds. the 690 was my first high-end card but not the last. Hope the rest of you guys with 690's have them for a long time still.


how long did you own it for? and what company offered a refund instead of replacement?


----------



## Stablerage

About 13 months and it was Dabs.com. I could have had a replacement or refund.


----------



## euphoria4949

Quote:


> Originally Posted by *s74r1*
> 
> how long did you own it for? and what company offered a refund instead of replacement?


By any chance did they offer you another 690 as replacement or was it an "Equivalent" to the 690??? I ask as I also got mine from Dabs, and my 690 has been acting up recently. So I enquired about an RMA, but Dabs said they could only offer a refund, which is a problem for me as I paid via PayPal, an account I shared with my now Ex-girlfriend that she now has complete control over, so if they refund me, SHE gets my money!!! And they wouldn't refund me any other way, so I'm kinda suck with what to do.

Cheers


----------



## Stablerage

Quote:


> Originally Posted by *euphoria4949*
> 
> By any chance did they offer you another 690 as replacement or was it an "Equivalent" to the 690??? I ask as I also got mine from Dabs, and my 690 has been acting up recently. So I enquired about an RMA, but Dabs said they could only offer a refund, which is a problem for me as I paid via PayPal, an account I shared with my now Ex-girlfriend that she now has complete control over, so if they refund me, SHE gets my money!!! And they wouldn't refund me any other way, so I'm kinda suck with what to do.
> 
> Cheers


All I can say is they at first gave me the option for a replacement or refund. I asked for the refund so sorry I never got as far to see what they would replace it with.


----------



## Stablerage

But I was led to believe it would be another 690.


----------



## euphoria4949

Quote:


> Originally Posted by *Stablerage*
> 
> But I was led to believe it would be another 690.


Ok cool, thanks man.


----------



## Stablerage

My pleasure and good luck.


----------



## V3teran

Quote:


> Originally Posted by *lascar*
> 
> bios modified via KGB ( 300W power limit x2 - 175W by default)
> 
> gtx 690 running stock 1071 mhz - 1280 mhz boost max / 1267 mhz avg
> 
> Maximum with 1.31V mod :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> toasted vrm's & Heaven 4.0 maxed out rock stable


I managed to get mine to 1.3v, how did you manage to get to 1.31v?


----------



## VonDutch

Hello all,

I have a GTX 690









what a amazing card, I play battlefield 3 and 4 etc, now with the new drivers it runs fast, like 140 fps on average, everything maxed out,
but...this wont surprise anyone I guess, it runs hot, about 80-85C! (spiked to almost 90C in bf4)

using evga precision, only adjusted power target to 135%, could easily OC it to almost 1200 MHz, im sure I can go higher,
but temps where getting to high then, the temps I talk about now is only with the power target on 135%..

my Q. is, is there a way to change the cooling pasta without voiding warranty?

I do have A+, that's for hardware, I am MCP, Microsoft certified pro and some other certificates,
I was wondering if that's enough to go ahead and take the cooler off,
since most of the time they say ist has to be done by a "official" repair people..

My story how I got it,
I bought it for 400 euro(547 U.S. dollars) from someone who had bad luck with it, he sent it RMA 2 times, and both times it broke,
then he sold it to me broke one, all I had to do was sent it RMA again, I did, had to wait 3-4 weeks, and guess what,
they sent me a brand new one back, so I have it about 1 week now









but im a temp freak, that's why I have a delidded 3770K, which never runs hotter then 40C now,
running prime, about...let me show you,

at 4.8 Mhz, 63C hottest core after 24H prime on air, push pull Scythe Mugen 3 rev b..( I have a Cooler Master Seidon 240M now)

anyways, the temps of this GTX 690 beast is way to hot for me, hence my Question,
what can I do?
I still have about 14 Months warranty now, I want to change the cooling pasta to liquid pro, same I use on my processor,
im sure temps will drop a lot, maybe even 15-20C ..well long story..


----------



## shaneduce

I have 2x of them.


----------



## V3teran

I found out what the throttling was caused by, basically it happens when your asking too much from your gpu as in the settings for each game. When i ran BC2 with 4xMSAA+4xSGSSAA (using Nvdia Inspector) gpu 1 was at 1306mhz-70% load. Gpu 2 was at 950mhz at 97% load.

Hmmm i thought this dont make sense as should be the other way around.
I disabled 4xMSAA+4XSGSSAA and enabled the ingame AA+AF+AO+High etc etc.

Instantly both GPU's are both now maxxed out when i play, gpu1 at 1306mhz and gpu2 at 1293mhz. So in a nutshell i found that by increasing the AA it also has a massive hit on the core and not only the vram, this would explain why Gpu 2 was at 950mhz and 97% load.

Anyhow here is a couple videos i made today and converted to MP4 showing these clocks in a bit of......

BC2





and

Wolfenstein New order.





Quality on these videos is MP4 so not the best but can see the clocks which is the whole point. Ive played 3 rounds of BC2 and not one crash but the card wont go any higher without crashing.

I know this card has more to give even at 1.31v could see a substantial increase to maybe around 1350mhz. I reckon the vrms can handle it as there well cooled and my temps hardly go over 45 degrees maximum.

Thing is i do not know if the rest of the circuitry is able to cope but im willing to try for around 1350mhz at 1.31v or 1.32v should be more than enough.


----------



## VonDutch

So, after some talking to the people where the gtx 690 was bought, there's no way I can change cooling pasta,
I am not allowed to do it myself, even the store its bought cant do it, has to be a official dealer or something,
long story short, only Asus can do it where I live ...pfft..

So...im voiding warranty, its a new card back from RMA with over 1 year warranty left,
yesterday I ordered this one, a ARCTIC Accelero Twin Turbo 690 Cooler..

I hope to get the temps under control with this cooler, anyone here using one ?

let me play some bf3, and show you how hot it gets after a while playing..

90C after playing 5 minutes orso..with just a minor OC, not that it matters much in temp difference tho,
maybe 3-4C tops with fan speed at 85%..

I took out a 14cm frontfan, and put it on the case floor, under the card blowing cool air toward it,
doesn't seem to help a lot, few degrees, but that's still a lot if temps are that high ..lol
case open, doesn't help a lot, I tried everything to make it cooler,
seems to me I have bad luck with this cards temps?


1215mhz is easy enough..played 30-45 minutes bf3 with it just now..np


----------



## shaneduce

Quote:


> Originally Posted by *VonDutch*
> 
> So, after some talking to the people where the gtx 690 was bought, there's no way I can change cooling pasta,
> I am not allowed to do it myself, even the store its bought cant do it, has to be a official dealer or something,
> long story short, only Asus can do it where I live ...pfft..
> 
> So...im voiding warranty, its a new card back from RMA with over 1 year warranty left,
> yesterday I ordered this one, a ARCTIC Accelero Twin Turbo 690 Cooler..
> 
> I hope to get the temps under control with this cooler, anyone here using one ?
> 
> let me play some bf3, and show you how hot it gets after a while playing..
> 
> 90C after playing 5 minutes orso..with just a minor OC, not that it matters much in temp difference tho,
> maybe 3-4C tops with fan speed at 85%..
> 
> I took out a 14cm frontfan, and put it on the case floor, under the card blowing cool air toward it,
> doesn't seem to help a lot, few degrees, but that's still a lot if temps are that high ..lol
> case open, doesn't help a lot, I tried everything to make it cooler,
> seems to me I have bad luck with this cards temps?
> 
> 
> 1215mhz is easy enough..played 30-45 minutes bf3 with it just now..np


That a load of bull man. You can put a different cooler on, if I was not able to I would not have a custom water loop in my system. So long as you return it to original look, it will be fine, tell no one at ASUS you changed it. I know no one that has the original cooler on there video card or CPU.


----------



## VonDutch

Quote:


> Originally Posted by *shaneduce*
> 
> That a load of bull man. You can put a different cooler on, if I was not able to I would not have a custom water loop in my system. So long as you return it to original look, it will be fine, tell no one at ASUS you changed it. I know no one that has the original cooler on there video card or CPU.


Thanks for your answer







, i will be very careful when I take it apart,
I think the heat will do more damage over time then the OC im planning,
when I have the new cooler..


----------



## shaneduce

Quote:


> Originally Posted by *VonDutch*
> 
> Thanks for your answer
> 
> 
> 
> 
> 
> 
> 
> , i will be very careful when I take it apart,
> I think the heat will do more damage over time then the OC im planning,
> when I have the new cooler..


There will be thermal pads on the memory and voltage regular that you will want to replace and also the gpu's and the sli chip you will want to apply new thermal compound too.
After finding out how easy cooler and quite water cooling is i never do a build that is not.


----------



## virgis21

I've just figured out interesting thing about drivers.
The latest driver run BF4 only about 40-60fps on Ultra. Start thinking why? Remember over 100FPS was minimum..So then I went to GeForce drivers download and looked up the history "what's new" and found the one that BF recomends (minimum). And now it is my best performance and issue free driver with my 690 - 331.82
Now I am back to ~140fps on all ultra (1080p).

Virgis

P.S. I think this is why Nvidia "slows" games with new drivers update, because we will run to buy new card!


----------



## grunion

You know I haven't tried BF4 on my 690 yet, think I'll toss it in and give a whirl.


----------



## shaneduce

I found out the hard way unless the driver FIXES any problems you are having DONT INSTALL NEW ONES! ! ! ! !
The only other reason would be new drivers that come out just before a new game (Now I try the new game for a bit before installing the new driver.)








The only problem I have with BF4 is Vram, with 2GB per a GPU I'm running Surround on 3 monitors, I hit it or get close so I turn down what a few setting's and there I go


----------



## grunion

Quote:


> Originally Posted by *virgis21*
> 
> I've just figured out interesting thing about drivers.
> The latest driver run BF4 only about 40-60fps on Ultra. Start thinking why? Remember over 100FPS was minimum..So then I went to GeForce drivers download and looked up the history "what's new" and found the one that BF recomends (minimum). And now it is my best performance and issue free driver with my 690 - 331.82
> Now I am back to ~140fps on all ultra (1080p).
> 
> Virgis
> 
> P.S. I think this is why Nvidia "slows" games with new drivers update, because we will run to buy new card!


I just played a couple rounds on the .23 WHQL drivers, Ultra no AA at 1440p, avg triple digit FPS for the most part.

Do we have an unlocked bios that we can use?
Power maxes at 135, fan at 85%

It really is hard to believe how long in the tooth the 690 has become.
Single 780 ti and 290x both give extremely superior game play.
I still love the 690, one of my most fav cards ever, but it is showing its age.

I'm going to start adding AA to see how much it can handle at 1440.


----------



## VonDutch

Quote:


> Originally Posted by *shaneduce*
> 
> There will be thermal pads on the memory and voltage regular that you will want to replace and also the gpu's and the sli chip you will want to apply new thermal compound too.
> After funding out how easy cooler and quite water cooling is i never do a build that is not.


Wish I could, but going full water is expensive also, for me that is








Yea, was planning on using liquid pro on the gpu's, had great temp drops (25+C) on my proc with it,
not sure if I can use it elsewhere tho, like sli chip etc..

I'm using the 337.50 drivers since a few days, bf3 and 4 did really get a nice fps bump, about 20 fps on average,
didn't see any problems yet running it, maybe one thing, I have a 144hz monitor, if I run it at 144hz, I get red dots and/or lines on my screen,
in game, but also in windows, have no problem running 120hz, anyone got a clue what this is?


----------



## grunion

What are the differences in the 3 skynet bios versions in the OP?


----------



## virgis21

Quote:


> Originally Posted by *shaneduce*
> 
> I found out the hard way unless the driver FIXES any problems you are having DONT INSTALL NEW ONES! ! ! ! !
> The only other reason would be new drivers that come out just before a new game (Now I try the new game for a bit before installing the new driver.)
> 
> 
> 
> 
> 
> 
> 
> 
> The only problem I have with BF4 is Vram, with 2GB per a GPU I'm running Surround on 3 monitors, I hit it or get close so I turn down what a few setting's and there I go


YES, Surround kills those 2Gb Vrams for sure!


----------



## RnRollie

a word of warning of the Accelero kit... the bolts/screws that come with it are really really cheap... if you are one of those guys with the tendency to give any bolt you tighten that extra quarter turn, you are going to snap a few of them. As soon as you have resistance on the bolts, proceed with care. And dont think any Philips or Pozi head will do, these bolts are incredibly flimsy.

The alternative is to replace them with an handful of good ones from any decent hardware/diy store

And.. get a really good screwdriver tool kit including a torx T5 & T6









http://resources.tannerbolt.com/articles/what-type-of-screw-is-this/

.


----------



## VonDutch

Quote:


> Originally Posted by *RnRollie*
> 
> a word of warning of the Accelero kit... the bolts/screws that come with it are really really cheap... if you are one of those guys with the tendency to give any bolt you tighten that extra quarter turn, you are going to snap a few of them. As soon as you have resistance on the bolts, proceed with care. And dont think any Philips or Pozi head will do, these bolts are incredibly flimsy.
> 
> The alternative is to replace them with an handful of good ones from any decent hardware/diy store
> 
> And.. get a really good screwdriver tool kit including a torx T5 & T6
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://resources.tannerbolt.com/articles/what-type-of-screw-is-this/
> 
> .


Thanks for the tips RnRollie, will keep that in mind,
are you using a Accelero Twin Turbo 690 yourself?
I still need to get a torx screwdrivers for it, but looks like I have time ..

Still no word from Arctic tho, ordered on last Friday,
it still says "Processing", will get a email with track and trace if they sent it,
I hate waiting ..lol

Battlefield3 uses about 1.6gb Vram, bf4 uses about 1.9Gb Vram,
what will happen if it hits 2Gb? anyone know or experienced it?


----------



## RnRollie

yeah, i've got one
it works rather well, and would probably even be better if i hadn't snapped two bolts








Still need to pick up some replacements from the local DIY. Then take the whole thing apart again and use decent bolts this time around.









The torx bit is only be needed depending on your brand of GPU..
ASUS really likes those torx bolts.. so, you, you'll need a T5 or T6 bit

When installing take your time... and study the instructions first. I've seen a lot worse instructions, but i've seen way better also








Pay attention to the thermal pads.. its not always clear which one goes where. Have enough workspace to sort out all the bits & bobs that come with the kit.

All that said, still think it doesn't look as good as the original housing thou... i keep saying the 690 was the most beautiful card Nvidia ever released.

For BF4 hitting the vram limits.. i dont know, i dont do BF4 or similar anymore. I avoid the competitive pitfalls, the addiction of online gaming... a decade ago for about a year i was almost 24/7 in Americas Army.. man did i own that bridge







But it became the main focus of my life and thus came with a cost. And many are called, but few are chosen and how many people you know make a decent living out of competitive gaming? So, anyways ...

I guess when the vram limit is reached anything can happen: slowing down, tearing, crashing, swapping/reading from disk... all depending on the game/engine. AFAIK at this point we're fine... however, if you consider buying a 4K monitor... the 690 2x2Gb vRam might no longer be enough. Although i expect there will be no 4k +60fps -5ms monitors available before the end of the year. so.. pffff

although Arioh had other things to say also, he does mention why he goes for Titans at Toms http://www.tomshardware.co.uk/forum/id-1756990/gtx-780-titan-card-300-difference.html .... Jsut an example why as long as you're not going to 4k screens of 4k texture packs the 690 will remain in the top 10


----------



## shaneduce

In BF4 the game just freezes and then your dead.


----------



## V3teran

Is it possible to take the 690 above 1.3v?


----------



## shaneduce

Yes with a modified bios. Its not wize to do tho as you can melt the card.


----------



## grunion

Quote:


> Originally Posted by *grunion*
> 
> What are the differences in the 3 skynet bios versions in the OP?


Anyone?


----------



## diego2k3

Anyone know where I could get an EK backplate for my card? I made the mistake of not buying one along with my block last year and am having no luck so far...


----------



## shaneduce

I use a Heatkiller one's my self.
Here is the one I have
https://sage-shop.com/epages/WatercooleK.sf/seccc8e2eff71/?ObjectPath=/Shops/WatercooleK/Products/16005


Here is what I found sofar for EK
http://www7.ncix.com/products/?sku=80037

I wish I can find EK - FC690 GTX - Plexi+Nickel someplace that will ship to the USA.


----------



## diego2k3

Quote:


> Originally Posted by *shaneduce*
> 
> I use a Heatkiller one's my self.
> Here is the one I have
> https://sage-shop.com/epages/WatercooleK.sf/seccc8e2eff71/?ObjectPath=/Shops/WatercooleK/Products/16005
> 
> 
> Here is what I found sofar for EK
> http://www7.ncix.com/products/?sku=80037
> 
> I wish I can find EK - FC690 GTX - Plexi+Nickel someplace that will ship to the USA.


Nice! I saw those too but I was concerned that it wouldn't play nice with the EK block - seems like it might be worth a shot. I've found a couple places that claim to carry EK stuff for the 690 still. But you're defiintely right, seems tought to find somewhere that will ship to the US.

Here are the links I came up with:
http://ministryofpc.com/index.php/liquidcooling#/GPU Backplate
http://www.computerlounge.co.nz/components/componentview.asp?partid=17095


----------



## diego2k3

After speaking to EK's support department, I've learnt that they will be restocking their 690 backplate on the official shop soon - most likely today. If any of you guys are interested, you can place an order at:

http://www.ekwb.com/shop/ek-fc690-gtx-backplate-black.html


----------



## shaneduce

To bad they are not restocking there blocks for the 690's.


----------



## Jabba1977

Hello , good day...

I bought a GTX 690 (second hand) for my girl-friend´s ring.

I want to know some questions about this Card:

- What is the best/stable/performace driver?
- How I disabled boost?... Should I put a modified BIOS on both gpu´s for this?
- What about update PLX and BIOSES for UEFI??

Please, need help with this card...

Thanks!!!.


----------



## Alex132

Welcome to the club, Jabba1977.

Most stable driver? I'm having no issues at all with the latest driver.

You can disable boost through a modified BIOS. But I dont see the point.

There's no need to update the PLX chips firmware on this card as far as I know. Never heard of a card ever needing such firmware updates


----------



## Jabba1977

Thanks, the Skynet RevA, RevB, RevC modified BIOS seems that don´t match with the target.... I think that is only for one gpu...

I put on the card a UEFI Bios and it´s OK with Windows 8.1 , in this manner I can enable Windows 8 features...

I put the latest drivers and seems ALL IS OK, and the GTX 690 performs well...

*But, I don´t know how I can disabled Boost...voltage it´s max at 1.175v ...(not problem...)
*

This card is great!!!, the performance is above a OC 780ti / TITAN...and I haven´t noticed microstuttering...

Only, in Tomb Raider Benchmark I can note flickering on the hair...Why???

Thanks, best regards.


----------



## V3teran

Look at this guide i made here at Guru3D, this is how i achieved 1306mhz on my gtx 690.
http://forums.guru3d.com/showthread.php?t=362667
Flickering on Tomb raider hair is probably the driver.


----------



## V3teran

false
Quote:


> Originally Posted by *shaneduce*
> 
> Yes with a modified bios. Its not wize to do tho as you can melt the card.


I do not think its possible to go gver 1.3v on 690, can you show me somebody that has or a working bios that goes over 1.31?


----------



## itlvk

I was bought GTX 690 on ebay.
I tested, this card has 1 GPU and only DVI-1 can use.

GPU-Z:









Control panel:









Device manager



























please help me!


----------



## shaneduce

Looks like GPU#1 is dead.
Contact the seller.


----------



## nickcnse

How much did you pay for it?


----------



## itlvk

Quote:


> Originally Posted by *shaneduce*
> 
> Looks like GPU#1 is dead.
> Contact the seller.


I think too, I was contacted with seller but they don't reply.
Quote:


> Originally Posted by *nickcnse*
> 
> How much did you pay for it?


I did pay 490$

http://www.ebay.com/itm/NVIDIA-GTX690-4GB-GDDR5-AMAZING-DEAL-/261469228744?ssPageName=ADME%3AX%3AAAQ%3AUS%3A1123&ViewItem=&item=261469228744&nma=true&si=dKKhsB%252FjaVh8V0e6Ln%252B2ZAyb%252B6w%253D&orig_cvip=true&rt=nc&_trksid=p2047675.l2557


----------



## nickcnse

You should be able to get your money back from it and return the card. If you get to keep the card as it is (and it's not up for rma) let me know. I have one with the same problem and I'm replacing one of the gpu's within the card. I'll keep you informed on how that goes. The new chip should get here in a week and a half and then I get to try out all of my new tools for it.


----------



## V3teran

I managed to get the 690 above 1.3v, i can run it upto 1.6v using a .bat file and string of text.
I know have my 690 running at 1.35v


----------



## shaneduce

Quote:


> Originally Posted by *itlvk*
> 
> I think too, I was contacted with seller but they don't reply.
> I did pay 490$
> 
> http://www.ebay.com/itm/NVIDIA-GTX690-4GB-GDDR5-AMAZING-DEAL-/261469228744?ssPageName=ADME%3AX%3AAAQ%3AUS%3A1123&ViewItem=&item=261469228744&nma=true&si=dKKhsB%252FjaVh8V0e6Ln%252B2ZAyb%252B6w%253D&orig_cvip=true&rt=nc&_trksid=p2047675.l2557


File a complaint with paypal and E-bay then guest RMA the card.


----------



## V3teran

*GTX 690 Overclock 1359mhz -Payday 2*


----------



## dynn

Hi there to everyone, 2 years ago i was asking about "what to get for my new rig".

i got it like i said 2 years ago, but in my case i am a really noobs about it, and i dont have any idea how to set up the best performance.

Here is the problem, i hope some of you can help me out how to configure it.

I just play 2 games with a single monitor:
1.- Saint seiya online MMORPG
2.- Starcraft 2

the problem is that in saint seiya i feel the game slowly, of course i play with ultra graphics, at beggining game runs very well, but after a hours it feels very slowly, feeling a bad performance with my gtx 690.

Unafortunately in my city theres no people with a lot knowledge about computers, so no one can help me out, the guy who build it, wasnt sure how to setup the ssd memory, and also he had a hard time trying to overcloack my processor, for now (im not sure at all) dont know if my computer is running at 4.4ghz.

I dont know if someone can help me about "how to configure" the best performance of my gtx 90 para that mmo game, if you need specificacion of my computer you can ask, and tell me how to provide it for you.

thanks for it guys


----------



## V3teran

GTX 690 1.3v - 1.6v on the fly.


----------



## Barefooter

I recently purchased a used pair of GTX 690s. I just bought some XSPC Razor waterblocks for them.

Does anyone know if the HEATKILLER GPU Backplates like these http://shop.watercool.de/HEATKILLER-GPU-Backplate-GTX-690/en will work with the XSPC water blocks?


----------



## shaneduce

Quote:


> Originally Posted by *Barefooter*
> 
> I recently purchased a used pair of GTX 690s. I just bought some XSPC Razor waterblocks for them.
> 
> Does anyone know if the HEATKILLER GPU Backplates like these http://shop.watercool.de/HEATKILLER-GPU-Backplate-GTX-690/en will work with the XSPC water blocks?


Yes they do.


----------



## V3teran

GTX 690 clocked at 1372 and 1359, no throttling on ARMA 3. Been playing for about 3 hours no problems.
Two short Videos merged into one video.
Will try for 1398! Should be possible with a slight nudge in voltage to around 1.44v-1.45v.
At 1.48v i get serious throttling issue on GPU1 which means to me that the circuitry is hitting its maximum threshold.

There is a reason why this card cost me $1400 (UK price) because it is made up of the finest vrms and circuitry as im able to push it quite far








I do not know anyone on any forum that has a 690 anywhere near this speed with custom water setup.


----------



## provost

Yeah, I was able to unlock the voltage a little while back using AB and Zawarudo's tool (Rbby 258 is also very similar). I remember both Zawarudo (RIP) and Rbby were both working on it concurrently, albeit with a slightly different approach. This all started with Uwinder providing his "unlock able" AB 15 at that time, if I recall correctly. I have used skyn3t/Occamrazor bios for Titans along with Zawarudo's /Rbby volt unlock.
However, when I did use the volt unlock on 690, I experienced massive throttling. Part of the throttling was because I didn't have the 690s underwater (too lazy even though I have the blocks, but this is my temporary rig at the moment, or so I keep telling myself while I finish the rebuild of my primary rig







) but primarily because no one has been able to come up with a working 690 bios that won't throttle due to power, temp, etc., even underwater. The complexity of the dual chip along with the Plx bridge might be the reasons why the 690's bios has been proven hard to crack. It's not simply the matter of using KBT to modify the bios. Many have tried in the past, and if you look at the history of this thread, you will see it. I do know these wonder bios exist looking at some HOF 3dmark scores, but no one has ever made these public...lol
My guess is that you are getting the benefit of the volt unlock (and boost disable through the bat file by locking the p-2 state?) , and not necessarily bios hack with 690s at 1350ish or so. I recall that I was able to push my 690s to 1250 - 1280 ( or even more) when I was benching underwater. The only reference point I have is that of synthetic benchmarks from HOF, HWBot, etc. This is why I was wondering if could run the futuremark, Heaven and Valley benches so that the score can be compared across a common data set, and then we can determine if there is any throttling, notwithstanding the volt unlock.


----------



## V3teran

Ive actually read almost all of this thread, i was one of the first people to get a 690 in the UK and im 7 or 8th in the list at the front, i actually had two 690's at release so im well aware of what people have tried regarding overclocking.

The most i ever managed with 1.325v was 1267Mhz stable, i could not get it to go any higher as when gaming all i would get was a blank screen.
I had to go past this voltage to get a nice 1372Mhz at 1.406 volts and temps hit around 55 degress with both cores pushing 1372-1354mhz.

I will be pushing for higher when i get the time, i have all my hex codes ready for voltage adjustment all the way up too 1.6v. I have found the hex codes to increase at 500mv at a time. However i found that my card becomes unstable at 1.48v, it hard crashes but anywhere around 1.45v is fine and ive gamed for many many hrs at this voltage just to see what happens.

Even though the card is well cooled i do not think the rest of the circuitry can cope at 1.48v so this is my maximum threshold. I will try for 1398mhz and leave it there for 24/7 use depending on how many volts it takes for an extra 26 MHz depends on whether i achieve this or whether i leave it at 1372mhz. Tbh im quite happy at 1372Mhz but i know there is a little more i could squeeze out of it.

You say many have tried to modify there bios and by the sound of it they failed. Even Skyn3t didnt succeed in going past 1.3v on the 690 because of the buck controller. Well i have succeeded and the .bat files with specific lines of text and hex that i have are very different to the ones that ive seen on here, hence there failure.









You need to read the ncp4206/8 spec sheet, most people either didn't know about this or didn't understand it.

I will get some benchies up when i get few mins...

Do you have Unofficial Overclocking enabled or disabled?

3D Mark 11 Result with cpu at stock 3.4ghz
http://www.3dmark.com/3dm11/8613881


----------



## provost

Well, this is interesting. I saw the video and noticed that your clocks are jumping around, so. presumably, you don't have boost disabled. The only person, other than the in-house talent (skyn3t/Occam), I know who can modify/unlock Nvidia, Intel bios is Slv7. I know of him as I also have a gaming laptop, and this is where he spends most of his time.
In any event, Kudos to you if you figured it out!
Post up the bios/bat files in the OP for the other members of this club.








Not sure how many members are still around from the list, but may help some closet 690 owners...


----------



## V3teran

Which video did you view, have you viewed the arms 3 vid? I don't get you? Which video do you refer too? I did alter the bios to allow for slightly higher boost than stock.


----------



## provost

Its the video you linked, bubba.








My whole point was that boost is enabled..and not disabled...lol
I guess you don't have a Gk110 card? Because most of the modded bios (by skyn3t/occam) have boost disabled. People prefer it this way since they can lock the clocks and don't see the kind of dips/ throttling as you might with boost enabled bios. The initial engineering bios (HWBot I believe) had the boost enabled for Titan, and so do the 780 KPE bios from kingpin site. This is because precision can be used to lock memory and core boost on this unlocked card.


----------



## V3teran

The arma 3 video has no jumping, ive mad some alterations since the PD2 video and all games are maxxed out now on both GPUS.
For the 3rd time i ask you have you enabled *unofficial overclocking*?

You dont need to disable the boost, there is more than one way to skin a cat!
No jumping clocks here or on any other game i play, the PD2 was an early video but ive fixed jumping clocks since then with a few tweaks.
Check out this vid, i can upload more! Makes no difference on my card whether boost is enabled or disabled, check out the vid.





Saying that i think i will go make a bios with boost disabled as i would like to see the difference, i have nothing else to do so gives me something to tinker with, thanks for highlighting that


----------



## provost

Quote:


> Originally Posted by *V3teran*
> 
> The arma 3 video has no jumping, ive mad some alterations since the PD2 video and all games are maxxed out now on both GPUS.
> For the 3rd time i ask you have you enabled *unofficial overclocking*?
> 
> You dont need to disable the boost, there is more than one way to skin a cat!
> No jumping clocks here or on any other game i play, the PD2 was an early video but ive fixed jumping clocks since then with a few tweaks.
> Check out this vid, i can upload more! Makes no difference on my card whether boost is enabled or disabled, check out the vid.
> 
> 
> 
> 
> 
> Saying that i think i will go make a bios with boost disabled as i would like to see the difference, i have nothing else to do so gives me something to tinker with, thanks for highlighting that


I don't know what you mean by "unofficial overclocking"?

I don't do bios, nor do I have any interest in learning how to, as I have plenty of other things of far more importance (for me) to spend my time on. But, I do appreciate people on this forum who selflessly and graciously help others.


----------



## anubis1127

Unofficial Overclocking in Afterburner is an AMD setting, it allows you to extend beyond the AMD vbios limits, the setting is "Extend official overclocking limits". That setting does nothing with NV cards, if that is the setting @V3teran is referring to then I'm not sure why he keeps mentioning it.


----------



## V3teran

Ok fair enough provost. Anubis your wrong, unofficial overclocking takes the power level on your gpu to maximum, the default is powersaving mode which WILL cause throttling because ive experienced it. Saying it only works for AMD cards only is the highest load of tosh ive heard in along time. If you don't believe me go ask Alex/unwinder. I concur when he says people dont know how to use it. You must be another one of those people.

unofficial overclocking not extend overclocking limits.
Learn to read before you pass judgement.


----------



## anubis1127

Quote:


> Originally Posted by *V3teran*
> 
> Ok fair enough provost. Anubis your wrong, unofficial overclocking takes the power level on your gpu to maximum, the default is powersaving mode which WILL cause throttling because ive experienced it. Saying it only works for AMD cards only is the highest load of tosh ive heard in along time. If you don't believe me go ask Alex/unwinder. I concur when he says people dont know how to use it. You must be another one of those people.
> 
> unofficial overclocking not extend overclocking limits.
> Learn to read before you pass judgement.


I said if that is the setting you were referring to, but it wasn't. I now see the setting you were referring to though. I was just thinking of a different setting than the one you were talking about, my bad.


----------



## V3teran

Quote:


> Originally Posted by *anubis1127*
> 
> I said if that is the setting you were referring to, but it wasn't. I now see the setting you were referring to though. I was just thinking of a different setting than the one you were talking about, my bad.


Ok no worries mate


----------



## provost

Your persistence made me curios enough, as I know through my "real world" dealings that if someone is so persistent (belligerent







) with conviction, then there must be something they are saying which might be worth listening too, even if the demeanor is off putting....lol
I completely forgot about our exchange in this thread earlier http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/5110
as I forget a lot of my posts since I am usually posting on the go , or while traveling, etc.... lol

You know, all you had to do was to link this
http://forums.guru3d.com/showthread.php?t=362667&page=20 and it would have sufficed.








It seems you had linked it before, but like many in my line of work, I too suffer from ADHD









Anywho, I now understand the context of your comments towards the "other forum" from another thread here...lol

so, it does appear that you have cracked the 690 code. So, when were you planning to post it in OP? - if you are the sharing type, that is.


----------



## V3teran

Quote:


> Originally Posted by *provost*
> 
> Your persistence made me curios enough, as I know through my "real world" dealings that if someone is so persistent (belligerent
> 
> 
> 
> 
> 
> 
> 
> ) with conviction, then there must be something they are saying which might be worth listening too, even if the demeanor is off putting....lol
> I completely forgot about our exchange in this thread earlier http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/5110
> as I forget a lot of my posts since I am usually posting on the go , or while traveling, etc.... lol
> 
> You know, all you had to do was to link this
> http://forums.guru3d.com/showthread.php?t=362667&page=20 and it would have sufficed.
> 
> 
> 
> 
> 
> 
> 
> 
> It seems you had linked it before, but like many in my line of work, I too suffer from ADHD
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anywho, I now understand the context of your comments towards the "other forum" from another thread here...lol
> 
> so, it does appear that you have cracked the 690 code. So, when were you planning to post it in OP? - if you are the sharing type, that is.


TBH if it wasnt for Agent-A01 i wouldnt have cracked it because he was the one who did most of the work. I just made a few changes once i had the code, although he was the one that pointed me in the right direction with regards to datasheet information which i do understand as its a small part of my job to understand binary etc. So many Kudos to him! I have managed to tweak things alittle to allow for constant gpu usage and various voltages changes on the fly as ive shown in my video. The beauty about this code is that it works and i think it will work with the Ti and Titan black once you make a few minor changes. I dont have either of those cards but tbh i have working code so it could work.

It works with Titan,690 and 780 so i see now reason why apart from a few changes.Tbh i was going to release it but that thread over at OCUK where i was ridiculed before i posted up evidence has really put me off if im honest.


----------



## provost

I am not sure how a grievance against another forum is linked to OCN. If you don't want to post your work on this forum for other members' benefit, that's fine, as it is your prerogative. This may sound harsh, but no one cares what you have been able to do with your cards, since no one else can benefit from it.
I can think of many people who have been helpful in the advancement of overclocking the recent Nvidia cards (whether on this forum or another), startling with Uwinder, Slv7, Skyn3t/Occam, Zwarudo(RIP)/Rbby258, and many others who have been very gracious in sharing their work and helping others. I am sure they have all felt slighted along the way in some form or another, but to ridicule any of them for their lack of knowledge (i.e Skyn3t) while not sharing one's own comes across as disingenuous, IMHO.


----------



## V3teran

Quote:


> Originally Posted by *provost*
> 
> I am not sure how a grievance against another forum is linked to OCN. If you don't want to post your work on this forum for other members' benefit, that's fine, as it is your prerogative. This may sound harsh, but no one cares what you have been able to do with your cards, since no one else can benefit from it.
> I can think of many people who have been helpful in the advancement of overclocking the recent Nvidia cards (whether on this forum or another), startling with Uwinder, Slv7, Skyn3t/Occam, Zwarudo(RIP)/Rbby258, and many others who have been very gracious in sharing their work and helping others. I am sure they have all felt slighted along the way in some form or another, but to ridicule any of them for their lack of knowledge (i.e Skyn3t) while not sharing one's own comes across as disingenuous, IMHO.


Ok fair enough but lets get one thing straight here, i only bark at people who bark at me *first*. So dont try and turn this around as if im the bad one here cause i aint. I came with good intentions and was ridiculed for nothing because it was previously not thought possible. Ive been around here for a few years now and just because i dont post dont mean i dont read the forum.
I came here for a Caselabs M10 long ago from Good old Jim!

If people want the code they can PM me for it, hopefully fingers crossed with GM200 a similar trick can be done depending on architecture.
After reading the Datasheet and understanding whats going on its not that difficult, so maybe other people should read that data sheet, process the information, put that information to use and then do it for themselves. There are plenty of "talkers" around with attitudes to go with it but not many "walkers" as far as i can see.


----------



## provost

Quote:


> Originally Posted by *V3teran*
> 
> Ok fair enough but lets get one thing straight here, i only bark at people who bark at me *first*. So dont try and turn this around as if im the bad one here cause i aint. I came with good intentions and was ridiculed for nothing because it was previously not thought possible. Ive been around here for a few years now and just because i dont post dont mean i dont read the forum.
> I came here for a Caselabs M10 long ago from Good old Jim!
> 
> If people want the code they can PM me for it, hopefully fingers crossed with GM200 a similar trick can be done depending on architecture.
> After reading the Datasheet and understanding whats going on its not that difficult, so maybe other people should read that data sheet, process the information, put that information to use and then do it for themselves. There are plenty of "talkers" around with attitudes to go with it but not many "walkers" as far as i can see.


Give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime..lol
And, to that extent, we are in agreement, philosophically speaking.
I thought that might be the case, but wanted to hear you say it.







Carry on, if this is indeed your intent, I too would like to see more "walkers" ...lol


----------



## Wiz766

This isnt my exact 690 but I was curious as to what the grey stuff was in the areas cicled in red? I pulled my 690 appart to clean and re apply thermal paste to the yellow area of course. Blue is thermal pads. I have no idea what they grey stuff is in the red area. Can someone please help me.


----------



## wrath04

Quote:


> Originally Posted by *Wiz766*
> 
> This isnt my exact 690 but I was curious as to what the grey stuff was in the areas cicled in red? I pulled my 690 appart to clean and re apply thermal paste to the yellow area of course. Blue is thermal pads. I have no idea what they grey stuff is in the red area. Can someone please help me.


I used thermal pads on that red circled area as well, as you can see on this card, while installing the Arctic Accelero twin turbo :


----------



## Wiz766

THanks for the picture!


----------



## V3teran

Quote:


> Originally Posted by *Wiz766*
> 
> This isnt my exact 690 but I was curious as to what the grey stuff was in the areas cicled in red? I pulled my 690 appart to clean and re apply thermal paste to the yellow area of course. Blue is thermal pads. I have no idea what they grey stuff is in the red area. Can someone please help me.


They are just thermal pads in the red area, if you do apply a water block make sure you use 0.5mm thermal pads because if you use any bigger the card will be tapered and one gpu wont cool as well as the other one. I tried 1mm and i had this problem so make sure you get 0.5mm.


----------



## Lightningln

Hi there, need a little help here. I can't do anything about OC with my 690. The problem : I can't get the 690 working with the 1.3V mod (including LLC hack). If I use it, I got random crash, even without OC : 




Everything work fine if I don't use the mod. I tested everything I can : modded vbios, stock vbios, W7, W8, older drivers, SLI off.. nothing works.

The only temporal solution I found is this : with NvidiaInspector, if I lock the "Performance Level" to P2, it eliminate the crash permanantly. Stable at 1228 MHz at 1.3V + 650 on MEM on every games or benchmark. Not a good solution but it work..

Other things.. +70 on Afterburner is the max stable (pretty low oc I think) without touching voltage.. dunno if it is related because I got 1228MHz with my trick but at 1.3V :

NVI.exe -forcepstate:0,2 -setGpuClock:0,2,1230 -setMemoryClockOffset:0,0,650 -setBaseClockOffset:0,0,208
NVI.exe -forcepstate:1,2 -setGpuClock:1,2,1230 -setMemoryClockOffset:1,0,650 -setBaseClockOffset:1,0,208

If I reset everything (force pstate 16, voltage, core clock etc..) after using my trick, I can get randomly (but not so fast) the same crash I told you up there, have to restart the PC for fix that. Pstate 8 (idle) don't do that.

Configuration (full watercooled : hapshack.com/images/XGiTP.jpg) :

- Corsair 900D
- Intel i5 4770K @ Stock
- MSI Z87 GD65 Gaming + BIOS Stock (latest)
- Nvida GTX 690 (Asus) @ Stock
- 16GB Kingston HyperX BEAST @ Stock
- BeQuiet Pure Power L8 730W

Anybody please ? I don't get it..


----------



## V3teran

Here is a guide that i wrote long ago, there are other ways to do it regarding the bios like using Keplar Power Tweaker but this is the easiest way.
http://forums.guru3d.com/showthread.php?t=362667

Or you can use that guide and download the Skyn3t bios here at start of this thread.
http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club


----------



## Wiz766

Would it matter if I am just using the stock headsink?


----------



## Alex132

Quote:


> Originally Posted by *Wiz766*
> 
> Would it matter if I am just using the stock headsink?


Yeah...
I am nervous about stock temps on the stock cooler.

Heck, I placed a whole bunch of heatsinks on the back of the VRMs and they still get bloody hot.


----------



## V3teran

I would advise not to go over the 1.3v without water, the KGB modded allowed me to get an extra 30mhz with boost. The 1.3v mod allowed me to go up to around 1241 stable and my own custom volt-mod allowed me to go around 1372mhz fully stable 24/7, i dont know of *anybody anywhere* running a 690 anywhere near this speed on custom water setup.

The thing is volt modding will put massive stress on the mosfets/vrm's so its not advised unless your on water. With the 1.3v mod and Zaruwardo tool you can expect an increase to around 1200+mhz fully stable if your cooling is ok.


----------



## Alex132

Quote:


> Originally Posted by *V3teran*
> 
> I would advise not to go over the 1.3v without water, the KGB modded allowed me to get an extra 30mhz with boost. The 1.3v mod allowed me to go up to around 1241 stable and my own custom volt-mod allowed me to go around 1372mhz fully stable 24/7, i dont know of *anybody anywhere* running a 690 anywhere near this speed on custom water setup.
> 
> The thing is volt modding will put massive stress on the mosfets/vrm's so its not advised unless your on water. With the 1.3v mod and Zaruwardo tool you can expect an increase to around 1200+mhz fully stable if your cooling is ok.


I can hit ~1215 core on air with a stock BIOS which is kinda close


----------



## V3teran

Double post Ignore please.


----------



## V3teran

Thats great on air, my card on air would hit 1202mhz, any higher would fail








With your card you would probably get around 1254 or 1267 with 1.3v mod and Zaruwardo Tool which is for LLC (Load Line Calibration)
There is also another to fix the LLC but the Zaruwardo tool is the easiest way.

Although i wouldnt chance that on air as a surge in electric without proper cooling could pop the vrm's.
Also hot vrm's without cooling could cause them to degrade and pop at a later date.


----------



## Alex132

Quote:


> Originally Posted by *V3teran*
> 
> Thats great on air, my card on air would hit 1202mhz, any higher would fail
> 
> 
> 
> 
> 
> 
> 
> 
> With your card you would probably get around 1254 or 1267 with 1.3v mod and Zaruwardo Tool which is for LLC (Load Line Calibration)
> There is also another to fix the LLC but the Zaruwardo tool is the easiest way.
> 
> Although i wouldnt chance that on air as a surge in electric without proper cooling could pop the vrm's.
> Also hot vrm's without cooling could cause them to degrade and pop at a later date.


I haven't actually tried to go higher than 1215Mhz. It was just the most I was comfortable with the temps. It would hit mid 90s.
I put extra VRM cooling on it actually lol (old pic, but you can see the copper heatsinks on top of the 690)



Ghetto, but effective


----------



## V3teran

Yeah i see what you mean, aslong as it works! A water block is worth getting if i may add, i also had mine running at 1372 on a single Rx360 with cpu overclock on old x58 930. So you do not actually need tons and tons of radiator. You should add your gpu to that loop as what you have is complete overkill for cooling.


----------



## justinyou

Quote:


> Originally Posted by *Alex132*
> 
> I haven't actually tried to go higher than 1215Mhz. It was just the most I was comfortable with the temps. It would hit mid 90s.
> I put extra VRM cooling on it actually lol (old pic, but you can see the copper heatsinks on top of the 690)
> 
> 
> 
> Ghetto, but effective


I did the exact same thing by sticking 2 long pieces of aluminium heatsink (not copper like yours) at the gtx690, and i think it is actually working well.
I can definitely feel the heatsink hot to touch even before i start running any games.








Fot those who wonders, i use the 3M 8810 adhesive to stick the heatsink to the vrm.


----------



## Alex132

Quote:


> Originally Posted by *justinyou*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> I haven't actually tried to go higher than 1215Mhz. It was just the most I was comfortable with the temps. It would hit mid 90s.
> I put extra VRM cooling on it actually lol (old pic, but you can see the copper heatsinks on top of the 690)
> 
> 
> 
> Ghetto, but effective
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did the exact same thing by sticking 2 long pieces of aluminium heatsink (not copper like yours) at the gtx690, and i think it is actually working well.
> I can definitely feel the heatsink hot to touch even before i start running any games.
> 
> 
> 
> 
> 
> 
> 
> 
> Fot those who wonders, i use the 3M 8810 adhesive to stick the heatsink to the vrm.
Click to expand...

Which makes me really wonder why Nvidia said that the GTX690 doesn't need a backplate for additional VRM cooling









Not only would the backplate looked WAY cooler, but it would have helped the VRMs stay cooler... I dunno man, I feel like it's not much to ask for on a $1000 GPU.


----------



## V3teran

This is the .bat file im using which will allow me to go from 1.3 to 1.6v depending on the Hex that i enter for voltage control.
"c:\program files (x86)\msi afterburner\msiafterburner.exe" /sg0 /wi3,20,21,22 /wi3,20,D2,7A /wi3,20,D3,7A /sg1 /wi3,20,21,22 /wi3,20,D2,7A /wi3,20,D3,7A

Here is a video i made few weeks back.





Here is my Hex codes that i used for personal use. This will save alot of time, effort and potential damage.

1.35v=2b
1.34v=2c
1.38v=26
1.60v=2g
1.4v=23
1.409v=22=1354?
1.413v=21=1372?
1.438v=1D
1.444V=1C
1.456v=1a
1.51v=0f
1.50v=11
1.475v=15
1.48v=14
1.42v=1f=1372?
1.256=3a

*USE ALL OF THE ABOVE AT YOUR OWN RISK!*


----------



## V3teran

Just to add which i should have included in the original post, I take no responsibility if you damage your gpu and its not advised to do this unless you are running with good custom water cooling, use at your *OWN RISK*! If you need help in anyway let me know.


----------



## virgis21

Ir was good tread and fun to play, but I have to leave you guys. Mu 690 went rma and Unfoxable.. So I got offer for Titan Black in place of my gtx 690. I said OK, not thinking long


----------



## JFAmill

hi guys!!!

One of this little beast just dropped in my hands... and i got some questions!!

Can you explain me how to use the new nvflash to put one of the bios to this dual GPu? Wich bios i should put A-B-C?

Can someone put a "detailed" guide to flash one of the bios with noBoost, step by step (chip plx) and i will warn the owner of the post to update it?

thanks in advance!


----------



## V3teran

Here is a step by step guide that i wrote long ago to flashing the 690 is ten easy to follow steps.
Its foolproof if you follow it correctly.
Use Nv flash 5.127 as it says in the guide.
http://forums.guru3d.com/showthread.php?t=362667


----------



## JFAmill

Quote:


> Originally Posted by *V3teran*
> 
> Here is a step by step guide that i wrote long ago to flashing the 690 is ten easy to follow steps.
> Its foolproof if you follow it correctly.
> Use Nv flash 5.127 as it says in the guide.
> http://forums.guru3d.com/showthread.php?t=362667


Really thanks bud!
I hope i could pay a big and cold beer someday!

I will check it tomorrow, thanks in advance for spending your time with my doubts!


----------



## Fraizer

hello all

in GPU-Z.0.7.9 i have this screen (attached picture) and when i want to see the list in the down left corner NVIDIA Geforce GTX 690 i can select only this in the list just 1 line... no another display card like the past...

oh and i have only 1 off the 2 d-dvi connectors work...

its normal ??? or my second GPU died ?

thank you


----------



## Alex132

Quote:


> Originally Posted by *Fraizer*
> 
> 
> hello all
> 
> in GPU-Z.0.7.9 i have this screen (attached picture) and when i want to see the list in the down left corner NVIDIA Geforce GTX 690 i can select only this in the list just 1 line... no another display card like the past...
> 
> oh and i have only 1 off the 2 d-dvi connectors work...
> 
> its normal ??? or my second GPU died ?
> 
> thank you


It says enabled (2) GPUs.

You can check with MSI Afterburnder running a game, and see whether or not there is load on both GPUs.


----------



## Fraizer

hello

now in NVIDIA SLI i have : Disabled

i use your soft and see just 1 gpu working


----------



## treadstone

HI, i just picked up a Corsair Air 240

i have a Rampage Gene
i7-4930K Ivy Bridge-E 6-Core

I also have a Evga Signature GTX 690, I am debating selling it while the prices are still very good but i just hate to part
with the beauty.

any comments on my thoughts.....

thanks in advance!


----------



## Fraizer

maybe stupid question... what its the best DVI-D to connect my screen to have the best quality etc... ^^ (they are 2 x dvid-d and 1 dvi-i in the gtx 690)

thank you


----------



## rulezz

Quote:


> Originally Posted by *wrath04*
> 
> I used thermal pads on that red circled area as well, as you can see on this card, while installing the Arctic Accelero twin turbo :


hello, ur photo really helped me , i want to ask you .. is the arctic worked fine with you ?? u can oc and play ultra bf4 or titanfull without any problem ?? also i ordered it and wanna ur help shall i put insulation tapes around and UPPER (front face ) inductors? or only around is enough ? need your help all , because i read many reviews that their gtx 690 died from this arctic but the best thing iam sure now that we put pads better in card it self not in the cooler , please confirm to me my info becuase i really worry and need it


----------



## V3teran

Quote:


> Originally Posted by *rulezz*
> 
> hello, ur photo really helped me , i want to ask you .. is the arctic worked fine with you ?? u can oc and play ultra bf4 or titanfull without any problem ?? also i ordered it and wanna ur help shall i put insulation tapes around and UPPER (front face ) inductors? or only around is enough ? need your help all , because i read many reviews that their gtx 690 died from this arctic but the best thing iam sure now that we put pads better in card it self not in the cooler , please confirm to me my info becuase i really worry and need it


Make sure you use 0.5mm thermal pads on the 690 because if you use 1mm the card will become tapered and 1 GPU will be around 10 degrees hotter than the other one.
Dont be like me who found this out myself which was very time consuming.


----------



## wrath04

Quote:


> Originally Posted by *rulezz*
> 
> hello, ur photo really helped me , i want to ask you .. is the arctic worked fine with you ?? u can oc and play ultra bf4 or titanfull without any problem ?? also i ordered it and wanna ur help shall i put insulation tapes around and UPPER (front face ) inductors? or only around is enough ? need your help all , because i read many reviews that their gtx 690 died from this arctic but the best thing iam sure now that we put pads better in card it self not in the cooler , please confirm to me my info becuase i really worry and need it


Sorry for the late reply bud, sent you a PM.


----------



## rulezz

Quote:


> Originally Posted by *V3teran*
> 
> Make sure you use 0.5mm thermal pads on the 690 because if you use 1mm the card will become tapered and 1 GPU will be around 10 degrees hotter than the other one.
> Dont be like me who found this out myself which was very time consuming.


thanks for ur help , i searched more too very hard to know anything about , only one video too but its very bad as guide , well about 0.5 mm thermal pads its come with arctic box? or i should buy it from another store?
Quote:


> Originally Posted by *wrath04*
> 
> Sorry for the late reply bud, sent you a PM.


yea really thank you your photos helped me , and left for you some questions


----------



## zancaracing

Hello . Can someone link the modded bios for my gtx690? Thanks !


----------



## wefornes

Hi guys I am a pretty new member of this forum and i am glad to be part of the 690 owners club..!! this is my vga if from ASUS and with a arctic 690 is running very quite 30 Cº idel and 60 Cº full load ( e.j: crysis 3, watchdogs. etc) .


----------



## MaRS-from-MaRS

Hi guys, I'm brand new to the website and forums, I have owned my GTX 690 for a little over a year now and I absolutely love it and have no intention of replacing it until Nvidia makes a GTX 990.

I have tried to overclock it using EVGA Precision and for some reason my card constantly crashes. The last time I did this it crashed and when I restarted my PC I got a blue screen and had to reinstall Windows 7. Major pain in the butt.

So I'd like to get some advice from you guys about what to do.

What is a good safe starting point to set GPU clock and Memory clock and any other settings like fan percentage, etc. that absolutely should not shutdown so that I know if its the card or if its me?

BTW this is what I am working with;

i7-3820
GTX 690
32GB ram

Any help would be much appreciated and I will get back to you guys asap.


----------



## shaneduce

Quote:


> Originally Posted by *MaRS-from-MaRS*
> 
> Hi guys, I'm brand new to the website and forums, I have owned my GTX 690 for a little over a year now and I absolutely love it and have no intention of replacing it until Nvidia makes a GTX 990.
> 
> I have tried to overclock it using EVGA Precision and for some reason my card constantly crashes. The last time I did this it crashed and when I restarted my PC I got a blue screen and had to reinstall Windows 7. Major pain in the butt.
> 
> So I'd like to get some advice from you guys about what to do.
> 
> What is a good safe starting point to set GPU clock and Memory clock and any other settings like fan percentage, etc. that absolutely should not shutdown so that I know if its the card or if its me?
> 
> BTW this is what I am working with;
> 
> i7-3820
> GTX 690
> 32GB ram
> 
> Any help would be much appreciated and I will get back to you guys asap.


I would up the Voltage to the max alould for the card before I started OCing.
Also are you on water or air?


----------



## MaRS-from-MaRS

I am currently running air.

What is a safe Voltage for a air cooled 690? Also is it dangerous to up the voltage to high?

Sorry for the noob questions, but I am very new to this.


----------



## shaneduce

Then your overclocking will not be much then, Sorry to tell you bad news.


----------



## wefornes

What about having an accelero gtx690, i have 30ºC idle and 60-65ºC gaming. what increment of Oc can i try..?


----------



## MaRS-from-MaRS

Quote:


> Originally Posted by *shaneduce*
> 
> Then your overclocking will not be much then, Sorry to tell you bad news.


Well that's okay then, at least I know now and won't be bothering with it anymore. Thanks for the heads up.


----------



## shaneduce

Quote:


> Originally Posted by *wefornes*
> 
> What about having an accelero gtx690, i have 30ºC idle and 60-65ºC gaming. what increment of Oc can i try..?


DISCLAMER: the advice below could cook your card dead. It's not my fault if you fry it.

Start at 135% power target, and 1150mV. Test it if it dosen't get to hot or die on you. Then start with 10 mhz jumps till it blue screen's back off 20-25 mhz and there you go. That would be an ok OC.


----------



## MaRS-from-MaRS

Quote:


> Originally Posted by *shaneduce*
> 
> DISCLAMER: the advice below could cook your card dead. It's not my fault if you fry it.
> 
> Start at 135% power target, and 1150mV. Test it if it dosen't get to hot or die on you. Then start with 10 mhz jumps till it blue screen's back off 20-25 mhz and there you go. That would be an ok OC.


So I tried to up the Voltage on EVGA's Precision but every time I would hit "Apply" it would then gray itself out and go back to zero.


----------



## shaneduce

Quote:


> Originally Posted by *MaRS-from-MaRS*
> 
> So I tried to up the Voltage on EVGA's Precision but every time I would hit "Apply" it would then gray itself out and go back to zero.


Try Nvidia Inspector.


----------



## timerwin63

A bit late to the game, but you can go ahead and add my name to the list. Proof of ownership:


Pardon the "iffy-at-best" picture quality.


----------



## shaneduce

Nice i'm selling mine to move on to the 9XX's


----------



## timerwin63

Quote:


> Originally Posted by *shaneduce*
> 
> Nice i'm selling mine to move on to the 9XX's


When and how much? I'm considering getting a second one, out of simplicity. That, or following the same path as you.


----------



## shaneduce

Quote:


> Originally Posted by *timerwin63*
> 
> When and how much? I'm considering getting a second one, out of simplicity. That, or following the same path as you.


Bro U got PM


----------



## timerwin63

Quote:


> Originally Posted by *shaneduce*
> 
> Bro U got PM


And a response to you.


----------



## shaneduce

Quote:


> Originally Posted by *timerwin63*
> 
> And a response to you.


Back at you with offer.


----------



## Ujaho

Hi fellas, ive got a GTX690 EVGA Sig card & my bios is 80.04.1E.00.91 & 91. I don't know if this is an old bios & if i should be looking at updating it, if so can someone please give me a link to download the new bios's..i picked up that very handy looking 10 step guide to updating, great site & thx for help


----------



## zancaracing

Can someone upload the modified bios?? Thank you!


----------



## V3teran

I will upload mine, gimme an hour or two.


----------



## Arniebomba

This might not be the most objective place to ask this.. But then agina, maybe it is..
I'm in doubt between a GTX980 and the GTX690.
I can buy a GTX690 for 250 euro's. The GTX 980 costs 580 euro's. It's a difference of 330 euro's..
I am running a ITX build, so there is no space for sli.
The GTX690 gives about the same performance, or slightly better, then the 980.
The GTX690 has less Vram. I play BF4 a lot, so 2GB might not be enough. (though i play at 1080)
The GTX980 will have better/longer driver support.

Any input on this?


----------



## timerwin63

If money's no object, I'd say the 980, 9.8/10 times. It'll OC higher, VRAM is an added bonus that'll keep your card relevant for a longer amount of time, etc, etc.

If it's a factor however, I would say, given the still small performance difference, that the 690 would be a solid card to buy.


----------



## V3teran

Sorry I forgot to upload my bios was busy last night. Will definitely upload the GPU bios tonight.


----------



## zancaracing

Thank you so much!


----------



## Fuzzysham

I'm glad this thread is back! I have some questions regarding my 690.

I saw the XSPC Razor nVidia GTX 690 Full Coverage VGA Block mentioned here earlier. Is that a decent block? It seems to be the only one I can find. An EK one would have been preferable. Would the EK backplate work with the XSPC waterblock?

The Skyn3t BIOS's posted; what is the difference between A, B and C? Someone asked that quite a while ago and nobody replied.

Does anybody have any stuttering issues when trying to play BF4 with SLI enabled? I get immense stuttering and I don't recall it being that way since release, only since I started playing it again in June/July. I have always played it at 1440p with everything maxed.

I have a GTX 980 on order with Amazon but if I can get my hands on a decent waterblock and sort out the BF4 issues, I will be happy to stick with my 690...although the VRAM is a problem.

Thanks!


----------



## wefornes

Quote:


> Originally Posted by *Arniebomba*
> 
> This might not be the most objective place to ask this.. But then agina, maybe it is..
> I'm in doubt between a GTX980 and the GTX690.
> I can buy a GTX690 for 250 euro's. The GTX 980 costs 580 euro's. It's a difference of 330 euro's..
> I am running a ITX build, so there is no space for sli.
> The GTX690 gives about the same performance, or slightly better, then the 980.
> The GTX690 has less Vram. I play BF4 a lot, so 2GB might not be enough. (though i play at 1080)
> The GTX980 will have better/longer driver support.
> 
> Any input on this?


I im thinking like you, but i think i will go for the GTX970, i a lot more cheaper and perform very nice. I play games at 2560 x 1080p. So the GTX 970 it more than enough for me.


----------



## Arniebomba

Quote:


> Originally Posted by *wefornes*
> 
> I im thinking like you, but i think i will go for the GTX970, i a lot more cheaper and perform very nice. I play games at 2560 x 1080p. So the GTX 970 it more than enough for me.


Yeah, i mean, at this point the best bang for the buck looks to be the 690. 250euro against 580euro with similar performance.. Though i'm "worried" about the driver support and 2GB vram.


----------



## provost

Quote:


> Originally Posted by *Fuzzysham*
> 
> I'm glad this thread is back! I have some questions regarding my 690.
> 
> I saw the XSPC Razor nVidia GTX 690 Full Coverage VGA Block mentioned here earlier. Is that a decent block? It seems to be the only one I can find. An EK one would have been preferable. Would the EK backplate work with the XSPC waterblock?
> 
> The Skyn3t BIOS's posted; what is the difference between A, B and C? Someone asked that quite a while ago and nobody replied.
> 
> Does anybody have any stuttering issues when trying to play BF4 with SLI enabled? I get immense stuttering and I don't recall it being that way since release, only since I started playing it again in June/July. I have always played it at 1440p with everything maxed.
> 
> I have a GTX 980 on order with Amazon but if I can get my hands on a decent waterblock and sort out the BF4 issues, I will be happy to stick with my 690...although the VRAM is a problem.
> 
> Thanks!


So, here is the thing; as silly as it may sound, I have been using a quad 690 set up on my dell 27 inch ips monitor and also asus 27 inch (alternating between the two monitors), while my Titans patiently await for me to finish the darn rebuild that I can't seem to finish (this makes me mad at myself, as I would love to be playing with my Titans on my 5760 x 1080 or 4k, but we all have our priorities (work) that rank ahead of gaming or benching, unfortunately.... Lol. Not to mention my BNIB KPE which will probably end up in the mausoleum of my unused hardware the way things are going.







)
Anywho, be as it may, the 690s have been holding up pretty well on everything that I have played (occasionally) so far. I did download that middle earth Mordor game recently, but I haven't had a chance to play it, so can't be be sure. But will find out soon.

Now, to answer your question, if I were buying today, I would pick two 970s over a 690 and a 980 for SFF, if money wasn't an issue. I don't know how much the used 690s are going for, but if the delta between a used 690 and 980 is too much than I would say that 690 seems to hold up extremely well on 27 inch 1080p 144hz or 27 inch Ips for almost all games, including a heavily modded skyrim. If 690s are cheap, you can buy one, and sell it when 980s drop in value once the next batch of Maxwell is reeled or when the new AMD cards come out.
However, even though the reviewer show 690 driving higher resolutions, I personally would not use it for resolutions higher than the ones I noted.

Net-net, if you can afford it, go for 970/980. If you can't, look at everything and you can find pretty good values everywhere in the secondary resale market.

Edit: ermmm, doesn't look like I read your question right ( blaming my add again...lol), you already seem to have a 690. No issues for me on bf4, but again I have only played all of one hour of this game once (not my cup of tea and I am not a multi player gamer either.. No time







) As for the blocks, I have the koolance blocks (currently not in use, as the cards are on stock heat sink without much of an OC), but these have been great, except that these are out of production now. Any backplate would work with another mfg's block. I have the evga backplates, again not sure if still in production. Sorry, couldn't be more helpful.


----------



## Sugi

Arnie, go for the GTX 690. They actually have 4GB versions of the 690 not split, 2GB & 2GB. The cost between the two video cards is huge and obviously, it's worth it right now. However, it would be smart to sell the 690 shortly after and get 980 or 990 or whatever is out at that point. But with that being said, if you are not driving a lot of pixels like 3x1080p or 4k resolution. I would opt for the 970 and be done with it.


----------



## Alex132

Quote:


> Originally Posted by *Sugi*
> 
> Arnie, go for the GTX 690. They actually have 4GB versions of the 690 not split, 2GB & 2GB.


Unless you're talking about the Asus MARS III that is a lie.

Nvidia made only reference GTX690s, they all are the same. A pitiful 2GB of VRAM on a tiny bus bandwidth for a $1000 card.


----------



## Sugi

The 690 is an amazing card that has held up extremely well. Advising against the card for that price, because of the vRam doesn't make sense. Oh yea and that magical vRam, just another marketing tool to make more money. Raw horse power and strength will always weight out cards with a lot of vRam.

Benchmarks for BF4 at high Resolution. The 690 still kills it, because it's just a thick, beefy card that has raw horse power and incredible strength. I am not going to argue this conversation anymore not because of you but because this point has been beaten into the ground way too many times and I am not going to start another one.
http://www.tomshardware.com/reviews/battlefield-4-graphics-card-performance,3634-9.html
http://www.bit-tech.net/hardware/graphics/2013/11/27/battlefield-4-performance-analysis/5

Arnie, Research the 690 card and decide for yourself or just take my word for it and game on!


----------



## Alex132

Quote:


> Originally Posted by *Sugi*
> 
> The 690 is an amazing card that has held up extremely well. Advising against the card for that price, because of the vRam doesn't make sense. Oh yea and that magical vRam, just another marketing tool to make more money. Raw horse power and strength will always weight out cards with a lot of vRam.
> 
> Benchmarks for BF4 at high Resolution. The 690 still kills it, because it's just a thick, beefy card that has raw horse power and incredible strength. I am not going to argue this conversation anymore not because of you but because this point has been beaten into the ground way too many times and I am not going to start another one.
> http://www.tomshardware.com/reviews/battlefield-4-graphics-card-performance,3634-9.html
> http://www.bit-tech.net/hardware/graphics/2013/11/27/battlefield-4-performance-analysis/5
> 
> Arnie, Research the 690 card and decide for yourself or just take my word for it and game on!


"Magical VRAM", what? And in terms of 'horse power' the GTX 780 Ti is faster - and the GTX980 even more so.

Personally, I will wait for GM210.

I have owned this card since it was launched, and you know what? For the first ~18 months the 192GB/s bandwidth and 2GB was okay at 1080p.

Now with 1440p and games demanding more and more power and VRAM - I can barely play games with low or medium textures and settings.

War Thunder: 40-50fps maxed out (Shouldn't be this low, it's not a visually demanding game)
CoH2: 40-60fps on low (I mean really)
Watch Dogs: Have to play at 1600x900 (ugh)

All of these games are maxing out my VRAM.

If you wanna max out games, and not have to constantly worry about textures, AA, resolution and bandwidth - then wait for GM210.
If you need something now, see if you can grab a 2nd hand 780 Ti or 290X.


----------



## V3teran

Moving from 690 to a Ti is a sidegrade at best, waste of time. For performance the 690 is still one of the best cards out there. If your gonna go the 980 route you need to buy 2 to make it worth your while otherwise wait for next dual gpu or gm 200. Any other option is a waste of money and time.


----------



## shaneduce

You will not hit the Vram cap with the 980 in BF4.


----------



## V3teran

Turn the msaa down to 2x in bf4 instead of 4x and its fine at 1440p. I don't play that rubbish anyhow tbh.


----------



## Fuzzysham

So I put my entire rig on water last night and I'd like to get a decent overclock on the 690. I really hate to be irritating and ask again, but what is the difference between the A, B and C BIOS's Skyn3t provided?


----------



## V3teran

Your better off asking skynet himself, send hum a pm
.I made my own bios long before skynets bios was even out.


----------



## Sugi

Quote:


> Originally Posted by *V3teran*
> 
> Your better off asking skynet himself, send hum a pm
> .I made my own bios long before skynets bios was even out.


Please update us after you contact him and get a reply back from him. Thanks!


----------



## PinzaC55

Is anyone running dual 690's with a 4K monitor and are there any issues ?


----------



## V3teran

Quote:


> Originally Posted by *Sugi*
> 
> Please update us after you contact him and get a reply back from him. Thanks!


Why should i contact him? I dont need to contact him, why dont you contact him yourself.
My 690 is running at 1372mhz fully stable in all games so im good.


----------



## Sugi

Quote:


> Originally Posted by *V3teran*
> 
> Why should i contact him? I dont need to contact him, why dont you contact him yourself.
> My 690 is running at 1372mhz fully stable in all games so im good.


Please excuse the confusion as my message wasn't directed towards you in the first place.


----------



## Razor2014

Hi guys, I'm just building a new WC'd rig with a 690. Has anyone hit that 2GB vRam limit on any of the newer games?


----------



## Alex132

Quote:


> Originally Posted by *Razor2014*
> 
> Hi guys, I'm just building a new WC'd rig with a 690. Has anyone hit that 2GB vRam limit on any of the newer games?


Yeah...

on 1440p there are a lot


----------



## Razor2014

Quote:


> Originally Posted by *Alex132*
> 
> Yeah...
> 
> on 1440p there are a lot


What did you do, turn the AA down?


----------



## Alex132

Quote:


> Originally Posted by *Razor2014*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Yeah...
> 
> on 1440p there are a lot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What did you do, turn the AA down?
Click to expand...

For Watch Dogs, I had to play at 1920x1080 or lower.
For CoH2 I have to play on minimum settings basically, no AA.

Most games I turn AA off or down to 2x.

Even GRID2 I had to turn off the AA









And I can only see this problem getting worse, there is no way it would get better lol.

I didn't buy a $1000 graphics card to have to turn down or off settings, that's why I dislike this card.


----------



## zancaracing

Quote:


> Originally Posted by *V3teran*
> 
> Why should i contact him? I dont need to contact him, why dont you contact him yourself.
> My 690 is running at 1372mhz fully stable in all games so im good.


Can you upload your bios?

Thank you!


----------



## Razor2014

Quote:


> Originally Posted by *Alex132*
> 
> For Watch Dogs, I had to play at 1920x1080 or lower.
> For CoH2 I have to play on minimum settings basically, no AA.
> 
> Most games I turn AA off or down to 2x.
> 
> Even GRID2 I had to turn off the AA
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I can only see this problem getting worse, there is no way it would get better lol.
> 
> I didn't buy a $1000 graphics card to have to turn down or off settings, that's why I dislike this card.


It's unfortunate but for a high-end card like the 690 there should of been 3 or 4GB of vRam.


----------



## Alex132

Quote:


> Originally Posted by *Razor2014*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> For Watch Dogs, I had to play at 1920x1080 or lower.
> For CoH2 I have to play on minimum settings basically, no AA.
> 
> Most games I turn AA off or down to 2x.
> 
> Even GRID2 I had to turn off the AA
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I can only see this problem getting worse, there is no way it would get better lol.
> 
> I didn't buy a $1000 graphics card to have to turn down or off settings, that's why I dislike this card.
> 
> 
> 
> It's unfortunate but for a high-end card like the 690 there should of been 3 or 4GB of vRam.
Click to expand...

It really should have been 4GB.

Even then, the 256bit bus is rather restricting. Especially considering my memory can't overclock at all.


----------



## V3teran

Quote:


> Originally Posted by *zancaracing*
> 
> Can you upload your bios?
> 
> Thank you!


http://www.mediafire.com/download/b1b5ihr97cfhec6/690_Bios.rar


----------



## virgis21

Guys, go with older drivers from Nvidia. Before my 690 got melted, i was using with the latest and got very bad fps rate in Battlefield 4 .. 40-60. After I've downgraded to the first driver that mentions Bf4 - back with 80-120fps on all ultra!
So first check out things with drivers.. All those new drivers are made for newer cards with extra features, that 690 doesn't support!!


----------



## Alex132

Quote:


> Originally Posted by *virgis21*
> 
> Guys, go with older drivers from Nvidia. Before my 690 got melted, i was using with the latest and got very bad fps rate in Battlefield 4 .. 40-60. After I've downgraded to the first driver that mentions Bf4 - back with 80-120fps on all ultra!
> So first check out things with drivers.. All those new drivers are made for newer cards with extra features, that 690 doesn't support!!


[Citation Needed]


----------



## Razor2014

Quote:


> Originally Posted by *virgis21*
> 
> Guys, go with older drivers from Nvidia. Before my 690 got melted, i was using with the latest and got very bad fps rate in Battlefield 4 .. 40-60. After I've downgraded to the first driver that mentions Bf4 - back with 80-120fps on all ultra!
> So first check out things with drivers.. All those new drivers are made for newer cards with extra features, that 690 doesn't support!!


Hey virgis, do you know which driver that was? That will be the one I use.


----------



## V3teran

I use the 340.65 driver, works great and is very smooth in all games.


----------



## Alex132

Quote:


> Originally Posted by *Razor2014*
> 
> Quote:
> 
> 
> 
> Originally Posted by *virgis21*
> 
> Guys, go with older drivers from Nvidia. Before my 690 got melted, i was using with the latest and got very bad fps rate in Battlefield 4 .. 40-60. After I've downgraded to the first driver that mentions Bf4 - back with 80-120fps on all ultra!
> So first check out things with drivers.. All those new drivers are made for newer cards with extra features, that 690 doesn't support!!
> 
> 
> 
> Hey virgis, do you know which driver that was? That will be the one I use.
Click to expand...

I have been using the latest drivers and everything has been fine.


----------



## zancaracing

Quote:


> Originally Posted by *V3teran*
> 
> http://www.mediafire.com/download/b1b5ihr97cfhec6/690_Bios.rar


Thank you so much!


----------



## Arizonian

Update: Well since the day this was released I've been a proud owner of my first dual GPU, See 2nd post.









Today I finally sold her for $450 locally. I used the money to buy two 970's which cost me $290 in the end. I can't complain for $550 I got to play with it for two years. I got to turn up settings and it was exciting.

This card was the best dual GPU to date on many levels IMO. Performance, power comnsumption and temperature. Great all around and mostly becuse it is 256bit. The GTX 690 changed NVIDIA reference cards as we knew them when styling the new shroud which is standard today. I'm going to miss this card as much as I was fond of my 580 back in the day and still current 780TI ACX. The 690 was easily holding its own on my 2nd rig still 1080 120 Hz monitor it ran.

Great ride but my hobby forces me to sell'em. It helps pay so I can play with new toys. I tried but couldn't find a way to keep it as a spare.


----------



## V3teran

Oh yes arizonian. I too will miss my 690 when I part with it. Mine is still able to keep up with todays modern cards. It really is one of the legendary cards in nvidias line-up. When the new titans arrive I will be parting with it which is a shame.


----------



## Barefooter

Well I just bought a pair of these last July used for $1200. My first dual GPUs! I think they will hold me over until I do a new build next spring. I'm still on a 27" 1080 monitor now and the two 690s tear it up! I plan to have triple Asus Swifts with the new build next year and I don't think the 690s will have enough memory to handle that. Until then...


----------



## Buzzkill

EVGA UEFI Bios for 690GTX (.exe installer)

2690UEFI.zip 950k .zip file


YOU DO THIS AT YOUR OWN RISK




UEFI BIOS with EVGA HydroCopper 993MHz & 1502MHz
Extracted with nvflash & Set with KeplerBiosTweaker v1.27 / For use with Waterblock



UEFI-EVGA-HydroCopper-9931502-settings.zip 243k .zip file


----------



## 86JR

Will GTX 690 run GTA V when it is out?

I don't use my 690 at all anymore, total waste of $$$ but been keeping it for gta v which is out in Jan...

thx


----------



## indiyet

Hi guys!
Would like to be added to the members list, recently adquire a EVGA GTX 690 for a good price, this card is so beautiful!
There are very few in my country, so I consider myself a lucky guy.
Soon i will put a full cover waterblock, XSPC RAZOR is ideal for my taste ... and a custom backplate of course, since the manufactured by EVGA no longer available.
Greetings from Argentina.


----------



## timerwin63

Quote:


> Originally Posted by *indiyet*
> 
> Hi guys!
> Would like to be added to the members list, recently adquire a EVGA GTX 690 for a good price, this card is so beautiful!
> There are very few in my country, so I consider myself a lucky guy.
> Soon i will put a full cover waterblock, XSPC RAZOR is ideal for my taste ... and a custom backplate of course, since the manufactured by EVGA no longer available.
> Greetings from Argentina.
> 
> 
> Spoiler: Pictures


Mind taking a couple pictures of the backplate? I'd love to see what you (or the previous owner) did. Or an I misunderstanding the phrase "custom" right now?


----------



## indiyet

Quote:


> Originally Posted by *timerwin63*
> 
> Mind taking a couple pictures of the backplate? I'd love to see what you (or the previous owner) did. Or an I misunderstanding the phrase "custom" right now?


Hi timerwin63, don't have the backplate yet, coldzero is out of work until january.. so sad. Probably the backplate would be something like this:


----------



## V3teran

Quote:


> Originally Posted by *indiyet*
> 
> Hi guys!
> Would like to be added to the members list, recently adquire a EVGA GTX 690 for a good price, this card is so beautiful!
> There are very few in my country, so I consider myself a lucky guy.
> Soon i will put a full cover waterblock, XSPC RAZOR is ideal for my taste ... and a custom backplate of course, since the manufactured by EVGA no longer available.
> Greetings from Argentina.


The OP has stopped updating this thread over 12 months ago.


----------



## indiyet

Quote:


> Originally Posted by *timerwin63*
> 
> Mind taking a couple pictures of the backplate? I'd love to see what you (or the previous owner) did. Or an I misunderstanding the phrase "custom" right now?


Ok V3teran, i was not aware of that.
Can you please tell me what improvement have if i use this bios:
http://www.mediafire.com/download/b1b5ihr97cfhec6/690+Bios.rar
Thank you.


----------



## timerwin63

Quote:


> Originally Posted by *indiyet*
> 
> Ok V3teran, i was not aware of that.
> Can you please tell me what improvement have if i use this bios:
> http://www.mediafire.com/download/b1b5ihr97cfhec6/690+Bios.rar
> Thank you.


You may have quoted the wrong person there, friend. Just thought I'd let you know.







@V3teran, he's talking to you.


----------



## indiyet

Quote:


> Originally Posted by *timerwin63*
> 
> You may have quoted the wrong person there, friend. Just thought I'd let you know.


haha yeah you are rigth


----------



## Sugi

Man, Shadow of Mordor is amazing for the PC. I haven't found any issues with the game except where I accidentally climbed inside of a wall once and one other time the camera got stuck in a horrible position when I was fighting an enormous horde. The controls and the performance are good. Triple 1080p is producing 60fps or close to it with Widescreen Fix plus a config edit. I can't recommend this game enough.


----------



## Buzzkill

Quote:


> Originally Posted by *timerwin63*
> 
> You may have quoted the wrong person there, friend. Just thought I'd let you know.
> 
> 
> 
> 
> 
> 
> 
> @V3teran, he's talking to you.


Here is Veteran's BIOS unlocking Guide

690 GTX Club/Overclocking/Benchmarks


----------



## indiyet

Quote:


> Originally Posted by *Buzzkill*
> 
> Here is Veteran's BIOS unlocking Guide
> 
> 690 GTX Club/Overclocking/Benchmarks


----------



## ocsi1970

With unlocked bios for 690 and oc how long life my card,1-2 year?


----------



## V3teran

Thanks guys!
I have been running mine at 1372mhz for a while now with a volt mod on water etc. Even before this I was running at 1202mhz on air.
The card should last years and years tbh.


----------



## ocsi1970

I have custom watercooling but my power suply is only 750w thermaltake,cpu is 3770k stock,my question is whether face power suply?Thanks.


----------



## Asus11

Quote:


> Originally Posted by *V3teran*
> 
> Thanks guys!
> I have been running mine at 1372mhz for a while now with a volt mod on water etc. Even before this I was running at 1202mhz on air.
> The card should last years and years tbh.


just picked up a gtx 690 from amazon.. will be delivered tomorrow

can't wait to get modding, should be great for my mitx build too









I would be happy with 1202mhz on air as waterblocks are hard to find for them now

had 780s, titans and recently 970s but think the 900 is cheaply made imho and sent them back due to too many issues thought id try a gtx 690 until the new titan comes out :-D

hope to see some nice results


----------



## indiyet

Quote:


> Originally Posted by *ocsi1970*
> 
> I have custom watercooling but my power suply is only 750w thermaltake,cpu is 3770k stock,my question is whether face power suply?Thanks.


your power supply is fine, 850w would be ideal if you OC, but everything in stock clocks run just fine.


----------



## indiyet

Quote:


> Originally Posted by *Asus11*
> 
> just picked up a gtx 690 from amazon.. will be delivered tomorrow
> 
> can't wait to get modding, should be great for my mitx build too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would be happy with 1202mhz on air as waterblocks are hard to find for them now
> 
> had 780s, titans and recently 970s but think the 900 is cheaply made imho and sent them back due to too many issues thought id try a gtx 690 until the new titan comes out :-D
> 
> hope to see some nice results


Great! upload some pics when the card arrives.


----------



## DNytAftr

Hey guys









so i been considering a 690 rather then a 970 as a replacement to my 670 their both at the same price point .. but im just looking for some thoughts

Ive always loved the 690 but the better part of me says that the 4gb of vram might be the better buy (although the 690 i believe has performance closer to a 980? i think) ... not really sure which way to go


----------



## Barefooter

Quote:


> Originally Posted by *DNytAftr*
> 
> Hey guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so i been considering a 690 rather then a 970 as a replacement to my 670 their both at the same price point .. but im just looking for some thoughts
> 
> Ive always loved the 690 but the better part of me says that the 4gb of vram might be the better buy (although the 690 i believe has performance closer to a 980? i think) ... not really sure which way to go


The GTX 690 will be fine for the monitor in your sig. If you plan to upgrade to a higher res monitor or a 4k monitor in the near future, I'd go with the GTX 970 with the 4 gb of ram.


----------



## Asus11

delivered @ 9:30pm.. lol

here it is


----------



## Sugi

Quote:


> Originally Posted by *Asus11*
> 
> delivered @ 9:30pm.. lol
> 
> here it is


Where did you get it from? Used or new? How much? I kind of want to pick up a second one... maybe.


----------



## indiyet

Quote:


> Originally Posted by *Asus11*
> 
> delivered @ 9:30pm.. lol
> 
> here it is


----------



## Asus11

Quote:


> Originally Posted by *Sugi*
> 
> Where did you get it from? Used or new? How much? I kind of want to pick up a second one... maybe.


I got it from amazon warehouse deals, it looks brand new although its got a used box, I presume it was sent from MSI as a replacement never the less
im very surprised by how good these perform at stock, can only imagine what damage they can do when overclocked/overvolted, the only gripe I have is that 2gb Vram gets full too quick @ 1440p
but changing to high kind of resolves it, thats my initial thoughts but if this was 3gb it was would so perfect, only tested BF4 so far ill test others in the next few days and report back


----------



## Rei86

Quote:


> Originally Posted by *Arizonian*
> 
> Update: Well since the day this was released I've been a proud owner of my first dual GPU, See 2nd post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Today I finally sold her for $450 locally. I used the money to buy two 970's which cost me $290 in the end. I can't complain for $550 I got to play with it for two years. I got to turn up settings and it was exciting.
> 
> This card was the best dual GPU to date on many levels IMO. Performance, power comnsumption and temperature. Great all around and mostly becuse it is 256bit. The GTX 690 changed NVIDIA reference cards as we knew them when styling the new shroud which is standard today. I'm going to miss this card as much as I was fond of my 580 back in the day and still current 780TI ACX. The 690 was easily holding its own on my 2nd rig still 1080 120 Hz monitor it ran.
> 
> Great ride but my hobby forces me to sell'em. It helps pay so I can play with new toys. I tried but couldn't find a way to keep it as a spare.


Made me cry too, damn was this the 1st card with the aluminum built heat sink shroud and god damn was it a thing of beauty. Thought it was a waste to put it in the case and lock it away. Sooner or later I'm going to buy another 690 just for show, and this was nVidia's crowning achievement when it came to dual GPU on a PCB card. IE the 295x2 and the Titan Z doesn't compare, yes power wise its better but I mean full design the 690 is better.


----------



## Asus11

this card rocks... its doing 20%+ better than my gtx 780 clocked @ 1293mhz and the gtx 690 is at stock..

its got the GPU horsepower, but im getting short stuttering in games which I suspect is Vram starvation







having HWinfo open while gaming its defo hitting the 2gb vram

IDK what to do now..


----------



## Rei86

Quote:


> Originally Posted by *Asus11*
> 
> this card rocks... its doing 20%+ better than my gtx 780 clocked @ 1293mhz and the gtx 690 is at stock..
> 
> its got the GPU horsepower, but im getting short stuttering in games which I suspect is Vram starvation
> 
> 
> 
> 
> 
> 
> 
> having HWinfo open while gaming its defo hitting the 2gb vram
> 
> IDK what to do now..


What resolution are you playing at? About late 2012 start of 2013 I had to ditch mine when I went 1440p. Games like Sleeping Dogs with HD texture pack was taxing on the whole card IMO.

Kind of funny that I went with two 4GB GTX 680 Classified and knew at 1440p that VRAM was gonna be an issue (Not before the GK104 runs out of steam), that I purchased a GTX 980 with the same 4GB of VRAM. At least this time around the GPU will have steam before VRAM will become a major issue at 1440p.

Might want to think about going GTX 970s or R9-290.


----------



## Asus11

Quote:


> Originally Posted by *Rei86*
> 
> What resolution are you playing at? About late 2012 start of 2013 I had to ditch mine when I went 1440p. Games like Sleeping Dogs with HD texture pack was taxing on the whole card IMO.
> 
> Kind of funny that I went with two 4GB GTX 680 Classified and knew at 1440p that VRAM was gonna be an issue (Not before the GK104 runs out of steam), that I purchased a GTX 980 with the same 4GB of VRAM. At least this time around the GPU will have steam before VRAM will become a major issue at 1440p.
> 
> Might want to think about going GTX 970s or R9-290.


ive had 780s, titans, 970s, I wanted to go back to the good stuff, I thought vram was a myth as I gotten by just fine on 3gb 780s, but it just seems 2gb is just not enough for 1440p
im still going to try sell myself to it by experimenting what will make it use less vram as I think its a beast of its kind


----------



## wrath04

Nice Card! She's beautiful.


----------



## Sugi

For me running @ 1080px3, I do not have any issues with my card. I do get some shuttering, but it does not hinder my gameplay. However, I do turn down my AA though to keep my FPS high. Only big issue right now is minecraft, but that's present with one monitor. That's related to how the game was made not hardware.


----------



## V3teran

This 690 is a king at 1080p. 1440p with custom aa kills the vram. Its the only reason I'm upgrading mine, performance wise its still up there at the top, well mine is anyhow.


----------



## Alex132

Quote:


> Originally Posted by *V3teran*
> 
> This 690 is a king at 1080p. 1440p with custom aa kills the vram. Its the only reason I'm upgrading mine, performance wise its still up there at the top, well mine is anyhow.


It makes me really sad, I wanna trade in my 690 for GM210 when it comes out purely because of VRAM.

Nvidia! It was $1000 card! Why didn't you give us 4GB of VRAM


----------



## indiyet

Quote:


> Originally Posted by *Asus11*
> 
> the only gripe I have is that 2gb Vram gets full too quick @ 1440p


agree
Quote:


> Originally Posted by *V3teran*
> 
> This 690 is a king at 1080p. 1440p with custom aa kills the vram. Its the only reason I'm upgrading mine, performance wise its still up there at the top, well mine is anyhow.


totally agree, i bought this card just for that reason. @1080p its more than enough.


----------



## indiyet

Quote:


> Originally Posted by *Alex132*
> 
> It makes me really sad, I wanna trade in my 690 for GM210 when it comes out purely because of VRAM.
> 
> Nvidia! It was $1000 card! Why didn't you give us 4GB of VRAM


you are right, but what can we do? nothing my friend.


----------



## Alex132

Quote:


> Originally Posted by *indiyet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> It makes me really sad, I wanna trade in my 690 for GM210 when it comes out purely because of VRAM.
> 
> Nvidia! It was $1000 card! Why didn't you give us 4GB of VRAM
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you are right, but what can we do? nothing my friend.
Click to expand...

Nothing but sell and move on









Please let GM210 not be awful!


----------



## Sugi

What's everyones' temps at? I was testing Ethan Carter and I was riding at 71C at stock. This seems high, no? I would like to shoot for 1202 on air.


----------



## indiyet

Quote:


> Originally Posted by *Alex132*
> 
> Nothing but sell and move on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please let GM210 not be awful!


Easy for you maybe, not for me.


----------



## indiyet

Quote:


> Originally Posted by *Sugi*
> 
> What's everyones' temps at? I was testing Ethan Carter and I was riding at 71C at stock. This seems high, no? I would like to shoot for 1202 on air.


31/38º Iddle - 84/89º BF4 Ultra Settings


----------



## Alex132

Just noticed you have 2500k + 690 too! Yay.

But my temps are much worse, ~32'c / 33'c idle and between 77/78'c load and 97/99'c load. (last number is on games that are very intensive).

Ambient temperature of my room is 23-26'c (yay south african summer







) When it gets to like upper 35'c to 40'c room temp, I just simply don't even bother with gaming. So I wouldn't know temps haha


----------



## indiyet

Quote:


> Originally Posted by *Alex132*
> 
> Just noticed you have 2500k + 690 too! Yay.
> 
> But my temps are much worse, ~32'c / 33'c idle and between 77/78'c load and 97/99'c load. (last number is on games that are very intensive).
> 
> Ambient temperature of my room is 23-26'c (yay south african summer
> 
> 
> 
> 
> 
> 
> 
> ) When it gets to like upper 35'c to 40'c room temp, I just simply don't even bother with gaming. So I wouldn't know temps haha


haha yeah, try to change the thermal paste, the airflow of your case is also "very important" .. that helps a lot.
If that not working just put a full cover watercooling


----------



## Alex132

Quote:


> Originally Posted by *indiyet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Just noticed you have 2500k + 690 too! Yay.
> 
> But my temps are much worse, ~32'c / 33'c idle and between 77/78'c load and 97/99'c load. (last number is on games that are very intensive).
> 
> Ambient temperature of my room is 23-26'c (yay south african summer
> 
> 
> 
> 
> 
> 
> 
> ) When it gets to like upper 35'c to 40'c room temp, I just simply don't even bother with gaming. So I wouldn't know temps haha
> 
> 
> 
> haha yeah, try to change the thermal paste, the airflow of your case is also "very important" .. that helps a lot.
> If that not working just put a full cover watercooling
Click to expand...

Can't do either, screws are threaded and I dare not take a dremel to them. Not that I have one. And I'm not gonna buy one either...


----------



## indiyet

Quote:


> Originally Posted by *Alex132*
> 
> Can't do either, screws are threaded and I dare not take a dremel to them. Not that I have one. And I'm not gonna buy one either...


here you can see how put off the screws:
http://www.ekwb.com/shop/EK-IM/EK-IM-3831109856543.pdf
see step 2


----------



## Alex132

Quote:


> Originally Posted by *indiyet*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Can't do either, screws are threaded and I dare not take a dremel to them. Not that I have one. And I'm not gonna buy one either...
> 
> 
> 
> here you can see how put off the screws:
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109856543.pdf
> see step 2
Click to expand...

Mmmm, I know it's just that 1 or 2 of them are stripped on the back and won't come off.


----------



## Sugi

Are both of you, Alex132 & indiyet at stock or 1202mhz or higher or lower? Which temps should we be at for longevity of the card? I would like to finally crank up the speed. ;D


----------



## Alex132

Quote:


> Originally Posted by *Sugi*
> 
> Are both of you, Alex132 & indiyet at stock or 1202mhz or higher or lower? Which temps should we be at for longevity of the card? I would like to finally crank up the speed. ;D


My temps are with 1250 core overclock


----------



## indiyet

Quote:


> Originally Posted by *Sugi*
> 
> Are both of you, Alex132 & indiyet at stock or 1202mhz or higher or lower? Which temps should we be at for longevity of the card? I would like to finally crank up the speed. ;D


@stock for now until I get a Fullcover block for her.


----------



## Sugi

Alex, are you on air right now? And indiyet let us know how OCing works for you with those fancy blocks.


----------



## Alex132

Quote:


> Originally Posted by *Sugi*
> 
> Alex, are you on air right now? And indiyet let us know how OCing works for you with those fancy blocks.


Yep on air, but I keep the side-panel off of my case 24/7 because the 800D has awful airflow.

I don't overclock anymore because I don't like those very high temps


----------



## indiyet

Quote:


> Originally Posted by *Sugi*
> 
> Alex, are you on air right now? And indiyet let us know how OCing works for you with those fancy blocks.


no problem, when he does get it and install it, I'll post the results here.


----------



## Alex132

What are people's general OCs on stock cooling?

I reached my max of +130 core / +65 mem. That seems really crap to me... My temps never exceeded 75'c while testing with 3DMark.

I have seen people overclock their memory way past +65, more like +200... Maybe I just got the dud of 690s


----------



## xHoLy

hi guys

I just played Wolfenstein: The New Order for the first time and SLI doesnt work. Is there a way to force SLI? I have to turn down settings for stable FPS but then it looks bad


----------



## Buzzkill

Quote:


> Originally Posted by *xHoLy*
> 
> hi guys
> 
> I just played Wolfenstein: The New Order for the first time and SLI doesnt work. Is there a way to force SLI? I have to turn down settings for stable FPS but then it looks bad


You use nvidia inspector? If not try it. You can set SLI & it has Wolfenstien settings


----------



## The Real Deal

Dear fellas,
I'm a future owner of a GTX 690. Considering now the distance of the product launch, is this model reliable?

And what could i expect in terms of overclocking?

Thanks a lot.


----------



## Barefooter

I decided to pull the two GTX 690s out of my rig last weekend. I'm going to sell them and put in a couple of GTX 980 tis when they become available. I'm just going to run a pair of 7950s I had layin' around until the 980 ti cards arrive.


----------



## Matze GER

Her I am with my Asus GTX 690.

Have one Problem. I wanna fresh up the termal pads, but I don`t found any info per google, witch size I need.
If any one can help me, pls?

regards from Hamburg Germany


----------



## MrTOOSHORT

Quote:


> Originally Posted by *Matze GER*
> 
> 
> 
> Her I am with my Asus GTX 690.
> 
> Have one Problem. I wanna fresh up the termal pads, but I don`t found any info per google, witch size I need.
> If any one can help me, pls?
> 
> regards from Hamburg Germany


On the stock cooler, 1mm thermal pads for the memory chips and just thermal interface material for VRMs.

*http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/3150*


----------



## ocsi1970

Hy guys!I have Samsung U28D590D 4K monitor and almost every time on windows log in turn screen on black.I signaled and nvidia:https://forums.geforce.com/default/topic/777412/geforce-drivers/black-screen-on-windows-login-344-16-and-11-on-displayport-with-gtx980-sc-fix-info-posted-11-25-/ but they gave me no answer.Does anyone else have this problem?Thanks!


----------



## 86JR

Anyone play arma 3?

Having fps issues in cities in arma 3 wasteland chernarus. Normally hovers around 30fps but drops down to single figures in cities.

i5 2500k 4.8ghz
16gb ddr3
corsair 850
gtx 690
ssd
etc
1080p res

I run 4km view distance but have everything turned down and it doesnt make much difference, I have vsync on wondering if this is the cause?


----------



## jesusnadinosaur

So i just bought my second 690 for quad sli and waterblockk for undder $300...Did i make the right move?>


----------



## shaneduce

Quote:


> Originally Posted by *jesusnadinosaur*
> 
> So i just bought my second 690 for quad sli and waterblockk for undder $300...Did i make the right move?>


Yes.


----------



## jesusnadinosaur

Quote:


> Originally Posted by *shaneduce*
> 
> Yes.


Care to elaborate or just yes lol. I should mention that these will be watercooled and I'll be able to overclock the balls off of them!


----------



## brootalperry

Quote:


> Originally Posted by *86JR*
> 
> Anyone play arma 3?
> 
> Having fps issues in cities in arma 3 wasteland chernarus. Normally hovers around 30fps but drops down to single figures in cities.
> 
> i5 2500k 4.8ghz
> 16gb ddr3
> corsair 850
> gtx 690
> ssd
> etc
> 1080p res
> 
> I run 4km view distance but have everything turned down and it doesnt make much difference, I have vsync on wondering if this is the cause?


You need to turn down the view distance, not everything else.


----------



## jesusnadinosaur

Today my second matching nickel ek waterblock arrived for my second card. This is what 2 690s look like in a corsair 540!





They almost didn't fit....now I just need to get my custom sleeved cables ordered and I can call this on done for now...


----------



## Mbbx

I sold my two 690s a while back. But I wonder how they will work in sli with DX12!!


----------



## Alex132

Quote:


> Originally Posted by *Mbbx*
> 
> I sold my two 690s a while back. But I wonder how they will work in sli with DX12!!


they wont?


----------



## timerwin63

Quote:


> Originally Posted by *Alex132*
> 
> they wont?


I think he's wondering about 690s in general, not _his_ 690s.


----------



## jesusnadinosaur

Quote:


> Originally Posted by *Mbbx*
> 
> I sold my two 690s a while back. But I wonder how they will work in sli with DX12!!


I too have been wondering this. Two 690s in sli was a complete bust for me. Wasted over 500 bucks to get worse fps...hopefully dx12 fixes it..if I don't sell one of the cards first. I might just sell my 970 and put a 690 in each one of my rigs as the 690 performs a hell of alot better than the 970 anyway. Still really sad though I had high hopes for it. It was a pain to install in my 540 and the performance is terrible compared to the single card...


----------



## timerwin63

You have to remember that the option to stack VRAM is just that. An option. It has to be written in by devs, and with how lazy they've been recently, I doubt it will be in a ton of titles if it's any work in the long run.


----------



## jesusnadinosaur

Quote:


> Originally Posted by *timerwin63*
> 
> You have to remember that the option to stack VRAM is just that. An option. It has to be written in by devs, and with how lazy they've been recently, I doubt it will be in a ton of titles if it's any work in the long run.


Yeah I figured as much. But as long as bf4 and fc4 are on the list I'd be happy. If I even keep both cards for that long. My first 690 holds a special place in my heart and I don't think I could ever sell it. I'm sure it will still be a good performer for years to come. I bought it for 300 bnib a few months back and it's got 9 years left on its ten year warrenty so I plan to use it till it dies and then send it in...hoping they will have a new dual gpu card then and if they don't have any 690s to replace it with maybe they send me a...990 lol. Wishful thinking..I can dream right?


----------



## PinzaC55

This problem may have been addressed before but here goes! One of my 690s has an EK clear acrylic/nickel waterblock and on removing it I have found it has a buildup of gunge over both GPUs. I tried to remove the clear plate but as luck would have it the very last screw stuck and the Allen Key hole is now too rounded to get the plate off. Is there any solution I could put into the block to remove it chemically ?


----------



## tehort

Is it dangerous setting the default tdp to 250000 (250w) on the bios via KBT?


----------



## V3teran

Quote:


> Originally Posted by *MrTOOSHORT*
> 
> On the stock cooler, 1mm thermal pads for the memory chips and just thermal interface material for VRMs.
> 
> *http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/3150*


Depends on the block, with my 690 i had the koolance block and although i used 1mm pads it was too big.
The card had low temps on one end and low temps on the other end. I went for 0.5mm pads and it fixed the issue.








Koolance themselves recommend 0.7mm.


----------



## V3teran

Quote:


> Originally Posted by *tehort*
> 
> Is it dangerous setting the default tdp to 250000 (250w) on the bios via KBT?


Had my 690 running way over that running at 1372mhz fully stable.
Here is a thread and guide i made long ago to achieve this.
http://forums.guru3d.com/showthread.php?t=362667

Running Arma 3 at 1372mhz


----------



## bobbavet

Are you sure it is a 1.2 displayport cord?

I remember when I got my 1080p 120hz Sammy, the dp cord was hopeless and suffered these issues.

I got a Accell 1.2 DP cable and it was fixed.


----------



## xteddy

Hello everyone!

Newbie here and looking for some help especially for GTX 690! I had some issues on it recently. First of all, sorry for the bad english









1: I have DELL GTX 690 (I believe its reference card), bad flash and I don't backup the rom file using the nvflash







(Yes i know my bad)

2: Download the .rom files from techpowerup, nvflash but seem i am stuck with 2 BIOS version for GPU 1 and GPU 2 via gpuz

*GPU 1 = Version: 80.04.1E.00.13*

*GPU 2 = Version: 80.04.1E.00.14*

3: I am wondering whether GPU1 and GPU2 do have identical bios version, or not?

4: I seem cannot re-flash GPU2 by using the BIOS .13, I got error with mismatch ** **

WARNING: Firmware image Board ID (E103) does not match adapter Board ID (E102).
WARNING: Firmware image Hierarchy ID (Switch Port 8)
either does not match the Hierarchy ID of the adapter (Switch Port 16)
or does not match the Hierarchy Role of the adapter (Switch Port 16).

*** nvflash -i2 -6 1.rom
*** There is no problem with GPU1

5. Appreciated if somebody can provide me stock vbios / bios for DELL GTX 690 here. Many thanks for helping and reading!


----------



## Alex132

One will be the master BIOS and one will be the slave BIOS. Master goes on GPU0 - slave goes on GPU1 (GENERALLY). I don't have my 690 anymore so I cannot check. But if these are to be believed, then the .13 is for the first GPU. (First GPU selection in GPU-Z) and .14 is for the 2nd GPU.

Note that it says g1 at the end of the BIOS - indicating it's for the first GPU, and g2 for 2nd.


Spoiler: Warning: Spoiler!
















*tl;dr*

Use .13 for first GPU, .14 for 2nd GPU. Any 690 BIOS will be fine - they're all the same bar some details (vendor name, etc) which really do not matter.


----------



## xteddy

Quote:


> Originally Posted by *Alex132*
> 
> One will be the master BIOS and one will be the slave BIOS. Master goes on GPU0 - slave goes on GPU1 (GENERALLY). I don't have my 690 anymore so I cannot check. But if these are to be believed, then the .13 is for the first GPU. (First GPU selection in GPU-Z) and .14 is for the 2nd GPU.
> 
> Note that it says g1 at the end of the BIOS - indicating it's for the first GPU, and g2 for 2nd.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *tl;dr*
> 
> Use .13 for first GPU, .14 for 2nd GPU. Any 690 BIOS will be fine - they're all the same bar some details (vendor name, etc) which really do not matter.


Hi! Thank you for the quick reply. Based on your info, seem like i flashed it in the correct way, except "BIOS VERSION" on my GPUZ just show *Version: 80.04.1E.00.13* instead of *Version: 80.04.1E.00.13 (P2000-00g1)*

The weird thing is, 690 having artifacts + nvidia driver crashing, when i use rendering tool via GPU-Z. Seem like the crash happened after 50-51c only. Does this something to do with the bios on the card?

Thanks!


----------



## Alex132

Try run driver sweeper in safe mode to remove any previous AMD/Nvidia drivers before and then use CCleaner to clear registry file mishaps before installing new drivers.

http://www.guru3d.com/content-page/guru3d-driver-sweeper.html


----------



## xteddy

Quote:


> Originally Posted by *Alex132*
> 
> Try run driver sweeper in safe mode to remove any previous AMD/Nvidia drivers before and then use CCleaner to clear registry file mishaps before installing new drivers.
> 
> http://www.guru3d.com/content-page/guru3d-driver-sweeper.html


Thanks for the recommendation! Will try it later. So you think *software* is likely the issue here? Possible for me to flash the bios provided at the first page? And what is the best nvidia drivers for gtx 690 under win8.1?

Again..many thanks!


----------



## Alex132

Quote:


> Originally Posted by *xteddy*
> 
> Thanks for the recommendation! Will try it later. So you think *software* is likely the issue here? Possible for me to flash the bios provided at the first page?


Sound like it's some software issue and/or instable clocks. Seeing as how it should be running stock, it shouldn't be instable. I don't see how flashing the BIOS should fix anything to be honest.

Do the driver crashes happen in game / during a benchmark?
Quote:


> Originally Posted by *xteddy*
> 
> And what is the best nvidia drivers for gtx 690 under win8.1?


The latest drivers are always the best.


----------



## xteddy

Quote:


> Originally Posted by *Alex132*
> 
> Sound like it's some software issue and/or instable clocks. Seeing as how it should be running stock, it shouldn't be instable. I don't see how flashing the BIOS should fix anything to be honest.
> 
> Do the driver crashes happen in game / during a benchmark?


Yup! Nvidia crashing to something kernel mode, then back to normal again, or i have to reboot my pc. For now, i assume its more to software related.

Thanks.


----------



## xteddy

Update :

I attach the bios information stated on the back of the gpu. If anyone here do have the stock/default bios for it, your help is highly appreciated!

Thanks.


Spoiler: Warning: Spoiler!


----------



## xteddy

Less chance for me now to get it


----------



## xteddy

Quote:


> Originally Posted by *xteddy*
> 
> Less chance for me now to get it


*SOLVED!* Found the bios (For previous vbios on image attached before!), re-flash and its stable again! Yahooooooooooooooooooooooooooo!!!!


----------



## nickcnse

Anyone know what a BNIB gtx 690 would be worth now days?


----------



## wrath04

This one went for over $530 including shipping on EBay on Jun 6th : Here


----------



## nickcnse

Thank you Wrath04!


----------



## wrath04

Anytime bud, I still love that card.


----------



## nickcnse

To be honest, I really still love it as well. The only thing is, I think after this long that it's finally time for an upgrade.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *nickcnse*
> 
> To be honest, I really still love it as well. The only thing is, I think after this long that it's finally time for an upgrade.


The card's gpus are still beastly, just the usable vram of 2gb is the let down.


----------



## boxfetish

Quote:


> Originally Posted by *nickcnse*
> 
> Anyone know what a BNIB gtx 690 would be worth now days?


I have been pricing these things on eBay for months. I just got a gtx 690 with EVGA warranty until next March for $260 incl shipping. Perfect card. I have see them selling used for anywhere from $225 to $325 and end price seems to depend more on the day and time of the auction end than the condition of or any extras with the card.

I have seen brand new, unopened ones sell for for over $400 a couple times. I think the $530 is probably a one-off. In my opinion it would probably sell for around $400, but sometimes people are willing to pay 2x as much for brand new even when the tech is nearing the end of it's life. Put it up for sale and make sure to put things like "RARE!" or "EXCLUSIVE!" in your auction tltle and you may get somebody to bite at $500. I wouldn't pay even $350 for a new one myself. Brand new doesn't add much value for me as this card will likely only be useable for another year or so and I will replace it once I can get a used 980 ti for around $400.


----------



## wrath04

Yeah I haven't seen one go that high since that one on June 6th, but IMO if it weren't for the 2gb Vram, this card could be viable for many more years to come.

Personally, I hope to get 1 or 2 more years out of my 690 quad-sli setup, we will see if this is possible, so far I remain extremely impressed with this, even with the vram dragging it down slightly in some titles.

I just picked up a 980 for under $400 w/ shipping for an unfinished build the other day, it looks to be a beast too.

When I do eventually replace the 690's in my gaming rig, I think the replacement could end up being the 980Ti depending a lot on the deals I find at the time.


----------



## OldGeoGamer

Hi All,

GTX 690 now really uses all of 4Gb vram in Win 10 ???

One guy in EVGA forum wrote:

"Hey guys I have GTX 690 and recently I installed WIN 10 (DX12). And DX12 works just perfectly awesome on any DX11 titles. I used EVGA Precision to see the memory usage and (on dx11 win8.1 I used maximum 2 Gb) I have access to whole 4GB. Damn I'm so happy!!! Now my GTX 690 has 4GB, so as your dual gpu will get with new DX12. I repeat its not a myth its true and working perfectly fine with ANY DX11 titles. I played GTA V, BFH, BF4, Dying Light and the memory usage was about 3.5Gb. Good luck!"

http://forums.evga.com/GTX-690-memory-with-DirectX-12-m2300321.aspx

Is that true? Did someone tested his 690 in DirectX 12? And GPU-Z showed 4Gb vram usage?

If you can, post screenshots please.

Thanks.


----------



## wrath04

I have read this thread too, I really hope this is true, this would really extend the life of this card.

Hopefully win10 will be available in the next few days so I can see this for myself, I'm still waiting for the upgrade download to come available for my region
Cant wait


----------



## nickcnse

I haven't really noticed a difference in BF4 with the gtx 690 on windows 10. It's always been a beast of a card and ran BF4 just fine.


----------



## ocsi1970

For me alvays crashes after 5-6 minutes.But i work better on Windows 10 and Far Cry 4.It happens to someone ?GPU -Z does not look only 2GB dae in Far Cry 4 and other games showing 4 GB used .I Aida 64 installed and Logitech G15 LCD im show that uses 4GB,I do not know where he still 2GB , I think from memory.
Go games in 4K without stuttering that the on win 8.1 jerky.


----------



## ocsi1970




----------



## OldGeoGamer

Thanks guys.

But can smbd prove that 690 uses all 4GB vram in Win10 ?


----------



## ocsi1970

GPU -Z shows only 2GB in win 10


----------



## OldGeoGamer

Thank you for your attention.

And can you post screenshots where EVGA Precision X shows memory usage in game, please?


----------



## ocsi1970

Are made with msi afterburner and with 4K resolution of that is so small memory usage.

Made on full hd to see better.


----------



## OldGeoGamer

Thank you so mach again.

It's strange why EVGA PrecisionX shows 4192Mb of vram usage per gpu while GPU-Z still shows 2Gb of vram in total....

Maybe some another software will clarify the situation.

Let's wait what the other GTX690 owners will say


----------



## ocsi1970

And I was surprised but for good


----------



## f16-r1

Hey got a friend with a GTX690 that won't work right he has an old amd card in with the GTX690 plugged into the second PCI-E slot. I cannot seem to find the latest bios for his card going to try to flash it some help would be great. EVGA GeForce GTX 690 - 04G-P4-2690-KR is the card he has.

Thank You.


----------



## OldGeoGamer

So, can anybody else prove that GTX690 uses all of 4Gb vram in Win10?


----------



## zalbard

Quote:


> Originally Posted by *Ujaho*
> 
> It's strange why EVGA PrecisionX shows 4192Mb of vram usage per gpu while GPU-Z still shows 2Gb of vram in total....


There is 2GB of VRAM per GPU, so 4GB in total. SLI duplicates resources in memory so a game can effectively access only 2GB.

Windows 10 changes nothing in this respect.


----------



## OldGeoGamer

Quote:


> EVGA PrecisionX shows 4192Mb of vram usage per gpu


----------



## Vit3D

Guys, please help. Need a native bios ASUS GTX690 numbered 80.04.4c.00.03 80.04.4c.00.06 I will be very grateful if you send him here or on my mail


----------



## ocsi1970

Guys, please help.Where should I put 0.5 mm and where 1mm thermal pad for waterblock?Or put only 0.5 everywhere?


----------



## RnRollie

Quote:


> Originally Posted by *ocsi1970*
> 
> Guys, please help.Where should I put 0.5 mm and where 1mm thermal pad for waterblock?Or put only 0.5 everywhere?


THAT should be documented and packed with the waterblock (or downloadable from the manufacturer)


----------



## ocsi1970

http://www.aquatuning.de/water-cooling/gpu-water-blocks/gpu-full-cover/14148/watercool-heatkiller-gpu-x-gtx-690-hole-edition?c=495
It is waterblock but does not say anywhere that you have to use,I do not remember who was in the box because I changed to 0.5mm.


----------



## RnRollie

Quote:


> Originally Posted by *ocsi1970*
> 
> http://www.aquatuning.de/water-cooling/gpu-water-blocks/gpu-full-cover/14148/watercool-heatkiller-gpu-x-gtx-690-hole-edition?c=495
> It is waterblock but does not say anywhere that you have to use,I do not remember who was in the box because I changed to 0.5mm.


http://watercool.de/sites/default/files/downloads/MA_GPU-X3_GTX_690_A5.pdf

Remember, these are Germans, so manuals do not come as translated from Mandarin to Korean to French to English


----------



## ocsi1970

Thanks verry much!!!


----------



## nicholasewood

Can you take me off the NVIDIA GTX 690 Owners Club? I no longer own and 690.


----------



## HyeVltg3

Glad to see there is still some love for this card, 3 years later and I'm so close to owning a 690 finally, settled with 670 in SLI that just wasnt fun, many games I played didnt even use SLI.

I'm about to purchase a 690 that comes with a Water block and the original heatsink but no thermal pads, was planning to buy some from ebay but they come in various sizes,
I saw this image post a ton of pages back:
http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/3150#post_18942658

can anyone tell me what those black stubs are? is that a different pad or is that on the heatsink
( I have yet to purchase the 690, the WB is installed and my PC doesnt have a WC loop, so going to save the block and reinstall the heatsink)


----------



## shiloh

Hello

Anyone had success to connect a 1440p 120hz/144hz display to a GTX690 via the mini DisplayPort? Please let me know what was your experience.

Just upgraded from a 1080p/120hz monitor to a 1440p/144hz panel and I am getting al sort of issues, not sure if it's the GTX690 or the panel. On my previous panel I was using the DVI output now I have to use the DisplayPort.

issues I have seen so far:

-sometimes (about 50% of the times) when I turn on my PC, it will not POST and will turn itself off after 20 seconds. Nothing display on the screen

-I dont see the BIOS screen anymore when I use my new display. I only get something displayed when the OS is booting.

-finally, once in the OS, the highest refresh rate I see is 120hz and upon switching the display mode to 120hz, I get nothing and windows eventually roll back to 60hz.

I dont have another graphic card on hand to troubleshoot and isolate the root cause of this so that's why I am reaching out to you. Looking for a confirmation that the GTX690 should work well at 1440p/120hz/144hz using the mini displayport.

thanks


----------



## RnRollie

You need hi-quality mini-DP to DP cable or convertor, none of that cheap eBay stuff.

Had all kind of issues with a 4k hooked into the 690 using the miniDP... some of it finally resolved by using a miniDP to DP cable for an APPLE system ........... i know, the shame, the horror..









BIOS not showing has three possible reasons:
- (this low) resolution not supported by monitor ... noting you can do about that, except maybe set up different resolution profiles ON the Monitor itself, if it supports it
- "wake-up" & switch to resolution time of the monitor is longer as the intercept BIOS timeout. Set your "Press F2 or DEL to enter BIOS" timeout to much, much longer
- MB has no way of hinting to GPU that it should display the BIOS screen through DP... OR also BIOS & GPU not playing well together. This is manufacturer depending, if set GPU PCI First & Disable Spread spectrum in the BIOS does not help, then.. well.. back luck

note: POST issues can be related to this... it takes too long for the Monitor to respond and/or for the GPU to recognize a monitor attached. So, the MB thinks there is no display attached at all, and shuts down. Its a similar behaviour as if there is no keyboard attached. Depending on the MB & BIOS, you might be able to stop it searching for a display, or to change the time-out.

Refresh rate:
the 690 should have no problems at all going 120 / 144. I had an ASUS "3D" [email protected] hooked to a 690 for a long time without issues, but ... drivers !
use some drivercleaner to make sure you have a really clean system before installing the latest NVidia GPU drivers.
Also, check if there are NEW BIOS updates for your MB & GPU from the manufactures site.


----------



## imago10

Hi! I have small question does someone can help me and post original rom for MSI 690 both gpu Bios?


----------



## mrteddy

Hi all!

quick question, i just bought a second gtx 690 with a waterblock attached and the stock cooler

but the stock cooler didnt come with the original screws

can someone please tell me what size/length/number of screws ill need to use the stock cooler, cant find much on the web

thanks!


----------



## V3teran

This maybe of some help to someone, i wrote this guide when volt modding and overclocking the 690.
http://forums.guru3d.com/showthread.php?t=362667

690 running at 1372mhz on Arma 3 fully stable


----------



## wanderman

Maybe i'm doing something wrong

Skynet bios is only gor 2nd GPU ? what about the first GPU ??

Since the card have 2 GPUS how to proper use Skynet bios ... i readed a lot of pages, and didn't find a proper answer.

Like, skynet revA - i flash only the 2nd GPU ?? and that's it ??

Shouldn't i flash both with skynet bios !?


----------



## AllGamer

He guys it's been a long while.

Still have the two GTX 690, but it will soon be replaced by the GTX 1080.

That's not what I came back here.

I came here to share some odd, yet weird but interesting findings.

For the longest time I've been trying to get the most out of the Quad SLI GTX 690 and was constantly getting poor performance.

Now that I'm ready to switch to the new card, I was running some test with a Single GTX 690 and I was constantly getting WAAAAAAAAAY better performance than when I was doing SLI, that totally blew my mind off









I was running a bunch of the games I currently play, to record the FPS I get now with my current setup, to later compare it to the new GTX 1080 when it arrives, and that's when I realized I should have gone single GTX 690 many years back to get full performance instead of dual GTX 690.

I always used the latest drivers,
clean install every time
motherboard is Full dual PCIEx16 slots running at x16 (not x8 mode, not x4 mode)

Technically speaking there was no problem there preventing the Quad SLI GTX 690 from dominating the games, but nope, in reality the SLI setup was doing worse than a single GTX 690... who would have thought


----------



## reeven

i haved 290 and an 280x( 2 pcs). My best friend give me for free an gtx690( i put it instead of 280x)
Well this baby boot is slow, i have 40-50sec from the moment i click power button on pc till i see win10 desktop screen.
With my 280x i haved about 28sec or so.
ssd samsuck.

I update 690 bios with uefi tool, same slow boot.


----------



## Wildey1771

Okay, So This thread doesnt seem to be as active as it used to be. Quite sad. Anyway,

Few weeks ago I bought 2 690's on ebay. Yep that's right, Quad SLI.

I've done my playing with them now. Temps are nice, and frames are decent. The OC is quite nice if I say so myself.










And Its Stable.

http://www.3dmark.com/3dm/13992962

http://www.ozone3d.net/gpudb/score.php?which=291900

I did run Heaven also but I was too impatient to finish it.

And 3D Mark 11 - Set to extreme whatever that is -



If anyone wants any more tests from these lemme know, and if anyone wants my bios's lemme know too and ill let em go.

Adios.


----------



## xartic1

Quote:


> Originally Posted by *Wildey1771*
> 
> Okay, So This thread doesnt seem to be as active as it used to be. Quite sad. Anyway,
> 
> Few weeks ago I bought 2 690's on ebay. Yep that's right, Quad SLI.
> 
> I've done my playing with them now. Temps are nice, and frames are decent. The OC is quite nice if I say so myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And Its Stable.
> 
> http://www.3dmark.com/3dm/13992962
> 
> http://www.ozone3d.net/gpudb/score.php?which=291900
> 
> I did run Heaven also but I was too impatient to finish it.
> 
> And 3D Mark 11 - Set to extreme whatever that is -
> 
> 
> 
> If anyone wants any more tests from these lemme know, and if anyone wants my bios's lemme know too and ill let em go.
> 
> Adios.


Are you able to test any DirectX 12 titles to show if the memory stacking is working and/or is beneficial for the 690?

I have a 1080 for 4k but grabbing a 690 for a very good price would be a great card for a 1080p on an ivy bridge build.


----------



## sutty86

having issues with with gtx690 using latest driver on a h81 mitx mobo


----------



## sutty86

Any help


----------



## Wildey1771

Quote:


> Originally Posted by *sutty86*
> 
> Any help


You need to flash a stock bios forcefully. One thats actually for your card and then reinstall the drivers.

Thats happened to me a few times.

https://www.techpowerup.com/vgabios/127971/EVGA.GTX690.2048.120430.rom

That is the correct file for your GPU.


----------



## sutty86

But it will only still find one gpu (gpu1)
So how do i flash gpu0 the one which isnt shown,

If it helps the right exhuast is blowing out hot air on bemchmarks and the left exhaust as your looking at the fan, is blowing out cold air


----------



## sutty86

Quote:


> Originally Posted by *Wildey1771*
> 
> You need to flash a stock bios forcefully. One thats actually for your card and then reinstall the drivers.
> 
> Thats happened to me a few times.
> 
> https://www.techpowerup.com/vgabios/127971/EVGA.GTX690.2048.120430.rom
> 
> That is the correct file for your GPU.


Also looks identical to current rom check gpuz


----------



## Wildey1771

Quote:


> Originally Posted by *sutty86*
> 
> Also looks identical to current rom check gpuz


I will send you a PM


----------



## Buzzkill

EVGA UEFI Bios for 690GTX (.exe installer)

2690UEFI.zip 950k .zip file


Please keep in mind that once the card is flashed it will only work on a fully supported UEFI motherboard. If you have any further questions please contact Tech Support 24/7 888-881-3842

Regards
EVGA





http://www.overclock.net/t/1249960/official-nvidia-gtx-690-owners-club/5420#post_23184852


----------



## sutty86

Quote:


> Originally Posted by *Buzzkill*
> 
> EVGA UEFI Bios for 690GTX (.exe installer)
> 
> 2690UEFI.zip 950k .zip file
> 
> 
> Please keep in mind that once the card is flashed it will only work on a fully supported UEFI motherboard. If you have any further questions please contact Tech Support 24/7 888-881-3842
> 
> Regards
> EVGA
> 
> So i flashed my os to UEFI now , but it will not allow me to download your link due to prividledges i think can you mirror or pm me cheers


----------



## sutty86

Can you mirror the attachment


----------



## deadsmiley

I just got my second GTX 690 a couple of days ago. Quad SLI is interesting. It does very well in benchmarks, spotty in games. WoW sees some performance increase, MechWarrior Online is crushed by 2x 690's.


----------



## ronnyjr

Hello guys, I'm new to the forum and was wondering about fan speeds..
My 690 does not go below 2880rpm no matter what. If I try to change it manually using SpeedFan, MSI Afterburner, EVGA Precision X or whatever, the percentage changes accordingly, but RPM stays constant at 2880.

I have been in talks with the previous owner(s) - I bought it used - and both owners never thought about the noise, they actually assumed it was supposed to be that way. It really sounds like you're just outside an airport..

I've also spoken to Gigabyte, and the guy who finally responded to my email said that there are no ways to control the fan speed on these cards.
He is not aware of what he is saying, right? I've seen screenshots of Nvidia control panel, and that clearly shows a fan speed of 30% and RPM at 1140.

Oh yes, the guy who originally bought the card is "against having online accounts" (would you imagine!?) so he has a hard time finding te receipt for the 690 which he bought online... *sigh*

Any thoughts?


----------



## 86JR

People with problems with quad sli...consider your PSU maybe not up to the job. These cards are heavy... I specced an 850w corsair for one of them.


----------



## deadsmiley

WIth an E5-1650 @ 4.2GHz and Quad SLI I saw around 400w at the wall with my Kill-a-watt meter. I am using an EVGA G2 1000w PSU so I am probably good.


----------



## Cogent

May I ask if anyone have lastest BIOS for both GPU's and PLX chip?

I have compatibility issues with ASUS Z9PE-D8 MBO and need fix for it.

Thank you.


----------



## dautopor

Hello there 690 fan's, i have a question GPU2 power% stay all the time above 350 (see attachment) is this normal?
System: dell precision t5400, 2x x5460, 16 gb ram

690.jpg 489k .jpg file


----------



## p.e.Haugen

*My computer only detects one of two gpus on my GTX 690*

Hi,

To clarify, I have two computers and both of them have one GTX 690 installed. For simplicity I name them Computer 1 and Computer 2. Computer 1 runs perfectly, while Computer 2 has what I suspect to be a faulty GTX 690 installed.

Some info on Computer 1:
- Here I get the option to enable and disable multi gpu mode in the Nvidia control panel.
- GPU-Z identifies two GTX 690, and it says that 2 GPUs are enabled. I also get the option of saving two separate logs here, one for each gpu.
- MSI Afterburner also detects two separate gpus on this computer.

Some info on Computer 2:
- Here I do not get the option to enable/disable multi gpu mode in the Nvidia control panel. 
- GPU-Z only identifies one GTX 690, and it says that SLI/multi gpu mode is disabled. I do not get the option of saving two logs on this computer.
- MSI Afterburner does not detect two separate gpus on this computer.

If I take the faulty GTX 690 out of computer 2 and insert it in computer 1 I get the same problems that are listed under "Some info on computer 2", and if I take the fully working GTX 690 out of computer 1 and insert it in computer 2 it works like a charm, so the problem clearly lies with the faulty GTX 690 and not one of the computers.

Is there anything I can do to make the computers detect the second gpu on my faulty GTX 690? How can the card even work with only one working GPU?


----------



## 86JR

Has anyone had any luck bios flashing their 690 to unlock min fan speed to 0% so when idling or browsing the internet the fans are not spinning?

People have done it on 680's etc.

ty


----------



## 86JR

Well time to leave the owners club. Technically, I still have the card but it is useless now due to noise and power usage... I upgraded to an Msi Gaming X 1060. Something I should have done A LONG time ago, apart from an interesting, but quite calm and smoothing combo of fan vibration and coil whine, the card is silent even doing 24/7 folding @ home whilst learning how to mine and gaming on the side. More than double the PPD in [email protected] than the 690, yet only 90w of power... and not hitting that 2GB vram limit in Arma 3. seriously guys, make the jump asap, especially if you can get the card cheap like I did!


----------



## Rebellion88

I was looking at a 690 to replace my old 6970, are they really that loud?


----------



## GorbazTheDragon

Late join, picked one up second hand (100 quid) as a placeholder sidegrade from my 670 and to keep as a sort of museumpiece. 

Have encountered a problem though, I pulled the heatsink to give it a thorough clean only to discover the VRMs use thermal paste rather than pads as interface to the backplate. I am abroad at my parent's house and in Chile for some reason it is impossible to buy pads. I tested using some AS ceramique 2 on the PLX and one of the power stages however it didn't really manage to fill the gap with any reasonable amount.

Anyone have voodoo magic tricks for getting this to work? Or should I just wait out until I'm back in Europe to get some normal pads?


----------



## Richard Hillman

Buzzkill said:


> EVGA UEFI Bios for 690GTX (.exe installer)
> 
> Is this still the most recent up to date BIOS for the EVGA GTX 690? I just scored one and Boot screen on DP port is red until I get into windows then all is fine, guessing this is what I need for UEFI\Boot screen access but I want to triple check before I just flash my GPU, placeholder until i can score something near MSRP ...so might be awhile lol. Thx!
> 2690UEFI.zip 950k .zip file
> 
> 
> Please keep in mind that once the card is flashed it will only work on a fully supported UEFI motherboard. If you have any further questions please contact Tech Support 24/7 888-881-3842
> 
> Regards
> EVGA
> 
> 
> 
> 
> 
> [Official] NVIDIA GTX 690 Owners Club


----------



## Richard Hillman

BTW looks like im the newest member to this club lol


----------

